Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

<sarcasm> In other words your code validates past computation and adds error correction code to all files you have to the HDD? After all if you don't do that then it will often produce buggy output and crash. (http://www.pcguide.com/ref/hdd/perf/qual/specRates-c.html) </sarcasm>

At best software running on commodity hardware works most of the time and acceptably broken really is good enough. Granted we want to minimize failures, but there is a wide range between good enough to be useful and perfect.



Yeah, that's why we don't have anyone fixing bugs encountered in software we use.

Please.

PS. Who cares about perfect? You can't have perfect, and that wasn't my point. Please reread.

PPS. I'm happy to see you're invalidating all the subject of my current work (carrier-grade systems) based on what's good enough on commodity systems. Extrapolating is bad.

PPPS. Sarcasm tags are so 17th century Protestant.


That mindset often results in over engineered crap.

I once wrote a program to parse a source code in ways that is provably limited. However, based on way the code was written it worked often enough to be extremely useful. Just because an approach can never cover all cases does not mean must be ignored.

Sniper bullets have tighter tolerances than the average bullet but this increase costs and becomes useless at high rates of fire. Suggesting all systems need to be as accurate as possible is ignorant.

PS: I have written and used plenty of code sitting on DoD computers and believe me it’s not all golden. Walk around the Pentagon and guess what Windows is sitting on the vast majority of peoples desks.


Again, you're extrapolating just to prove me wrong. I admit it may seem 'extreme' to some, but advanced tools aren't a threat to us. However, our current culture as programmers shows that we don't even trust other people's source code, let alone tools, to do our job. In this context of today, a tool that "writes code" (imperfect, of course) is not to be trusted any more than me, you, or our colleagues.

But do not downplay the importance of not producing, fixing, and avoiding bugs. It's just wrong.


There are basically two ways to write software you can either completely understand the problem and create a completely understood solution, or you can approximate a reasonable solution and then patch. Now when it’s possible the first approach is far better, unfortunately some of the worst systems where created when people tried to use the first approach when it was simply too complex for any one person to understand.

If you have ever copied a function changed the name and modified the code you are on the second path. Now plenty of people have gone down that path and know it’s likely to produce bugs, but as long as you have reasonable mitigation strategies it can still be a good idea. Building a tool that does similar things with larger sections of code would be dangerous, but with a little care it could still be useful.

In the early days of Fortran compilers they tended to be buggy so people would often money patch the output. http://en.wikipedia.org/wiki/Monkey_patch While error prone this was significantly faster than hand coding in assembler from the start. This died out with the invention of better compilers, but plenty of good software was written before then.

PS: I wish most software could be written using the first approach. I just don't think that's possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: