Why is there no (great) improvement in debuggers? In the last few decades we have seen a huge amount of improvement in languages to improve development speed, but very little when it comes to debugging. The time it takes to go from an observed “badness” in the program (suspected bug) to understanding how it happend does not seem to decreases, and does not seem to have decreased. Instead it is becoming longer and longer as the complexity of the entire system increases.

The only “new” leap I can see in the last couple of decades is memory, threading and locking tools such as Valgrind, but nothing on the UI side of things (why is this checkbox 5 pixels to the right? How did that pixel become grey?) or stepping the program back.

Is there some great belief that I don’t share that modern systems are built up by fully debugged components and therefore this is not a problem? Am I wrong, and there has been serious improvement? Am I just using a bad methodology when debugging, and fancy tools are just what I ask for to hide my own lack of skill? Or am I just programming a type of software that is more complex than most, and therefor suffer in exess?

Admittedly, there is still no proper debugger available for the language I’m mostly maintaining, so perhaps I should not throw stones in my glass house.


Leave a Reply

Your email address will not be published. Required fields are marked *