This list is interesting and certainly contains some valid points, but I’d argue that some of the symptoms may also be a sign that you are simply maintaining legacy code that has outlived its life-expectancy for too long and perhaps didn’t have a sound architecture in the first place.
Besides, my argument would start with symptom one already, which claims that:
initializing variables that are never used
is a bad thing. Excuse me? Since when? Certainly there are differences between programming languages here, but I’d rather have deterministic behavior in my code by initializing the variables with some known value, rather than random behavior that depends on the contents of the memory prior to the execution of my code. Frankly, in this very case I’m of the exact opposite opinion and think that you are a bad programmer if you don’t initialize variables when you declare them (for example in C). Compilers are smart enough to figure out what’s needed and what’s not and will throw out any redundant initializations and completely throw out unused variables. So perhaps not initializing variables is a sign that you are an incompetent programmer that doesn’t know his/her compiler?! 😐
Either way, except for the first section “Inability to reason about code” this self-test is certainly worth a read.
// Oliver
PS: Don’t get me wrong about the initialization of variables, though. If they are unused, they should be removed from the source code as well, but more often than not their appearance in the source code is a sign of improperly set #ifdef
s or code changes where the programmer forgot to remove them (or did not notice their redundancy at the end of a change). But for this task there are tools like PCLINT or the compilers themselves which are much more efficient at spotting these.
ACK
3rd party library: default value changed after update -> nice debugging session -> never ever without own default values