105

Over the course of my career, I've noticed that some developers don't use debugging tools, but do spot checking on erroneous code to figure out what the problem is.

While many times being able to quickly find errors in code without a debugger is a good skill to have, it seems it's less productive to spend a lot of time looking for issues when a debugger would easily find little mistakes like typos.

Is it possible to manage a complex without a debugger? Is it advisable? What benefits are there to be had by using "psychic debugging?"

Jonathan DS
  • 460
  • 2
  • 6
  • 15
  • 19
    Hi jonathan, I've revised your question to avoid the trappings of a rant and keep the question open: I think—as worded now—it's a decent enough, answerable question. –  Jan 23 '12 at 21:02
  • Example : Consider a code `a = 6/3`., instead by typo you have typed `a = 6/2`.. Now you are searching in mnemonics level ., the ADD, JMP instructions and then you find there was extra one iteration instead of 2., then you realise the divider has a wrong typo. Now you can infer, how ridiculous to always use a debugger. – EAGER_STUDENT Dec 10 '13 at 16:35

21 Answers21

155

What looks like guessing from the outside often turns out to be what I call "debugging in your mind". In a way, this is similar to grandmasters' ability to play chess without looking at a chess board.

It is by far the most efficient debugging technique I know, because it does not require a debugger at all. Your brain explores multiple code paths at the same time, yielding better turnaround than you could possibly get with a debugger.

I was not conscious about this technique before briefly entering the world of competitive programming, where using a debugger meant losing precious seconds. After about a year of competing, I started using this technique almost exclusively as my initial line of defense, followed by debug logging, with using an actual debugger sitting at the distant third place. One useful side effect of this practice was that I started adding new bugs at a slower pace, because "debugging in my mind" did not stop as I wrote new code.

Of course this method has its limitations, due mostly to the limitations of one's mind at visualizing multiple paths through the code. I learned to respect these limitations of my mind, turning to a debugger for fixing bugs in more advanced algorithms.

Sergey Kalinichenko
  • 17,393
  • 4
  • 57
  • 73
  • 27
    +1 I find "programming by guessing" to be a loaded phrase. There is no substitute for thinking. What the OP does not explain is how effective the "guessing" is. My doubt is that it's purely guessing (i.e. spaghetti on the wall approach), but rather using deductive reasoning. Debuggers have their place, but they are not a panacea for deductive reasoning and simply understanding the code. – Bill Jan 23 '12 at 18:06
  • 3
    +1, this was how I learned to write code in the first place. Being able to notice bugs as you're typing in the code means it won't come back to bite you later. – Izkata Jan 23 '12 at 19:06
  • "by far the most efficient debugging technique I know, because it does not require a debugger". That is true only if you don't have a debugger. If you have a debugger it is no more efficient not to use it than to use it. – DJClayworth Jan 23 '12 at 19:55
  • 9
    @DJClayworth That is not entirely accurate: sometimes trying to use a debugger is a poor choice, even if you have a good debugger at your disposal: you end up wasting a lot of time without accomplishing much. One case that immediately comes to my mind is solving concurrency issues; the other ones are debugging recursive algorithms with high branching factors, some dynamic programming algorithms, and hardware interrupt service routines. Of course it is silly not to use a debugger when you genuinely need one, but deciding when you start needing a debugger is a highly individual choice. – Sergey Kalinichenko Jan 23 '12 at 20:23
  • @dasblinkenlight Your answer says "debugging in your mind is by far the most efficient debugging technique I know". If you meant "there are a few occasions when not using a debugger is better", then I don't disagree with that. If you edited your answer to say that I would consider voting it up. – DJClayworth Jan 23 '12 at 20:27
  • This depends. A debugger can be extremely helpful in running down a crash or something in a large codebase you haven't visited in a few months. Not saying that one always needs or should go straight to a debugger, but any programmer should know how to at least use one. – Billy ONeal Jan 23 '12 at 20:39
  • 9
    +1 although I find a debugger invaluable for certain types of bug (particularly in more complex algorithms) there really is no substitute for a simply having a good understanding of the code – Chris Browne Jan 23 '12 at 20:44
  • 7
    @DJClayworth I deliberately went for a stronger statement than "a few occasions when not using a debugger is better": my brief encounter with competitive programming taught me that instinctively reaching for a debugger is not the most efficient behavior *for me*. These days I start by (1) quickly re-reading the code, and (2) examining the debug trace (when available) before (3) going for a debugger. In many cases, the third step is unnecessary, because I spot the problem in steps (1) or (2), write a unit test that reproduces the issue, and code a fix, all without using a debugger. – Sergey Kalinichenko Jan 23 '12 at 20:51
  • 2
    +1, couldn't aggree more. I can only guess that people who always use debuggers (before even taking a quick glance at the code) cannot read code. – user281377 Jan 23 '12 at 21:03
  • 11
    I think what you really mean is that a programmer should have a *debugging sequence*, instead of clicking the magic "find bug" button. A debugger is an extremely powerful tool, but you don't fire up the chainsaw to trim the hedges. – Spencer Rathbun Jan 23 '12 at 21:29
  • +1 could not agree more. Having the ability to follow multiple paths of execution in one's head is imperative. – V_P Jan 26 '12 at 18:55
  • 1
    +1 for "It is by far the most efficient debugging technique I know, because it does not require a debugger at all. Your brain explores multiple code paths at the same time, yielding better turnaround than you could possibly get with a debugger." – Bjarke Freund-Hansen Feb 27 '12 at 14:09
  • How is this different from what you would do anyway when dealing with code? Isn't the implied assumption in the question is that your ability to consider multiple execution paths failed (if you are the code author), or that you are not familiar enough with the code to have a proper mental "map" of it to look at possible code paths? – EpsilonVector Jun 03 '13 at 13:47
  • @EpsilonVector I see nothing in the question to support your assumption. If you are the author of a code with the bug, you may have missed something the previous time, perhaps because you didn't pay enough attention. Sometimes, knowing the value that causes failure is enough to re-evaluate your mental map of the code. For example, someone telling you that the code crashes when he enters a negative number may be enough to recall that you considered only positive entries before. If you are looking at other people's code, you can map it out as you read it, especially shorter fragments of it. – Sergey Kalinichenko Jun 03 '13 at 14:06
  • @dasblinkenlight Again, you do not start thinking about your code only when there is a problem. You think about it before, during, and after you write it, so if a debugger is even considered, it means your ability to think about it failed. This is like: a car is used to travel long distances fast, someone comes and asks what's the benefit of not using a car, so you answer that there's benefits when you don't need to travel long distances fast. If you don't need to travel long distances fast, the question is moot, the interesting part is why would you not use a car when you do need to do that. – EpsilonVector Jun 04 '13 at 10:05
  • @EpsilonVector Again, sometimes people simply do not pay attention: humans are quite unlike computers at that. So reaching for a debugger means nothing at all. Continuing your car analogy, consider jumping into your car to go to a corner store three blocks away. I know people who do it, and I know people who do not do it; the "need to travel long distances fast" is quite irrelevant to their decision. – Sergey Kalinichenko Jun 04 '13 at 10:25
  • @dasblinkenlight OK fair enough. I did not consider that some would use a debugger as a crutch for simple situations as well. – EpsilonVector Jun 04 '13 at 11:42
  • what does it exactly mean to not use the debugger? To write code in notePad and not in Xcode? or it means to somehow turn off Xcode giving you compiler errors/warnings and only having runtime errors enabled? – Honey Jun 12 '17 at 16:10
  • @Honey Not using debugger has nothing to do with not using IDE. You write, compile, and run your code as usual, fixing all problems Xcode reports to you as you type in the debugger. However, you do not reach for the debug facilities (breakpoints, stepping through code, examining variables) until you absolutely have to. – Sergey Kalinichenko Jun 12 '17 at 16:29
41

The more I know a code base, the less I need a debugger (but I'd still check the reported error, it is an important clue in any reasoning).

It is a fine tool to understand some dynamic behavior of small to medium complexity, but I often find out that it focus me on the details instead of the bigger picture. And after a while, that's where the problems are: in wider scope interactions whose dynamic behavior tend to be more easily understandable with other tools (logging of input and outputs at module boundaries for instance).

AProgrammer
  • 10,404
  • 1
  • 30
  • 45
36

They may not be bad programmers, but they are probably terribly inefficient troubleshooters.

I tend to follow the advice from Debugging: The 9 Indispensable Rules for Finding Even the Most Elusive Software and Hardware Problems (David Agans), and this one falls squarely under the guidance of "Quit thinking and look"

JohnFx
  • 19,052
  • 8
  • 65
  • 112
  • 12
    I disagree, though I won't downvote. As delnan says, if you can understand what the code is doing, it **can** be faster to spot what it is doing wrong than to step through the debugger and try to find when it goes wrong. That said, a developer who refuses to use a debugger when they **can't** identify the problem from reading the code is making a big mistake. –  Jan 23 '12 at 19:14
  • @Mark plus the added bonus of misdiagnosing the problem and plugging in a new defect. – Keith Brings Jan 23 '12 at 19:17
  • 13
    @Mark Bannister - I see what you are saying. Let me amend that to, if you have been looking for the problem in code for more than 15 minutes, give up and use the debugger and don't be stubborn. – JohnFx Jan 23 '12 at 19:35
  • @JohnFx: OK, plus 1. :) –  Jan 23 '12 at 20:00
  • 9
    I think a good programmer should not be _dependent_ on the debugger. This should not keep him from using one immediately (where available), once his insight fails -- or periodically, to make sure his insight is still on track... – comingstorm Jan 23 '12 at 23:41
  • 1
    @mark unless you are working on a very small code base I think it is impossible to understand every line of code. 95% of my current bugs are solved in the way you describe, but the trickier ones are where you need the debugger. – wobbily_col Mar 03 '14 at 10:12
32

Any job requires using the right tools the right way. If you have a debugger then use it to see what is actually happening. Most bugs are caused by assumptions.

I've worked with developers who refuse to use debuggers because they knew better. The classic response I got once was 'the crash isn't being caused by me, I spent all day inspecting the code [where it was crashing] and there's nothing wrong'. (What about that null value that was read in from the db?) The boss seemed to think it was a great reply but the customer didn't.

I got off that team as fast as I could. Their purpose was to featherbed the job and make a simple 10 minute problem into an all-day-looking-busy problem.

jqa
  • 1,410
  • 10
  • 13
14

Your best guide to the practice of debugging is Steve McConnel's book Code Complete. Chapter 23 covers debugging in detail, and I will distill a few points from it.

  1. Understanding the problem is important, and use of the debugger is not a substitute for it.
  2. Guessing is a bad approach to debugging. If your colleagues are really using guesswork, rather than thinking about the problem, then they are doing a bad job. Guesswork means sticking random print statements in the code and hoping to find something useful.
  3. If your colleagues genuinely don't know how to use a debugger (rather than choosing not to use one) then yes, they are incompetent, just like someone who doesn't know the syntax of the language they are supposed to be using.
DJClayworth
  • 538
  • 4
  • 9
  • 2
    Whilst I agree with you on most of your post, I think incompetent is unfair. It's possible to develop without the use of a debugger, it's just inefficient. Some people learn about debuggers before others! – ChrisFletcher Jan 23 '12 at 21:49
  • I wouldn't casually throw around words like "incompetent". I know somebody who debugs entirely with print statements, and nobody else comes close to making the contribution he does. – Mike Dunlavey Jan 24 '12 at 15:55
  • 2
    @MikeDunlavey Does that person *know* how to use a debugger and choose not to use it? Fine. If they don't know, then I stand by my statement. – DJClayworth Jan 24 '12 at 16:52
  • 3
    Stand as you like, a time could easily come when that adjective might be applied to you. Then you'll understand - it's schoolyard stuff. – Mike Dunlavey Jan 24 '12 at 19:42
10

I'm surprised that the discussion on this topic has not mentioned "unit testing".

Because I do test-driven development, I don't spend a lot of time in the debugger. 10 years ago, I used to dutifully step through the debugger:

  1. After writing a piece of code to ensure that it worked and
  2. When I received a bug report to try to diagnose the problem

What I've found after 10 years of test-driven development is that I'm a lot more productive as a programmer if:

  1. I write unit tests before I write the code to ensure that I wrote it correctly
  2. I write unit tests immediately upon receiving a bug report to attempt to duplicate and drill-down on the problem.

Allowing the computer to run through the code and validate the result is thousands of times faster than I can think or step through the code to mentally validate the results, and doesn't make mistakes.

I still have to step through in the debugger occasionally, and I'm still engaged in mentally analyzing the code... but only rarely, and mostly for very tricky code.

  • +1 It's often faster to add a print statement and rerun the test then use a debugger. – Winston Ewert Jan 24 '12 at 16:14
  • @ winston - its often quicker to fire up the debugger than to write multiple print statements until you find the location of the problematic code. It all depends. Simple problems are usually resolved more quickly the way you describe, but complex problems are where you need the debugger. Being able to use both is better than strictly adhering to any absolute principle. – wobbily_col Mar 03 '14 at 10:03
9

Hard to tell. Debugging by guessing might work if you already have an idea about what the bug is (incorrect value passed to a library function, possibly invalid SQL, etc). I admit I do it sometimes when the error itself seems small or obvious, such as "character buffer too small" - the stack trace shows me the line it failed and I don't need a debugger to solve that one.

Doing this all the time can be counterproductive and if the first few "guesses" fail, guessing is probably the wrong problem-solving strategy and a real debugger should be called in. Normally, I'd say there's absolutely nothing wrong with using the debugger.

That being said, I've worked with tools and environments where the debugger was so difficult to get working right, or so minimal and useless that guessing was unfortunately often a better approach. I've worked with some proprietary tools that didn't even have proper debuggers. I suppose it's possible that if a person worked in such environments too long they'd eventually lose their trust in debuggers and rely soley on the guessing approach.

FrustratedWithFormsDesigner
  • 46,105
  • 7
  • 126
  • 176
8

Personally, I try to minimize the use of a debugger by:

  • using static checkers and similar compiler options which hint at possible sources of bugs just by analyzing the code
  • writing code with as few side effects as possible, in the most functional style possible, eliminating mutable state where possible
  • writing unit tests with the minimal reasonable granularity
  • not swallowing exceptions

Of course, everyone makes errors, so even when composing programs this way, if a test fails, I use the debugger to inspect a value of an intermediate expression. But by adhering to the above principles, the defect is easier to locate, and debugging doesn't mean a painful, indeterministic process.

thSoft
  • 109
  • 5
6

Use the debugger whenever possible. The debugger will either simply nail the issue (oh look, we didn't check for this value), or provide a great deal of context that is useful when analyzing the relevant code (wow, the stack is totally messed up, I'll be it's a buffer overflow issue).

Kevin Hsu
  • 1,613
  • 10
  • 11
5

Debugging is a very useful tools for inspecting the state of the objects and variables in your code at run time.

As previously mentioned in the answers above, debugging is extremely helpful, but there are some cases where it is limited.

In my experience, I find using the debugger to be very useful because it helps to reveal false assumptions that I was making about the state of my code. Some people aren't as astute at reading through the code to find a bug, so debugging can help in revealing false assumptions that you or another developer made about the state of the code.

Maybe you expect that a parameter will never be null when passed to a method, so you never check for that case and carry on in the method as if that parameter will never be null. The reality is that parameter will end up being null at some point even if you set as a pre-condition to the method that the parameter should never be null. It always will happen.

In contrast to debuggers' usefulness in the aforementioned examples, I find it difficult and somewhat not useful to use when multi-threading (i.e., concurrency, asynchronous processing) is involved. It can help, but it is easy to lose your orientation in the multi-threaded fog when the debugger's breakpoints are being hit in one thread at point A and a completely separate thread at point B. The developer is forced to push the new breakpoint "thought process" on the top of his brain's "stack" and orient himself to the code at the point of the new breakpoint. After the relevancy of breakpoint B decreases, the developer then switches back to the first breakpoint, and has to recall what he/she was looking out for before the trigger of breakpoint B. I know that this may be a confusing explanation, but my point in this paragraph is that debugging where concurrency is used can be a very A.D.D. (Attention Deficit Disorder) process, and so it may be more difficult to remain productive in your debugging thought pattern.

Also the unpredictability of concurrent code can further distract the developer in debugging concurrent code.

In conclusion, in my honest opinion:

  • Debugging when concurrency is used = increased tendency to lose focus of "debugging thought pattern"

and

  • anytime else = increased debugging productivity b/c your attention isn't interrupted by unexpected breakpoints (unexpected due to race conditions).
Jarrod Nettles
  • 6,125
  • 2
  • 41
  • 45
BigSauce
  • 121
  • 1
  • 2
    +1 for bringing up the issue of debugging in concurrent environments, where the usefulness of traditional debuggers often diminishes to near zero. – Sergey Kalinichenko Jan 23 '12 at 19:27
4

I Think they're being a bit too hardcore. Personally when I run into a bug, I recheck the code, try to trace it in my mind from the program logic, because that sometimes helps me uncover other problems or side effects easier than just using the debbuger and fixing the bug where it manifests itself.

Even when I think I've nailed it, I usually debug it to make sure I'm right. When the problem is a bit more complex, I believe debugging is absolutely essential.

Also... just my opinion but, there is no excuse for not taking a decent advantage of the tools a modern IDE can bring to the table. If it helps you complete your job faster and in a more reliable way, you should use it.

pcalcao
  • 191
  • 4
4

Hate to generalize, but many programmers I have met think there is only one way to solve a problem (their way). It is easy to assume that every possible test has been thought of. A different perspective can be very valuable.

Programming by trial and error can come up with some great new approaches, and catch things others have missed.

The downside, usually takes much longer.

4

Erm, it depends on the person. Personally, I don't use debuggers that much myself. When I program micro controllers, I basically use LEDs or writing data to EEPROMs to "debug" the code on it. I don't use JTAG.

When I program software for PCs or servers, I tend to use logging and lots of console output. For C-style languages, I use preprocessor directives, and in Java I used log levels.

Since I don't use debuggers, would you say I'm doing something wrong? It's the editors jobs, to show me where I have syntactical errors, and when there's a logical error, I just have to run tests.

polemon
  • 337
  • 3
  • 12
4

There is a difference between not needing to use a debugger and not knowing how to (or refusing to) use a debugger. The debugger is just one of many tools to use in tracking and fixing bugs. I've worked with developers who can puzzle it out in their head and others who think they can.

The best mix is to write your code so it is easy to test via unit tests, and logs the errors. Then you hope you don't need to look at the logs or use the debugger. It is kind of like buying insurance. You hopefully never need to use it, but once you run into a bug that can't be solved by rechecking the code then it is too late to add proper error handling / logging, unit tests, or learn to use a debugger.

Different tools/platforms favor different debugging techniques (debugger, logging, unit tests, etc.) As long as a developers is familiar with a few of the techniques for their platform/tool, in addition between just rechecking the code, then they may be a skilled developer, but if they only have one trick when it comes to debugging then they will eventually run into a bug they cannot find or fix.

Jim McKeeth
  • 2,126
  • 15
  • 30
4

Many answers, but not a mention about Heisenbug?!?!

Heisenbugs occur because common attempts to debug a program, such as inserting output statements or running it in a debugger, usually modify the code, change the memory addresses of variables and the timing of its execution.

I use debugger, only in the worst case (for hard-to-find bugs). Also, as per best practices that many acclaimed developers/testers have been talking about, it's good to unit test the code thoroughly. That way, you can cover most of the problems and hence there would be no need to use the debugger.

bchetty
  • 101
  • 3
3

With good unit tests, and exceptions that provides you the backtrace, you rarely have to use a debugger.

The last time I used a debugged was when I got a core file in some legacy application.

Am I being a "debbuger minion" or are these guys being "too hardcore"?

Neither. They are just kind of people who likes to make their life harder then it should be.

Dynamic
  • 5,746
  • 9
  • 45
  • 73
BЈовић
  • 13,981
  • 8
  • 61
  • 81
3

I read an argument against debugger debugging here recently (or was it StackOverflow?). You should have test cases against your code. If your tests pass, your debugging probably isn't going to exercise the bug (assumption: you will debug with data similar to your test data).

On the other hand, logging is mandatory. If you pass your tests and deploy to production, you may find that you have a bug. The evidence of the bug is from something that happened in the past. i.e. someone says, "How did that get in there?" If you don't have good logs, you'll never find the cause. Even a debugger may be of no use at that point because you don't know what the data looked like that actually exercised the bug. You need to be able to debug the application from the logs.

Unfortunately, I'm paraphrasing quite a bit, and may be doing the original argument a disservice. In particular, the position of "There are important debugging aides to spend development time supporting" might be orthogonal to the importance of debuggers. But the part about the difficulty in setting system state in a configuration that makes debugging useful for finding bugs struck me as something to think about.

ccoakley
  • 1,126
  • 1
  • 7
  • 6
2

Debugging is just a tool that a good developer should use proficiently.

Certainly sometimes you can know by heart where the bug can be if you know the code base. But you can also lose an entire day or week to find a pesky bug just by looking into the code.

In dynamically typed languages without some kind of debugging (even if it's just dumping values to the console) guessing sometimes becomes impossible.

So to answer your question - maybe they are brilliant programmers, but their troubleshooting skills and their proficiency when hunting bugs are bad.

Christian P
  • 1,954
  • 2
  • 19
  • 24
2

Depends on the scope of a problem. If the program is small and things are well-divided you probably can figure it out by looking. If the program is 4.5 million lines of code developed by a team of 100+ people over the course of several years then certain bugs will be impossible to spot.

The one in question in said program (in C) was a memory overwrite. The debugger with a memory breakpoint identified the offending line of code as soon as the bug appeared. But in this case there is no way someone could have read and retained all 4.5 million lines of code to identify the one spot someone wrote past their array (plus they'd have to have known the runtime layout of the memory for the gargantuan program's state about 10 minutes into a long run of inputs to get it to that point).

Point being: In small programs or things that are highly modularized you can get away w/o a debugger. If the program is really big and complex then the debugger can save you lots of time. As others have said, it's a tool, and it has its situations where it excels above any other method, and others where it isn't the best choice.

anon
  • 1,474
  • 8
  • 8
0

If the bug occurs in a client's computer, or in a computer that its environoment is much different than yours, then setting up a debugger / remote debugger is cumbersome. So, for the cold day where you get a bug from the field, the response of 'but ... i don't have a debugger' doesn't help. Therefore, you need to develop a skill set of trouble shooting and finding the bug just through understanding of code and log files.

yarony
  • 1
  • 1
0

What a bunch of nonsense: "Real Programmers don't need Debuggers." Might as well say that a real programmer doesn't need any IDE, just give me a note pad and a dull pencil. The debugger is a tool like any other that aids productivity.

Also, consider that not everyone tasked with debugging code is familiar with that code in question. Many time contractors come into an environment where they only have a general ideal what is happening. They may even be given a detailed description of an environment – or a 20 year old schema map and guide to arcane naming conventions (try understanding the difference between table X1234 and table X4312 with fields F1, F2, and F3 [yes, garbage like this exists] when you are new), but many times that description is wrong; otherwise, why is there a "mystery" error.

As someone new to an environment, you can spend hours or days mapping and getting to "know" a large database for a problem area that you may fix and then never have need to look at again. This is a huge waste of time and money. If you have access to the debugger, you look see what is happening, correct it, and are gone in a matter of minutes. All of this "you don't need debuggers" hooey is just elitist puffery.

  • 2
    this [straw man](http://en.wikipedia.org/wiki/Straw_man) rant doesn't answer the question asked, nowhere there is a statement "Real Programmers don't need Debuggers" – gnat Nov 14 '13 at 14:10