As a C# programmer I use the debugger every single day, since it's so natural and easy to use as to just run the application. I've grow spoiled actually, when I program in Go or Rust I really miss the "it just works" debugger.
Programming
All things programming and coding related. Subcommunity of Technology.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Same here. The Visual Studio debugger is excellent, and there's never a day that goes by without me using it.
I can't imagine programming without regularly pausing execution to inspect intermediate variables, run some quick checks in the immediate window or set conditional breakpoints. I'm always a bit surprised when I remember there are people who don't work like that
I use debuggers all day every day. If I'm running something in development, there's a very good chance I have it connected to a debugger. Also use it whenever I encounter an unexpected behavior in production (we use our own product for work too)
The profiler is a lot more specific and I haven't used it in a while.
I have a tendency to just use console logging, and only use debuggers when things are starting to get hairy.
Really often, even since TurboDebuger in the 90s, no other way to trace your code, step over/into, watch variables, etc. For compiled program it's necessary. For javascript I use print lol
Don't forget being able to watch the stack in realtime, and run your code backwards to roll back its state!
I've used a debuggers only a handful of times in the last decade or so. The projects I work on have complex stacks, are distributed, etc. The effort to get that to run in a debugger is simply not worth it, logging and testing will do 99.9% of the time. Profiling on the other hand, now that's useful, especially on prod or under prod load.
I seldom use profilers because I seldom need to. It's only usefull to run a profiler if your programm has a well defined perfomance issue (like "The request should have an average responsetime of X ms but has one of Y ms" or "90% of the requests should have a response after X ms but only Y% actually do).
On the other hand I use a debugger all the time. I rarely start any programm I work on without a debugger attached. Even if I'm just running a smoke test, if this fails I want to be able to dig right into the issue without having to restart the programm in debug mode. The only situation, where i routinely run code without a debugger is the red-green-refactor cycle with running unit tests because I'll need to re run these multiple times with a debugger anyway if there are unexpected reds...
What enables me? Well there's this prominent bug-shaped icon in my IDE right besides the "play button", and there's Dev-Tools in Chrome that comes with the debugger for JS...
Running your code without a debugger is only usefull if you want to actually use it or if you're so sure that there aren't any issues that you might as well skip running the code altogether...
I find debuggers are used a lot more on confusing legacy code.
Lately, monitoring tools such as OpenTelemetry have replaced a lot of my use of profilers.
At my last job, doing firmware for datacenter devices, almost never. JTAG debugging can be useful if you can figure out how to reproduce the problem on the bench, but (a) it's really only useful if the relevant question is "what is the state of the system" and (b) it often isn't possible outside of the lab. My experience with firmware is that most bugs end up being solved by poring over the code or datasheets/errata and having a good long think (which is exactly as effective as it sounds -- one of the reasons I left that job). The cases I've encountered where a debugger would be genuinely useful are almost always more practically served by printf debugging.
Profilers aren't really a thing when you have kilobytes of RAM. It can be done but you're building all the infrastructure by hand (the same is true of debugger support for things like threads). Just like printf debugging, it's generally more practical to instrument the interesting bits manually.
I recently started doing xeyes debugging.
We have so many debug logs that trying to find your log of a background takes a non zero amount of time.
So just inserting system("xeyes");
is actually way easier, to get instant feedback, and you can just use system("xmessage msg")
, if you need a message.
That makes me so happy.
Pycharm debugger has been great for me recently, I love the feature where you can drop into an ipython repl and interact with your program state.
All the time. I deal with a lot C# code that makes and responds to HTTP API requests, and being able to check if requests and responses are properly formed without having to slap print statements everywhere is a godsend.
One of the thing I learn about in programming is that, if you have to use debuggers too often then maybe it's a good time to re-evaluate on how you develop a project.
- Did you misunderstand the pattern design?
- Were there something you don't understand?
- Maybe it's an indication that you need to document more and do some project designs before committing the implementation?
- Were the way you write code more prone to bugs?
- Are there any libraries or tools that can help you alleviate this?
By fixing your practice and making it less prone to bugs, you wouldn't have to resort to using debugger as often.
As for profilers, it really depends, but generally if you try to be conservative like applying lockless concurrency where possible and sometime resorting to mutex/semaphore if otherwise needed, you should generally be ok when dealing with concurrency situation. As for overall performance, the rule of thumb is that, the less code you run to do the work, the better. You can see what program would actually do when dealing with C language for instance, but you might have a harder time to make such evaluation on higher level languages, so the general wisdom is that the heavy computation operation should be deferred to low level language like C language and you should have high level language calls into that C function to handle those performance intensive operations.
That's an interesting point about depending too heavily on a debugger. I haven't run into anyone too dependent on it, but I could see that happening.
To me, debuggers offer a tighter dev loop when there's something you're stuck on. They also let you 'grok' a call stack in an unfamiliar codebase. "Did this function get called?" "What's in this variable?" etc.
That I agree, I always see it as a critical necessity to always document everything when more than 1 developer work on the project. It like making a trade:
Spend time and effort debugging
Or
Spend time documenting and maintain it with the help of Chatgpt
With ChatGPT, it seems to reduce cost for documenting while same can't be said for debugging.
These days? Never, but I'm mostly writing Ansible and Terraform at work.
When I was writing any code at previous jobs? Also never. It was one part we were highly restricted in what we were allowed to use (and I didnt feel like trying to get gdb through the approval process; it was far easier to just use print statements inside of conditionals) and one part the languages all being scripting languages.
For microcontrollers, quite often. Mainly because visibility is quite poor, you're often trying to do stupid things, problems tend to be localized, and JTAG is easier than a firmware upload.
For other applications, rarely. Debuggers help when you don't understand what's going on at a micro level, which is more common with less experience or when the code is more complex due to other constraints.
Applications running in full fledged operating systems often have plenty of log output, and it's trivial to add more, formatted as you need. You can view a broad slice of the application with printouts, and iteratively tune those prints to what you need, vs a debugger which is better suited for observing a small slice of the application.
This usually depends on which industry you work in, and what language you're using usually :)
I work in gamedev, c++, and I ALWAYS use a debugger. There's no running the game, or even the editor without the debugger connected. No matter if you need it currently or not. You always launch the project through the debugger so if anything comes up you can investigate immediately.
Profiler is used any time there's a performance problem.
What enables me to use them is probably that this is very much true for the whole industry so software is built with that in mind.
For example, we use special "print" statements for some of the errors that if a debugger is running, it will automatically stop the program so you can investigate. Without a debugger, it will just output the error in the log.
There is no docker, the app is running usually on your local hardware. Consoles are also built with debugger support that you connect to from your PC. So it's very easy to use. Even connecting to another PC in a local network, for example, an artist or tester hardware, is possible from your computer without a problem. We have all the tools prepared for that.
I'm starting to get into the habit of reaching for debuggers more and more as opposed to just print()ing everything and hoping for the best.
Profilers on the other hand I still have no idea how to apply (and more importantly, read the results of) properly, so that's something I'll need to learn.
I used to just use print statements and avoided debuggers because I didn't understand it. But as I've gotten more experienced, it's become my first choice in debugging now (go figure haha).
Always, but I’m a former Googler, so performance was always a huge concern with each and every frontend change we made.
Rendering something to a page without errors should be the starting goal, where you then shift focus to readability, accessibility, maintainability, interoperability - all that other stuff that actually matters more but is opaque to users - but in most cases, it’s the end goal, and all that other stuff isn’t considered at all.
IMO, the web would be a lot better if frontend devs spent more time learning how to use their tools instead of logging everything to the console.
My primary languages are Java (for work), Javascript (for work), and C/C++ (for hobbies). Earlier in my career, I used to use the debugger a lot to help figure out what's going on when my applications were running, but I really don't reach for it as a tool anymore. Now, I'll typically gravitate towards either logging things (at a debug level that I can turn on and off at runtime) or I'll write tests to help me organize my thoughts, and expectations
I don't remember when, or if ever, I made the deliberate decision to switch my methodology, but I feel like the benefit of doing things in logging or tests gives me two things. 6 months later, I can look back and remind myself that I was having trouble in that area; it can remind me how I fixed it too. Those things also can serve as a sanity check that if I'm changing things in that area, I don't end up breaking it again (at least breaking it in the same way)
With that said, I will reach for the debugger as a prototyping tool. IntelliJ IDEA (and probably other Java debuggers) allow you to execute statements on-the-fly. I'll run my app up to the point where I know what I want to do, but don't know exactly how to do it. Being able to run my app up to the point of that, then pause it and get to try different statements to see what comes back has sped up my development pretty well. Logging or testing don't really apply for that kind of exploration, and pausing to run arbitrary statements beats the other options in how quickly and minimally that exploration can be done
Every single day. They’re built into the IDE. It’s easier to use them than to not use them.