I was fascinated with debuggers a while back exactly because they were so mysterious to me. I then wrote a ptrace debugger myself [1]. It features pretty simple implementations of the most common stuff you would expect in a debugger. Though I made the grave mistake of formatting all the code in GNU style.
And there is a brand new book with more than 700 pages about how to build a debugger by the same Author with a lot of additional information (see https://nostarch.com/building-a-debugger); I recently bought it and can highly recommend it.
nyarlathotep_ 19 hours ago [-]
It's really top-shelf. Haven't enjoyed a book of this kind since 'Crafting Interpreters.'
There's added bonuses too--the author evidently is quite a skilled C++ programmer, so there's all sorts of little modern-cppisms in there that I was ignorant of as someone that's only ever written the language as a hobbyist.
ddelnano 18 hours ago [-]
Also second that the book is a fantastic read. I work in the eBPF profiling space (on CNCF Pixie) and this has helped me understand DWARF concepts that I previously took at face value in our project's implementation.
I have bought the book months back and I think I recently received an email that the final version was released.
Anyways I can recommend it even though I am not finished with it yet.
remexre 19 hours ago [-]
Could you check what version of DWARF it covers?
ddelnano 18 hours ago [-]
It covers version 4, but it explains differences with v5 as they come up.
remexre 18 hours ago [-]
Okay, thanks!
ddelnano 18 hours ago [-]
Even if you have experience with DWARF, I think you will learn something new from the book.
I work on CNCF Pixie, which uses DWARF for eBPF uprobes and dynamic instrumentation. While I understood how our project uses DWARF, the book made many details much clearer for me.
tilne 1 days ago [-]
> But perhaps most importantly, debuggers are an intricate piece of the puzzle of the design of a development platform—a future I become more interested in every day, given the undeniable decay infecting modern computing devices and their software ecosystems.
I agree with this sentiment, yet still I’m wondering if it’s fully justified. There has never been more bad software than right now, but there has never been more good software either, no?
It’s not super relevant to the main contents of the article. Just a bit that caught my attention with regards to how it made me think.
nyarlathotep_ 19 hours ago [-]
I think there's two 'poles' here--'good' in terms of feature rich, sure, there's loads of it.
'good' as in performant--an area that game dev types (rightly, IMO) criticize and harp on? There's far less of it, video games aside.
Think of the perceivable slowness of many web application you use daily, Windows 11's, well, everything UI-related, etc.
Hell, my 3 year old iPhone can't scroll Uber Eats at 60fps consistently. Is 'Uber eats' 'good'? From a functionality standpoint, yeah, of course. But is displaying a list of images and text and expecting it to scroll smoothly too much to ask?
Software can be 'good' in terms of functionality offered and 'bad' at the same time, depending on your perspective.
IIRC I think Mr Fleury has a background in game-dev, so his perspective is totally understandable. Modern games are remarkable feats of software.
chillfox 14 hours ago [-]
Games are not immune to bad code.
Many gamers are unhappy with the performance of modern games as fewer and fewer of them can manage 60 fps at release on even high-end hardware.
indy 1 days ago [-]
Thinking selfishly, the absolute quantity of good/bad software isn't as important as the software you have to interact with on a day-to-day basis. Good software is invisible and under-appreciated, you use it for it's purpose and move on, bad software really sticks out.
tilne 24 hours ago [-]
Good point. The amount of bad software I’m forced to interact with regularly has gone up, mostly because there’s so many systems cobbled together in workflows now.
dahart 23 hours ago [-]
It’s a good question, what exactly is this decay and why is it called undeniable? Is that even true? If I think back on what programming was like thirty years ago up through today, everything about computing has steadily improved, gotten easier, more reliable, and higher quality, on average. All operating systems crash a lot less often than they used to. Computing devices from desktops & laptops, to phones, to routers & NASes, to household appliances, have all become faster, better, cheaper, and have more features and higher utility.
There are some ways I could see the author being somewhat justified, especially when it comes to the need for debuggers. Software is getting more layers. The amount of it and the complexity of it is going up. Debuggers are super useful for helping me understand the libraries I use that I didn’t write, and how my own code interacts with them. There are also a lot more people writing code than there used to be, and because the number of people writing code has been growing, that means the distribution skews toward beginners. I feel like the number of languages in popular use is also going up, and the diversity of coding environments increasing. I don’t know that I would frame all this as ‘decay’ but it does mean that we’re exposed to higher volumes of iffy code and abstractions over time.
jxjnskkzxxhx 1 days ago [-]
> there has never been more good software
Right. It's incredible that something like Linux is free. For a more recent example, look at Vs code. An even more recent example, look at how many open weight llms there are out there.
bigstrat2003 22 hours ago [-]
I definitely would not call VS Code good software, at least not overall. It's good in that it's not buggy, but it uses an absurdly high amount of system resources without any actual benefit. It is not ok that just to open a handful of small text files, it uses 1-2 GB of memory.
jxjnskkzxxhx 17 hours ago [-]
Memory used is not the only metric by which to evaluate software.
My machine has enough ram that this doesn't matter to me, and Vs code allows to be very productive compared eg to vim.
tilne 1 days ago [-]
Yeah vs code was one of the first examples I thought of as well. It has its own set of issues for sure, but even as a former vim fanatic it’s amazing from both a default experience perspective and that of a power user.
jesse__ 19 hours ago [-]
> For a more recent example, look at Vs code
HA! Comparing VSCode to Linux is like comparing an overweight, acne-ridden drug addict that lives in his mom's basement to an astronaut with 3 PhDs. They're barely even the same species.
apples_oranges 1 days ago [-]
Very interesting topic. Once you know how they work, the next fun thing is writing code that can detect or prevent debugging (and thus circumventing your DRM or copy protection..) ;)
Amazing! I'll follow. For what it's worth, I owe my career to the Eclipse debugger. At some point I started using it so much that my friends started to call me "debugger". I find writing code together with a debugger extremely educating.
wiz21c 1 days ago [-]
I've used debuggers now and then. What's the state of the art nowadays (in terms of cool functionalities) ? (too lazy to ggl or gpt it)
zeusk 1 days ago [-]
I've used kd/windbg at Microsoft, lldb at Apple, gdb at Intel; WinDbg(Next) and kd are still my absolute favorite.
RemedyBG might also be worth looking at. It is my go to debugger that I've used on Windows and it's made to look and feel like the debugger that comes with Visual Studio, except that it works way faster and you can hold the "Go to next line" key and see the watch window update in real time. Unfortunately you have to pay for it and it doesn't work on linux, but oh well.
wiz21c 7 hours ago [-]
cool, never heard of windbg, looks real good
nh23423fefe 1 days ago [-]
yeah, windbg pretty amazing. loved writing debugger scripts to do repros and the state just right
Rendered at 14:06:41 GMT+0000 (Coordinated Universal Time) with Vercel.
https://blog.tartanllama.xyz/writing-a-linux-debugger-setup/
Eli Bendersky also wrote about debuggers (I think his post is a great place to start):
https://eli.thegreenplace.net/2011/01/23/how-debuggers-work-...
I was fascinated with debuggers a while back exactly because they were so mysterious to me. I then wrote a ptrace debugger myself [1]. It features pretty simple implementations of the most common stuff you would expect in a debugger. Though I made the grave mistake of formatting all the code in GNU style.
[1]: https://github.com/thass0/spray/tree/main/src
There's added bonuses too--the author evidently is quite a skilled C++ programmer, so there's all sorts of little modern-cppisms in there that I was ignorant of as someone that's only ever written the language as a hobbyist.
Anyways I can recommend it even though I am not finished with it yet.
I work on CNCF Pixie, which uses DWARF for eBPF uprobes and dynamic instrumentation. While I understood how our project uses DWARF, the book made many details much clearer for me.
I agree with this sentiment, yet still I’m wondering if it’s fully justified. There has never been more bad software than right now, but there has never been more good software either, no?
It’s not super relevant to the main contents of the article. Just a bit that caught my attention with regards to how it made me think.
'good' as in performant--an area that game dev types (rightly, IMO) criticize and harp on? There's far less of it, video games aside.
Think of the perceivable slowness of many web application you use daily, Windows 11's, well, everything UI-related, etc.
Hell, my 3 year old iPhone can't scroll Uber Eats at 60fps consistently. Is 'Uber eats' 'good'? From a functionality standpoint, yeah, of course. But is displaying a list of images and text and expecting it to scroll smoothly too much to ask?
Software can be 'good' in terms of functionality offered and 'bad' at the same time, depending on your perspective.
IIRC I think Mr Fleury has a background in game-dev, so his perspective is totally understandable. Modern games are remarkable feats of software.
Many gamers are unhappy with the performance of modern games as fewer and fewer of them can manage 60 fps at release on even high-end hardware.
There are some ways I could see the author being somewhat justified, especially when it comes to the need for debuggers. Software is getting more layers. The amount of it and the complexity of it is going up. Debuggers are super useful for helping me understand the libraries I use that I didn’t write, and how my own code interacts with them. There are also a lot more people writing code than there used to be, and because the number of people writing code has been growing, that means the distribution skews toward beginners. I feel like the number of languages in popular use is also going up, and the diversity of coding environments increasing. I don’t know that I would frame all this as ‘decay’ but it does mean that we’re exposed to higher volumes of iffy code and abstractions over time.
Right. It's incredible that something like Linux is free. For a more recent example, look at Vs code. An even more recent example, look at how many open weight llms there are out there.
My machine has enough ram that this doesn't matter to me, and Vs code allows to be very productive compared eg to vim.
HA! Comparing VSCode to Linux is like comparing an overweight, acne-ridden drug addict that lives in his mom's basement to an astronaut with 3 PhDs. They're barely even the same species.
https://github.com/EpicGamesExt/raddebugger/
Coolest feature of windbg is time travel debugging - https://learn.microsoft.com/en-us/windows-hardware/drivers/d...
(Warning: contains me trying to play Doom :)