> we talk about programming like it is about writing code, but the code ends up being less important than the architecture, and the architecture ends up being less important than social issues.
A thousand times this! This puts into words something that's been lurking in the back of my mind for a very long time.
nuclearnice3 4 days ago [-]
Strongly agree. Peopleware 1987 [1]
> The first chapter of the book claims, "The major problems of our work are not so much technological as sociological in nature". The book approaches sociological or 'political' problems such as group chemistry and team jelling, "flow time" and quiet in the work environment, and the high cost of turnover
I’ve been drumming this for so long now, even before I heard of (let alone read) this book.
I feel that the development of psychology and sociology has been lost on the workplace and it isn’t well applied. Executives want everyone to be widgets except themselves, even when study after study shows that for companies to perform optimally their workers must feel well compensated, well valued, balanced freedom in the workplace, chances for advancement etc.
In many respects you could apply psychology and sociology to how products should / could behave etc. as well, which I’m sure due to the monetary component some companies have taken seriously at least in some periods of their lifecycle, like Apple under Steve Jobs in his comeback
pydry 4 days ago [-]
>Executives want everyone to be widgets except themselves
Of course. This maximizes their relative power within the company.
Some executives are focused on the health of a company as a whole but not many. To most of them the pie can be assumed to be a fixed size and their job is to take as much of it as possible.
zemvpferreira 4 days ago [-]
For businesses or business areas where excellent isn’t necessary and good will do, this attitude can even be considered to be in the best interest of the company. The more fungible employees are made, the less bargaining power they have.
nuclearnice3 2 days ago [-]
Fair point. You could also imagine it's an easier management task to have fungible employees. Sam quits. No risk to the company. Our employees are fungible. Sarah can step right in.
zemvpferreira 14 hours ago [-]
I guess so. As a business owner/administrator it’s easy to see how your interests align in this way. As a worker your priority should be the opposite of course.
BOOSTERHIDROGEN 4 days ago [-]
What if the company has significant constraints on its financial health?
lmm 4 days ago [-]
Then it's all the more important to avoid unnecessary employee turnover.
mst 4 days ago [-]
People tend to vastly underestimate how much the time needed for a new hire to come up to speed costs the employer.
This is true even of (theoretically simple) things like retail jobs, because even if you're proficient in the basic skill set on day one, coming up to speed on the rhythm of a specific workplace still takes time.
I'm buggered if I can remember where I saw it, but there was a study once that showed that (in that specific instance, I have no clue as to whether or not it generalises) a minimum wage increase actually *saved* retail/service employers in the area money overall, just because the reduced churn meant that over the lifetime of an employee with the company the fact that said lifetime was longer meant they were getting enough more value per hour out of each employee to more than compensate for the higher cost per hour.
Of course the study could always have been wrong, but it didn't seem obviously so back when I looked at it and it at the very least seems plausible to me.
4 days ago [-]
mihaaly 4 days ago [-]
Considering that programing and tools used for it are not for computers but humans, and that apart from most trivial things more than one people is necessary to make something that work on/with computer(s), it is no surprise that SE is much more social science than many would like to admit or feel comfortable with, over-emphasizing its natural science part to the level of failure eventually (on the product level aimed at addressing needs of the people). Probably because social sciences are very fluid and much less reliable than natuaral sciences, so we have an inner tendency avoiding the social bit, or handling it on a very primitive level? I do not know, this is a feeling. So much focus on atomic details of technology yet the group effort of the product is still rubbish too many times.
> Organizations which design systems (in the broad sense used here) are constrained to produce designs which are copies of the communication structures of these organizations. — Melvin E. Conway, How Do Committees Invent?
transpute 4 days ago [-]
Source code repos could have USER.md and DEVELOPER.md files to record social context.
lou1306 4 days ago [-]
But again, that is at best infrastructure documentation, not code. Unless you dilute the term "code" until it loses nearly all utility.
transpute 3 days ago [-]
User (social) org structure and Developer (social) org structure are unavoidable requirements which constrain code implementation, as much as processor speed and memory capacity.
Swizec 4 days ago [-]
In my experience roughly 80% of technical issues are because 2 people (or teams) didn’t want to just sit down together and talk it out.
Agingcoder 1 days ago [-]
Yes, because ‘they’ ( the other team ) are not doing it right so not talking is best
zelphirkalt 4 days ago [-]
It is not a dichotomy though, as a good architecture manages to fulfill the requirements people have for the system _and_ keeps it understandable for human beings.
IgorPartola 4 days ago [-]
This precisely describes why Google Glass failed.
mattigames 4 days ago [-]
Elaborate?
IgorPartola 4 days ago [-]
Doesn’t matter how good the platform was, it wasn’t a socially acceptable product.
mst 4 days ago [-]
I *really* wanted basically "google class without the frigging camera."
Being able to overlay an 80x24 terminal over one of my eyes (and drive it with a bluetooth keyboard or whatever) would've been fantastic for me.
Unfortunately for me, this is enough of an outlier desire that it doesn't seem likely anybody will ever want to sell me that at a price point I can convince myself of.
(Reposts are fine after a year or so! links to past threads are just to satisfy extra-curious readers)
defer 4 days ago [-]
This is hilarious to me:
Android, which uses it for some large component of the system that I've never quite understood
Ninja is really a huge part of AOSP, the build system initially used makefiles. Things got complex really fast with a custom declarative build system (soong) and a failed/aborted migration to bazel. Google developed kati (https://github.com/google/kati) which converts Makefiles to ninja build files (or should I say file), which really is huge:
Going from makefiles/soong to ninja is painful, it takes several minutes even in a modern machine but it simply flies once ninja picks it up.
zelphirkalt 4 days ago [-]
As someone, who has not used Ninja, what advantage is there, compared to Makefiles? And is it worth introducing yet another tool, to translate one to the other? Especially, when the Ninja files are that huge, possibly human-unreadable.
defer 3 days ago [-]
They are huge because android has hundreds of smallish makefiles but the generated ninja file is a single flat file.
The advantage in android is that the different build systems will generate ninja, so they can interoperate.
flqn 3 days ago [-]
The Ninja files being that huge is likely more to do with the Android build environment or the tool that generates them. The main advantages of Ninja as a build executor are that the language is simple and it processes the build graph very quickly.
high_priest 4 days ago [-]
> I also believe that programmers feel latency and it affects their mood even if they don't notice it. (Google has recently done some research in this area that kinda confirmed my belief, here's hoping they'll publish it publicly!)
Anyone knows if it happened? Has the google research on latency been published?
I don't think anybody talking about "latency" without a qualifier is thinking about build latency.
But it's a nice article. The idea that giving-up on waiting for a delay has a simple exponential distribution is something that I never thought. (And now I'm fixed on understanding why... Something must have biased me against it.)
rendaw 3 days ago [-]
It's an article about a fast build system, why wouldn't it be about build latency?
jacobgorm 3 days ago [-]
The 400ms Doherty Treshold applies to builds too.
forrestthewoods 4 days ago [-]
Ninja is pretty popular with gamedevs.
I was amused by this line:
> But Windows is still a huge platform in terms of developers, and those developers are starved for tools.
As a primarily Windows dev I feel that it is poor Linux devs who are starved for tools! Living life without a good debugger (Visual Studio) or profiler (Superluminal) is so tragic. ;(
It does feel like in recent years the gap between the two platforms is increasingly minimal. I definitely like all the Rust utilities that generally work crossplatform for example.
marcosdumay 4 days ago [-]
Every time somebody puts "Visual Studio" on the same sentence as "good" I get that strange feeling the Universe is a simulation and other people are interacting here, but do come from a different one.
I can't make for any other explanation. I can't think on any interaction with it that I would describe as "good". I can think of a few "minimally ok", but debugging isn't one of them. (But at least on the 2022 the debugger isn't full of bugs anymore. Maybe that's what this is about.)
dahart 4 days ago [-]
I don’t like working in Visual Studio much, and I am a big fan of gdb too (and of Chrome’s debugger when working in JavaScript), but for C++ debugging, the Visual Studio debugger is excellent, and has been near the top of the class for a long time, compared to other debuggers. That is the explanation.
I don’t doubt there are warts, but for you what’s missing or sub-par from VS that is better elsewhere? What debuggers do you consider better? Gdb is also excellent, but in a different way. Gdb is programmable and that maybe makes it more powerful. (I don’t know if VS debugging is scriptable, I think it wasn’t last time I tried.) But gdb’s learning curve, lack of UI (even with tui), and lack of discoverability is a major impediment to it’s use. You mentioned interaction, and interaction is what holds back gdb.
marcosdumay 3 days ago [-]
For a start, more speed would be great. It's slow to start, end, and slows down the code so much that it has semantic implications.
Also, it can fail loudly if it loses the control of the target process or if the target process fails before it finishes connecting. Also, it should not run after the UI reports that it finished.
dahart 3 days ago [-]
So I’m still left curious which debuggers are much better than VS?
What do you mean about so slow there are semantic implications? How does execution speed change meaning? Can you give an example? And are you talking about VS specifically, or just debugging in general? Gdb can be extremely slow when debugging too, and besides that, simply turning on symbols and turning off optimizations can be a major reason for slowdowns.
For the connection issues, I rarely if ever see that with VS. Usually I’m launching my executable from the debugger. I’m not generally doing remote debugging, or attach-to-process debugging — is that what you’re talking about? Certainly all debuggers have those kinds of issues with remote debugging or attaching to processes. Are these issues better in some other debugger you use? If so, I’m certainly curious to hear about it, I would love to learn & use something that’s superior.
Agingcoder 1 days ago [-]
Pernosco
https://pernos.co/
Is vastly better than any debugger I’ve ever used.
Gdb is horrible, vs debugger is quite good but has strange limitations
forrestthewoods 4 days ago [-]
Visual Studio debugger for C++ is still best in class. It’s far from perfect. But Linux doesn’t even have anything that attempts to compete. Command line GDB and LLDB are not comparable.
3836293648 4 days ago [-]
In what world do you live in where the visual studio debugger in considered good? Or have they finally got around to fixing it? Last I tried it was unbearably slow, like seconds to step a single line
forrestthewoods 4 days ago [-]
The world where I’ve used it professionally debug C++ for almost 20 years?
It’s certainly not perfect. But “seconds to step a single line” is not normal. Certainly not what I experience. Even when debugging very large code bases like Unreal Engine.
71bw 4 days ago [-]
Sounds like a hardware issue, works fine on my machine. No speed issues at all.
rnewme 4 days ago [-]
I do admit I haven't used it for almost a decade but wasn't vs debugger (at least for cpp) considered top notch and unrivaled? What's better nowadays?
pjmlp 4 days ago [-]
There is a community that thinks UNIX is the be all, end all of developer tools, and then they miss the trees from the forest.
I know UNIX pretty well, since being introduced to Xenix in 1993, used plenty of variants, and yet my main use of WSL is to run Linux docker containers and nothing else.
mgaunard 4 days ago [-]
I switched to samurai for the few things I have that still used ninja; it's an improvement in every possible way.
But regardless, I think those kinds of build systems are just wrong. What I want from a build system is to hash the content of all the transitive inputs and look up if it exists or not in a registry.
dikei 4 days ago [-]
Yes, basically any build system that supports distributed caching use digest instead of timestamp when checking modification: Bazel, Pants, Buck, etc.
They're all hugely complex though.
For local build only, I think SCons and Waf both use hash for changes detection.
mgaunard 4 days ago [-]
Any build system is overly generic and it's up to the user to define how things should be built. So what happens is that at the end of the day every project ends up with a poorly made build system layered on top of a third-party generic tool but without abstracting away its complexity or abstractions.
My opinion is that a build system should figure out on its own how to build files, that is its job. The last thing I want to do is to define targets or dependencies. All of this is already implicit from the code itself and is useless busywork. I should just point it to a file or directory and that is it.
I prefer to just build my own build systems, bespoke to each project or environment, that just does what it should, no more and no less, leveraging the conventions in place and neatly integrating with team workflows (debugging, sanitizers, continuous integration, release, packaging, deployment, etc.)
I find that when you do that, there isn't much value in using any of the tools, they just add noise or make things slow. Running a graph of compiler and linker commands in parallel is fairly trivial and can be done in 20 lines of Python. The hard part is figuring out where the dependencies live, which versions to pick, and how the code implies those dependencies; for which the tools do nothing.
dikei 4 days ago [-]
The problem with handcrafted build system is only the author can effectively maintain it. When he moves on, someone has to spend the time ripping it out and replace with something more standard.
I've been on both end of this situation and would rather not do it again, so I'll use whatever is the de-facto standard, but you do you.
mgaunard 4 days ago [-]
Any project effectively has a handcrafted build system, whether it's built on top of CMake, Bazel, Scons or built from scratch doesn't really affect that.
And if it's doing everything from scratch, it's more likely to be simple and self-contained, making it easier to maintain.
Sesse__ 4 days ago [-]
You might be interested in n2, from the author of ninja.
TOGoS 4 days ago [-]
I think that was the idea behind NetKernel.
I've built something similar, a Deno library called "TDAR"[1], and it works well, but it takes some work to wrap up all the command-line tools that expect to work in some mutable filesystem so that you can pretend you're calling pure functions.
[1] I haven't got around to pulling it out of the parent project[2], but I talked about it in this youtube video: https://youtu.be/sty29o8sUKI
[2] If you're interested in this kind of thing you could poke me to open up the source for that thing. togos zero zero at gee mail dot comb
chubot 4 days ago [-]
What’s better about Samurai? I thought it was a compatible subset of ninja
Also, “not the thing I wanted” doesn’t mean “wrong”, simply because there are other people in the world with different preferences
mgaunard 4 days ago [-]
One thing in particular that's always been a problem with Ninja is the output. It does too much buffering, removes colors without being able to force them back, and in general leads to an experience where for me it's not usable since I want to pipe its output to a pager. When I used ninja I needed to maintain builds with all sorts of patches to fix it. With samurai it just did the right thing out of the box.
bonzini 4 days ago [-]
Is Samurai still alive? I have sent a pull request to improve signal handling but it has been sitting ignored for over half a year.
ccache is just a hack to make traditional build systems less stupid.
Good build systems have native support for these things.
pjmlp 4 days ago [-]
Given that ninja is required for C++20 modules when using CMake, it is going to stay around for quite a bit.
edflsafoiewq 4 days ago [-]
Most interesting point to me
> You must often compromise between correctness and convenience or performance and you should be intentional when you choose a point along that continuum. I find some programmers are inflexible when considering this dynamic, where it's somehow obvious that one of those concerns dominates, but in my experience the interplay is pretty subtle; for example, a tool that trades off correctness for convenience might overall produce a more correct ecosystem than a more correct but less convenient alternative, if programmers end up avoiding the latter.
bakudanen 4 days ago [-]
This is goldmine. This is why Pyhon, Go, and TypeScript/JavaScript is way more popular than Haskell/OCaml.
zX41ZdbW 4 days ago [-]
> Relatedly, please forgive me for the embarrassing name.
The name is great!
PS. It's possible to make it even faster if we implement this: https://github.com/ninja-build/ninja/issues/2157 But you explained in the article that the tool intentionally lacks state, even tiny hints from previous runs.
grobibi 4 days ago [-]
I thought this was going to be about people buying less air fryers.
Krastan 4 days ago [-]
I thought this was going to be about the Fortnite streamer
morning-coffee 2 days ago [-]
I thought it was going to be about the lead singer of Die Antwoord.
airstrike 4 days ago [-]
I thought of the smoothie blenders first too, but I can't see how they would ever have failed given how great they are. My life has changed since buying the first such blender about 4 months ago
firesteelrain 4 days ago [-]
Oh don’t call the Ninja a blender - there is a giant thread on one of the main FB groups. OP is getting ripped
Spivak 4 days ago [-]
Is there some fun tea here? Ninja themselves describe them as blenders, has the community mythologized them into something else?
firesteelrain 4 days ago [-]
More like an ice shaver that adds air is what the community likes to call it
Because blenders don’t turn things into an ice cream texture
ultrafez 4 days ago [-]
We're conflating the Ninja Creami with Ninja's smoothie makers and blenders - they are separate product lines
firesteelrain 4 days ago [-]
Ok
bakudanen 4 days ago [-]
I had my stint with build systems. Nx, Bazel to name a few. In the past I was always the go to guy to configure these stuffs.
OP said that ninja is small enough to be implemented in your favorite programming language. I wonder if there is step by step tutorial to create your own build system?
emmanueloga_ 3 days ago [-]
Short answer: write a ninja configuration generator instead.
> ... Where other build systems are high-level languages, Ninja aims to be an assembler.
> ... Ninja is intended to be used with a separate program generating its input files.
> ... Ninja is pretty easy to implement for the fun 20% of it and the remaining 80% is "just" some fiddly details.
There are many ninja generators out there already [1] but writing a simple, custom one shouldn't be too hard [2] and could make sense for some projects.
BTW, ninja is great but I wish the configuration file had used a more standard format, easier to parse and generate from any language. JSON would have been a better option I think, given the abundance of tooling around it.
Man, I was so afraid this was going to be about Fortnite. Turns out it was a fantastic read. I feel really sad but unsurprised about his description of what it's like to be an Open Source maintainer.
einpoklum 4 days ago [-]
### Statistics ###
ninja has ~26 kloc, ~3,100 commits, and only a quarter of them by the original author (although by loc changed their weight is higher). Interesting!
> users of ninja ... all Meson projects, which appears to increasingly be the build system used in the free software world;
So, AFAICT, that hasn't turned out to be the case.
> the code ends up being less important than the architecture, and the architecture ends up being less important than social issues.
Well... sometimes. Other times, the fact that there's good code that does something goes a very long way, and people live with the architectural faults. And as for the social issues - they rarely stand in opposition to the code itself.
> Some pieces of Ninja took struggle to get to and then are obvious in retrospect. I think this is true of much of math
Yup. And the some of the rest of math becomes obvious when some re-derives it using alternative and more convenient/powerful techniques.
> I think the reason so few succeed at this is that it's just too tempting to mix the layers.
As an author of a library that also focuses on being a "layer" of sorts (https://github.com/eyalroz/cuda-api-wrappers/), I struggle with this temptation a lot! Especially when, like the author says, the boundaries of the layers are not as clear as one might imagine.
> I strongly believe that iteration time has a huge impact on programmer satisfaction
I'm pretty certain that the vast majority developers perform 10x more incremental builds than full builds. So, not just satisfaction - it's just most of what we do. It's also those builds which we wait-out rather than possible go look for some distraction:
OTOH, the article doesn't mention interaction with build artifact caching schemes, which lessen the difference between building from scratch and building incrementally.
> Peter Collingbourne found Ninja and did the work to plug it into the much more popular CMake ... If anyone is responsible for making Ninja succeed out there in the real world, Peter is due the credit.
It is so gratifying when a person you didn't know makes your software project that much more impactful! Makes you really feel optimistic again about humanity and socialism and stuff.
a_t48 4 days ago [-]
Im going to have to give your CUDA wrapper a look later. :)
einpoklum 4 days ago [-]
I should say that unlike the author of ninja though, I am _very_ interested in user complaints and criticism, even if its not fully articulated and respectful. I _need_ contradiction and opposition to go beyond the bounds of my own conceptions as a almost-always-sole developer and sole maintainer of the library. I may not accept/agree with everything, but I'll at least know to take the concerns into consideration. And I've already refactored quite a bit over the years based on use cases user have pointed out to me.
a_t48 4 days ago [-]
Same :) I started down the rabbit hole of abstracting CUDA for our robotics framework, but it’s not really something I want to maintain right now.
mst 4 days ago [-]
Some people are wired to find disrespectful complaints and unconstructive criticism genuinely upsetting (which is unfortunate in a bunch of ways, but OTOH the same personality traits often also make for somebody who's fantastic at handholding polite newbies through learning something).
I am excellent at finding such things either hilarious or grounds to say "well, if you're going to be like that, I can't say I care about your opinion, piss off" and moving on to the next complaint in the hopes I can get useful feedback out of that one.
But there's a fair swathe of newbies where I have to step back and let other people help them instead, because if I try I'll end up accidentally driving them off and feeling like a dickhead afterwards :D
(I have tried and failed repeatedly at "Not Being a Bastard," so I've settled for leveling up in "Being a Self Aware Bastard" instead; at least that reduces how often I end up causing *un*intentional offence ;)
burrish 4 days ago [-]
Damn and here I was expecting real Ninjas
4 days ago [-]
Rendered at 06:18:42 GMT+0000 (Coordinated Universal Time) with Vercel.
A thousand times this! This puts into words something that's been lurking in the back of my mind for a very long time.
> The first chapter of the book claims, "The major problems of our work are not so much technological as sociological in nature". The book approaches sociological or 'political' problems such as group chemistry and team jelling, "flow time" and quiet in the work environment, and the high cost of turnover
[1] https://en.wikipedia.org/wiki/Peopleware:_Productive_Project...
I feel that the development of psychology and sociology has been lost on the workplace and it isn’t well applied. Executives want everyone to be widgets except themselves, even when study after study shows that for companies to perform optimally their workers must feel well compensated, well valued, balanced freedom in the workplace, chances for advancement etc.
In many respects you could apply psychology and sociology to how products should / could behave etc. as well, which I’m sure due to the monetary component some companies have taken seriously at least in some periods of their lifecycle, like Apple under Steve Jobs in his comeback
Of course. This maximizes their relative power within the company.
Some executives are focused on the health of a company as a whole but not many. To most of them the pie can be assumed to be a fixed size and their job is to take as much of it as possible.
This is true even of (theoretically simple) things like retail jobs, because even if you're proficient in the basic skill set on day one, coming up to speed on the rhythm of a specific workplace still takes time.
I'm buggered if I can remember where I saw it, but there was a study once that showed that (in that specific instance, I have no clue as to whether or not it generalises) a minimum wage increase actually *saved* retail/service employers in the area money overall, just because the reduced churn meant that over the lifetime of an employee with the company the fact that said lifetime was longer meant they were getting enough more value per hour out of each employee to more than compensate for the higher cost per hour.
Of course the study could always have been wrong, but it didn't seem obviously so back when I looked at it and it at the very least seems plausible to me.
> Organizations which design systems (in the broad sense used here) are constrained to produce designs which are copies of the communication structures of these organizations. — Melvin E. Conway, How Do Committees Invent?
Being able to overlay an 80x24 terminal over one of my eyes (and drive it with a bluetooth keyboard or whatever) would've been fantastic for me.
Unfortunately for me, this is enough of an outlier desire that it doesn't seem likely anybody will ever want to sell me that at a price point I can convince myself of.
https://www.youtube.com/watch?v=bckifBIPlHI&t=136s
Not sure I can convince myself to spend the money to find out, but they're definitely going in the direction I was thinking of.
Ta.
The Success and Failure of Ninja - https://news.ycombinator.com/item?id=23157783 - May 2020 (38 comments)
(Reposts are fine after a year or so! links to past threads are just to satisfy extra-curious readers)
The advantage in android is that the different build systems will generate ninja, so they can interoperate.
Anyone knows if it happened? Has the google research on latency been published?
But it's a nice article. The idea that giving-up on waiting for a delay has a simple exponential distribution is something that I never thought. (And now I'm fixed on understanding why... Something must have biased me against it.)
I was amused by this line:
> But Windows is still a huge platform in terms of developers, and those developers are starved for tools.
As a primarily Windows dev I feel that it is poor Linux devs who are starved for tools! Living life without a good debugger (Visual Studio) or profiler (Superluminal) is so tragic. ;(
It does feel like in recent years the gap between the two platforms is increasingly minimal. I definitely like all the Rust utilities that generally work crossplatform for example.
I can't make for any other explanation. I can't think on any interaction with it that I would describe as "good". I can think of a few "minimally ok", but debugging isn't one of them. (But at least on the 2022 the debugger isn't full of bugs anymore. Maybe that's what this is about.)
I don’t doubt there are warts, but for you what’s missing or sub-par from VS that is better elsewhere? What debuggers do you consider better? Gdb is also excellent, but in a different way. Gdb is programmable and that maybe makes it more powerful. (I don’t know if VS debugging is scriptable, I think it wasn’t last time I tried.) But gdb’s learning curve, lack of UI (even with tui), and lack of discoverability is a major impediment to it’s use. You mentioned interaction, and interaction is what holds back gdb.
Also, it can fail loudly if it loses the control of the target process or if the target process fails before it finishes connecting. Also, it should not run after the UI reports that it finished.
What do you mean about so slow there are semantic implications? How does execution speed change meaning? Can you give an example? And are you talking about VS specifically, or just debugging in general? Gdb can be extremely slow when debugging too, and besides that, simply turning on symbols and turning off optimizations can be a major reason for slowdowns.
For the connection issues, I rarely if ever see that with VS. Usually I’m launching my executable from the debugger. I’m not generally doing remote debugging, or attach-to-process debugging — is that what you’re talking about? Certainly all debuggers have those kinds of issues with remote debugging or attaching to processes. Are these issues better in some other debugger you use? If so, I’m certainly curious to hear about it, I would love to learn & use something that’s superior.
Gdb is horrible, vs debugger is quite good but has strange limitations
It’s certainly not perfect. But “seconds to step a single line” is not normal. Certainly not what I experience. Even when debugging very large code bases like Unreal Engine.
I know UNIX pretty well, since being introduced to Xenix in 1993, used plenty of variants, and yet my main use of WSL is to run Linux docker containers and nothing else.
But regardless, I think those kinds of build systems are just wrong. What I want from a build system is to hash the content of all the transitive inputs and look up if it exists or not in a registry.
They're all hugely complex though.
For local build only, I think SCons and Waf both use hash for changes detection.
My opinion is that a build system should figure out on its own how to build files, that is its job. The last thing I want to do is to define targets or dependencies. All of this is already implicit from the code itself and is useless busywork. I should just point it to a file or directory and that is it.
I prefer to just build my own build systems, bespoke to each project or environment, that just does what it should, no more and no less, leveraging the conventions in place and neatly integrating with team workflows (debugging, sanitizers, continuous integration, release, packaging, deployment, etc.)
I find that when you do that, there isn't much value in using any of the tools, they just add noise or make things slow. Running a graph of compiler and linker commands in parallel is fairly trivial and can be done in 20 lines of Python. The hard part is figuring out where the dependencies live, which versions to pick, and how the code implies those dependencies; for which the tools do nothing.
I've been on both end of this situation and would rather not do it again, so I'll use whatever is the de-facto standard, but you do you.
And if it's doing everything from scratch, it's more likely to be simple and self-contained, making it easier to maintain.
I've built something similar, a Deno library called "TDAR"[1], and it works well, but it takes some work to wrap up all the command-line tools that expect to work in some mutable filesystem so that you can pretend you're calling pure functions.
[1] I haven't got around to pulling it out of the parent project[2], but I talked about it in this youtube video: https://youtu.be/sty29o8sUKI
[2] If you're interested in this kind of thing you could poke me to open up the source for that thing. togos zero zero at gee mail dot comb
Also, “not the thing I wanted” doesn’t mean “wrong”, simply because there are other people in the world with different preferences
Good build systems have native support for these things.
> You must often compromise between correctness and convenience or performance and you should be intentional when you choose a point along that continuum. I find some programmers are inflexible when considering this dynamic, where it's somehow obvious that one of those concerns dominates, but in my experience the interplay is pretty subtle; for example, a tool that trades off correctness for convenience might overall produce a more correct ecosystem than a more correct but less convenient alternative, if programmers end up avoiding the latter.
The name is great!
PS. It's possible to make it even faster if we implement this: https://github.com/ninja-build/ninja/issues/2157 But you explained in the article that the tool intentionally lacks state, even tiny hints from previous runs.
Because blenders don’t turn things into an ice cream texture
OP said that ninja is small enough to be implemented in your favorite programming language. I wonder if there is step by step tutorial to create your own build system?
> ... Where other build systems are high-level languages, Ninja aims to be an assembler.
> ... Ninja is intended to be used with a separate program generating its input files.
> ... Ninja is pretty easy to implement for the fun 20% of it and the remaining 80% is "just" some fiddly details.
There are many ninja generators out there already [1] but writing a simple, custom one shouldn't be too hard [2] and could make sense for some projects.
BTW, ninja is great but I wish the configuration file had used a more standard format, easier to parse and generate from any language. JSON would have been a better option I think, given the abundance of tooling around it.
--
1: https://github.com/ninja-build/ninja/wiki/List-of-generators...
2: https://ninja-build.org/manual.html#ref_ninja_file
ninja has ~26 kloc, ~3,100 commits, and only a quarter of them by the original author (although by loc changed their weight is higher). Interesting!
https://github.com/ninja-build/ninja/graphs/contributors
### Bunch of other comments ###
> users of ninja ... all Meson projects, which appears to increasingly be the build system used in the free software world;
So, AFAICT, that hasn't turned out to be the case.
> the code ends up being less important than the architecture, and the architecture ends up being less important than social issues.
Well... sometimes. Other times, the fact that there's good code that does something goes a very long way, and people live with the architectural faults. And as for the social issues - they rarely stand in opposition to the code itself.
> Some pieces of Ninja took struggle to get to and then are obvious in retrospect. I think this is true of much of math
Yup. And the some of the rest of math becomes obvious when some re-derives it using alternative and more convenient/powerful techniques.
> I think the reason so few succeed at this is that it's just too tempting to mix the layers.
As an author of a library that also focuses on being a "layer" of sorts (https://github.com/eyalroz/cuda-api-wrappers/), I struggle with this temptation a lot! Especially when, like the author says, the boundaries of the layers are not as clear as one might imagine.
> I strongly believe that iteration time has a huge impact on programmer satisfaction
I'm pretty certain that the vast majority developers perform 10x more incremental builds than full builds. So, not just satisfaction - it's just most of what we do. It's also those builds which we wait-out rather than possible go look for some distraction:
https://xkcd.com/303/
OTOH, the article doesn't mention interaction with build artifact caching schemes, which lessen the difference between building from scratch and building incrementally.
> Peter Collingbourne found Ninja and did the work to plug it into the much more popular CMake ... If anyone is responsible for making Ninja succeed out there in the real world, Peter is due the credit.
It is so gratifying when a person you didn't know makes your software project that much more impactful! Makes you really feel optimistic again about humanity and socialism and stuff.
I am excellent at finding such things either hilarious or grounds to say "well, if you're going to be like that, I can't say I care about your opinion, piss off" and moving on to the next complaint in the hopes I can get useful feedback out of that one.
But there's a fair swathe of newbies where I have to step back and let other people help them instead, because if I try I'll end up accidentally driving them off and feeling like a dickhead afterwards :D
(I have tried and failed repeatedly at "Not Being a Bastard," so I've settled for leveling up in "Being a Self Aware Bastard" instead; at least that reduces how often I end up causing *un*intentional offence ;)