NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Tog's Paradox (votito.com)
nine_k 7 hours ago [-]
It looks almost as if humans have a nearly infinite backlog of things they would do if they only had time and capability, and a limit on the amount of effort they are capable of exerting per day. Then, once new tools increase their productivity and free up a bit of resources, they pick more desiderata from the backlog, and try to also accomplish that. Naturally they seek more tools for the newly-possible activities, and the loop closes.

This applies to any activity, leisure emphatically included. Travel became simpler → more vacations now involve flying a plane and thus obtaining tickets online and thus comparison-shopping, aggregating reviews of faraway places, etc → omg, vacation travel is complex again. It just allows to fulfill more of a dream.

TheJoeMan 7 hours ago [-]
I like to apply a similar lesson taught to me about content to consume - with the internet, there is a nearly infinite stream of entertainment and news, and it can feel overwhelming. In the past, our predecessors could read their 1 local printed newspaper and be "finished". So you have to change your thinking, to be we are able to curate a high-quality stream that constantly flows by, and when we desire, we can dip in and scoop up 1 serving.

To your comment about vacations, the issue is people subconsciously want to ensure their trip value is "maximized" - oh no, do I have time to see all 10 best spots in the city? Or some historical building is closed, and you read online how it's a lifechanging experience to see, and now you feel left out. So you have to push that aside, follow the 80/20 rule, and appreciate what you ARE able to do on your trip.

bioxept 4 hours ago [-]
How do you curate the content you consume? And how do you prevent yourself from consuming non-curated content and loosing yourself in it?
mfro 16 minutes ago [-]
Feed readers and self discipline I would guess. I don't want to pay or host a feed reader right now, and I'm bad at self discipline, so I just limit what social media I use to HN and some blogs.
MichaelZuo 4 hours ago [-]
The interesting question is, why do so many people value spending time ‘maximizing’ with uncertain prospects more than extra time to enjoy the trip?
Eisenstein 3 hours ago [-]
I think people are different that way. When I visit a city on my own, I tend to just wander around and find a nice spot or meet some interesting people and do whatever flows from that. Whereas when I am with certain friends or family, there is always a schedule and a destination.
gukov 2 hours ago [-]
Well, travelling as a group of people almost always demands having a plan, otherwise “we’ll just wander around until we find something interesting” is a hard sell to get everyone on board.
Vedor 2 hours ago [-]
Fair point. And that's why I prefer to travel alone or just with my fiancee. It's just much easier to, well, wander as you please.
Terr_ 1 hours ago [-]
That makes me think of Dune:

> Mankind has ah only one mm-m-m science," the Count said as they picked up their parade of followers and emerged from the hall into the waiting room - a narrow space with high windows and floor of patterned white and purple tile.

> "And what science is that?" the Baron asked.

> "It's the um-m-m-ah-h science of ah-h-h discontent," the Count said.

There are various ways to interpret that, but I prefer a more Stoic or Buddhist view, where it's a bad habit but we can be better at it. (As opposed to a more god-worm-totalitarian one, where humans are dissatisfied cattle to be managed.)

a_c 1 hours ago [-]
Human as an aggregate, yes. Individually, not so much. I’ve seen way too many people getting lost in life when traditional values no longer applicable to them. They still have desire, but lose all their purpose.
stocknoob 4 hours ago [-]
One trick is to hold your desires relatively constant (remind yourself that just X years ago, you dreamed of doing Y, which you can do now for much less effort). We somehow let the cost involved in a task influence how much we can enjoy it.
delichon 7 hours ago [-]
The nearly infinite backlog also means that there is nearly infinite demand for labor and Luddite adjacent arguments that labor saving technology causes persistent underemployment are invalid.
falcor84 7 hours ago [-]
Even if we shouldn't be concerned about "persistent" underemployment, I still think that rapid "transient" unemployment due to rapidly evolving tech over the coming decades may cause significant societal upheaval that we should be concerned about - even if it's "just luddites" coming to burn our data centers.
nine_k 7 hours ago [-]
It mostly means that human desires are insatiable by construction, so humans always feel somehow missing out and wanting.

Check out the works of S. Gautama on the topic; it's enlightening! :)

Epa095 7 hours ago [-]
Friendly reminder that things ended up quite shit for the actuall ludites, and the advantages only 'trickled down' after a generation or two. So I will keep being worried for everyone who works now, and their kids.
delichon 6 hours ago [-]
I don't know anyone who disputes that economic progress necessarily has winners and losers in the near term. Or that there is much we can do to cushion the blow to the losers. But to prevent the blow altogether would mean preventing the rise of powered looms and other machines that have done much for those later generations. It would be an example of ruinous empathy.
asoneth 3 hours ago [-]
> It would be an example of ruinous empathy.

Setting aside empathy, giving some thought about how we can slow the rate of change and/or cushion the fall for those affected is also in our self-interest.

As the number of people who have little left to lose grows, it destabilizes society and sets the stage for populism and revolution. Are cheap goods really so important that we're willing to leave our children to deal with another round of communism vs fascism?

lupire 7 hours ago [-]
Indeed. People who use Luddite as a slur are ignorant of history and (possibly unwittingly) repeating capitalist propaganda.
mrguyorama 3 hours ago [-]
It's just classic "Ends justify the means" thinking. It doesn't matter that 60k people will be jobless and eventually homeless because we are not "limiting" the "advancement" of society. It's okay if people suffer today because we are reducing the global suffering of tomorrow!

Nevermind that there does not have to be any cross purposes in those two sides! We don't have to get our clogs out and beat up the AI machines, we just have to "take care of" the people who's jobs the AI machines made redundant!

Adequate social welfare and safety nets, significant opportunities to retrain in new (and otherwise expensive) fields, funding for re-homing people and entire towns that have been made redundant.

And also a willingness to agree that "tech advancement" isn't morally neutral by default.

posix86 8 hours ago [-]
Tog's paradox is the main reason why I suspect that generative AI will never destroy art, it will enhance it. It allows you to create artworks within minutes that until recently required hours to create and years to master. This will cause new art to emerge that pushes these new tools to the limit, again with years of study and mastery, and they will look like nothing we've been able to produce so far.
rifty 54 minutes ago [-]
My opinion, art meant to capture and communicate the emotions and truth of an exact moment will always have a place. But also, as the time cost to represent a single frame of an idea becomes achievable in fractions of a second, what it unlocks is the ability to represent ideas that are best expressed through longer time sequences. What we wait on is tools that better allows us to constrain, guide and sculpt the generated sequence as it evolves.

As someone who has always loved fractal and Mandelbrot zooms, infinite AI zooms are already cool new art experience made possible in terms of feasible time cost to make. https://www.youtube.com/watch?v=L1vrPpM4eyM

psychoslave 7 hours ago [-]
We don’t align completely with the part on mastering, at least as stated here.

That is, yes, we can make large amount of images/videos/texts with generative AI that we would never have been able to produce otherwise, because we didn’t dedicated enough time in mastering corresponding arts. But mastering an art is only marginally about the objects you can craft. The main change it brings is how we perceive objects and how we imagine that we can transform the world (well at least a tiny peace of it) through that new perspective.

Of course "mastering generative AI" can be an interesting journey of it’s own.

Drakim 8 hours ago [-]
This is exactly what happened when digital tools like Photoshop became mainstream, where you can copy-paste, recolor, adjust, stretch and transform. It didn't obsolete the manual creation of art, but instead enhanced it. It's common for artists to sketch on paper (or tablet) and later digitize and color on their computer, achieving results faster and better than what was possible in the past.
psd1 6 hours ago [-]
I agree but also don't.

I crave authenticity. I recognise the creativity and talent in digital painting, but it lacks authenticity. I hardly feel I'll like AI art more.

Not all art needs to be high art, of course. I've bought prints of digital paintings and woodblock prints. Nonetheless, /r/ArtPorn today is like going to the cinema and being shown a compilation of TV adverts. AI art is probably not going to improve that.

Drakim 6 hours ago [-]
I totally get that, but do consider that there were probably people in the past who felt that non-analog art wasn't authentic. That it's not a real piece of art on a real piece of paper or canvas, but a mocking grid of pixels digitized to mimic the authentic but with a jagged plastic aftertaste.

Personally, I love pixel art and think it a very legitimate medium to create art in. I can understand why somebody wants art to be something physical and real, unique and non-digital, but I feel much more strongly that the advent of digital art gave more than it took.

My hopes is that the same will be true for AI art.

Eisenstein 3 hours ago [-]
A.I. will not create 'art' because art is at its essence an expression of the human condition. However, it will create a lot of what is now commoditized craft that resembles certain kinds of art, like advertising, corporate design, a lot of architecture, and graphic design.
k__ 4 hours ago [-]
Fair.

However, it seems to me that most people just think they are some kind of Rick Rubin, who just need the right tools ato be finally appreciated for their taste and I don't think even a fraction of them has taste.

marcosdumay 6 hours ago [-]
The same way that photography didn't kill painting. But it killed some specific forms of it.

What we have today isn't very useful. But once it gets good, gen AI will probably have a similar impact.

bitwize 3 hours ago [-]
It depends on how the AI is used. If it's too high-level or abstract, it will produce "slop". Solving for AI generated content to be non-slop is probably very close to solving for AGI. But the statistical tools have proven useful in streamlining or automating what once were challenging processes. For example, generating an animated character still yields slop, but you can take a hand-crafted character and have an AI model analyze a live actor's movements and then rotoscope them onto the character. This makes life easier for the animator AND the actor: the actor can give a more natural performance without having to wear cumbersome motion capture gear; and the animator can apply those movements directly to the character without having to clean up motion capture data, let alone rotoscope the movements by hand as was done in the classic Disney animation days.
jodacola 8 hours ago [-]
First I've seen this, but also: this feels like a slightly long-winded explanation of what we're actually trying to achieve through improving efficiency and such through software, right?

Make things easier and improve productivity, because we humans can do more with technology. Especially relevant in the current AI dialogue around what it's going to do to different industries.

> Consider an HR platform that automates payroll and performance management, freeing up HR staff from routine tasks. HR teams will need to justify what they do the rest of the time...

This quote, though, is one I'd like to further mull: added software complexity that is the result of job justification.

ChrisMarshallNY 8 hours ago [-]
> added software complexity that is the result of job justification.

I have found that some folks like to be "high priest gatekeepers." They want to be The Only One That Understands The System, so they are indispensable, and it also strokes their own ego.

If possible, they might customize the system, so they are the only ones that can comprehend it, and they can often be extremely rude to folks that don't have their prowess.

I suspect that we've all run into this, at one time or another. It's fairly prevalent, in tech.

jodacola 8 hours ago [-]
> high priest gatekeepers

I like that! I'll be adding that to my back pocket for an appropriate conversation in the future.

I've absolutely experienced this, and, to a degree, I'm dealing with it now in supporting a huge enterprise platform that's a few decades old.

The really interesting (frustrating?) piece is that the "high priest gatekeepers" are on both sides of the equation - the people who have used the system for years and know the appropriate incantations and the people who have developed it for years and understand the convoluted systems on the backend.

This dynamic (along with other things, because organizations are complex) has led to a very bureaucratic organization that could be far more efficient.

ChrisMarshallNY 8 hours ago [-]
I remember an xkcd, that was talking about releasing a version that fixes a keyboard mapping bug, and a user complaining, because they had learned to compensate for the mapping error.

You can't please everyone.

psd1 8 hours ago [-]
Worse than that: https://xkcd.com/1172/
ChrisMarshallNY 7 hours ago [-]
I think that's the one I had in mind.
Suppafly 4 hours ago [-]
>I have found that some folks like to be "high priest gatekeepers." They want to be The Only One That Understands The System, so they are indispensable, and it also strokes their own ego.

I agree that that happens, but I suspect a lot of times it's not a conscious decision by the person who is doing the gatekeeping. The end result is more or less the same, but often those people feel like they are the only one that understands, not that they intentionally want to be the only one that understands.

It seems like a trivial difference, but having some empathy for these people and finding out which is which makes it possible to deal with at least a subset of these people.

Terr_ 1 hours ago [-]
> I suspect a lot of times it's not a conscious decision by the person who is doing the gatekeeping

Also, it might not always/only be about seeking status but also a safety/trauma situation, where the high-priest has a lonely duty to prevent some danger that others don't truly understand.

ChrisMarshallNY 1 hours ago [-]
That’s a really good point, and I’ll try to keep that PoV in mind, the next time I run into it.
psychoslave 8 hours ago [-]
I don’t know, I tend to prefer honing my skill at crafting simpler solutions. And if some colleague come with something simpler than my proposal, I will rather be pleased and honored to be able to work with bright minds that can cast more lights for me on path to more elegant patterns.
lupire 6 hours ago [-]
Considering your choice of metaphor, it's clear that the phenomenon existed long before "tech". It is a hallmark of bureaucracy through the ages.
ChrisMarshallNY 6 hours ago [-]
Oh, yeah. Basic human nature.
oersted 8 hours ago [-]
There's a flip side to this that I think is quite positive.

When you build a tool that improves efficiency, the users either do more with the same effort or do the same with less effort. The former might be more constructive, both are good.

When the tool is particularly effective, it enables use cases that were not even considered before because they just took too much effort. That's fantastic, but I suppose that's the paradox described here, the new use case will come with new requirements, now there's new things to make more efficient. That's what progress is all about isn't it?

thuridas 2 hours ago [-]
As a developer, that is heart warming thought.
silvestrov 5 hours ago [-]
It is somewhat similar to Jevons paradox: when technological progress increases the efficiency with which a resource is used, but the falling cost of use induces increases in demand enough that resource use is increased, rather than reduced

E.g. People who purchase cars with Improved Fuel Economy ends up driving so much more that they end up using even more fuel than they would have with a less efficient car.

https://en.wikipedia.org/wiki/Jevons_paradox

Eisenstein 3 hours ago [-]
That 'paradox' is pretty much just basic economics when dealing with an elastic product, though. 'When efficiency gains reduce the price of a good that people would buy more of if it were cheaper, consumption of that good will rise'.
eesmith 2 hours ago [-]
Yes, and the provided Wikipedia link goes into detail about the mechanisms behind the rebound effect.
ChrisMarshallNY 8 hours ago [-]
This has been my experience.

There's a friction, between delivering the highest reasonable Quality, yet also allowing the initial users to provide feedback, and helping us to adjust the UX.

I deal with that, by using what I call "Constant Beta." My projects are designed to reach "beta" (or betta), as quickly as possible, so people can start actually using them, even if incomplete. Since I write Apple stuff, I can use Apple's TestFlight. I tend to start using it very early in the project, and there's often hundreds of releases, by the time it actually ships.

I have found that users will almost never provide feedback, no matter how easy I make it, or how much I beg, so I need to infer their usage, via requests for help, or features that I can tell are being used/not used.

The stuff I write has extremely strict privacy requirements, so I often can't collect metrics, or have to anonymize the shit out of the ones I do collect, so there's a lot of tea-leaves-reading, in my work.

eesmith 2 hours ago [-]
Back in the 1990s we called it 'evolutionary prototyping'. https://en.wikipedia.org/wiki/Software_prototyping#Evolution...
kayo_20211030 8 hours ago [-]
Really good piece, with which I agree.

Parkinson's law seems off to me w.r.t. Tog's paradox. Were it true, Tog would be silent because nothing would ever get more complex.

> that work expands so to fill the time available for its completion

If it's restated as "that the worker expands time spent so as to fill the time available to them", it comes in line. And is more in line with my observational experience. People like to do things in their job. If the "job" gets easier, people invent "job+", and Tog's on the money.

marcosdumay 6 hours ago [-]
Parkinson's work expansion is very often on the form of new features, better finishing, higher quality, and etc.

It doesn't necessarily imply people creating bureaucracy out of thin air to justify their existence. It's just means that people don't leave extra time being "extra".

The busy-work explanation isn't even consistent, because people mostly can't create busy-work in a project scope. It's something that comes from the overall processes.

kayo_20211030 2 hours ago [-]
Possibly we're taking by each other. Persons with a focus on a narrow work goal, when given extra time, will probably come up with extra, productive goals using that extra time. It's not bureaucracy (at least for the honest), and it's not busy-work. It's just better work; but, it's outside the parameters of the originally defined task. Often, they'll ask the tool to help them. Not for the piece-work element, but for a slightly expanded purpose; maybe simply for organizational goals that aren't in any handbook. Sometimes it's vanity, sometimes it's lulz; it just seems pretty human.
konstruction 4 hours ago [-]
I miss mention of Parkinson's law stated (1955) by the naval historian Cyril Northcote Parkinson in "The Economist":

"work expands so as to fill the time available for its completion"

https://en.wikipedia.org/wiki/Parkinson%27s_law

analog31 7 hours ago [-]
Ironically, programmers could also be "users" in this paradox, since we use complex software tools too. As tools for making software get simpler and more productive, programmers demand more complex features (languages, IDEs, paradigms, frameworks, etc).
nonce42 6 hours ago [-]
A similar paradox applies to interpersonal relationships: in many cases, if you try to reduce someone's workload, they will take on more tasks and end up where they started. E.g. Wife: "I'm too busy; you need to do more". Husband: cleans more. Wife: takes on the PTA fundraising auction. "I'm too busy; you need to do more."
ncruces 5 hours ago [-]
That stops when husband can legitimately claim he does more than wife.
LaundroMat 5 hours ago [-]
Does the increase in complexity also bring a comparable increase in value?

Do the gains of increased complexity justify the investments they require?

Even if they don't, we don't often dare _reduce_ complexity, marginally decreasing gains while massively decreasing cost.

un1970ix 6 hours ago [-]
We have created many innovations to speed up tasks and simplify certain jobs. These improvements are always marketed as ways to create more time for family, leisure, and personal interests. But they didn't actually free up time for these purposes. Instead, the extra time is often filled with even more work.
rcarmo 5 hours ago [-]
Oh man. I still have Tog On Interface in a prominent place in my bookshelf.

Sometimes I take it out and wonder at how thoughtful he was about UX and how messy and inconsistent things have become since then.

d--b 8 hours ago [-]
The examples he gives aren't very clear. Let's just state one that's fairly obvious to me:

Back in the day, someone introduced tabs in browsers that made it possible to browse several websites in a single browser window. People loved it so much that they started running browsers with dozens of opened tabs. But then this caused more pain, because now people had too much tabs to navigate. And this sparked the creation of tab managers, which introduce more complexity in how people browse the web than they used to.

falcor84 7 hours ago [-]
A couple of decades ago, browsing the web was considered a specific "activity" that you do on a computer for a specific need, and then and close the browser window when you're done.

A few decades earlier, using a personal computer at all was considered to be a specific activity, and people didn't really "know they needed" to have multiple applications running at the same time.

Tog's paradox seems to explain this evolution really well.

MatthiasPortzel 4 hours ago [-]
It can also be a result of the XY problem. Person wants to do Y, they imagine a software that does part of the hardest parts of Y (call that part X), and they commission software that does X. They then commission a ton of other small parts of Y to be added to the software of the course of years. Whereas an all-inclusive software to do Y from the beginning would have been simpler.

This issue can be avoided by product leads with vision for the entire problem.

lupire 6 hours ago [-]
This is the weightlifter's paradox.

Lifting weights never gets easier. Lifting weights and getting strong makes the weights heavier.

psychoslave 8 hours ago [-]
Great, now I’m fully confident that humanity is on its road to maintain existence over all the struggle cosmological challenges might throw at it. It might do so in the form of an intergalactic bureaucracy though.

Now, please don’t disappoint this bright new hope, go back to your work while I sit down in my sofa watching that prophecy happening.

m3kw9 5 hours ago [-]
A lot of times adding of new features or complexity is due to competitors filling a new need. Essentially the same thing but driven differently
pphysch 6 hours ago [-]
This is why correctness-oriented programming methods, while popular among academics, have and always will struggle with mainstream adoption.

A corollary of Tog's Paradox is that the definition of "correct" in a given program is always changing (as requirements evolve).

There are exceptions, like rocket science.

fsflover 7 hours ago [-]
Doesn't it contradict with the Unix philosophy of "Make each program do one thing well"? I don't see how cp and ls are getting infinitely more complex with time.
pphysch 6 hours ago [-]
Dan Luu: The growth of command line options, 1979-2017

https://danluu.com/cli-complexity/

6 hours ago [-]
tightbookkeeper 5 hours ago [-]
They demand more commands
crazygringo 8 hours ago [-]
Not really sure what makes this a "paradox"?

Seems like a lot of words to say that, when you deliver the features users want, then they will continue to want more features. (And all these features keep making users more productive/efficient, so it's a good thing.)

And, of course, more features means more software complexity.

But I'm struggling to see a paradox here, or even what's supposed to be the novel observation.

ChrisMarshallNY 8 hours ago [-]
What makes it a "paradox," is the classic Waterfall model that most companies (even ones that say they are "agile") use for development.

In Waterfall, the design and requirements are "one and done." They are not supposed to be revisited and iterated.

Once we have gone past "thresholds," we are not supposed to go back, without many staff meetings and begging to Higher Ups.

I have found that I need to make my entire product lifecycle iterable. I need to have a "done" state, so that I can get something out, and that needs to be extremely high Quality, but I also design my projects to be re-entered, and re-implemented, with the expectation that I'll be rapidly jumping back in, and making fairly significant changes (not just bug fixing).

crazygringo 8 hours ago [-]
> In Waterfall, the design and requirements are "one and done." They are not supposed to be revisited and iterated.

The article doesn't seem to be about waterfall though? But even if it were, I don't see what's novel here. In waterfall, the design and requirements are "one and done" for version 1.0. But then you plan a version 2.0 in response to the new features desired, and then 3.0, and so forth. In any case, the article doesn't even mention waterfall or agile, so I don't think it's about that.

ChrisMarshallNY 8 hours ago [-]
The article isn't really about any particular model. It's about product development, in general.

> ...But then you plan a version...

Yeah, but these are painful. I know of which I speak, as I worked for decades in Waterfall companies.

Rapid iteration at High Quality is really difficult, but it's also the only way that I've found, that delivers truly useful software (the products that I write). It's a great deal more difficult to do this with hardware, though.

I worked for hardware companies, for most of my career, and suffered hardware development methodologies forced upon software. It was painful.

Since working on my own, I have developed what I call "Evolutionary Design" techniques, and they seem to be working, but I also work at a much more humble scale, than I used to.

crazygringo 8 hours ago [-]
Sure, I totally agree with you. That's why waterfall gets a bad name. It just seems like waterfall vs. agile is a totally separate topic from the article.
ChrisMarshallNY 8 hours ago [-]
But the prevalence of Waterfall is the elephant in the room, that interferes with rapid iteration, which is what the article is about.

We can ignore it, if we like, but it's still there, making big giant ploppers on the coffee table.

adzicg 8 hours ago [-]
it's paradox in a sense that reducing complexity actually ends up increasing complexity; Tognazzini originally proposed it as a complaint against Tesler's Law ("Conservation of complexity"). Tesler observed that complexity stays the same when people try to reduce it. Tognazzini suggested that the complexity doesn't stay the same, but actually increases.
crazygringo 8 hours ago [-]
I can understand the concept of conservation of complexity -- that you can reduce steps ("complexity") for the user by automating those steps in the software and making the software more complex.

But then you don't need to build more features. The "conservation of complexity" obviously assumes that the feature set is static. Once you allow the feature set to grow, obviously complexity will increase.

So I still not only don't see the paradox, I continue to just see common sense. I don't see what's supposed to be new here.

sharpshadow 7 hours ago [-]
Technically the complexity is done by somebody to reduce the complexity for somebody. If it would be 1:1 it would stay the same, but since one solution can be copied to many the complexity overall reduces. But the reduced complexity gets filled again. So reducing complexity increases complexity. That's the paradox.
crazygringo 2 hours ago [-]
> But the reduced complexity gets filled again. So reducing complexity increases complexity. That's the paradox.

But it doesn't increase if you just don't add new features. Nobody is forcing you too.

Reducing complexity doesn't add complexity. It simply doesn't. It's the adding further features that does. Which you have a choice over.

sharpshadow 1 hours ago [-]
Software is build upon and ideas are spread, when something exists if will be extended. In this context it will get more complex, that’s how I understand it at least.

I guess at some extend it’s in the human nature to never be sated.

elijahjohnston 8 hours ago [-]
I think the paradox comes from the assumption that one of the goals of a software product is to simplify a human task. Where the paradox comes into play is that it actually increases the complexity of that human task.

I could be wrong!

falcor84 7 hours ago [-]
Add I see it, the paradox is that while users might say that they want the software to be "simpler" to use, what they actually want is more complex software where each particular action is simpler.

For us developers it may not look as a paradox, since we're so used to constantly adding levels of abstraction, but to me it does seem like a paradox on the UX front.

marcosdumay 6 hours ago [-]
Those famous "paradoxes" that are actually true are only paradoxes on the context of other widely believed ideas. And, of course, those other ideas are false.

This one is a paradox on the context that the way to create software is to make a complete specification first, then implement it.

tokai 8 hours ago [-]
Yeah its really not a paradox. There's nothing contradictory in the observation. Increased efficiency making it possible to introduce more complex tasks is not inexplicable at all.
lupire 6 hours ago [-]
It's a "veridical paradox". A statement that seems impossible to a naive mindm
pattimanners00 2 hours ago [-]
[flagged]
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 21:42:20 GMT+0000 (Coordinated Universal Time) with Vercel.