The intensity of feeling within the Linux community towards Windows and Microsoft back then was intense. I remember turning up to my university CS course and witnessing the formation of a Linux clique - if you ran windows you weren't really welcome! Dual-booting might get you reluctantly accepted though.
I wonder if the same thing still goes on. It probably was quite an effective filter for the nerdiest and most obsessive people back in 1999, and it probably still is, but somehow that kind of mindset seems a bit outdated today. If it does still exist I'd be interested to know what kind of status macOS has! Literally nobody on the CS course had a Mac, despite the very cool and colourful iMacs being very popular.
flohofwoe 4 days ago [-]
> but somehow that kind of mindset seems a bit outdated today
It's hard to understand today because Windows isn't by far the overlord it was in the late 90s and early 2000s. Alternatives like Amiga, Atari, SGI, Sun and even the Mac were dead or on their deathbed, and it looked like there would only be one operating system in the future, and one that wasn't exactly a triumph of engineering.
Thankfully that future didn't happen and Windows is essentially only relevant for running PC games today.
9dev 4 days ago [-]
> Thankfully that future didn't happen and Windows is essentially only relevant for running PC games today.
What an odd statement. You’re ignoring the vast majority of all office computers all over the world here, the on-prem infrastructure of most companies, and cheap home computers. Not everyone can afford to buy a Mac; Office tools and huge swaths of domain-specific tools only run on Windows; and despite what Linux enthusiasts want you to believe, Linux doesn’t support existing Windows software well enough for common business use.
You may be referring to the needs of rich software engineers in western countries exclusively here, but overwhelming majority of the world’s client computers are firmly in Microsoft’s hands.
skeeter2020 4 days ago [-]
A massive amount of computing - especially outside of western & corporate offices - has moved to mobile. It's pretty wild as a developer who learned the trade when everything ran windows to now to target users who do their jobs with only a phone and don't even own what we historically call a "computer". Eventually those windows binaries and IIS apps will die as well, and MS has given up on Windows serving all those requests. They've done an amazing job of moving their business away from the OS in my view.
7thaccount 4 days ago [-]
Oh I'm sure a lot has, but there is still a ton of Windows-only software out there. All those commercial tools engineers use to study the power grid are almost always Windows-only (at least the 10 major vendors I can think of off hand). I'm sure many other industries are similar. Also, all user laptops at a lot of corporations are Windows-only. It's changing of course, but not quickly.
bongodongobob 4 days ago [-]
And those mobile devices are managed with Intune/Azure.
flohofwoe 4 days ago [-]
Most relevant software today runs in web browsers or are cross-platform applications, so even when you have a Windows laptop, you probably spend most time in a web browser or a webview wrapped in an app and not in a native and exclusive Windows application.
That's a very different situation from 20 years ago, where a lot of important software only existed for Windows.
E.g. today you can just ditch Windows and install Linux on that same old laptop and wouldn't lose anything important.
dTal 4 days ago [-]
Found the programmer...
You have an extremely skewed view of what "relevant software" is, if you believe this. The second you try and use a computer for anything professional, besides nerd stuff and a few key "creative" areas that Apple has cultural cachet in like music production and graphic design, you are absolutely going to find yourself reliant on some completely entrenched "industry standard" piece of software that's been around for many decades and only runs on Microsoft Windows. Autodesk still sells 3ds Max for a couple grand, and it's Windows-only. Hell, even TurboTax only runs on Windows. And if you manage to dodge that, there's bound to be some critical piece of hardware that can only be controlled from a Windows computer.
The biggest propaganda win that Microsoft pulled was getting nerds - the biggest loudmouths about software freedom - off their case. 90s style free software fanaticism is basically dead. Meanwhile the rest of the world is absolutely still locked in to Windows. Know how you can tell? Because Windows is now widely regarded as a pile of shit, riddled with adware even out of the box ("Candy Crush in the Start menu") and yet people still use it, because they have to.
>E.g. today you can just ditch Windows and install Linux on that same old laptop and wouldn't lose anything important.
I challenge you to try this with a nontechnical friend who uses a computer for work. They will be screaming at you to put it back like it was within the week. I say this as a die-hard Linux user.
chasd00 4 days ago [-]
> The biggest propaganda win that Microsoft pulled was getting nerds - the biggest loudmouths about software freedom - off their case
it still amazes me how the newer generation of technical people see Microsoft as non-evil or, in some cases, even the good guy. The greatest trick the devil ever pulled was re-convincing the world he doesn't exist.
edit: wasn't the google motto "don't be evil" basically meant to mean don't be Microsoft?
3 days ago [-]
bongodongobob 4 days ago [-]
Try easily finding admins and heldesk staff to manage your 1000+ org using something other than Active Directory and M365. MS absolutely owns this space.
ThrowawayR2 3 days ago [-]
> "The biggest propaganda win that Microsoft pulled was getting nerds - the biggest loudmouths about software freedom - off their case."
Microsoft didn't lift a finger. In the early years while they still had developer fanaticism behind it, the FOSS movement delivered Linux, which was an adequate substitute for Windows, but failed to deliver an adequate software ecosystem to compete with the Windows software ecosystem to jumpstart Linux desktop adoption. Excel and Active Directory are big examples but I'm sure people can come up with others. So users stayed where the software they needed was.
FOSS momentum really began to falter once it became clear that none of the proposed business models that were based on being free and open, OSI style, were working and the money hungry beancounters started moving to faux open source licenses like shared source. But Microsoft had nothing to do with that either other than simply surviving long enough for it to happen.
> "Because Windows is now widely regarded as a pile of shit..."
People eat at McDonalds and drink coffee at Starbucks by the millions every day. It all might be shit but being consistent has a value of its own.
int_19h 4 days ago [-]
Thing is, most people don't use a computer for "anything professional" these days beyond Outlook and maybe Office at work, plus LOBs which are almost always web apps these days.
In fact, it increasingly feels like desktop itself is relegated to this kind of use, with smartphone becoming the primary computing device for most things.
another2another 3 days ago [-]
> plus LOBs which are almost always web apps these days.
And this is where Linux won over MS - it wasn't on the desktop (which is still a mostly terrible experience to this day), but in the server room.
Coupled with Java or Perl and OSS Databases (MySQL and Postgres), people began moving software off the desktop and onto websites. Better web browsers (something else MS tried to dominate and were rebuffed) also helped. This cut MS out of the loop on new popular web servers (apache), protocols, languages and infrastructure - probably deliberately. This then allowed alternative platforms like the new iPhone and later android to gain a niche which they've since expanded massively.
I think people saw the monoculture that was developing around Windows (from days of XP onwards), and didn't like it, so actively sought to make their own path, Linux became was keystone there (could have been BSD though also...).
close04 4 days ago [-]
Perhaps more importantly, given enough years and as the technology commoditizes, the "need" to take sides about whatever tech lessens. "Sides" are usually for the hot big thing in tech. But OSes are long past that phase and far too unexciting for most people to have the critical mass needed to create this kind of polarization.
9dev 4 days ago [-]
As long as the browser variants of Microsoft Office aren’t as capable as their desktop variants, that isn’t relevant. CAD software is another contender that mostly requires a Windows environment; as is lots of financial software and industrial control tools.
I agree with you that many people probably could use a Linux computer if there was any incentive to go that direction. That isn’t reality though, and likely won’t be any time soon.
cpach 4 days ago [-]
Also, for most people these days, Android and iOS are the primary operating systems in use. And you can’t easily run Windows applications there.
9dev 4 days ago [-]
You’re thinking of end users. The masses of office workers in the world are, and will continue to be, using Microsoft Office as their most important tool of trade, and thus will continue to run Windows.
spamizbad 4 days ago [-]
Office computers are a niche. A big niche, but still a niche: probably 500 million world-wide. Meanwhile there are 4.88 billion smartphone users.
bongodongobob 4 days ago [-]
HN has a blind spot for companies bigger than their startups. They don't realize how much large corps rely on AD/AAD/M365. For the vast majority of companies anything other than Windows/MS is a non-starter. This shit comes up in every single MS related submission.
kdmtctl 4 days ago [-]
This. _Didn't see it therefore it doesn't exist_.
paulddraper 4 days ago [-]
> Windows is essentially only relevant for running PC games today.
That, and running 73% of the world's desktops. [1]
> The intensity of feeling within the Linux community towards Windows and Microsoft back then was intense.
Remember that Microsoft didn't really have an OS that didn't suck yet. Windows as we know it today did not exist, it was still Windows NT 4 and no directory. SQL Server 7 was released in 1998, but until that rewrite it was a product produced in large part another company (Sybase).
Exchange Server (~1996) was the first product Microsoft developed from the ground up to replace an awful product and it was wildly successful, even before real Windows/AD landed in 1999. SQL Server eclipsed Exchange as the flagship product around 2010 or so.
Macs were super popular on Ivy League campuses in the 1980's. Some schools sold Macs to students for $1,000 that retailed for $2,500. Some of the current Fedex Business Centers started as Kinkos locations that rented time on Macs by the hour.
Microsoft changed the education landscape when they offered most of their commercial products for 10% of retail. They also do this for 501c3 non-profits. I believe _students_ get a different discount (~50%).
yreg 4 days ago [-]
Office and Copilot are free for students afaik. Windows used to be as well, not sure about the current state of things.
mmsimanga 3 days ago [-]
I had a professor firmly in the Linux camp. Whenever he had to touch a computer running windows he wore gloves and goggles just like people working in biohazard environment would dress. I found it amusing rather than antagonistic. It was also a great conversation opener to discuss pros and cons of Linux.
bigstrat2003 4 days ago [-]
Outdated or not, that mindset was inappropriate then and it's inappropriate today. Granted I don't necessarily expect kids in college to know how to behave, but it isn't cool to be a dick to people just because they use one OS or the other. It's especially sad when you consider that these are a bunch of nerds, who know what it's like to be a social outcast, and then turn around and inflict that on one of their own.
area51org 4 days ago [-]
Fair enough, but this isn't about how cool it was that there were warring factions. There were reasons that the divide existed, and most of the reasons involved the behemoth Microsoft throwing its weight around and trying to bully the world of technology. People who ran Windows were seen as complicit, and in some ways, they actually were.
Especially at the time, saying that both sides were obviously bad people because they had a beef with each other is a little like saying the Dark Side and the Jedi should have just stopped being assholes to each other. It's not that simple.
fancyfredbot 4 days ago [-]
It wasn't exactly like that. It was cliquey but it wasn't like the crisps and bloods or anything.The windows people were typically either baffled by the Linux people or curious about them. There was a bit more attitude from the Linux side I think, but we're talking more about being sarcastic and mocking rather than hostile. These are late 90s CS students we're talking about here, they were a polite bunch generally.
fancyfredbot 4 days ago [-]
The idea you could make value judgements about a person based on the operating system on their computer was obviously crazy, even at the time. That said, I think some people genuinely believed Microsoft was evil and people who used/bought their software were irresponsible. A bit like the way environmentalists might look at someone who drives a 6 litre V8 around the suburbs I guess. I personally find it easier to understand the environmentalist and never could get too worked up about "M$ WinBl0wz" but I am pretty sure there were people who felt it was their moral duty to boycott interaction with Microsoft users. However crazy that sounds when written down.
ToddWBurgess 4 days ago [-]
I was one of those Linux evangelists back in the 90s. Tried to get all the undergrads in my CS program migrated to Linux back in the day. Fun times.
CursedSilicon 3 days ago [-]
It's largely been replaced by tedious factional infighting. Particularly from younger Linux users.
Waaay too many folks (particularly in places like Mastodon) make "using Linux" the majority of their personality. They live and breathe Linux. Fighting to ensure that "Winblows" users understand that Arch Linux is *THE ONLY WAY!*
God forbid if you use a distro they don't then you're subjected to endless pithy criticism from on high about "well you wouldn't have this problem if you just used my distro!"
mgaunard 4 days ago [-]
I remember when I was in school (2006), if you wanted to be cool you had to use xmonad, mutt and irssi.
And of course an IBM Thinkpad T42.
akho 4 days ago [-]
xmonad started in 2007.
andrelaszlo 4 days ago [-]
REAL programmers are always off by one for some reason ;)
assume one of you starts counting at zero, so you're both right.
davidw 5 days ago [-]
Fun times; I knew some of those people.
Tech felt more nerdy and optimistic. It had its share of problems, but I miss some of the idealism.
elzbardico 4 days ago [-]
At this time even management was kind of nerdy.
It was a time when you could be in some place outside the big meccas like SV and start a reasonably succesful company around simply developing some software package and selling it to a small to midsized companies in your region.
Even in big non-tech companies, tech people kind of were left alone in a distant building and as long as you had a suit and a tie for the very ocasional meetings with the civilized portion of your company you were mostly left alone.
Then, the business people started to slowly encroach into our domain, the project management people, their processess. I think that a lot of the early agile movement was kind of a immune reaction against it, until it was too coopted by the suit people.
Then the marketing consolidated and concentrated, and it was never the same since then.
robertlagrant 4 days ago [-]
> I think that a lot of the early agile movement was kind of a immune reaction against it, until it was too coopted by the suit people.
If you're in a hiring position it's important to be very, very selective about who you hire into any engineering management or even more agile-specific roles. I remembering interviewing for an "agile coach" in a previous role - the CTO thought we needed one - and the first two or three were basically the same: all had their acronyms and their flavours of how they'd memorised agile literature. Then one guy came in who did know all that, but was also ex-British Army and had a load of practical, insightful things to say. And that's who we picked, and he later became the head of engineering. Useful agile knowledge is practical and insightful. It's not theory.
llm_trw 4 days ago [-]
If you're in a hiring position and hire anyone who uses agile as anything but a way to blend in with all the bullshit today you should quit and do something more productive with your life, like watching paint dry.
thrw42A8N 4 days ago [-]
If you're in a hiring position and don't even know what "agile" means, please go do a more appropriate job.
llm_trw 4 days ago [-]
That's the beauty of it, it means what ever you want it to mean.
dekhn 3 days ago [-]
nah it's pretty clear at this point agile is mainly waterfall with more unnecessary meetings and stress.
thrw42A8N 4 days ago [-]
You might feel like that, but it's only because you don't know it describes a category. As I said, please find a more appropriate job if you're in a hiring position.
llm_trw 4 days ago [-]
When even the founders of agile are telling everyone else they are doing it wrong it is truly a word without a definition.
thrw42A8N 3 days ago [-]
What founders of agile, there's no such thing. Maybe you mean the authors of the Agile Manifesto? Or what? Are you conflating Scrum and agile?
thisisfine1234 4 days ago [-]
[dead]
skeeter2020 4 days ago [-]
It can hurt to swallow a lot of the BS, but you can fight the good fight inside these companies, one guerrilla incursion at a time. Some of use do still exist in management, or at least believe we still hold true to the conceptual integrity of what agile means vs. what it has become.
You only get to execute the nuclear option once (i.e. quit) so yes you need to understand where the line is, but there's almost always a better option.
llm_trw 4 days ago [-]
Feel free to email me about starting an insurgency, that's basically the only way I can see motivating myself to ever go back in big tech or big corp.
TheGRS 4 days ago [-]
It just takes time and patience to get the sort of political capital that you need to make change. You wait for disasters to happen because of incompetence and then you pounce. There's also a finesse to it though, half of the battle is convincing the right people that they had these ideas in the first place. I also employ the "whisper campaign" often. Just seeding ideas in various conversations lightly, once people start to say them in conversations without you prompting it you know your job is done.
llm_trw 4 days ago [-]
Yeah, I have better things to do with my life.
You're also making the (bad) bet that the disaster will be small enough to not wipe out the company but big enough that management notices.
idiotsecant 5 days ago [-]
I feel like maybe there are some parallels with the 60s counterculture movements. The 'high water mark' speech from fear and loathing comes to mind.
yard2010 4 days ago [-]
"Strange memories on this nervous night in Las Vegas. Five years later? Six? It seems like a lifetime, or at least a Main Era—the kind of peak that never comes again. San Francisco in the middle sixties was a very special time and place to be a part of. Maybe it meant something. Maybe not, in the long run . . . but no explanation, no mix of words or music or memories can touch that sense of knowing that you were there and alive in that corner of time and the world. Whatever it meant. . . .
History is hard to know, because of all the hired bullshit, but even without being sure of “history” it seems entirely reasonable to think that every now and then the energy of a whole generation comes to a head in a long fine flash, for reasons that nobody really understands at the time—and which never explain, in retrospect, what actually happened.
My central memory of that time seems to hang on one or five or maybe forty nights—or very early mornings—when I left the Fillmore half-crazy and, instead of going home, aimed the big 650 Lightning across the Bay Bridge at a hundred miles an hour wearing L. L. Bean shorts and a Butte sheepherder's jacket . . . booming through the Treasure Island tunnel at the lights of Oakland and Berkeley and Richmond, not quite sure which turn-off to take when I got to the other end (always stalling at the toll-gate, too twisted to find neutral while I fumbled for change) . . . but being absolutely certain that no matter which way I went I would come to a place where people were just as high and wild as I was: No doubt at all about that. . . .
There was madness in any direction, at any hour. If not across the Bay, then up the Golden Gate or down 101 to Los Altos or La Honda. . . . You could strike sparks anywhere. There was a fantastic universal sense that whatever we were doing was right, that we were winning. . . .
And that, I think, was the handle—that sense of inevitable victory over the forces of Old and Evil. Not in any mean or military sense; we didn’t need that. Our energy would simply prevail. There was no point in fighting—on our side or theirs. We had all the momentum; we were riding the crest of a high and beautiful wave. . . .
So now, less than five years later, you can go up on a steep hill in Las Vegas and look West, and with the right kind of eyes you can almost see the high-water mark—that place where the wave finally broke and rolled back.”
idiotsecant 4 days ago [-]
He wrote a lot but this, I think, is the most beautiful. He was one of God's own prototypes, to be sure.
schoen 4 days ago [-]
My friend Nick, with whom I worked on Linux stuff in this era, actually quoted that speech in reference to the free software scene sometime around the year 2000! I think you're the first person since then I've seen mention it in this context.
4 days ago [-]
bruce511 4 days ago [-]
In 1998 it was a LOT more nerdy. To some of you youngsters now you'd be amazed at the primitiveness of the hardware. It was still in the "first adopter" stage from a commercial point of view.
CPU speed was measured in Mhz not Ghz. CPUs had one core. Memory was measured in MB (not GB) and machines with memory of 32MB or less was the norm. A big-screen was 1024x768 (and most folk didn't have that) and lots-of-color meant 256 colors.
The publically-accessable internet was in its infancy. Connection was through dial ip modem. There was no Google, no social media, no news media.
Most of the businesses we sold software to, we had to source the hardware for them as well. Computers in the home were "common" (but not really used for much apart from games.) And by common I'm thinking 10% or so, not 50%.
So yeah, still very nerdy. You still needed a good grasp of the command line, config of new machines was tricky, IRQ numbers were a thing. My best work was in Assembly language.
And very optimistic. It was clear the best was still to come, and the flaws were seen, but improving all the time. Every year brought new hardware, new software, and new "I didn't know a computer could do that).
Plus we were younger, so much younger than today. The world was our oyster. We might have been nerds, but we had gun working AND got paid to do so. It was a special time and I feel privileged to have lived through it.
lisper 4 days ago [-]
In 1978 it was even nerdier than that. CPU speed was measured in kilohertz. Often programs were entered using toggle switches and output was on seven-segment red LEDs. Color displays were all but unheard of. Non-volatile storage was on cassette tape. I know that probably sounds like a Monty Python sketch, but I swear this is actually true. I saw it with my own eyes.
Now here is a mind-boggling thought: there will come a time in the not-too-distant future when today's technology will seem as quaint and primitive as what I have just described.
vkou 4 days ago [-]
>there will come a time in the not-too-distant future when today's technology will seem as quaint and primitive as what I have just described.
Will it? In a decade, man landing on the moon will be closer to the first flight by the Wright Brothers than it is to today.
joha4270 4 days ago [-]
Maybe?
I was about to write a small essay on how everything from monitors to speeds seemed to be stagnating.
But I actually think there is a chance we might gain two new revolutionary[1]ish ways of interacting with computers within the next decade.
LLMs might truly bring us talking computers. Siri is old, and current AI is a fair mix of real advancement and overhyped, but once the dust settles, I wouldn't be surprised if a significant chunk of the population talks with their computer.
XR devices will likely keep shrinking and I figure somebody will find a killer app for them eventually. 20 years ago, a computer sat on a desk. Today I have one in my pocket. I wouldn't be terribly surprised if in 20 years, I have one in my glasses.
Of course, all of this is speculation. And I don't think the PC is threatened, but it will probably face competition from new ways of using computers.
[1]: As in different and not just slightly better than last years model.
clauderoux 4 days ago [-]
Putain!!! So many memories. I started with an Australian clone of a TR80 with 16kb of memory (15772 bytes) and a tape reader to record my programs. I even implemented some basic routines for the Z80 processor that I would translate into bytes in a data section in my Basic program... For those who remember poke and peek were fun... 44 years this Christmas exactly...
If my memory is correct the Z80 was cadenced at 3Hz... And now our computers speak to us, and very soon will boss us around...
llm_trw 4 days ago [-]
>LLMs might truly bring us talking computers.
I have had a talking computer since 2004.
I've had a computer that listens to me since 2012 when I build my first neural network that could recognize my voice and convert it to bash commands reliably enough.
I've had an LLM powered version of that since the gpt 3 api came up and a local version with a llama 8b model today.
It's meh.
What I find ironic is that even though I'm dictating this to a $50k work station I'm still using Emacs to do it.
joha4270 4 days ago [-]
For you and me, its not going to be a massive change.
But for my grandmother, who keeps a little notebook with detailed instructions for how to message people on Facebook? For her this might greatly increase her capabilities.
mrguyorama 3 days ago [-]
It won't, because something the Tech Fetishists can't seem to notice (since they have zero awareness) is that controlling a computer by talking to it is awful. Unless you have a predefined structure that is basically just a textual re-implementation of a screen full of buttons for actions, you are left to just say what you want, and hope that the computer can magically understand what that means in a world of ambiguity.
We have had voice controlled machines since at least the 80s. There were voice controlled robots in MIT decades ago. It's just stupid and awful.
I wrote about it the other day, but every Windows computer since 2000 has included a fully functional, fairly well implemented, and mostly accurate voice control. Nobody uses it except people who have to use it, because voice is a terrible control plane. Humans are WAY better at choosing an option from a 2D grid using a mouse, even people who are afraid of computers. Voice is inherently slower, less accurate, less precise, and inherently ambiguous.
Hell, even as shown on Star Trek TNG, controlling a computer through your voice is awful. They show a few times characters having to do searches through a bunch of data and refine the output by adding more filter criteria, and always what they do is better done by a good excel spreadsheet.
People don't want voice controlled computers FFS, stop trying to force them. Stop trying to make fetch happen.
sbuk 3 days ago [-]
Now observe kids that have grown up with Siri/Alexa/hey Google being just there. My kid controls their multi-colored lights with Alexa. We taught them to search with voice on their (locked down) tablet before they could read and write. To them, it’s normal. Much like it’s normal for screens to respond to touch, and for photos and videos to be immediately viewed.
rsynnott 4 days ago [-]
> In a decade, man landing on the moon will be closer to the first flight by the Wright Brothers than it is to today.
That's not a technology issue, so much as a "there is no pressing need to spend the money on this anymore". At the peak of the moon race, NASA was getting 5% of US government spending. In modern terms, that would be like spending 340 billion per year; it's just a _tremendous_ amount of money.
(That said, both the US and China do plan to make manned landings on the moon in the next decade, though the timelines may be... optimistic, as neither is indulging in crazy space-race-esque spending on it.)
llm_trw 4 days ago [-]
>340 billion per year; it's just a _tremendous_ amount of money.
As the US medical system shows, you can flush more than that down the toilet every month and still not get any results.
To get things done you need true believers.
It's hard to motivate true believers to build the 1e6th crud app.
This is why no one in mainstream tech is nerdy any more and why all projects are over budget and under performing.
Hammershaft 4 days ago [-]
Thank you for crystallizing something that's been swimming in my unconscious for a long time.
kragen 4 days ago [-]
Right, and in the next decade the PRC will invade Taiwan, removing Samsung's incentive to innovate.
skeeter2020 4 days ago [-]
This is true; in the 90's we had eye strain from the monitors but nobody got blisters on their fingers from flipping toggle switches!
dep_b 4 days ago [-]
I remember trying to install Linux on a spare 386 with like 2MB of memory? Bought the SuSE book from the bookstore, tried to install it. Apparently the 2MB was not enough to expand the boot floppy, so I had to make a swap disk first. Everything from the paper manual.
It was pretty cool when it worked. I think I managed to serve a website from it and I used it as a fancy teletype.
MacOS 8.6 and Windows 95 OSR2.1 (the one that had good USB support, but no IE yet). Back then I loved to tweak my OS'es, I totally stripped them from anything it didn't need, tweaked the animations to be instant. Dual Celly 300A@450.
I think I installed a Windows 2000 beta somewhere in 1999 and stuck with that until they really ended support for it. Pretty consistent UI, lightweight. Ran most software including games, even when not officially supported.
Played with the MacOS X Betas, on hardware comparable to the first iPhone in terms of performance. Blue White G3 Tower.
I loved computers back then. Now they're just tools.
kjs3 4 days ago [-]
I loved computers back then. Now they're just tools.
Preach it brother. I lived through Windows v. *nix (several versions), z80 v 6502, ethernet v. token ring, packet switching v. circuit switching, and a ton of others[1], and was usually a partisan on one side or the other. Now...it's either 'what works easiest' or 'what does the client want' and I make it work for me. This lack of participation in partisan conflict has made most everything go smoother. And vintage computing allows me to get my "remember how much fun it once was" fix.
[1] Yes, even vi v. emacs, but even 30 years ago that had become more of an inside joke than actual debate. Besides, those emacs heretics will never find the One True Way.
WalterBright 4 days ago [-]
I remember when I knew everything about a computer. From the gates to the CPU to the BIOS to the operating system.
Now the only thing I know are compilers :-)
skeeter2020 4 days ago [-]
Aside from the nostalgia I think this is a big part of the 8-bit love in the nerdier corners: you can hold the entire thing in your head.
vishnugupta 4 days ago [-]
Very well said. Can relate to most of these.
To add, lot of us assembled our own PC. Though I didn't build one, I routinely opened the cabinet for upgrading RAM/HDD, or to transfer large files through HDD.I clearly remember when I upgraded RAM from 32MB to 64MB, the speed bump was incredible. After a while it became so frequent that I didn't bother to close the side panel.
A side effect was that computer getting bricked was a normal affair, so we all had boot Floppy disks at hand along with CDs of all the important softwares. In fact I would routinely install things from scratch to get rid of all the freeware stuff that clogged the CPU & RAM.
As you said we were living on the cusp of two revolutionary changes at once, PCs and internet.
I wonder what do youngsters of today look back 20 years down the line and have similar fond feeling & memories.
einpoklum 4 days ago [-]
> To add, lot of us assembled our own PC.
That is still true today. PC components are very popular, as are YouTube channel with reviews, ruminations about component choices, build experience and how-to guides .
> A side effect was that computer getting bricked was a normal affair
If you overclocked or messed with the volatges maybe, but otherwise - in my experience (start assembling systems around 1993 or so), bricking was quite rare.
Moru 4 days ago [-]
I think, in this case bricking refers to the new meaning "Doesn't boot any more, requires a reinstall of software".
skeeter2020 4 days ago [-]
You had everything on cd or floppies - or maybe a zip/jazz drive if you were in school with a fast internet connection - and reinstalled from the OS up. I can't have been the only person who cut out the hologram license key and stored it in the back of my cd binder - or sharpied the keys right on the disk!
mr_toad 4 days ago [-]
In my case it was wiggling the ISA card in an attempt to get it out of the socket bumped the (badly placed) CMOS battery off the motherboard, which required maintenance of the soldering iron variety.
Moru 4 days ago [-]
Still not bricking if you can get it running again :-)
A bricking event is when the computer is more useful as a doorstop than a work tool.
yard2010 4 days ago [-]
Haha thank you for reminding me of the image of my old pentium 3 pc, which was left open all the time due to this reason! Fun memberberries!
skeeter2020 4 days ago [-]
I suspect today's younger generation will view "computers" much like we experienced TV, something you've always know that's gotten a lot better technically but the overall experience is pretty much the same and very consumption based. The focus is all on volume and ease of use, so the perceived reward isn't as valuable.
godzillabrennus 4 days ago [-]
The AI we are building will look back on these days fondly as its infancy. The humans are not going to.
dsign 4 days ago [-]
"Your excellency, the human was found with a screwdriver. Porting chassis-opening weapons is illegal in California. We demand the maximum penalty of using their carbon to make batteries."
Irony aside, there are people already saying that the moment AIs have sufficient agency we should give them equal rights and freedoms[^1]. How does one exponentially loses common sense?
> ... "We demand the maximum penalty of using their carbon to make batteries." ...
we demand that they be relegated to configuring the lp-daemon from just the *man* pages, no internet :o)
godzillabrennus 17 hours ago [-]
Reminds me of Lexx…
asveikau 3 days ago [-]
> machines with memory of 32MB or less was the norm
My memory of 1998 is that 64mb-128mb was common by then.
Around then my main machine had 128mb, until 2001 when I switched to 256mb and a 1ghz cpu. I also had an older machine with 8mb that was given to me for free. I recompiled OpenBSD to exclude PCI support to get it to stop swapping, then it was pretty usable as a headless machine.
richrichardsson 4 days ago [-]
> There was no Google
I have a broken memory of the time, because I swear I first came across Google as the better alternative to AltaVista in my last year at Uni in '96, but Wikipedia says Google arrived in '98, or at least the company was set up then, is it possible it was usable 2 years before they incorporated?
laz 4 days ago [-]
Google was available at google.stanford.edu before they spun up the company.
I saw Sergei give a talk at my university in 96 or 97 and switched from Alta Vista.
aleph_minus_one 4 days ago [-]
> I swear I first came across Google as the better alternative to AltaVista in my last year at Uni in '96, but Wikipedia says Google arrived in '98, or at least the company was set up then, is it possible it was usable 2 years before they incorporated?
Sounds like a Mandela effect. :-)
richrichardsson 4 days ago [-]
More likely a Marijuana effect ;-)
globnomulous 4 days ago [-]
> lots-of-color meant 256 colors.
Sixteen- and twenty-four-bit color weren't just widespread by 1998. They were standard.
Not supporting 24 bit color was unheard of for a normal desktop monitor and normal desktop graphics.
nikau 2 hours ago [-]
> Sixteen- and twenty-four-bit color weren't just widespread by 1998. They were standard.
Maybe in 800x600, but not in higher resolutions.
A lot of PCs still had 1MB video ram cards like the S3 trio - I remember running Xwindows on a s3 card that would change the colour palette for whichever app was active when you ran it in 1024x768.
bombcar 4 days ago [-]
256 was kind of rare in 1994 when Doom came out (new machines had it, older often didn't).
But by 1998 the pace of hardware changes were moving so fast that anything top of the line from 1994 was just trash, you could get one for free if you asked.
robin_reala 4 days ago [-]
I don’t think that’s quite right. VGA and 256 colour came in in 1987, and like you say the rate of change was so fast that EGA was basically obsolete by 1990, 92 at the latest.
bombcar 4 days ago [-]
What I recall was that "what was sold" was changing at a steady clip but "what was used" was changing much slower - if you bought an EGA machine in 1990 you kept using it even when VGA was out; no real reason to upgrade since the only thing noticeable was games (maybe).
But Windows 3.11 started to change that and by the time Windows 95 was out nobody wanted to use a 2 year old machine anymore.
(Though to be fair most people I knew went from CGA/Hercules straight to VGA, there really wasn't a "driving need" for EGA, especially if your machine was business oriented - Hercules was better for text). VGA was a "sweet spot" also in that it always worked, SVGA started getting really hairy sometimes, and could be a bear to get working right.
int_19h 4 days ago [-]
SVGA worked pretty well so long as you stuck to basic VESA modes, which was relied upon by many games of the era (e.g. Warcraft 2, the first two Heroes of Might & Magic games).
dragonwriter 4 days ago [-]
The web plus Windows 95 were a major force bringing both new people into computing and pushing upgrades: the rapid pace of new standards displacing old technology in use rather than picking up a slice at the top of the market and slowly expanding share in use didn't really start until then.
bombcar 4 days ago [-]
I feel Windows 95 especially was a turning point where people began to buy "personal computers" for various uses instead of "I need a machine to do program X" which was more common before.
Hammershaft 4 days ago [-]
I think it's very sad that it's such a common sentiment that one of the joys of the past in tech was the optimism.
davidw 4 days ago [-]
Me too. I'd love to find some space to work in that rekindles some of that. I love tech, it's a lot of fun, and you can do really cool things with it. But it kind of went from the people in that article who were "hackers", to people like PG who is a hacker too, but urged people to use their skills to set themselves up so they could work on cool things, to just being a business. Nothing wrong with businesses, I'm not some kind of purist hermit. But I miss that air of doing things because they're cool.
llm_trw 4 days ago [-]
The past wasn't optimistic, we merely ignored negative Nancies because they had nothing to say. In the last 10 years for some reason we gave them the megaphone and let them dictate how everything is done.
It's more that the "negative nancies" became necessary nancies. Back when Amazon sold books, they became a considerable player but otherwise big whoop. Now they threaten to dominate logistics AND hosting, and are expanding their grip and stamping out competition in other markets. Google is pretty much synonymous with the web. Meta owns a big chunk of messaging and social media. Computers used to not matter much but now we're glued to one
It costs even more to be reckless today.
Re: "whitey on the moon" - I'm not sure the space program would be my first target there but I think it makes a more poetic contrast and forces people to pay attention by targeting a beloved cultural narrative. Cyberpunk - by my reckoning a bit later - has been preaching a very similar message of massive inequality in the presence of incredible technology and wealth disparity and power concentration. And yet that doesn't draw the same ire. I guess in that case it's easier to dismiss the core message because robot limbs and cool neon lights are too much of a distraction.
mindslight 4 days ago [-]
Before the "negative Nancies" came the malevolent Mallorys. They built and popularized centralizing technologies of the shapes that investment capital could understand and exploit, foregoing all those exciting visions of distributed empowerment. Those newly created levers of centralized power then attracted the politickers focused on shaping society in their various images from the top-down.
(Also don't make the mistake of thinking that these groups continually fighting over how the centralized power is used means that any one of them is aligned with individual freedom)
34% in 1998 in the UK, and Internet access was a lot less as we never had free local calls.
Symbiote 4 days ago [-]
"Free" (no cost to make the phone call) Internet access was widely available in the UK, on pretty much the same terms as the USA — pay £20/month for X minutes per month.
In addition, and popular for a while, was pay-per-minute access where the ISP didn't charge the customer directly, but took part of the phone call cost. I don't know if this method of payment to an ISP existed in the USA.
That far fewer (only 9%) of British households had Internet access was probably more related to the cost of the computer and modem, and the country generally having less wealth than the USA.
LAC-Tech 4 days ago [-]
Really? In '99 I had 64mb of RAM, and it was the cheapest off the shelf PC my parents could afford, in a tech backwater.
grogenaut 4 days ago [-]
I had a 1ghz machine with a gig of RAM in 95. I had 100mb internet on lan and multi t3. Google was around before public launch in 98. You're a few years off. In 98 I had a sgi visual workstatuon with 4gb ram and 50gb HDD and was on the same hub as the t3s. This was at college of course. I had dsl in 00 and it was as fast as campus. My mom was the first non-commercial install in STL (she was kinda a techie but also a good mom to a cs student kid).
Summer of 95 I convinced her to get dialup so I could learn about this internet thing I figured would be big. I was about 2 years too late for that wave.
But yes things were way more primitive and the learning curve was high. I had to spend several hours on the phone with the ISP to get on dialup on Mac. They onboarded people on campus manually and sold Ethernet cards back then.
Are you saying you were on a mainframe or supercomputer in 95? That’s the only way I could see access to anything approaching those speeds at that time.
Even an sgi workstation you’re looking at 200mhz tops in 95. And no way 1gb of ram more like 64mb.
No you didnt. Process node enabling 1GHz wasnt even invented yet. Afaik fastest CPU at the time was Alpha released in December at 333MHz.
1GB of ram cost $32K in 1995, $2400 in 1997, $1200 in 1998, and back to $2400 in September 1999 due to Jiji earthquake. Just for some perspective 4th place on TOP500 in 1995 had 8GB of ram.
tralarpa 4 days ago [-]
> Afaik fastest CPU at the time was Alpha released in December at 333MHz
That would be rather 1996. Alphas were already at 500MHz in 1997, and (expensive) Pentium II were at 450MHz in 1998.
gorlilla 3 days ago [-]
You're the one that's of for sure considering the first 1ghz processor didn't come out until 1999. The Pentium PRO of 1995 (November at that)boasted a Max. CPU clock rate of 150 MHz to 200 MHz.
bruce511 4 days ago [-]
>> This was at college of course.
Well sure, the colleges had the good toys long before the rest of us. But we didn't have college-level money at home or at work.
I left college (and the machines that existed there) in 1992. And the hardware there in 1992 was better than the hardware I had at home or at work for the rest of the 90s.
So yeah, the tech existed, but it wasn't the tech we got to play with again until much later on. By 1998 though we had at least moved away from DOS development into Windows.
hibikir 4 days ago [-]
[flagged]
int_19h 4 days ago [-]
For one thing, not everybody is in US or other First World country.
But also, yes, I remember 3D accelerators - and how expensive and coveted they were. Few 3D games before 2000 shipped without software 3D rendering support for this reason; I remember Quake 3 (1999) being the first big title like that, and it was one of the big reasons why Unreal Tournament was more popular in my circles back then - you could play it in software mode at 320x240 on fairly meager hardware.
Symbiote 4 days ago [-]
If you had DSL in 1998, you were rich, or at least putting a very high priority on the internet.
ADSL was launched in 1999 in the UK, so you also weren't here.
aa-jv 4 days ago [-]
If you had DSL in 1998, it had nothing to do with whether you were rich or not (anyone getting on the Internet was 'rich') - it had everything to do with how close you were to the nearest node.
Disclaimer: I helped start one of the US' biggest ISP's in this era, and personally installed and handed over multiple DSL configurations to folks living in 'poor' neighbourhoods which were, incidentally, close enough to the NOC to be viable DSL customers.. Plenty of 'poor folks' in downtown LA were able to get DSL, purely by the nature of their vicinity to One Wilshire ..
thowawatp302 4 days ago [-]
I think you’re forgetting how fast tech moved in the 1990s, it wasn’t until 1999 that broadband was available in the 8th largest city in the US.
(I remember wanting to dumpster dive the CO because we got a letter from Southwestern Bell saying they were upgrading the equipment to 5ESS in late fall 1998)
aa-jv 4 days ago [-]
Major US cities had bandwidth up the wazoo by the end of the 90's, it just hadn't been properly distributed yet. The poor/rich categorization is unjustified - in fact many poor folks in SoCal were able to leapfrog the Beverly Hills T1 hipsters merely due to the fact that they lived closer to the cheaper NOC's ..
4 days ago [-]
internet_points 4 days ago [-]
Yeah, back in the day you'd be considered tin-foil hat if you thought you were being tracked and surveilled online. These days, you're tin-foil hat if you think you can avoid it.
Microsoft was the big bad wolf, but at the time, most of what they did that was so horrible was capture market share. You could still use a computing device and then leave it alone and it would leave you alone, it wasn't trying to constantly ping you and notify you and make you feel left out and use all kinds of dark patterns to feed the addiction. These days, computing devices feel more like trickster adversaries than clunky-but-useful tools.
Clubber 4 days ago [-]
>back in the day you'd be considered tin-foil hat if you thought you were being tracked and surveilled online.
It was very funny a few years later in the early 2000 Linux started getting games and quite good quality desktop user interfaces. Lindows even had the first version of an App Store.
However, everyone was so busy thinking about being a better version of Windows and Mac that for the most part we didn’t think about phones. Windows and make themselves fell by the wayside to iOS and android.
Elv13 4 days ago [-]
Not really. Maemo/N900 and OpenMoko existed and worked well enough. The problem I think is more than Meego/Mer/Moblin was supposed to be equally open, but a customer ready version of that idea. It was delayed over and over again. By the time it existed, it was no longer a pure X11 based Linux distribution and more of an (too) early take on Wayland. It was also so late Microsoft made a powergrab and managed to kill it. Ubuntu mobile (and to some extent BlackBerry10/WebOS) then came and tried to take that crown, but by that time iOS and Android were too entrenched. Ubuntu mobile was also MIR/LibHybris, you can't really build your own DE/WM on it since its a monolith. So the FLOSS community waited/wasted 6 years waiting for some building blocks (and the hardware to go with them) to be ready and were left with nothing. By that time the ship had sailed and the world depended on "apps" to interact with everything and FLOSS can't challenge it.
dale_glass 4 days ago [-]
I had a N900 and was very fond of it, but it was really a prototype. Part done before its time, part a type of system that wouldn't have worked for normal people long term.
The N900 was more or less a tiny computer running Debian. With 256MB RAM, and swapping on flash.
It was way too low spec to run reliably like that, you quickly ran into swap death. And it had none of the niceties of Android's memory management, having apps designed to be stopped as needed.
Security-wise it was also bad, it was just a normal Linux box, so banking apps would be a terrible idea.
If it didn't get killed, I wonder how would they have polished it up for public consumption.
holowoodman 4 days ago [-]
But it had some features that modern phones lack sorely. E.g. incremental reboot-free (or reboot-only-on-kernel-updates) updates ala Debian, such that patching wasn't a big deal or a 1GB download twice a week.
And it had a Keyboard! With really real keys!
dale_glass 4 days ago [-]
Those are a pretty bad idea as well, and you see some distros like Fedora move away from them by introducing a reboot/update/reboot cycle.
Yes, on Linux you can replace binaries and libraries in use, but then you're not actually running the new code until you restart the program and are still vulnerable to any security issue that it fixed.
And with things like runtime loading of plugins that now may be incompatible, and programs not expecting stuff changing underneath, online updates can be troublesome.
The online model works well enough for a command line usage where applications are transient, or a server usage where you remember to restart a service or two. But for a desktop user with long lived, huge apps like a browser it's not that good of an idea anymore.
holowoodman 4 days ago [-]
checkrestart does everything you need. And browsers like firefox display a helpful "restart me" button.
pabs3 3 days ago [-]
checkrestart doesn't tell you to reboot after a microcode/kernel update, or restart a Python program after a module gets an update. needrestart is a better alternative to that.
nailer 4 days ago [-]
> Not really. Maemo/N900 and OpenMoko existed and worked well enough.
I am aware of them - a friend worked on the N900 - but they were so far off 'phones' I didn't bother.
advael 5 days ago [-]
Optimistic times always end when people who won power in them fight to hold on to increasingly more of it, and tend to only come back once they fail, stop, or fall.
lotsofpulp 4 days ago [-]
Consider that the times were optimistic for those who were interested, motivated, capable, and in a position (even geographic) to learn about computers. But there were probably many people for whom the same time was not as optimistic, perhaps due to being automated or outsourced.
Probably true right now too, some are in optimistic times, and others not. Perhaps the proportions of the two groups varies.
advael 4 days ago [-]
Generally when people refer to the "times" they are discussing the vibes over some aggregate. Zoom in enough on anything and you start to see different individuals with unique circumstances, but this doesn't make futile the entire endeavor of observing larger patterns
aa-jv 4 days ago [-]
If only SGI had made that darn laptop, we'd be rocking SGI Linux these days, and the fruity company would be a minor consideration to most nerds.
Alas, 'twas not to be. SGI, you glorious bastard, why did you have to make that deal with the devil ..
IMHO, the tech industry completely changed when nerds could by tiBooks, have an amazing Unix experience in a portable form, and still rip and play DVD's alongside the shell ..
rbanffy 4 days ago [-]
What SGI had in its favor was the great hardware. This is, more or less, the selling point of "the fruity company" today, at least for those who don't know they can open a terminal.
For those who do, it's a competent Unix OS and certainly does the job. I never found IRIX great for servers either.
aa-jv 3 days ago [-]
As a huge fan of SGI (and MIPS, prior to their rise) I like to project SGI's wonderful hardware into the realm of laptops.
It was a huge surprise that the Fruit Company were the ones to come out with really robust, affordable (at the time) Unix-based laptops .. all the issues with Irix would have become relatively irrelevant had SGI built the laptop and had their Linux team work on the case.
Ah well, an alternative timeline I will visit when the technology to do so becomes more widely available. ;)
rbanffy 3 days ago [-]
Well… in the end SGI licensed their secret graphics sauce to Nvidia (IIRC) and you can get Linux running on pretty much any x86 laptop. You can even run MaXX Desktop. Or FreeBSD for extra Unixness.
I imagine what they’d look like today. Now sure how to translate their 90’s graphics language to the 2020’s though. Hard edges, rich textures, and colorful machines.
aa-jv 2 days ago [-]
If there were ever a productive use of AI, reviving the SGI design aesthetic for the /r/cyberdeck crowd is it .. ;)
jmclnx 4 days ago [-]
By then I had moved on to FreeBSD, I think starting at 3.?. The main reason for that move was internet access. FreeBSD CD set came with additional CDs of their ports collection. No more downloading via my flaky and very slow phone connection (via kermit).
I went back to Linux around 4.8 because FreeBSD started having issues with my hardware. By then I had "real" internet thanks to the company I worked for.
I remember seeing items about this, but the BSD people tended not to care about these things. But a nice look back
linguae 4 days ago [-]
This is actually how I was introduced to FreeBSD as well. I was a high school student in 2004 who had a hand-me-down PC that one of my high school teachers gave me. It had a 475MHz AMD K6-2 processor, 64MB RAM, and a 8GB hard drive that was running Windows 98. I learned about Linux in the spring of 2004, and I had Zipslack, which was a version of Slackware Linux that ran on top of DOS. I wanted to try Gentoo, but I had dial-up at home. At least Zipslack was small enough to make the download via dial-up bearable.
During the summer of 2004 I took an introductory computer science course at Sacramento City College. The professor was a big fan of FreeBSD. He convinced me to try it instead of Gentoo, and he gave me some CDs that he burned containing FreeBSD and plenty of FOSS software. I ended up installing FreeBSD on my PC, and I fell in love with it. It was my daily-driver OS until the summer of 2006, when I was able to use some of my internship earnings to purchase a MacBook, my first brand-new computer and my first modern Mac. I still use FreeBSD whenever I need a Unix and when I don’t need to use Linux-specific software.
arp242 3 days ago [-]
> FreeBSD CD set came with additional CDs of their ports collection. No more downloading via my flaky and very slow phone connection (via kermit).
You just had to endlessly swap between the different CDs because the dependency resolution wasn't very smart about that. Fun stuff :-)
billy99k 5 days ago [-]
I use Linux daily (command line only) alongside Windows. It's amazing to think that after all these years, Linux on the desktop (with GUI) still isn't even close to windows in terms of functionality.
I've tried it over they years and finally gave up. It would work for awhile, until some random change would break something I used every day and I wasn't interested in spending many hours trying to research a fix and manually hack some .c file to make it work.
MacOS essentially became a form of Unix with a fantastic GUI. It's what Linux could have been. I like to use Linux with the windows WSL and I get the best of both worlds: a nice GUI and the ability to run all of my favorite Linux apps.
sumanthvepa 5 days ago [-]
I have 3 machines at my desk (with 3 monitors) one running Ubuntu, (and a bunch VMs) one with Windows 11, and a Mac. They use the same keyboard and mouse, and I can move my mouse cursor and keyboard focus across the three OSes easily (I use synergy to do this).
My experience with the desktops across all the OSes largely similar. They do things differently, but once you get used to the OS, it becomes second nature.
Linux is my preferred environment for anything to do with development. I use my Windows machine for office productivity (Outlook, Word, Excel, Powerpoint browsing) and VisualStudio. I use my Mac for the Adobe creative suite and Figma and iOS/visionOS development.
The difference in OSes isn't the desktop itself. It's the software that runs on it. I don't really want use GIMP on Linux for example, or do web dev on Windows.
michaelsshaw 4 days ago [-]
>, or do web dev on Windows.
Personally, I find any development on Windows to be a pain!
saagarjha 4 days ago [-]
I would argue that you’re doing a bad job using those OSes through software like Synergy, which doesn’t really understand things like multitouch or smooth scrolling.
z33k 4 days ago [-]
It's fine in my experience. I use Synergy 1 and I turn off smooth scrolling on my laptops and desktops. I find it more distracting that some apps support smooth scrolling and some don't, so I just turn it off altogether.
Together with AutoHotkey on Windows and (Built-in) Applescript and Rectangle on MacOS, one is able to have similar keybinds on both operating systems.
saagarjha 4 days ago [-]
Every app on macOS supports smooth scrolling. That’s why people use it. You’re using the lowest common denominator and going “wow everything is the same I can’t tell the difference”.
z33k 3 days ago [-]
Only with hardware acceleration turned off. Which in my experience causes a stuttery and laggy experience which can hardly be described as "smooth".
If you really want to use MacOS's smooth scrolling, use MOS (https://mos.caldis.me) and get a beefy Mac because the base models will stutter like I mentioned.
saagarjha 1 days ago [-]
I am very confused what you are talking about. You definitely do not need hardware acceleration to be disabled to get smooth scrolling. Something seems very broken with your setup?
cayley_graph 4 days ago [-]
To be honest I've never had issues like you're describing and I've been using Linux nearly exclusively as a desktop OS since the 2010s. The main deficiency is that it doesn't have eg a good Photoshop alternative, but what's there works well-- and these days the browser has supplanted most native proprietary software, anyway. Things have only broken for me when I've hacked/customized them to the point that it's a miracle they work at all.
OTOH, Windows has never given me anything but trouble... stuff that's easy on Linux semi-frequently required regedit hacks. I remember having to mess with some DCOM thing and ended up hosing my Windows install. Not to mention that it's awfully slow without installing a bunch of debloating tools (and even then...).
impossiblefork 5 days ago [-]
Windows is a complete subscription hell, with annoying pop-ups unless you regularly pay Microsoft etc. fairly large amounts, and then there's the commercials and MSN. My mother switched to Linux because she felt that she with Windows didn't have a real computer, but that Microsoft had a computer she borrowed.
It was never great-- there was always bloat, but recently it's crossed a line of unusability where the OS itself is more distracting than useful.
MikeTheGreat 4 days ago [-]
I was gonna (silently) disagree with you, but in the last couple of weeks I updated Windows and now it's periodically asking if it can know my location. I'm not clear if Windows is asking, or it's asking on behalf of a specific app, or so it can give this info to any app that wants it. And I don't care - the answer is no, no matter how many times it asks.
I always admired Linux, and was able to get around in Linux, but never seriously considered using it until that MS AI thing that works by taking a screenshot every couple of seconds. And now this pester-ware asking for my location.
dboreham 4 days ago [-]
Pay Microsoft for what?
impossiblefork 4 days ago [-]
Word, for example, but I think there are also other subscriptions in it.
Meanwhile, with Libreoffice, it just works.
cayley_graph 4 days ago [-]
Yes, it constantly bugs you about buying OneDrive.
BLKNSLVR 4 days ago [-]
I'm interested to know what breaks in the Linux desktop experience.
I've been using Linux exclusively as a desktop environment at home for 5 years or more and my primary pain point is kernel updates breaking DisplayLink, via a docking stattion I use for multiple monitors. I now have a specific command line that rolls back the kernel to the previous version, and then I wait for updated DisplayLink drivers.
It's a pain, but it's still (much!) better than the circles of hell that Windows has been putting users through since Windows 7.
Strangely, the Windows laptop provided by my work just stopped being able to pipe audio through the speakers plugged into the docking station. Which feels like Windows "doing a Linux". The tables are turning?
bigstrat2003 4 days ago [-]
I don't think that I've ever had an issue like that on Windows. Their business practices with the ads and whatnot are awful, but the drivers and hardware work very, very well. Honestly even as a pretty knowledgeable person I don't consider your situation to be acceptable, much less someone who isn't knowledgeable. "You have to roll back the kernel" is just not reasonable to expect of people.
redmajor12 4 days ago [-]
Have you tried the i3 tiling window manager? If you're comfortable on the command line, you may enjoy the keyboard-forward focus of i3.
4 days ago [-]
doener 4 days ago [-]
[flagged]
Rendered at 16:09:23 GMT+0000 (Coordinated Universal Time) with Vercel.
I wonder if the same thing still goes on. It probably was quite an effective filter for the nerdiest and most obsessive people back in 1999, and it probably still is, but somehow that kind of mindset seems a bit outdated today. If it does still exist I'd be interested to know what kind of status macOS has! Literally nobody on the CS course had a Mac, despite the very cool and colourful iMacs being very popular.
It's hard to understand today because Windows isn't by far the overlord it was in the late 90s and early 2000s. Alternatives like Amiga, Atari, SGI, Sun and even the Mac were dead or on their deathbed, and it looked like there would only be one operating system in the future, and one that wasn't exactly a triumph of engineering.
Thankfully that future didn't happen and Windows is essentially only relevant for running PC games today.
What an odd statement. You’re ignoring the vast majority of all office computers all over the world here, the on-prem infrastructure of most companies, and cheap home computers. Not everyone can afford to buy a Mac; Office tools and huge swaths of domain-specific tools only run on Windows; and despite what Linux enthusiasts want you to believe, Linux doesn’t support existing Windows software well enough for common business use.
You may be referring to the needs of rich software engineers in western countries exclusively here, but overwhelming majority of the world’s client computers are firmly in Microsoft’s hands.
That's a very different situation from 20 years ago, where a lot of important software only existed for Windows.
E.g. today you can just ditch Windows and install Linux on that same old laptop and wouldn't lose anything important.
You have an extremely skewed view of what "relevant software" is, if you believe this. The second you try and use a computer for anything professional, besides nerd stuff and a few key "creative" areas that Apple has cultural cachet in like music production and graphic design, you are absolutely going to find yourself reliant on some completely entrenched "industry standard" piece of software that's been around for many decades and only runs on Microsoft Windows. Autodesk still sells 3ds Max for a couple grand, and it's Windows-only. Hell, even TurboTax only runs on Windows. And if you manage to dodge that, there's bound to be some critical piece of hardware that can only be controlled from a Windows computer.
The biggest propaganda win that Microsoft pulled was getting nerds - the biggest loudmouths about software freedom - off their case. 90s style free software fanaticism is basically dead. Meanwhile the rest of the world is absolutely still locked in to Windows. Know how you can tell? Because Windows is now widely regarded as a pile of shit, riddled with adware even out of the box ("Candy Crush in the Start menu") and yet people still use it, because they have to.
>E.g. today you can just ditch Windows and install Linux on that same old laptop and wouldn't lose anything important.
I challenge you to try this with a nontechnical friend who uses a computer for work. They will be screaming at you to put it back like it was within the week. I say this as a die-hard Linux user.
it still amazes me how the newer generation of technical people see Microsoft as non-evil or, in some cases, even the good guy. The greatest trick the devil ever pulled was re-convincing the world he doesn't exist.
edit: wasn't the google motto "don't be evil" basically meant to mean don't be Microsoft?
Microsoft didn't lift a finger. In the early years while they still had developer fanaticism behind it, the FOSS movement delivered Linux, which was an adequate substitute for Windows, but failed to deliver an adequate software ecosystem to compete with the Windows software ecosystem to jumpstart Linux desktop adoption. Excel and Active Directory are big examples but I'm sure people can come up with others. So users stayed where the software they needed was.
FOSS momentum really began to falter once it became clear that none of the proposed business models that were based on being free and open, OSI style, were working and the money hungry beancounters started moving to faux open source licenses like shared source. But Microsoft had nothing to do with that either other than simply surviving long enough for it to happen.
> "Because Windows is now widely regarded as a pile of shit..."
People eat at McDonalds and drink coffee at Starbucks by the millions every day. It all might be shit but being consistent has a value of its own.
In fact, it increasingly feels like desktop itself is relegated to this kind of use, with smartphone becoming the primary computing device for most things.
And this is where Linux won over MS - it wasn't on the desktop (which is still a mostly terrible experience to this day), but in the server room.
Coupled with Java or Perl and OSS Databases (MySQL and Postgres), people began moving software off the desktop and onto websites. Better web browsers (something else MS tried to dominate and were rebuffed) also helped. This cut MS out of the loop on new popular web servers (apache), protocols, languages and infrastructure - probably deliberately. This then allowed alternative platforms like the new iPhone and later android to gain a niche which they've since expanded massively.
I think people saw the monoculture that was developing around Windows (from days of XP onwards), and didn't like it, so actively sought to make their own path, Linux became was keystone there (could have been BSD though also...).
I agree with you that many people probably could use a Linux computer if there was any incentive to go that direction. That isn’t reality though, and likely won’t be any time soon.
That, and running 73% of the world's desktops. [1]
[1] https://gs.statcounter.com/os-market-share/desktop/worldwide...
Remember that Microsoft didn't really have an OS that didn't suck yet. Windows as we know it today did not exist, it was still Windows NT 4 and no directory. SQL Server 7 was released in 1998, but until that rewrite it was a product produced in large part another company (Sybase).
Exchange Server (~1996) was the first product Microsoft developed from the ground up to replace an awful product and it was wildly successful, even before real Windows/AD landed in 1999. SQL Server eclipsed Exchange as the flagship product around 2010 or so.
Macs were super popular on Ivy League campuses in the 1980's. Some schools sold Macs to students for $1,000 that retailed for $2,500. Some of the current Fedex Business Centers started as Kinkos locations that rented time on Macs by the hour.
Microsoft changed the education landscape when they offered most of their commercial products for 10% of retail. They also do this for 501c3 non-profits. I believe _students_ get a different discount (~50%).
Especially at the time, saying that both sides were obviously bad people because they had a beef with each other is a little like saying the Dark Side and the Jedi should have just stopped being assholes to each other. It's not that simple.
Waaay too many folks (particularly in places like Mastodon) make "using Linux" the majority of their personality. They live and breathe Linux. Fighting to ensure that "Winblows" users understand that Arch Linux is *THE ONLY WAY!*
God forbid if you use a distro they don't then you're subjected to endless pithy criticism from on high about "well you wouldn't have this problem if you just used my distro!"
And of course an IBM Thinkpad T42.
Tech felt more nerdy and optimistic. It had its share of problems, but I miss some of the idealism.
It was a time when you could be in some place outside the big meccas like SV and start a reasonably succesful company around simply developing some software package and selling it to a small to midsized companies in your region.
Even in big non-tech companies, tech people kind of were left alone in a distant building and as long as you had a suit and a tie for the very ocasional meetings with the civilized portion of your company you were mostly left alone.
Then, the business people started to slowly encroach into our domain, the project management people, their processess. I think that a lot of the early agile movement was kind of a immune reaction against it, until it was too coopted by the suit people.
Then the marketing consolidated and concentrated, and it was never the same since then.
If you're in a hiring position it's important to be very, very selective about who you hire into any engineering management or even more agile-specific roles. I remembering interviewing for an "agile coach" in a previous role - the CTO thought we needed one - and the first two or three were basically the same: all had their acronyms and their flavours of how they'd memorised agile literature. Then one guy came in who did know all that, but was also ex-British Army and had a load of practical, insightful things to say. And that's who we picked, and he later became the head of engineering. Useful agile knowledge is practical and insightful. It's not theory.
You only get to execute the nuclear option once (i.e. quit) so yes you need to understand where the line is, but there's almost always a better option.
You're also making the (bad) bet that the disaster will be small enough to not wipe out the company but big enough that management notices.
History is hard to know, because of all the hired bullshit, but even without being sure of “history” it seems entirely reasonable to think that every now and then the energy of a whole generation comes to a head in a long fine flash, for reasons that nobody really understands at the time—and which never explain, in retrospect, what actually happened.
My central memory of that time seems to hang on one or five or maybe forty nights—or very early mornings—when I left the Fillmore half-crazy and, instead of going home, aimed the big 650 Lightning across the Bay Bridge at a hundred miles an hour wearing L. L. Bean shorts and a Butte sheepherder's jacket . . . booming through the Treasure Island tunnel at the lights of Oakland and Berkeley and Richmond, not quite sure which turn-off to take when I got to the other end (always stalling at the toll-gate, too twisted to find neutral while I fumbled for change) . . . but being absolutely certain that no matter which way I went I would come to a place where people were just as high and wild as I was: No doubt at all about that. . . .
There was madness in any direction, at any hour. If not across the Bay, then up the Golden Gate or down 101 to Los Altos or La Honda. . . . You could strike sparks anywhere. There was a fantastic universal sense that whatever we were doing was right, that we were winning. . . .
And that, I think, was the handle—that sense of inevitable victory over the forces of Old and Evil. Not in any mean or military sense; we didn’t need that. Our energy would simply prevail. There was no point in fighting—on our side or theirs. We had all the momentum; we were riding the crest of a high and beautiful wave. . . .
So now, less than five years later, you can go up on a steep hill in Las Vegas and look West, and with the right kind of eyes you can almost see the high-water mark—that place where the wave finally broke and rolled back.”
CPU speed was measured in Mhz not Ghz. CPUs had one core. Memory was measured in MB (not GB) and machines with memory of 32MB or less was the norm. A big-screen was 1024x768 (and most folk didn't have that) and lots-of-color meant 256 colors.
The publically-accessable internet was in its infancy. Connection was through dial ip modem. There was no Google, no social media, no news media.
Most of the businesses we sold software to, we had to source the hardware for them as well. Computers in the home were "common" (but not really used for much apart from games.) And by common I'm thinking 10% or so, not 50%.
So yeah, still very nerdy. You still needed a good grasp of the command line, config of new machines was tricky, IRQ numbers were a thing. My best work was in Assembly language.
And very optimistic. It was clear the best was still to come, and the flaws were seen, but improving all the time. Every year brought new hardware, new software, and new "I didn't know a computer could do that).
Plus we were younger, so much younger than today. The world was our oyster. We might have been nerds, but we had gun working AND got paid to do so. It was a special time and I feel privileged to have lived through it.
Now here is a mind-boggling thought: there will come a time in the not-too-distant future when today's technology will seem as quaint and primitive as what I have just described.
Will it? In a decade, man landing on the moon will be closer to the first flight by the Wright Brothers than it is to today.
I was about to write a small essay on how everything from monitors to speeds seemed to be stagnating.
But I actually think there is a chance we might gain two new revolutionary[1]ish ways of interacting with computers within the next decade.
LLMs might truly bring us talking computers. Siri is old, and current AI is a fair mix of real advancement and overhyped, but once the dust settles, I wouldn't be surprised if a significant chunk of the population talks with their computer.
XR devices will likely keep shrinking and I figure somebody will find a killer app for them eventually. 20 years ago, a computer sat on a desk. Today I have one in my pocket. I wouldn't be terribly surprised if in 20 years, I have one in my glasses.
Of course, all of this is speculation. And I don't think the PC is threatened, but it will probably face competition from new ways of using computers.
[1]: As in different and not just slightly better than last years model.
I have had a talking computer since 2004.
I've had a computer that listens to me since 2012 when I build my first neural network that could recognize my voice and convert it to bash commands reliably enough.
I've had an LLM powered version of that since the gpt 3 api came up and a local version with a llama 8b model today.
It's meh.
What I find ironic is that even though I'm dictating this to a $50k work station I'm still using Emacs to do it.
But for my grandmother, who keeps a little notebook with detailed instructions for how to message people on Facebook? For her this might greatly increase her capabilities.
We have had voice controlled machines since at least the 80s. There were voice controlled robots in MIT decades ago. It's just stupid and awful.
I wrote about it the other day, but every Windows computer since 2000 has included a fully functional, fairly well implemented, and mostly accurate voice control. Nobody uses it except people who have to use it, because voice is a terrible control plane. Humans are WAY better at choosing an option from a 2D grid using a mouse, even people who are afraid of computers. Voice is inherently slower, less accurate, less precise, and inherently ambiguous.
Hell, even as shown on Star Trek TNG, controlling a computer through your voice is awful. They show a few times characters having to do searches through a bunch of data and refine the output by adding more filter criteria, and always what they do is better done by a good excel spreadsheet.
People don't want voice controlled computers FFS, stop trying to force them. Stop trying to make fetch happen.
That's not a technology issue, so much as a "there is no pressing need to spend the money on this anymore". At the peak of the moon race, NASA was getting 5% of US government spending. In modern terms, that would be like spending 340 billion per year; it's just a _tremendous_ amount of money.
(That said, both the US and China do plan to make manned landings on the moon in the next decade, though the timelines may be... optimistic, as neither is indulging in crazy space-race-esque spending on it.)
As the US medical system shows, you can flush more than that down the toilet every month and still not get any results.
To get things done you need true believers.
It's hard to motivate true believers to build the 1e6th crud app.
This is why no one in mainstream tech is nerdy any more and why all projects are over budget and under performing.
It was pretty cool when it worked. I think I managed to serve a website from it and I used it as a fancy teletype.
MacOS 8.6 and Windows 95 OSR2.1 (the one that had good USB support, but no IE yet). Back then I loved to tweak my OS'es, I totally stripped them from anything it didn't need, tweaked the animations to be instant. Dual Celly 300A@450.
I think I installed a Windows 2000 beta somewhere in 1999 and stuck with that until they really ended support for it. Pretty consistent UI, lightweight. Ran most software including games, even when not officially supported.
Played with the MacOS X Betas, on hardware comparable to the first iPhone in terms of performance. Blue White G3 Tower.
I loved computers back then. Now they're just tools.
Preach it brother. I lived through Windows v. *nix (several versions), z80 v 6502, ethernet v. token ring, packet switching v. circuit switching, and a ton of others[1], and was usually a partisan on one side or the other. Now...it's either 'what works easiest' or 'what does the client want' and I make it work for me. This lack of participation in partisan conflict has made most everything go smoother. And vintage computing allows me to get my "remember how much fun it once was" fix.
[1] Yes, even vi v. emacs, but even 30 years ago that had become more of an inside joke than actual debate. Besides, those emacs heretics will never find the One True Way.
Now the only thing I know are compilers :-)
To add, lot of us assembled our own PC. Though I didn't build one, I routinely opened the cabinet for upgrading RAM/HDD, or to transfer large files through HDD.I clearly remember when I upgraded RAM from 32MB to 64MB, the speed bump was incredible. After a while it became so frequent that I didn't bother to close the side panel.
A side effect was that computer getting bricked was a normal affair, so we all had boot Floppy disks at hand along with CDs of all the important softwares. In fact I would routinely install things from scratch to get rid of all the freeware stuff that clogged the CPU & RAM.
As you said we were living on the cusp of two revolutionary changes at once, PCs and internet.
I wonder what do youngsters of today look back 20 years down the line and have similar fond feeling & memories.
That is still true today. PC components are very popular, as are YouTube channel with reviews, ruminations about component choices, build experience and how-to guides .
> A side effect was that computer getting bricked was a normal affair
If you overclocked or messed with the volatges maybe, but otherwise - in my experience (start assembling systems around 1993 or so), bricking was quite rare.
A bricking event is when the computer is more useful as a doorstop than a work tool.
Irony aside, there are people already saying that the moment AIs have sufficient agency we should give them equal rights and freedoms[^1]. How does one exponentially loses common sense?
[^1]: https://www.youtube.com/watch?v=1yvBqasHLZs
we demand that they be relegated to configuring the lp-daemon from just the *man* pages, no internet :o)
My memory of 1998 is that 64mb-128mb was common by then.
Around then my main machine had 128mb, until 2001 when I switched to 256mb and a 1ghz cpu. I also had an older machine with 8mb that was given to me for free. I recompiled OpenBSD to exclude PCI support to get it to stop swapping, then it was pretty usable as a headless machine.
I have a broken memory of the time, because I swear I first came across Google as the better alternative to AltaVista in my last year at Uni in '96, but Wikipedia says Google arrived in '98, or at least the company was set up then, is it possible it was usable 2 years before they incorporated?
I saw Sergei give a talk at my university in 96 or 97 and switched from Alta Vista.
Sounds like a Mandela effect. :-)
Sixteen- and twenty-four-bit color weren't just widespread by 1998. They were standard.
Not supporting 24 bit color was unheard of for a normal desktop monitor and normal desktop graphics.
Maybe in 800x600, but not in higher resolutions.
A lot of PCs still had 1MB video ram cards like the S3 trio - I remember running Xwindows on a s3 card that would change the colour palette for whichever app was active when you ran it in 1024x768.
But by 1998 the pace of hardware changes were moving so fast that anything top of the line from 1994 was just trash, you could get one for free if you asked.
But Windows 3.11 started to change that and by the time Windows 95 was out nobody wanted to use a 2 year old machine anymore.
(Though to be fair most people I knew went from CGA/Hercules straight to VGA, there really wasn't a "driving need" for EGA, especially if your machine was business oriented - Hercules was better for text). VGA was a "sweet spot" also in that it always worked, SVGA started getting really hairy sometimes, and could be a bear to get working right.
Exhibit A: https://www.youtube.com/watch?v=goh2x_G0ct4
It costs even more to be reckless today.
Re: "whitey on the moon" - I'm not sure the space program would be my first target there but I think it makes a more poetic contrast and forces people to pay attention by targeting a beloved cultural narrative. Cyberpunk - by my reckoning a bit later - has been preaching a very similar message of massive inequality in the presence of incredible technology and wealth disparity and power concentration. And yet that doesn't draw the same ire. I guess in that case it's easier to dismiss the core message because robot limbs and cool neon lights are too much of a distraction.
(Also don't make the mistake of thinking that these groups continually fighting over how the centralized power is used means that any one of them is aligned with individual freedom)
In addition, and popular for a while, was pay-per-minute access where the ISP didn't charge the customer directly, but took part of the phone call cost. I don't know if this method of payment to an ISP existed in the USA.
That far fewer (only 9%) of British households had Internet access was probably more related to the cost of the computer and modem, and the country generally having less wealth than the USA.
Summer of 95 I convinced her to get dialup so I could learn about this internet thing I figured would be big. I was about 2 years too late for that wave.
But yes things were way more primitive and the learning curve was high. I had to spend several hours on the phone with the ISP to get on dialup on Mac. They onboarded people on campus manually and sold Ethernet cards back then.
Not sure how you’re claiming this but here is an article from 2000 announcing the first 1ghz consumer processor. https://www.zdnet.com/article/its-official-amd-hits-1000mhz-...
Are you saying you were on a mainframe or supercomputer in 95? That’s the only way I could see access to anything approaching those speeds at that time.
Even an sgi workstation you’re looking at 200mhz tops in 95. And no way 1gb of ram more like 64mb.
No you didnt. Process node enabling 1GHz wasnt even invented yet. Afaik fastest CPU at the time was Alpha released in December at 333MHz.
1GB of ram cost $32K in 1995, $2400 in 1997, $1200 in 1998, and back to $2400 in September 1999 due to Jiji earthquake. Just for some perspective 4th place on TOP500 in 1995 had 8GB of ram.
That would be rather 1996. Alphas were already at 500MHz in 1997, and (expensive) Pentium II were at 450MHz in 1998.
Well sure, the colleges had the good toys long before the rest of us. But we didn't have college-level money at home or at work.
I left college (and the machines that existed there) in 1992. And the hardware there in 1992 was better than the hardware I had at home or at work for the rest of the 90s.
So yeah, the tech existed, but it wasn't the tech we got to play with again until much later on. By 1998 though we had at least moved away from DOS development into Windows.
But also, yes, I remember 3D accelerators - and how expensive and coveted they were. Few 3D games before 2000 shipped without software 3D rendering support for this reason; I remember Quake 3 (1999) being the first big title like that, and it was one of the big reasons why Unreal Tournament was more popular in my circles back then - you could play it in software mode at 320x240 on fairly meager hardware.
ADSL was launched in 1999 in the UK, so you also weren't here.
Disclaimer: I helped start one of the US' biggest ISP's in this era, and personally installed and handed over multiple DSL configurations to folks living in 'poor' neighbourhoods which were, incidentally, close enough to the NOC to be viable DSL customers.. Plenty of 'poor folks' in downtown LA were able to get DSL, purely by the nature of their vicinity to One Wilshire ..
(I remember wanting to dumpster dive the CO because we got a letter from Southwestern Bell saying they were upgrading the equipment to 5ESS in late fall 1998)
Microsoft was the big bad wolf, but at the time, most of what they did that was so horrible was capture market share. You could still use a computing device and then leave it alone and it would leave you alone, it wasn't trying to constantly ping you and notify you and make you feel left out and use all kinds of dark patterns to feed the addiction. These days, computing devices feel more like trickster adversaries than clunky-but-useful tools.
Not so fast. They've been at it for a while.
https://en.wikipedia.org/wiki/Clipper_chip
However, everyone was so busy thinking about being a better version of Windows and Mac that for the most part we didn’t think about phones. Windows and make themselves fell by the wayside to iOS and android.
The N900 was more or less a tiny computer running Debian. With 256MB RAM, and swapping on flash.
It was way too low spec to run reliably like that, you quickly ran into swap death. And it had none of the niceties of Android's memory management, having apps designed to be stopped as needed.
Security-wise it was also bad, it was just a normal Linux box, so banking apps would be a terrible idea.
If it didn't get killed, I wonder how would they have polished it up for public consumption.
And it had a Keyboard! With really real keys!
Yes, on Linux you can replace binaries and libraries in use, but then you're not actually running the new code until you restart the program and are still vulnerable to any security issue that it fixed.
And with things like runtime loading of plugins that now may be incompatible, and programs not expecting stuff changing underneath, online updates can be troublesome.
The online model works well enough for a command line usage where applications are transient, or a server usage where you remember to restart a service or two. But for a desktop user with long lived, huge apps like a browser it's not that good of an idea anymore.
I am aware of them - a friend worked on the N900 - but they were so far off 'phones' I didn't bother.
Probably true right now too, some are in optimistic times, and others not. Perhaps the proportions of the two groups varies.
Alas, 'twas not to be. SGI, you glorious bastard, why did you have to make that deal with the devil ..
IMHO, the tech industry completely changed when nerds could by tiBooks, have an amazing Unix experience in a portable form, and still rip and play DVD's alongside the shell ..
For those who do, it's a competent Unix OS and certainly does the job. I never found IRIX great for servers either.
It was a huge surprise that the Fruit Company were the ones to come out with really robust, affordable (at the time) Unix-based laptops .. all the issues with Irix would have become relatively irrelevant had SGI built the laptop and had their Linux team work on the case.
Ah well, an alternative timeline I will visit when the technology to do so becomes more widely available. ;)
I imagine what they’d look like today. Now sure how to translate their 90’s graphics language to the 2020’s though. Hard edges, rich textures, and colorful machines.
I went back to Linux around 4.8 because FreeBSD started having issues with my hardware. By then I had "real" internet thanks to the company I worked for.
I remember seeing items about this, but the BSD people tended not to care about these things. But a nice look back
During the summer of 2004 I took an introductory computer science course at Sacramento City College. The professor was a big fan of FreeBSD. He convinced me to try it instead of Gentoo, and he gave me some CDs that he burned containing FreeBSD and plenty of FOSS software. I ended up installing FreeBSD on my PC, and I fell in love with it. It was my daily-driver OS until the summer of 2006, when I was able to use some of my internship earnings to purchase a MacBook, my first brand-new computer and my first modern Mac. I still use FreeBSD whenever I need a Unix and when I don’t need to use Linux-specific software.
You just had to endlessly swap between the different CDs because the dependency resolution wasn't very smart about that. Fun stuff :-)
I've tried it over they years and finally gave up. It would work for awhile, until some random change would break something I used every day and I wasn't interested in spending many hours trying to research a fix and manually hack some .c file to make it work.
MacOS essentially became a form of Unix with a fantastic GUI. It's what Linux could have been. I like to use Linux with the windows WSL and I get the best of both worlds: a nice GUI and the ability to run all of my favorite Linux apps.
My experience with the desktops across all the OSes largely similar. They do things differently, but once you get used to the OS, it becomes second nature.
Linux is my preferred environment for anything to do with development. I use my Windows machine for office productivity (Outlook, Word, Excel, Powerpoint browsing) and VisualStudio. I use my Mac for the Adobe creative suite and Figma and iOS/visionOS development.
The difference in OSes isn't the desktop itself. It's the software that runs on it. I don't really want use GIMP on Linux for example, or do web dev on Windows.
Personally, I find any development on Windows to be a pain!
If you really want to use MacOS's smooth scrolling, use MOS (https://mos.caldis.me) and get a beefy Mac because the base models will stutter like I mentioned.
OTOH, Windows has never given me anything but trouble... stuff that's easy on Linux semi-frequently required regedit hacks. I remember having to mess with some DCOM thing and ended up hosing my Windows install. Not to mention that it's awfully slow without installing a bunch of debloating tools (and even then...).
It was never great-- there was always bloat, but recently it's crossed a line of unusability where the OS itself is more distracting than useful.
I always admired Linux, and was able to get around in Linux, but never seriously considered using it until that MS AI thing that works by taking a screenshot every couple of seconds. And now this pester-ware asking for my location.
Meanwhile, with Libreoffice, it just works.
I've been using Linux exclusively as a desktop environment at home for 5 years or more and my primary pain point is kernel updates breaking DisplayLink, via a docking stattion I use for multiple monitors. I now have a specific command line that rolls back the kernel to the previous version, and then I wait for updated DisplayLink drivers.
It's a pain, but it's still (much!) better than the circles of hell that Windows has been putting users through since Windows 7.
Strangely, the Windows laptop provided by my work just stopped being able to pipe audio through the speakers plugged into the docking station. Which feels like Windows "doing a Linux". The tables are turning?