I feel super fortunate to be a part of that generation where screwing around at home could lead directly to employment. I taught myself Atari BASIC on my 800 and took a 286 IBM compatible to college where I was a music major. I dropped out and landed working for an industrial automation company because I knew how to write dumb little programs. A couple years later I was the sole guy programming robots for them in a structured BASIC language.
acka 181 days ago [-]
Whenever I read replies like these, I feel jealous of people who dropped out of college yet still managed to land a job in tech.
In my country, the Netherlands, it was almost impossible in the late 1980s to land a tech job other than a low-level service technician (read: roadie or cable guy) if you did not have at least a bachelor's degree or higher in a tech subject or a degree from a technical community college. College dropouts were turned away before even getting an interview, and bankers would rather commit suicide than finance tech startups founded by anyone without an appropriate college degree.
Times sure have changed.
bluedino 181 days ago [-]
It still works both ways. I work for a very large company with no degree, doing HPC/AI
I used to work for another very large company doing the same thing, but as a contractor. A FTE position opened on our team but I was told by HR that I wasn't qualified for the role (even though I had been doing it for a few years on the same team...) because I didn't have a degree (not a requirement for a contractor)
BehindBlueEyes 180 days ago [-]
Could you get your experience approved as equivalent to a degree, if there is such a thing - like the VAE in france.
My brother had to get a degree with evening classes for the same reason but since he was already doing the work, it was fairly easy - assuming the cost of studying isn't prohibitive where you live.
That said it is such BS. The whole contractor vs. FTE thing is.
Where I work, FTEs get laid off before contingency staff is fired. What is the point of having contingency staff if they're more permanent than FTEs?
Contractors who do the work for years can't get interviews because they're overqualified for the FTE position they apply for but the same hiring managers are happy to string them along doing the same work they're overqualified for, on the same team but as a contractor with less pay and none of the benefits.
I understand contractors applying to a junior role to even get a foot in the door when it is the only FTE role that opened for over a year... But you'd have better odds landing an FTE role straight out of graduating from college than with a track record of doing the work well for years as a contractor.
And they're "cool" so they let contractors attend a bunch of FTE meetings which has the primary effect of rubbing in all the great diversity and inclusion initiatives they are excluded from due to their second class citizen status.
At some point those companies don't deserve to have you. But even if you get paid half what the FTEs make, it's still a guilded cage with a 6 figures salary so it's hard to just give it the finger and move on.
177 days ago [-]
HeyLaughingBoy 180 days ago [-]
Not really. IME, it's by far the exception that gets a tech job without a degree. If it were common, no one would bother mentioning it.
adastra22 181 days ago [-]
Have they? I thought this was a uniquely Silicon Valley thing.
chrismcb 182 days ago [-]
So... The current generation? Between mobile devices, raspberry pis, Web pages, Linux and even Windows there is plenty of stuff you can do just futzing and in your basement. Yeah it might be impossible to create your own AAA game, but you can still even create your own software. Plenty of open source opportunities out there as well
raincole 182 days ago [-]
I suppose the parent comment was referring the job market, not technology accessibility.
lukan 181 days ago [-]
I guess the equivalent would be people getting a job via their github profile?
thrw42A8N 181 days ago [-]
Don't ask for a million dollars per year and you'll have plenty of opportunities. There are tens of thousands of unfilled software jobs for higher than average wages.
iteria 181 days ago [-]
But are they willing to even talk to someone who doesn't have a degree or experience? I've never worked at jobs that were super high paying. I've never seen a fresh self-taught person on a job in the last 5 years. And I've done consulting and gotten exposure to a lot a of different companies. I've also done scrappy startups. And boring small companies no one has ever heard of.
Running into a self-taught person at all was rare, but when I did their story rarely involved not transferring from another career and leveraging some SME knowledge to get started. They already had training or a degree just not in this.
I'm not sure screwing around at home will actually land you a job. Not anymore.
maccard 181 days ago [-]
Yes.
There are definitely places that won’t talk to you without a degree, but many, many places will take a degree or equivalent.
> screwing around at home will actually land you a job. Not anymore
I don’t think “screwing around” will land you a job whether it’s at home or at college/uni. But a degree tells me that you can stick by something for longer than a few months even when you don’t always feel like it by our own volition.
Someone who has spent a year on and off learning to code hasn’t shown they can code or that they have any sort of consistency- both of which are (equally) as important as each other in a workplace. Someone with a degree in marine biology and a handful of GitHub projects and can pass a programming test? They’re probably my first choice. Someone with 3 years experience of writing code on their own? Absolutely. Show me those candidates and I’ll interview every one of them for a junior role.
BehindBlueEyes 180 days ago [-]
> show me those candidates
Not speaking for where you work but they might not even pass the automated resume filters anymore unfortunately.
jonfw 181 days ago [-]
I was a self taught programmer who at one point dropped out of college to try and get into the industry earlier. I spent about a year sending out applications and got absolutely zero response.
I go back to school for the remaining 2 years, and when I graduated I had 5 competing offers with salaries starting at double what I would have accepted when I had not finished school. This huge reversal in outcomes was purely the college degree as far as I can tell- I had less time to send out applications, no internships, and no new personal projects of any substance.
My experience is that there are too many college grads and boot campers with github profiles to get into the industry off of some basic home tinkering.
If you're going to do it, I imagine you've got to go one step up and stand out.
bluecheese452 181 days ago [-]
No there aren’t.
thrw42A8N 178 days ago [-]
Yes there are, I can provide you with one.
dyauspitr 181 days ago [-]
You’re going to need a very impressive portfolio of personal projects to get a job without a degree or experience today.
kdjdndnsn 180 days ago [-]
That's really not true. You just have to be good
dyauspitr 179 days ago [-]
How do you prove you’re good?
kindeyoowee 181 days ago [-]
[dead]
sandworm101 182 days ago [-]
>> might be impossible to create your own AAA game
Like Minecraft? Factorio? Modern tools allow for very small team to quickly generate very AAA games. Eye candy is still an issue, but AI is quickly creeping into that space. I would not be surprised if within the next decade we have the tools for a single person to generate what we would today call a AAA game.
lawik 181 days ago [-]
"Very AAA" games and Minecraft/Factorio are not related.
Minecraft and Factorio are both simpler productions in terms of visual fidelity and lean on gameplay that is captivating. AAA is not a label for the quality of game, more of a style/level of execution.
Both Minecraft and Factorio started indie to my knowledge which is a separate path and approach from AAA games. Unrelated to good/bad.
BehindBlueEyes 180 days ago [-]
Neither minecraft or factorio are AAA.
AAA requires not just using but creating the latest visual and audio innovations, creating a huge surface area prone to bugs which all need to be polished out and creating tools to manage your version of that complexity, optimize everything so it runs smoothly and doesn't take an unreasonable amount of disk space.
Even with AI, anything an individual could do, hundreds to thousands of people are also doing at AAA studios. An individual might innovate in a few aspect, but never clear the AAA bar as AAA is a constantly moving goalpost, and most tools the individual can use are likely contributed back by AAA studios to popular AAA game engines like Unreal.
It's like racing in a hamster wheel against the person making the wheels...
Fire-Dragon-DoL 180 days ago [-]
Both factorio and minecraft used their own proprietary engine, built in-house, ad-hoc for their game, as far as I remember? Minecraft was pioneering voxels, while factorio was the first one dealing with that massive amount of objects running at all time.
So by definition, they did not use modern tools.
To be clear, there are plenty of games that do that, I just think those 2 are terrible examples.
eterm 181 days ago [-]
Neither are AAA.
Also, Factorio was crowdfunded via a kickstarter-like platform.
Also both are around 15 years old. They are both closer in age to 1995 than today.
ChrisMarshallNY 182 days ago [-]
Same here. My first programming job was a "crossover" from a hardware technician job. It both got me into software, and introduced me to the title of "Engineer." (I was originally a Technician, then, an Electrical Engineer, even though I mostly did software, but in those days, I also designed the hardware the software ran on).
I got my first Apple programming job, because I had a Mac Plus at home, and learned to program it in ASM and Pascal.
I've only taken some non-matriculated math courses. All the rest was pretty much OJT and home study (and a lot of seminars and short classes). My original education was High School Dropout/GED.
rmbyrro 181 days ago [-]
pretty much my personal experience in a newer generation, just without the Atari, IBM, and basic
a lot of employers actually like engineers who come from a personal hacking background more than traditional paths, because we're truly passionate and care deeply. we're not in for 8-5 and a paycheck.
zelphirkalt 180 days ago [-]
I am from "traditional background" but I do lots of programming in my free time, so I think it is fair to say I care deeply as well. Please tell me how to find such an employer.
xattt 182 days ago [-]
I’m a tech literati in a fairly tech illiterate field. My co-workers think I’m some sort of wizard when I show them basic Excel skills.
Still waiting for my breakthrough.
qup 181 days ago [-]
I was a witness in court last week and it was said that I was a computer whiz for knowing how to play the mp4 file on the thumb drive.
And then later that was used against me to accuse me of lying about not knowing how to check the voicemail on my landline.
182 days ago [-]
dghlsakjg 182 days ago [-]
I’m not entirely sure we’re past those days.
Up until the current hiring lull, it was very possible to get a programming position with just a self taught background.
When the need for juniors comes back around, I’m sure we’ll start to see it again.
hn_throwaway_99 181 days ago [-]
> When the need for juniors comes back around, I’m sure we’ll start to see it again.
Man, I'm skeptical, at least in the US. Since the pandemic, I've seen an absolute explosion in offshoring, which makes perfect sense when so many people are working remotely anyway. I've worked with lots of excellent engineers from Argentina to Poland and many places in between. It's tough for me to see how an American "tinkerer" will be able to find a job in that world if he wants an American-level salary.
Also, I know the adage about "this time it's different" being the most dangerous phrase in language, but, at least in one example, something really is different. In the early 00s, after the dot com bust, there was a ton of fear about outsourcing the bulk of software work to India. That turned out not to happen, of course, because (a) remote meeting software was nowhere close to where it is today, (b) remote work in general wasn't common, and (c) the timezones issues between US and India were an absolute productivity killer. These days, though, everyone is used to remote work, and US companies have realized there are enough lower cost locales with plenty of timezone overlap to make offshoring the norm these days.
musicale 182 days ago [-]
I hope this is still true. There are certainly lots of opportunities for self-taught software and hardware development. And university lectures and course material (much of which is very good) that used to be locked inside physical campuses with expensive tuition fees are often freely available to anyone on the internet.
You can definitely build a nice portfolio of open source software (and even hardware) on github. I would hope that is enough to get a job, but it might not be, especially in the current era of AI-fueled employment pressure.
dyauspitr 181 days ago [-]
Juniors aren’t coming back, not with all this AI around.
globalnode 182 days ago [-]
the keys to employability have been captured by salesmen
2b3a51 181 days ago [-]
Quote from OA
"About 15 years later I ran into that manager again, and he was close to dying from a kidney ailment. I spent a day with him, driving him around so he could take some photographs, and having lunch. We didn't talk much about work, mostly he wanted to get out of being in bed and see the world a bit."
Just deLurking to say that was an excellent thing to do. This small paragraph tucked away at the end of the anecdote shifted the whole experience for me.
rekabis 181 days ago [-]
Agreed. This is what Genuinely Good People do.
Tempest1981 181 days ago [-]
And what started as a tech connection, became a deeper human connection.
Is generative art just AI, or is there something else out there that was called that before the emergence of AI? Genuinely curious.
swiftcoder 181 days ago [-]
It's all hand-coded. Most folks in the generative art community are pretty upset about "generative AI" preempting the name.
erichocean 182 days ago [-]
Generative art pre-AI was art created with code.
vajrabum 182 days ago [-]
And he says on his about page "This art is primarily non-objective and abstract, focusing on complex shapes and colors. I use my math, programming, and digital manipulation knowledge to produce highly unique art." It's not AI generated.
indigoabstract 181 days ago [-]
It's made by people, but using code instead of brushes. Andrew Wulf's art is really beautiful, both in colors and patterns.
> Today I make generative art, see it on my website
I do love the ways random events can change folks’ lives. Would the author have ended up doing art at all without this happening?
Scubabear68 181 days ago [-]
I had my interest in computing cemented when I was about 16 in the early 1980’s I volunteered that summer at a local hospital where my cousin worked running one of the test labs.
They had an apple II and some equipment they figured might be able to interface with it. Like the article author, I basically got a few manuals and just figured out how to make it work. I was able to do some simple control of one of the new instruments via the RS-232 serial connection, plus print out some it’s internal state.
I was very proud that all the other volunteers were basically candy stripers, and I was doing software development in a lab environment.
I would have gone bonkers with everything available today. But then again, in the 80s the sorting function was very strong - you were either one of those who could figure this stuff out, or you weren’t. Then of course, there were further delineations. Case in point, my apple ii basic and 6502 assembly hacks in the lab sound like child’s play compared to what this dude did!
fcatalan 181 days ago [-]
I was studying Physics, not out of particular interest, just because it was challenging, so I was doing badly.
I then discovered a small room that had two unsupervised computers hooked up to some mysterious world-spaning network, and made friends there, ended up leaving Physics for Computer Science.
My first job and every job in my 20s came from people I met in that room getting jobs themselves and calling me to see if I would go work with them, or someone from the previous jobs calling me back. I've never done a real job interview or sent a CV.
But then I formed a family and my social life plummeted. I'm also bad at really nurturing relationships that don't self sustain, so in retrospect I can see how my career ossified since then.
I don't totally regret it because even if I'm now underpaid and underemployed, I earn more than enough for my lifestyle and have loads of free time, so it balances the pang for greater things.
But yeah, networking is very important.
yapyap 182 days ago [-]
Yeah, networking can give you the world.
Often networking is seen as this robot-like bleep bloop hello, here’s my business card thing and at the dedicated events it very well could be but networking in the most basic sense is just making friends and shooting the shit, only difference is that you can leverage those friends for opportunities in the workplace and vice versa.
glitchc 182 days ago [-]
If there's mutual interest, certainly, but in most cases networking feels shallow and forced. If the only thing in common between us is the weather, I tune out quickly. Networking is mainly for those who truly like people.
otteromkram 181 days ago [-]
> So when they asked who could write 6502 assembly on an Apple II, I raised my hand figuring everyone here was a programmer — and found only my hand had been raised!
Pre comedy. I can just imagine the initial indifference when raising his hand only to look around and start lowering his hand slowly when one of the bosses just looks at him and says, "You. No, not him. YOU. You stay, everyone else can leave."
cranberryturkey 181 days ago [-]
I was late to programming. I had a computer in junior high but didn’t start programming until the mid 90s when I got internet access at college.
oceanparkway 183 days ago [-]
Metal desks!
sjf 182 days ago [-]
My forearms are getting cold just thinking about it.
AlienRobot 181 days ago [-]
>I raised my hand figuring everyone here was a programmer—and found only my hand had been raised
I guess some things never change.
commandersaki 182 days ago [-]
Eh, I read the article and I still don't know what it means to "talk over a wall".
vajrabum 182 days ago [-]
If you worked in a cubicle farm you'd know. The cubicles were generally divided by low portable walls. There were different setups but generally you don't see people when you're seated but if you stand you can see your neighbors.
dghlsakjg 182 days ago [-]
I think he means literally talking to someone on the other side of his cubicle wall.
GrumpyNl 181 days ago [-]
For me it boils doen to, communication is key, talk to each other, exchange ideas.
tantalor 182 days ago [-]
I think it means taking to people outside your team, about your personal interest areas.
aaron695 182 days ago [-]
[dead]
wlindley 182 days ago [-]
It's a program. "App" is a word, short for "Application Program," publicized by Apple for its handheld computers that masquerade as (and are euphemistically called) "telephones." "App" effectively means "proprietary closed-source program that talks to proprietary walled-garden programs running on someone else's computer, and acts as a spy sending all your sensitive data to who-knows-where."
No-one ever called a real program an "app" before that, did they?
Smeevy 182 days ago [-]
I've been programming professionally for over 30 years and "app", "application", and "program" have been interchangeable for me and the people I worked with as far back as I can remember.
peterfirefly 182 days ago [-]
Operating systems are not apps. Embedded controller programs are not apps.
happytoexplain 182 days ago [-]
"Application" has been a common general term for an end-user program for a very long time, and "app" is just an obvious abbreviation that people and UIs have used to varying degrees all along. iOS apps merely mainstreamed the term, they didn't take ownership of it.
TeMPOraL 181 days ago [-]
iOS mainstreamed it, but for a long time, "app" had a different meaning. Like, "application" was the big full-featured thing you run on your PC; "app" was the toy thing you run on the phone.
Then some "genius" started calling their desktop offerings "apps" (perhaps because lazy multiplatform-via- webap development eventually extended to marketing copy"), and now everything is an "app".
0xcde4c3db 182 days ago [-]
I don't recall seeing "app" on its own that often, but there was the idiom "killer app", meaning an application that was compelling enough to drive sales of its host platform (VisiCalc on Apple II being the go-to example).
tom_ 182 days ago [-]
GEM on the Atari ST supported the .app (short for "application") extension for gui executables. One of its components was the AES, short for Application Environment Services. This stuff dates from the early to mid 1980s.
cannam 182 days ago [-]
> No-one ever called a real program an "app" before that, did they?
Yes. Apple called them apps in the 80s, at least on the Mac - this is Apple II but it's plausible they were also referred to as apps there?
For my part I read the title as "Taking over a wall changed my direction as a programmer" which had me really confused for a while. I'd like to read that article, I think.
musicale 182 days ago [-]
Apple (App-le?) certainly popularized abbreviating "applications programs" or "application software" (vs. system software, systems programs etc.) to "applications" in the 1980s, and "apps" with the advent of the App Store in 2008, but Apple was unsuccessful in trying to obtain and enforce an App Store trademark given prior uses of app, store, and app store (including, perhaps ironically given Steve Jobs' return and Apple's acquisition of NeXT, a store for NeXTSTEP apps.) "Killer App(lication)" dates to the 1980s, applying to software like VisiCalc for the Apple II.
majormajor 182 days ago [-]
"Applications" was a very common term in the classic Mac days. "Programs" was a more Windows-y term. ("Applications" vs "Program Files" in ye olden 90s world of where to put things you installed.)
koolba 182 days ago [-]
IIRC, even the default template on Windows in the early 90s with Visual Studio was MFCApp.
rahimnathwani 181 days ago [-]
And VBA stands for Visual Basic for Applications.
TeMPOraL 181 days ago [-]
In my experience, end-user programs you'd run and operate were called "applications" or "programs", and it was a specialist term anyway, because general population didn't think in terms of applications anyway - they thought in terms of "running Word" or "running Google", with the machine itself an implementation detail.
As I remember it, the term "app" came from smartphones, where it referred specifically to smartphone applications. Connotations were rather negative - inferior facsimile of the real thing (not even a full application - just an "app"), also came from "app marketplace"[0][1]. And to this days, apps are inferior to their desktop counterparts (at least surviving ones), except marketing won this one - people got used to this term, then some vendors started calling desktop software "apps", and now suddenly everything is an app.
--
[0] - To this day I roll my eyes at the mental model here. It feels unnatural and wrong, but that's maybe because I'm not used to think in terms of trying to find markets and money-making opportunities everywhere.
[1] - Or maybe it was just a huge letdown to me, and I'm soured ever since. Back when first iPhones and then Android came out, I was hoping a full-blown computer with a Linux on board would mean it'll be much easier to just write your own stuff for it. That it would feel more like a desktop in terms of functionality, capabilities, opportunities. I did not expect them to come up with "app stores" and lock the thing down, and made you accept the mark of the beast, entangle yourself with the commercial universe, just to be able to add your own joke app or such.
Since then, it only got worse.
Rendered at 17:46:09 GMT+0000 (Coordinated Universal Time) with Vercel.
In my country, the Netherlands, it was almost impossible in the late 1980s to land a tech job other than a low-level service technician (read: roadie or cable guy) if you did not have at least a bachelor's degree or higher in a tech subject or a degree from a technical community college. College dropouts were turned away before even getting an interview, and bankers would rather commit suicide than finance tech startups founded by anyone without an appropriate college degree.
Times sure have changed.
I used to work for another very large company doing the same thing, but as a contractor. A FTE position opened on our team but I was told by HR that I wasn't qualified for the role (even though I had been doing it for a few years on the same team...) because I didn't have a degree (not a requirement for a contractor)
My brother had to get a degree with evening classes for the same reason but since he was already doing the work, it was fairly easy - assuming the cost of studying isn't prohibitive where you live.
That said it is such BS. The whole contractor vs. FTE thing is.
Where I work, FTEs get laid off before contingency staff is fired. What is the point of having contingency staff if they're more permanent than FTEs?
Contractors who do the work for years can't get interviews because they're overqualified for the FTE position they apply for but the same hiring managers are happy to string them along doing the same work they're overqualified for, on the same team but as a contractor with less pay and none of the benefits.
I understand contractors applying to a junior role to even get a foot in the door when it is the only FTE role that opened for over a year... But you'd have better odds landing an FTE role straight out of graduating from college than with a track record of doing the work well for years as a contractor.
And they're "cool" so they let contractors attend a bunch of FTE meetings which has the primary effect of rubbing in all the great diversity and inclusion initiatives they are excluded from due to their second class citizen status.
At some point those companies don't deserve to have you. But even if you get paid half what the FTEs make, it's still a guilded cage with a 6 figures salary so it's hard to just give it the finger and move on.
Running into a self-taught person at all was rare, but when I did their story rarely involved not transferring from another career and leveraging some SME knowledge to get started. They already had training or a degree just not in this.
I'm not sure screwing around at home will actually land you a job. Not anymore.
There are definitely places that won’t talk to you without a degree, but many, many places will take a degree or equivalent.
> screwing around at home will actually land you a job. Not anymore
I don’t think “screwing around” will land you a job whether it’s at home or at college/uni. But a degree tells me that you can stick by something for longer than a few months even when you don’t always feel like it by our own volition.
Someone who has spent a year on and off learning to code hasn’t shown they can code or that they have any sort of consistency- both of which are (equally) as important as each other in a workplace. Someone with a degree in marine biology and a handful of GitHub projects and can pass a programming test? They’re probably my first choice. Someone with 3 years experience of writing code on their own? Absolutely. Show me those candidates and I’ll interview every one of them for a junior role.
Not speaking for where you work but they might not even pass the automated resume filters anymore unfortunately.
I go back to school for the remaining 2 years, and when I graduated I had 5 competing offers with salaries starting at double what I would have accepted when I had not finished school. This huge reversal in outcomes was purely the college degree as far as I can tell- I had less time to send out applications, no internships, and no new personal projects of any substance.
My experience is that there are too many college grads and boot campers with github profiles to get into the industry off of some basic home tinkering.
If you're going to do it, I imagine you've got to go one step up and stand out.
Like Minecraft? Factorio? Modern tools allow for very small team to quickly generate very AAA games. Eye candy is still an issue, but AI is quickly creeping into that space. I would not be surprised if within the next decade we have the tools for a single person to generate what we would today call a AAA game.
Minecraft and Factorio are both simpler productions in terms of visual fidelity and lean on gameplay that is captivating. AAA is not a label for the quality of game, more of a style/level of execution.
Both Minecraft and Factorio started indie to my knowledge which is a separate path and approach from AAA games. Unrelated to good/bad.
AAA requires not just using but creating the latest visual and audio innovations, creating a huge surface area prone to bugs which all need to be polished out and creating tools to manage your version of that complexity, optimize everything so it runs smoothly and doesn't take an unreasonable amount of disk space.
Even with AI, anything an individual could do, hundreds to thousands of people are also doing at AAA studios. An individual might innovate in a few aspect, but never clear the AAA bar as AAA is a constantly moving goalpost, and most tools the individual can use are likely contributed back by AAA studios to popular AAA game engines like Unreal.
It's like racing in a hamster wheel against the person making the wheels...
So by definition, they did not use modern tools.
To be clear, there are plenty of games that do that, I just think those 2 are terrible examples.
Also, Factorio was crowdfunded via a kickstarter-like platform.
Also both are around 15 years old. They are both closer in age to 1995 than today.
I got my first Apple programming job, because I had a Mac Plus at home, and learned to program it in ASM and Pascal.
I've only taken some non-matriculated math courses. All the rest was pretty much OJT and home study (and a lot of seminars and short classes). My original education was High School Dropout/GED.
a lot of employers actually like engineers who come from a personal hacking background more than traditional paths, because we're truly passionate and care deeply. we're not in for 8-5 and a paycheck.
Still waiting for my breakthrough.
And then later that was used against me to accuse me of lying about not knowing how to check the voicemail on my landline.
Up until the current hiring lull, it was very possible to get a programming position with just a self taught background.
When the need for juniors comes back around, I’m sure we’ll start to see it again.
Man, I'm skeptical, at least in the US. Since the pandemic, I've seen an absolute explosion in offshoring, which makes perfect sense when so many people are working remotely anyway. I've worked with lots of excellent engineers from Argentina to Poland and many places in between. It's tough for me to see how an American "tinkerer" will be able to find a job in that world if he wants an American-level salary.
Also, I know the adage about "this time it's different" being the most dangerous phrase in language, but, at least in one example, something really is different. In the early 00s, after the dot com bust, there was a ton of fear about outsourcing the bulk of software work to India. That turned out not to happen, of course, because (a) remote meeting software was nowhere close to where it is today, (b) remote work in general wasn't common, and (c) the timezones issues between US and India were an absolute productivity killer. These days, though, everyone is used to remote work, and US companies have realized there are enough lower cost locales with plenty of timezone overlap to make offshoring the norm these days.
You can definitely build a nice portfolio of open source software (and even hardware) on github. I would hope that is enough to get a job, but it might not be, especially in the current era of AI-fueled employment pressure.
"About 15 years later I ran into that manager again, and he was close to dying from a kidney ailment. I spent a day with him, driving him around so he could take some photographs, and having lunch. We didn't talk much about work, mostly he wanted to get out of being in bed and see the world a bit."
Just deLurking to say that was an excellent thing to do. This small paragraph tucked away at the end of the anecdote shifted the whole experience for me.
Is generative art just AI, or is there something else out there that was called that before the emergence of AI? Genuinely curious.
https://artasartist.com/what-is-generative-art/?ref=thecodis...
I do love the ways random events can change folks’ lives. Would the author have ended up doing art at all without this happening?
They had an apple II and some equipment they figured might be able to interface with it. Like the article author, I basically got a few manuals and just figured out how to make it work. I was able to do some simple control of one of the new instruments via the RS-232 serial connection, plus print out some it’s internal state.
I was very proud that all the other volunteers were basically candy stripers, and I was doing software development in a lab environment.
I would have gone bonkers with everything available today. But then again, in the 80s the sorting function was very strong - you were either one of those who could figure this stuff out, or you weren’t. Then of course, there were further delineations. Case in point, my apple ii basic and 6502 assembly hacks in the lab sound like child’s play compared to what this dude did!
I then discovered a small room that had two unsupervised computers hooked up to some mysterious world-spaning network, and made friends there, ended up leaving Physics for Computer Science.
My first job and every job in my 20s came from people I met in that room getting jobs themselves and calling me to see if I would go work with them, or someone from the previous jobs calling me back. I've never done a real job interview or sent a CV.
But then I formed a family and my social life plummeted. I'm also bad at really nurturing relationships that don't self sustain, so in retrospect I can see how my career ossified since then.
I don't totally regret it because even if I'm now underpaid and underemployed, I earn more than enough for my lifestyle and have loads of free time, so it balances the pang for greater things.
But yeah, networking is very important.
Often networking is seen as this robot-like bleep bloop hello, here’s my business card thing and at the dedicated events it very well could be but networking in the most basic sense is just making friends and shooting the shit, only difference is that you can leverage those friends for opportunities in the workplace and vice versa.
Pre comedy. I can just imagine the initial indifference when raising his hand only to look around and start lowering his hand slowly when one of the bosses just looks at him and says, "You. No, not him. YOU. You stay, everyone else can leave."
I guess some things never change.
No-one ever called a real program an "app" before that, did they?
Then some "genius" started calling their desktop offerings "apps" (perhaps because lazy multiplatform-via- webap development eventually extended to marketing copy"), and now everything is an "app".
Yes. Apple called them apps in the 80s, at least on the Mac - this is Apple II but it's plausible they were also referred to as apps there?
For my part I read the title as "Taking over a wall changed my direction as a programmer" which had me really confused for a while. I'd like to read that article, I think.
As I remember it, the term "app" came from smartphones, where it referred specifically to smartphone applications. Connotations were rather negative - inferior facsimile of the real thing (not even a full application - just an "app"), also came from "app marketplace"[0][1]. And to this days, apps are inferior to their desktop counterparts (at least surviving ones), except marketing won this one - people got used to this term, then some vendors started calling desktop software "apps", and now suddenly everything is an app.
--
[0] - To this day I roll my eyes at the mental model here. It feels unnatural and wrong, but that's maybe because I'm not used to think in terms of trying to find markets and money-making opportunities everywhere.
[1] - Or maybe it was just a huge letdown to me, and I'm soured ever since. Back when first iPhones and then Android came out, I was hoping a full-blown computer with a Linux on board would mean it'll be much easier to just write your own stuff for it. That it would feel more like a desktop in terms of functionality, capabilities, opportunities. I did not expect them to come up with "app stores" and lock the thing down, and made you accept the mark of the beast, entangle yourself with the commercial universe, just to be able to add your own joke app or such.
Since then, it only got worse.