I feel super fortunate to be a part of that generation where screwing around at home could lead directly to employment. I taught myself Atari BASIC on my 800 and took a 286 IBM compatible to college where I was a music major. I dropped out and landed working for an industrial automation company because I knew how to write dumb little programs. A couple years later I was the sole guy programming robots for them in a structured BASIC language.
rmbyrro 32 minutes ago [-]
pretty much my personal experience in a newer generation, just without the Atari, IBM, and basic
a lot of employers actually like engineers who come from a personal hacking background more than traditional paths, because we're truly passionate and care deeply. we're not in for 8-5 and a paycheck.
chrismcb 11 hours ago [-]
So... The current generation? Between mobile devices, raspberry pis, Web pages, Linux and even Windows there is plenty of stuff you can do just futzing and in your basement. Yeah it might be impossible to create your own AAA game, but you can still even create your own software. Plenty of open source opportunities out there as well
raincole 10 hours ago [-]
I suppose the parent comment was referring the job market, not technology accessibility.
thrw42A8N 2 hours ago [-]
Don't ask for a million dollars per year and you'll have plenty of opportunities. There are tens of thousands of unfilled software jobs for higher than average wages.
iteria 1 hours ago [-]
But are they willing to even talk to someone who doesn't have a degree or experience? I've never worked at jobs that were super high paying. I've never seen a fresh self-taught person on a job in the last 5 years. And I've done consulting and gotten exposure to a lot a of different companies. I've also done scrappy startups. And boring small companies no one has ever heard of.
Running into a self-taught person at all was rare, but when I did their story rarely involved not transferring from another career and leveraging some SME knowledge to get started. They already had training or a degree just not in this.
I'm not sure screwing around at home will actually land you a job. Not anymore.
maccard 1 hours ago [-]
Yes.
There are definitely places that won’t talk to you without a degree, but many, many places will take a degree or equivalent.
> screwing around at home will actually land you a job. Not anymore
I don’t think “screwing around” will land you a job whether it’s at home or at college/uni. But a degree tells me that you can stick by something for longer than a few months even when you don’t always feel like it by our own volition.
Someone who has spent a year on and off learning to code hasn’t shown they can code or that they have any sort of consistency- both of which are (equally) as important as each other in a workplace. Someone with a degree in marine biology and a handful of GitHub projects and can pass a programming test? They’re probably my first choice. Someone with 3 years experience of writing code on their own? Absolutely. Show me those candidates and I’ll interview every one of them for a junior role.
dyauspitr 4 hours ago [-]
You’re going to need a very impressive portfolio of personal projects to get a job without a degree or experience today.
kindeyoowee 2 hours ago [-]
[dead]
sandworm101 8 hours ago [-]
>> might be impossible to create your own AAA game
Like Minecraft? Factorio? Modern tools allow for very small team to quickly generate very AAA games. Eye candy is still an issue, but AI is quickly creeping into that space. I would not be surprised if within the next decade we have the tools for a single person to generate what we would today call a AAA game.
lawik 4 hours ago [-]
"Very AAA" games and Minecraft/Factorio are not related.
Minecraft and Factorio are both simpler productions in terms of visual fidelity and lean on gameplay that is captivating. AAA is not a label for the quality of game, more of a style/level of execution.
Both Minecraft and Factorio started indie to my knowledge which is a separate path and approach from AAA games. Unrelated to good/bad.
ChrisMarshallNY 9 hours ago [-]
Same here. My first programming job was a "crossover" from a hardware technician job. It both got me into software, and introduced me to the title of "Engineer." (I was originally a Technician, then, an Electrical Engineer, even though I mostly did software, but in those days, I also designed the hardware the software ran on).
I got my first Apple programming job, because I had a Mac Plus at home, and learned to program it in ASM and Pascal.
I've only taken some non-matriculated math courses. All the rest was pretty much OJT and home study (and a lot of seminars and short classes). My original education was High School Dropout/GED.
6 hours ago [-]
xattt 8 hours ago [-]
I’m a tech literati in a fairly tech illiterate field. My co-workers think I’m some sort of wizard when I show them basic Excel skills.
Still waiting for my breakthrough.
dghlsakjg 11 hours ago [-]
I’m not entirely sure we’re past those days.
Up until the current hiring lull, it was very possible to get a programming position with just a self taught background.
When the need for juniors comes back around, I’m sure we’ll start to see it again.
hn_throwaway_99 5 hours ago [-]
> When the need for juniors comes back around, I’m sure we’ll start to see it again.
Man, I'm skeptical, at least in the US. Since the pandemic, I've seen an absolute explosion in offshoring, which makes perfect sense when so many people are working remotely anyway. I've worked with lots of excellent engineers from Argentina to Poland and many places in between. It's tough for me to see how an American "tinkerer" will be able to find a job in that world if he wants an American-level salary.
Also, I know the adage about "this time it's different" being the most dangerous phrase in language, but, at least in one example, something really is different. In the early 00s, after the dot com bust, there was a ton of fear about outsourcing the bulk of software work to India. That turned out not to happen, of course, because (a) remote meeting software was nowhere close to where it is today, (b) remote work in general wasn't common, and (c) the timezones issues between US and India were an absolute productivity killer. These days, though, everyone is used to remote work, and US companies have realized there are enough lower cost locales with plenty of timezone overlap to make offshoring the norm these days.
musicale 10 hours ago [-]
I hope this is still true. There are certainly lots of opportunities for self-taught software and hardware development. And university lectures and course material (much of which is very good) that used to be locked inside physical campuses with expensive tuition fees are often freely available to anyone on the internet.
You can definitely build a nice portfolio of open source software (and even hardware) on github. I would hope that is enough to get a job, but it might not be, especially in the current era of AI-fueled employment pressure.
dyauspitr 4 hours ago [-]
Juniors aren’t coming back, not with all this AI around.
globalnode 11 hours ago [-]
the keys to employability have been captured by salesmen
Is generative art just AI, or is there something else out there that was called that before the emergence of AI? Genuinely curious.
erichocean 13 hours ago [-]
Generative art pre-AI was art created with code.
vajrabum 11 hours ago [-]
And he says on his about page "This art is primarily non-objective and abstract, focusing on complex shapes and colors. I use my math, programming, and digital manipulation knowledge to produce highly unique art." It's not AI generated.
I was studying Physics, not out of particular interest, just because it was challenging, so I was doing badly.
I then discovered a small room that had two unsupervised computers hooked up to some mysterious world-spaning network, and made friends there, ended up leaving Physics for Computer Science.
My first job and every job in my 20s came from people I met in that room getting jobs themselves and calling me to see if I would go work with them, or someone from the previous jobs calling me back. I've never done a real job interview or sent a CV.
But then I formed a family and my social life plummeted. I'm also bad at really nurturing relationships that don't self sustain, so in retrospect I can see how my career ossified since then.
I don't totally regret it because even if I'm now underpaid and underemployed, I earn more than enough for my lifestyle and have loads of free time, so it balances the pang for greater things.
But yeah, networking is very important.
ian-g 11 hours ago [-]
> Today I make generative art, see it on my website
I do love the ways random events can change folks’ lives. Would the author have ended up doing art at all without this happening?
yapyap 12 hours ago [-]
Yeah, networking can give you the world.
Often networking is seen as this robot-like bleep bloop hello, here’s my business card thing and at the dedicated events it very well could be but networking in the most basic sense is just making friends and shooting the shit, only difference is that you can leverage those friends for opportunities in the workplace and vice versa.
glitchc 12 hours ago [-]
If there's mutual interest, certainly, but in most cases networking feels shallow and forced. If the only thing in common between us is the weather, I tune out quickly. Networking is mainly for those who truly like people.
oceanparkway 1 days ago [-]
Metal desks!
sjf 14 hours ago [-]
My forearms are getting cold just thinking about it.
commandersaki 11 hours ago [-]
Eh, I read the article and I still don't know what it means to "talk over a wall".
vajrabum 11 hours ago [-]
If you worked in a cubicle farm you'd know. The cubicles were generally divided by low portable walls. There were different setups but generally you don't see people when you're seated but if you stand you can see your neighbors.
dghlsakjg 11 hours ago [-]
I think he means literally talking to someone on the other side of his cubicle wall.
GrumpyNl 2 hours ago [-]
For me it boils doen to, communication is key, talk to each other, exchange ideas.
tantalor 11 hours ago [-]
I think it means taking to people outside your team, about your personal interest areas.
aaron695 13 hours ago [-]
[dead]
wlindley 13 hours ago [-]
It's a program. "App" is a word, short for "Application Program," publicized by Apple for its handheld computers that masquerade as (and are euphemistically called) "telephones." "App" effectively means "proprietary closed-source program that talks to proprietary walled-garden programs running on someone else's computer, and acts as a spy sending all your sensitive data to who-knows-where."
No-one ever called a real program an "app" before that, did they?
happytoexplain 12 hours ago [-]
"Application" has been a common general term for an end-user program for a very long time, and "app" is just an obvious abbreviation that people and UIs have used to varying degrees all along. iOS apps merely mainstreamed the term, they didn't take ownership of it.
TeMPOraL 2 hours ago [-]
iOS mainstreamed it, but for a long time, "app" had a different meaning. Like, "application" was the big full-featured thing you run on your PC; "app" was the toy thing you run on the phone.
Then some "genius" started calling their desktop offerings "apps" (perhaps because lazy multiplatform-via- webap development eventually extended to marketing copy"), and now everything is an "app".
Smeevy 13 hours ago [-]
I've been programming professionally for over 30 years and "app", "application", and "program" have been interchangeable for me and the people I worked with as far back as I can remember.
peterfirefly 11 hours ago [-]
Operating systems are not apps. Embedded controller programs are not apps.
0xcde4c3db 13 hours ago [-]
I don't recall seeing "app" on its own that often, but there was the idiom "killer app", meaning an application that was compelling enough to drive sales of its host platform (VisiCalc on Apple II being the go-to example).
tom_ 13 hours ago [-]
GEM on the Atari ST supported the .app (short for "application") extension for gui executables. One of its components was the AES, short for Application Environment Services. This stuff dates from the early to mid 1980s.
cannam 13 hours ago [-]
> No-one ever called a real program an "app" before that, did they?
Yes. Apple called them apps in the 80s, at least on the Mac - this is Apple II but it's plausible they were also referred to as apps there?
For my part I read the title as "Taking over a wall changed my direction as a programmer" which had me really confused for a while. I'd like to read that article, I think.
musicale 11 hours ago [-]
Apple (App-le?) certainly popularized abbreviating "applications programs" or "application software" (vs. system software, systems programs etc.) to "applications" in the 1980s, and "apps" with the advent of the App Store in 2008, but Apple was unsuccessful in trying to obtain and enforce an App Store trademark given prior uses of app, store, and app store (including, perhaps ironically given Steve Jobs' return and Apple's acquisition of NeXT, a store for NeXTSTEP apps.) "Killer App(lication)" dates to the 1980s, applying to software like VisiCalc for the Apple II.
majormajor 13 hours ago [-]
"Applications" was a very common term in the classic Mac days. "Programs" was a more Windows-y term. ("Applications" vs "Program Files" in ye olden 90s world of where to put things you installed.)
koolba 13 hours ago [-]
IIRC, even the default template on Windows in the early 90s with Visual Studio was MFCApp.
TeMPOraL 2 hours ago [-]
In my experience, end-user programs you'd run and operate were called "applications" or "programs", and it was a specialist term anyway, because general population didn't think in terms of applications anyway - they thought in terms of "running Word" or "running Google", with the machine itself an implementation detail.
As I remember it, the term "app" came from smartphones, where it referred specifically to smartphone applications. Connotations were rather negative - inferior facsimile of the real thing (not even a full application - just an "app"), also came from "app marketplace"[0][1]. And to this days, apps are inferior to their desktop counterparts (at least surviving ones), except marketing won this one - people got used to this term, then some vendors started calling desktop software "apps", and now suddenly everything is an app.
--
[0] - To this day I roll my eyes at the mental model here. It feels unnatural and wrong, but that's maybe because I'm not used to think in terms of trying to find markets and money-making opportunities everywhere.
[1] - Or maybe it was just a huge letdown to me, and I'm soured ever since. Back when first iPhones and then Android came out, I was hoping a full-blown computer with a Linux on board would mean it'll be much easier to just write your own stuff for it. That it would feel more like a desktop in terms of functionality, capabilities, opportunities. I did not expect them to come up with "app stores" and lock the thing down, and made you accept the mark of the beast, entangle yourself with the commercial universe, just to be able to add your own joke app or such.
Since then, it only got worse.
Rendered at 11:10:43 GMT+0000 (Coordinated Universal Time) with Vercel.
a lot of employers actually like engineers who come from a personal hacking background more than traditional paths, because we're truly passionate and care deeply. we're not in for 8-5 and a paycheck.
Running into a self-taught person at all was rare, but when I did their story rarely involved not transferring from another career and leveraging some SME knowledge to get started. They already had training or a degree just not in this.
I'm not sure screwing around at home will actually land you a job. Not anymore.
There are definitely places that won’t talk to you without a degree, but many, many places will take a degree or equivalent.
> screwing around at home will actually land you a job. Not anymore
I don’t think “screwing around” will land you a job whether it’s at home or at college/uni. But a degree tells me that you can stick by something for longer than a few months even when you don’t always feel like it by our own volition.
Someone who has spent a year on and off learning to code hasn’t shown they can code or that they have any sort of consistency- both of which are (equally) as important as each other in a workplace. Someone with a degree in marine biology and a handful of GitHub projects and can pass a programming test? They’re probably my first choice. Someone with 3 years experience of writing code on their own? Absolutely. Show me those candidates and I’ll interview every one of them for a junior role.
Like Minecraft? Factorio? Modern tools allow for very small team to quickly generate very AAA games. Eye candy is still an issue, but AI is quickly creeping into that space. I would not be surprised if within the next decade we have the tools for a single person to generate what we would today call a AAA game.
Minecraft and Factorio are both simpler productions in terms of visual fidelity and lean on gameplay that is captivating. AAA is not a label for the quality of game, more of a style/level of execution.
Both Minecraft and Factorio started indie to my knowledge which is a separate path and approach from AAA games. Unrelated to good/bad.
I got my first Apple programming job, because I had a Mac Plus at home, and learned to program it in ASM and Pascal.
I've only taken some non-matriculated math courses. All the rest was pretty much OJT and home study (and a lot of seminars and short classes). My original education was High School Dropout/GED.
Still waiting for my breakthrough.
Up until the current hiring lull, it was very possible to get a programming position with just a self taught background.
When the need for juniors comes back around, I’m sure we’ll start to see it again.
Man, I'm skeptical, at least in the US. Since the pandemic, I've seen an absolute explosion in offshoring, which makes perfect sense when so many people are working remotely anyway. I've worked with lots of excellent engineers from Argentina to Poland and many places in between. It's tough for me to see how an American "tinkerer" will be able to find a job in that world if he wants an American-level salary.
Also, I know the adage about "this time it's different" being the most dangerous phrase in language, but, at least in one example, something really is different. In the early 00s, after the dot com bust, there was a ton of fear about outsourcing the bulk of software work to India. That turned out not to happen, of course, because (a) remote meeting software was nowhere close to where it is today, (b) remote work in general wasn't common, and (c) the timezones issues between US and India were an absolute productivity killer. These days, though, everyone is used to remote work, and US companies have realized there are enough lower cost locales with plenty of timezone overlap to make offshoring the norm these days.
You can definitely build a nice portfolio of open source software (and even hardware) on github. I would hope that is enough to get a job, but it might not be, especially in the current era of AI-fueled employment pressure.
Is generative art just AI, or is there something else out there that was called that before the emergence of AI? Genuinely curious.
https://artasartist.com/what-is-generative-art/?ref=thecodis...
I then discovered a small room that had two unsupervised computers hooked up to some mysterious world-spaning network, and made friends there, ended up leaving Physics for Computer Science.
My first job and every job in my 20s came from people I met in that room getting jobs themselves and calling me to see if I would go work with them, or someone from the previous jobs calling me back. I've never done a real job interview or sent a CV.
But then I formed a family and my social life plummeted. I'm also bad at really nurturing relationships that don't self sustain, so in retrospect I can see how my career ossified since then.
I don't totally regret it because even if I'm now underpaid and underemployed, I earn more than enough for my lifestyle and have loads of free time, so it balances the pang for greater things.
But yeah, networking is very important.
I do love the ways random events can change folks’ lives. Would the author have ended up doing art at all without this happening?
Often networking is seen as this robot-like bleep bloop hello, here’s my business card thing and at the dedicated events it very well could be but networking in the most basic sense is just making friends and shooting the shit, only difference is that you can leverage those friends for opportunities in the workplace and vice versa.
No-one ever called a real program an "app" before that, did they?
Then some "genius" started calling their desktop offerings "apps" (perhaps because lazy multiplatform-via- webap development eventually extended to marketing copy"), and now everything is an "app".
Yes. Apple called them apps in the 80s, at least on the Mac - this is Apple II but it's plausible they were also referred to as apps there?
For my part I read the title as "Taking over a wall changed my direction as a programmer" which had me really confused for a while. I'd like to read that article, I think.
As I remember it, the term "app" came from smartphones, where it referred specifically to smartphone applications. Connotations were rather negative - inferior facsimile of the real thing (not even a full application - just an "app"), also came from "app marketplace"[0][1]. And to this days, apps are inferior to their desktop counterparts (at least surviving ones), except marketing won this one - people got used to this term, then some vendors started calling desktop software "apps", and now suddenly everything is an app.
--
[0] - To this day I roll my eyes at the mental model here. It feels unnatural and wrong, but that's maybe because I'm not used to think in terms of trying to find markets and money-making opportunities everywhere.
[1] - Or maybe it was just a huge letdown to me, and I'm soured ever since. Back when first iPhones and then Android came out, I was hoping a full-blown computer with a Linux on board would mean it'll be much easier to just write your own stuff for it. That it would feel more like a desktop in terms of functionality, capabilities, opportunities. I did not expect them to come up with "app stores" and lock the thing down, and made you accept the mark of the beast, entangle yourself with the commercial universe, just to be able to add your own joke app or such.
Since then, it only got worse.