When I was in school, decades ago now, very few people went into CS compared to other majors. Everyone I knew going into it did it because they loved it. I would have done it regardless of the career opportunities because I want to build stuff.
Interviewing candidates over the years since then, my experience has been there are still very few of those passionate nerds and a lot of people who did it for other reasons, like the money or similar. There is nothing inherently wrong with this. I don’t fault people for it.
Maybe if we get very lucky, it will go back to a relatively few passionate people building stuff because it is cool?
That runs completely counter to the basics of supply and demand in a perfect competition market. It would be market with far fewer (labor) suppliers, who could therefore command a higher wage, not lower.
“Teachers work an average of 34.5 hours per week on an annual basis (38.0 hours per week during the school year and 21.5 hours per week during the summer months).”
That’s leaving out the benefits of incredibly strong union protections, it being a state job with matched benefits, absurd job security even in the face of terrible performance, etc.
I hope not, because we don't need software developers to be "starving artist 2.0".
And on that note: I vividly remember people staying away from the video game development industry because it was deemed "passion industry", and that had a really negative connotation of long working hours for asymmetrical return, and more.
I don't look forward for every other software engineering branch to become like that.
I initially pursued my real passion which was math and physics and got a cold water bucket to the face only after grad school.
I think we basically lost this when software/computer/internet entered the mainstream. Now, like everything else, it has to be bland, unoffensive, and a commodity.
Can you sit down with an unfamiliar domain and develop enough genuine curiosity to get good at it, without a syllabus or a credential dangling in front of you?
The kids who'll do well in a world where the field-to-security mapping keeps shifting are the ones who can self-direct — not the ones who picked the right field in 2026.
Although full disclosure I'm short humans and very long paperclips.
What a ludicrous world we live in where this is a socially acceptable view to hold.
Agreed that if someone can self direct and is capable, they’ll do better. Assuming two people who are similar in that regard, what are professions that may benefit from AI rather than hurt because of it.
If things play out I see there being two classes of low paid developers in a decade or so: the first being the vibe coders who earn a subsistence wage because most people can do it (not everyone, there will still be a cost of entry, paying for the tools, which will exclude some groups), the second being the more “artisnal” developers working on the things that can't (yet) be vide coded and fixing up the problems caused by insufficient care by the vibers and those employing them. These will be low paid because while the work is important demand will be low and there will still be a fair few people with the skills and desire (they'll make ends meet between good jobs by taking on gig-economy vide-coding work themselves). There will be a lucky few still making a decent living, but a much lower proportion than now.
I'm hoping to arrange retirement before things get that far… Failing that I'll do something else (I could be a sparky, though if all the youngsters are training for that perhaps that industry will gain a bad supply/demand picture from the worker's PoV too!) to pay the bills and reclaim dicking around with tech as a hobby.
But let me ask you this: has AI made life easier for illustrators, book authors, or musicians? They were affected by the technology earlier on. If they don't embrace AI, they face increased competition from cheaply-made products that the average consumer can't distinguish from the "real" thing. But if they embrace it, they can't differentiate themselves from the cheaply-produced content! In fact, for artists, the best strategy may be to speak out very vocally against AI, reject it early on, and build a following of like-minded consumers.
If you want to be in a remote, small town, get into construction and become a builder with their own GC license in a few years. Then charge people 400k to build that little dream cottage with 2 guys (you and a team mate) twice a year. 150k each 100k mats for each house. Just a small warning: It's hard but real work and very rewarding.
I went to the local Claude Code meetup last week, and the contrast between the first two speakers really stuck with me.
The first was an old-skool tech guy who was using teams of agents to basically duplicate what an entire old-fashioned dev team would do.
The second was a "non-technical" (she must have said this at least 20 times in her talk) product manager using the LLM to prototype code and iterate on design choices.
Both are replacing dev humans with LLMs, but there's a massive difference in the technical complexity of their use. And I've heard this before talking to other people; non-technical folks are using it to write code and are amazed with how it's going, while technical folks are next-level using skills, agents, etc to replace whole teams.
I can see how this becomes a career in its own right; not writing code any more, but wrangling agents (or whatever comes after them). The same kind of mental aptitude that gets us good code can also be used to solve these problems, too.
Especially considering that the implication is that humans just become a pair of hands with opposable thumbs?. Take the electrician in the article, sure its a skilled job but the barrier into it drops massively imo if you can just take a picture of whatever issue is at hand and ai spits out what is needed, no?
1) The supply of work will skyrocket when everyone will flock there for work
2) Demand will plummet as the white collar people who bought these services will loose their jobs and income
And of course if robotics will get solved to an acceptable degree most of those jobs will also get mostly automated.
When a robot can reliably do this work, I think it can reliably do any human job that requires physical ability and judgement.
Basically all that would be left of desk jobs would be those which have unfair legal powers (including via licenses and credentials) or are pure accountability plays. Like politicians, lawyers, aircraft pilots, corporate accountants... And those jobs will suck because people will be accountable for work that is not their own.
These jobs won't require any skills because most people may be able to go through their entire career without doing any work. But they will get paid a lot just for having being selected for their position... While other people who may be more skilled than them might be broke and homeless.
1) No matter the age, they are using said AI to replace human
2) Within workplace, they are using AI to do their work so they are learning nothing
3) That is it, people are using AI to replace their own work rather than improve it, people are driving themselves out of work.
There was never any value in simply the ability to invert a binary tree from memory. First, contrary to popular belief, this particular challenge is quite trivial, even easier imo than fizzbuzz. The value of testing candidates with easy problems is their usefulness in quickly filtering out potentially problematic coders, not necessarily to identify strong ones.
Second, another common take on coding challenges is that they're about memorization. Somewhat, but only to a point. Data structures and algorithms are a vocabulary. A big part of the challenge of using them "creatively" in real life is your ability to recognize that a particular subset of that vocabulary best matches a particular situation. In many novel contexts an LLM might be able to help you with implementation once the right algorithm has been identified, but only after you yourself have made that insightful connection.
Having said this I generally agree with the philosophy [0] that keeping things simple is enough 95+% of the time.
Come to think of it, domain knowledge should be an LLMs strong suit as long as you can provide the right documentation, which is working pretty well already.
Right now the main issue I see with AI is that it doesn't do well with scaling. It's great for building demos and examples but you have to fix its code for real production work. But for how long?
At the same time medicine, hardware design, good industrial, and specific domain knowledge (problems you solve in assembly or control loops) that are fundamentally proprietary and aren't well documented will continue to have value even when LLMs make solving the problems around them easier. Those might have increased leverage, at least for this round of LLMs. Now, maybe they succeed in World Models, but that is not today.
Really, I don't know what "kids these days" are going to do. I couldn't have predicted the influencer boom 15 years ago, but I also think there are geopolitical risks that are probably bigger than that shift, and "synergized" with the push to AI Everything, it doesn't look like a good time to be a learning/working human.
Post-LLMs, the value of this (as differentiator) has dropped to zero. Domain knowledge (also known as business knowledge) is the obvious area to skill up on. It simply means knowledge about the area your organisation is working in. Whether it is yogurt delivery logistics, clothing manufacturing supply chain systems, etc. That's the real differentiator now. Anyone can invert a binary try in 5 minutes using an LLM. But designing a software system knowing well the domain your organisation is in is invaluable.
Ain't nobody gonna hire a code monkey - you are being hired based on whether or not you can reason and enable workflows via tech.
If you're only name to grace is you can write pretty Python but cannot architect at scale or care to actually understand the bigger picture of what is being built and why, you will get offshored to someone who is also using Claude Code.
If I'm working on a fullstack for a cloud security product like Wiz, I'd rather hire an average developer who deeply understands the cloud security industry versus a NodeJS doc wiz who has zero empathy or interest in learning about cloud security. There are too many of the latter and not enough of the former in the American scene now, and especially on HN.
If HNers cry about how cut-throat the American market has become, they haven't seen it in China, India, or the CEE.
tldr; Just like knowledge work, most trade stuff is probably mostly repeated (i.e. very trainable) task with a small amount of taste and discernment applied. The repeated will be trainable, the discernment may be trainable. I don't think the physical world is necessarily any safer than the knowledge world.
Even if we get robots who can, say, build roads start to end, there is still a HUGE gap between that and it actually being used. There is a hard floor, too. Robots are made of physical things, physical things have scarcity, and there's no way around that to our knowledge. Even if you can build the robot for 1 cent, the material cost will still exist.
People are not, though, and all the folks who are no longer necessary in knowledge work are available for physical work.
That being said, the absolute focus on trades from the fed right now just reeks of the wild pendulum swing. It used to be 'go to college to get a good job' then we had too many college grads. In ten years we'll have a glut of people trained in the trades with no prospects.
It just keeps swinging back and forth and somehow Joe Regularworker keeps losing.
https://serjaimelannister.github.io/wsj-article/
and I have also uploaded the github link on archive.org for persistence/archival purposes.
https://web.archive.org/web/20260322213950/https://serjaimel...
I hope that this might help some people and I have another friendly suggestion to please donate to archive.org :-)
Cloudflare flags archive.today as "C&C/Botnet"; no longer resolves via 1.1.1.2
related:
https://news.ycombinator.com/item?id=46843805 "Archive.today is directing a DDoS attack against my blog"
> People stop learning programming.
> Programmers become scarce.
> Programmers become valuable again.
Maybe it's wishful thinking but I'm not going to be surprised if it plays out like this. In some sense the reverse happened over the last couple of decades - everyone and their mother got into IT and the industry became saturated.
There were always unqualified people coming out of college, but the amount of people in interviews that can literally do nothing these days seems higher than before.
There was always some cohort of people that somehow managed to graduate from college with a CS degree, and seemingly not learning anything, or at least not learn how to even write basic code (independently).
It seems like AI is not reducing that percentage - possibly increasing it.
Anecdata, take it with a grain of salt.
Not that AI is the same as Websites all going broke. But no one can see the future and it’s unlikely that deep technical knowledge will be obsolete.