The question is... is this based on existing capability of LLMs to do these jobs? Or are companies doing this on the expectation that AI is advanced enough to pick up the slack?
I have observed a disconnect in which management is typically far more optimistic about AI being capable of performing a specific task than are the workers who currently perform that task.
And to what extent is AI-related job cutting just an excuse for what management would want to do anyway?
I do not see anything in this study that accounts for the decline in economic activity. Is it AI replacing the jobs, or is it that companies are not optimistically hiring, which disproportionally impacts entry level jobs?
Agree, I think the high cost of full time hires for entry level software jobs (total comp + onboarding + mentoring) vs investing in AI and seeing if that gap can be filled is a far less risky choice at the current economic state.
6-12 months in, the AI bet doesnt pay off, then just stop spending money in it. cancel/dont renew contracts and move some teams around.
For full time entry hires, we typically dont see meaningful positive productivity (their cost is less than what they produce) for 6-8 months. Additionally, entry level takes time away from senior folks reducing their productivity. And if you need to cut payroll cost, its far more complicated, and worse for morale than just cutting AI spend.
So given the above, plus economy seemingly pre-recession (or have been according to some leading indicators) seems best to wait or hire very cautiously for next 6-8 months at least.
Even then why hire a junior dev instead of a mid level developer that doesn’t need mentoring? You can probably hire one for the same price as a junior dev if you hire remotely even in the US.
Exactly this. 2023Q1 was when the interest rate hike from the previous year really kicked in with full force. It was the first hiring market I ever saw in well over a decade where the employers were firmly in the drivers seat even for seniors.
I can imagine that there were a decent number of execs who tried chatgpt, made some outlandish predictions and based some hiring decisions upon those predictions though.
This is the big question. It could be any combination of the following and it likely depends on the company/position too:
- Generative AI is genuinely capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is convincing people who make hiring decisions that it is capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is being used as an excuse to not hire during an economic downturn.
This was close to my first thought as well. I don't think we're far enough along the LLM adoption curve to actually know how it will affect the business case and thus employment long term. In the last couple of years of LLM/AI honeymoon, the changes to accommodate the technology may obscure direct and second order effects.
Interesting. However just because this is true right now doesn't mean it will be true going forward. Unique to the current moment is that there are simultaneously (1) high interest rates and a challenging economy (2) a narrative that AI adoption should enable cutting junior roles. This could lead to companies that would anyway be doing layoffs choosing to lay off or not hire juniors, and replace with AI adoption.
To really test the implied theory that using AI enables cutting junior hiring, we need to see it in a better economy, in otherwise growing companies, or with some kind of control (though not sure how this would really be possible).
We had some marketing folks give us a company-wide demo of Chat GPT and some other Gen AI tools and showed us how cool it is and how quick they can make stylish and sophisticated pitch decks and marketing materials now.
And the entire time I'm watching this I'm just thinking that they don't realize that they are only demonstrating the tools that are going to replace their own jobs. Kinda sad, really. Demand for soft skills and creatives is going to continue to decline.
And then the customers use gen AI to summarize the same pitch decks/marketing materials so they don't have to look at them. Let's cut out the middle man and just send the prompt instead.
LLMs are good at creating single use documents, like a pitch deck used for one prospective customer (never to be used again). But for long lived documents, on which future work builds atop, the bar is higher and the value of LLMs is more grey.
In the late 90s you weee considered a prodigy if you understood how to use a search engine. I had so many opportunities simply because I could find and retain information.
So LLMs have solved this. Knowing a framework or being able to create apps is not a marketable skill any longer. What are we supposed to do now?
It’s the soft skills that matter now. Being well liked has always been more important in a job than being the best at it. We all know that engineer who knows they are hot shit but everyone avoids because they are insufferable.
Those marketing people don’t need to spend a week on their deck any longer. They can work the customer relationship now.
Knowing how to iterate with an LLM to give the customer exactly what they need is the valuable skill now.
but i guess to me the question is: if you're management, do you expect your workers to do more/work faster (like a TAS in a way)? or do you expect to replace your workers entirely?
I personally think we're still a ways from the latter...
Generative AI may automate some entry-level tasks, but young professionals are not just “replaceable labor.” They bring growth potential, adaptation, and social learning. Without frameworks to manage AI’s role, we risk undermining the very training grounds that prepare the next generation of experts.
In the future, there will be two kinds of companies:
1. Those that encourage people to use AI agents aggressively to increase productivity.
2. Those that encourage people to use AI agents aggressively to be more productive while still hiring young people.
Which type of company will be more innovative, productive, and successful in the long run?
I don't understand how you want innovation and productivity in a world with rapidly increasing population. We need less and less people while producing more and more people. Where am I wrong?
That world was 30 years ago. In 2025 world average total fertility rate is 2.2, which is a shade above replacement rate (2.1). And 2.2 is a 10% drop since 2017 alone (when it was 2.46).
Because life expectancy is higher, the population will continue to increase. But not "rapidly".
Any job that doesn't creatively generate revenue will be systematized and automated as soon as possible. AI agents are just an acceleration factor for this
It’s really hard to adjust for economic factors here. I am in agreement with skepticism in this thread as job numbers got revised downward heavily in both 2024 and 2025 (too negatives in some months) indicating a poor economic situation.
The tragedy of the commons: companies acting in their self interest at the expense of the industry by drying up the workforce pipeline. The next generation will pay, like when America stopped producing hardware.
It has been like this for decades by now, and your precious government loves this because they also consist of old people who hate the younger generations. Any and every time the government intervenes it is to stomp down on the youth and nothing else.
The pessimistic reading is well-represented, so here's another: AI changes the definition of "entry-level", but it doesn't eliminate the class of professional labor that experienced workers would rather not do.
Until AI can do literally everything we can, that class of work will continue to exist, and it'll continue to be handed to the least experienced workers as a way for them to learn, get oriented, and earn access to more interesting problems and/or higher pay while experienced folks rest on their laurels or push the state of the art.
New Harvard's study (62M workers, 285k firms) shows firms adopting generative AI cut junior hiring sharply while continuing to grow senior roles — eroding the bottom rungs of career ladders and reshaping how careers start.
What was the incentive for companies to train juniors into seniors in the past, post job-hopping era? Curious to know if that incentive has warped in the past two decades or so as someone who's starting their career now.
In 10 years where do the senior dev's come from? Real question. Seems like with lower entry level jobs now, in 10 years there won't be seniors to hire.
Even if we grant this is going to be a problem, it makes no sense for any individual company to do anything about it. Why take on the cost of training a junior when they can bail in a few years? This is especially true if you're not a big tech company, which puts you at risk of having your junior-turned-senior employees poached by big tech.
In 10 years, the management (or "leadership" if you like the taste of boot) responsible for doing the cutting will have moved on to something else, with no consequences for them.
Junior devs eventually will have been brought up with agentic coding, etc. Hopefully whatever the "new way" becomes is how they'll be taught.
Currently part of the problem is the taboo using AI coding in undergrad CS programs. And I don't know the answer. But someone will find the right way to teach new/better ways of working with and without generative AI. It may just become second nature to everyone.
This is the same reason they force you to do the math by hand in undergrad and implement functions that are already in the standard libraries of most languages. Because you don't know anything yet, and you need to learn why the more automated stuff works the way it does.
While agentic coding can make you productive, it won't teach you to deeply understand the source code, algorithms, or APIs produced by AI. If you can't thoroughly audit any source code created by an AI agent, then you are definitely not a senior developer.
This is just not true. I have witnessed people who would have been called dabblers or tinkerers just a few years ago become actual developers by using cursor. They ask a few key questions when they get stuck about engineering best practices and really internalize them. They read the code they are producing and ask the assistant questions about their codebase. They are theorycrafting using AI then implementing and testing. I have witnesses this with my own eyes and as AI has gotten better they have also been getting more knowledgeable. They read the chains of thought and study the outputs. They have become real developers with working programs on their github. AI is a tool that teaches you as it is used if you put in the effort. I understand many folks are 'vibe coding' and not learning a single thing and I don't know if thats the majority or the minorty, but the assertion that all people learn nothing from use of these tools is false.
You're talking about people who put in a significant non-trivial effort to thoroughly understand the code produced by the AI. For them, AI was just one path to becoming proficient developers. They would have gotten there even before the AI boom. I was not talking about such highly-motivated people.
Agentic coding is like leading and instructing a team of a bunch of very dumb but also very smart junior devs. They can follow instructions to the T and have great memory but lack common sense and have no experience. The more experienced and skilled their leadership, the better chance of getting a good result from them, which I don’t think is a good job (yet?) for an entry level human SWE.
Not even a little bit. Where I work we regularly churn through kids just out of college and most of them don't have Clue One how to operate anything on their computer.
Yeah, growing up in the 80s or 90s might have had you uniquely well-positioned to be "good with computers", because "the computer that has games and the internet" was (in some sense) the same as "the computer that adults are supposed to use for work".
That's not true anymore in the smart phone / tablet era.
5-10 years ago my wife had a gig working with college kids and back then they were already unable to forward e-mails and didn't really understand the concept of "files" on a computer. They just sent screenshots and sometimes just lost (like, almost literally) some document they had been working on because they couldn't figure out how to open it back up. I can't imagine it has improved.
Might have been the case before. But these days, kids are brought up on locked-down content-focused machines (e.g. ipads). They struggle with anything harder than restarting an app.
When my little cousin was three and already knew how to use the phone by himself people were claiming he was gonna be a tech wizard and everybody was talking about digital natives. But when he got to high school he didn't know how to turn a computer on. How useful is it to be god tier at getting results from LLMs, if you have zero clue if the result you got is any good?
> In 10 years where do the senior dev's come from?
From company interns. Internships won't go away, there will just be less of them. For example, some companies will turn down interns because they do not have the time to train them due to project load.
With AI, now employed developers can be picky on whether or not to take on interns.
I suppose the idea is that those junior developers who weren't hired will spend 10 years doing intensive, unpaid self-study so that they can knock on the door as experienced seniors by that time.
Are you serious? How on earth are these people going to eat or pay rent for 10 years? As well, most companies would laugh you out the door if you were applying a senior role without any experience working in the role.
I'm not laughing at all. I'm definitely not making fun of those who may be affected by this. My sarcasm was directed at people or companies planning to implement such ideas.
They will be promoted but they won't have the requisite experience. We'll have people in the highest positions with the usual titles, but they will be severely underqualified. Enshitification will ensue.
Today they're admitting AI is hollowing out entry-level jobs. The reality is that it can and will replace mid-level and eventually even quite senior jobs.
Why?
It's already doing a lot of the loadbearing work in those mid-level roles too now, it's just a bit awkward for management to admit it. One common current mode of work is people using AI to accomplish their work tasks very quickly, and then loafing a bit more with the extra time. So leaders refrain from hiring, pocket the savings, and keep a tight lid on compensation for those who remain.
At some point they'll probably try to squeeze the workforce for some additional productivity, and cut those who don't deliver it. Note that the "ease" of using AI for work tasks will be a rationale for why additional compensation is not warranted for those who remain.
What happens when companies refuse to hire, even though there is an obvious need? It has to lead to reduced growth. If the majority of companies do this, I would think it would lead to a severe deflationary cycle.
If a few companies do this it is probably fine, but more interesting if most companies do this. It seems that it would be akin to a self inflicted depression with severe deflation.
If entry-level roles are shrinking, how should companies rethink talent development?
Without the traditional “bottom rungs,” how do we grow future seniors if fewer juniors ever get the chance to start?
When were companies ever thinking about talent development, especially for SWE? We had some loose "mentorship" roles but IME most folks are left to their own devices or just learn by bandwagoning things from reddit.
The plan seems to be to hope that AI will be able to replace the senior ICs in the near future. They're certainly gutting the ranks of management today in a way that presupposes there will be far fewer ICs of all levels to manage soon.
I think open source contributions/projects will still be a way to gain verifiable experience.
Other than that, I guess developing software in some capacity while doing a non-strictly software job - say, in accounting, marketing, healtcare, etc. This might not be a relevant number of people if 'vibe coding' takes hold and the fundamentals are not learned/ignored by these accountants, marketers, healthcare workers, etc.
If that is the case, we'd have a lot of 'informed beginners' with 10+ years of experience tangentially related to software.
Edit: As a result of the above, we might see an un-ironic return to the 'learn to code' mantra in the following years. Perhaps now qualified 'learn to -actually- code'? I'd wager a dollar on that discourse popping up in ~5 years time if the trend of not hiring junior devs continues.
I'm looking forward for the weird inflective trend of "organic" programs, "humane" dev treatment, and software development taking a long time being seen as a mark of quality rather than stagnation or worry. :)
I'm half-joking, but I wouldn't be surprised to see all sorts of counterpoint marketing come into play. Maybe throw in a weird traditional bent to it?
> (Pretentious, douche company): Free-range programming, the way programming was meant to be done; with the human touch!
All-in-all, I already feel severely grossed out any time a business I interact with introduces any kind of LLM chatbot shtick and I have to move away from their services; I could genuinely see people deriving a greater disdain for the fad than there already is.
That's much longer than a quarterly earnings report away, which makes it "somebody else's problem" for the executives pushing these policies. There's no reason to expect these people to have a long-term strategy in mind as long as their short-term strategy gives them a golden parachute.
presumably this also means the relative value of seniors is now increasing, as the pipeline to replace them is smaller.
its like how the generic "we take anyone" online security degree has poisoned that market -- nothing but hoards of entry level goobers, but no real heavy hitters on the mid-to-high end. put another way, the market is tight but there are still reasonable options for seniors.
Agree, increased value and demand for seniors. But how will the market solve the generation of new seniors if juniors are getting less opportunities?
Take the software development sector as example: if we replace junior devs by AI coding agents and put senior devs to review the agent's work, how will we produce more seniors (with wide experience in the sector) if the juniors are not coding anymore?
Who cares? This is a once in a lifetime opportunity to finally gatekeep software engineering the way lawyers and finance professionals do with their fields! Enjoy the windfall in 5 years!
The answer is that developers are a self selected group and always have been. If you don’t know how to keep up on your own and reinvent yourself as necessary, then forget it. No company or school will ever evolve a developer to the degree a self directed developer will do on their own.
If you are complaining about finding a job today, guess what, you’ll be complaining in five years too. If you’re 25 complaining about seniors, you’ll be 28 complaining about college graduates. It’s a BEAST out there. The job description literally requires you to be ready for entire paradigm shifts.
With all that said, one would really have to think about why they would want to invest into a career with a cadence like that? Shrugs. That’s the job description, be careful what you get involved with in life.
If it’s about the money, then I’d just get a nursing degree and never worry about shit until the end of my life, keep all of it invested for 40 fucking years without a flicker of a thought about job instability or performance. So, the “about the money” people are literally going down the least optimal path imho.
Unless you really want to spend a lifetime (and I mean that, people spend a LIFETIME doing this) sprucing up hobby GitHub projects to demo to employers or sell the same shit as everyone else trying on Product Hunt, unless you LOVE doing this, please, for the love of god, fucking reevaluate your future.
That’s my best Tyler Durdan “get your ass back to Veterinarian school” impression.
> the largest effects in wholesale and retail trade
Hard for me to believe that AI in its current state is hollowing out junior shop assistant and salesperson roles. Either those jobs were already vulnerable to "dumb" touchscreen kiosks or they require emotional intelligence and embodied presence that LLMs lack.
This makes me think the conversation around AI and jobs is too focused on total employment, when the real story is how it shifts opportunities within companies. If juniors are getting fewer chances to enter, that could create long-term bottlenecks for talent growth.
The economic turmoil in the US is hollowing out the entry level jobs, AI is just the cover companies are using. The constant tariff changing means that companies have to be very pessimistic in their long term planning, as any assumptions they make can be turned on their head with no notice.
Another way to look at it is that legacy jobs have no future therefore there is no point bringing in the next generation into a dying system.
Another way to look at it is that hiring is fine, and that the vain entitled generation we all suspected was going to emerge feels that a job should absolutely be available to them, and immediately.
Another way to look at it is that journalism has been dead for quite a while, and writing about the same fear-based topics like “omg hiring apocalypse” is what makes these people predictable money (along with other topics).
Another way to look at it is that we raised a generation of narcissistic parents and children that have been going “omg grades”, “omg good college”, “omg internship”, “omg job” for so long that that these lamentations feel normalized. A healthy dose of stfu was never given to them. Neurotic motherfuckers.
The question is... is this based on existing capability of LLMs to do these jobs? Or are companies doing this on the expectation that AI is advanced enough to pick up the slack?
I have observed a disconnect in which management is typically far more optimistic about AI being capable of performing a specific task than are the workers who currently perform that task.
And to what extent is AI-related job cutting just an excuse for what management would want to do anyway?
6-12 months in, the AI bet doesnt pay off, then just stop spending money in it. cancel/dont renew contracts and move some teams around.
For full time entry hires, we typically dont see meaningful positive productivity (their cost is less than what they produce) for 6-8 months. Additionally, entry level takes time away from senior folks reducing their productivity. And if you need to cut payroll cost, its far more complicated, and worse for morale than just cutting AI spend.
So given the above, plus economy seemingly pre-recession (or have been according to some leading indicators) seems best to wait or hire very cautiously for next 6-8 months at least.
I can imagine that there were a decent number of execs who tried chatgpt, made some outlandish predictions and based some hiring decisions upon those predictions though.
- Generative AI is genuinely capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is convincing people who make hiring decisions that it is capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is being used as an excuse to not hire during an economic downturn.
Could even still be other things too.
To really test the implied theory that using AI enables cutting junior hiring, we need to see it in a better economy, in otherwise growing companies, or with some kind of control (though not sure how this would really be possible).
And the entire time I'm watching this I'm just thinking that they don't realize that they are only demonstrating the tools that are going to replace their own jobs. Kinda sad, really. Demand for soft skills and creatives is going to continue to decline.
Dev jobs too.
In the late 90s you weee considered a prodigy if you understood how to use a search engine. I had so many opportunities simply because I could find and retain information.
So LLMs have solved this. Knowing a framework or being able to create apps is not a marketable skill any longer. What are we supposed to do now?
It’s the soft skills that matter now. Being well liked has always been more important in a job than being the best at it. We all know that engineer who knows they are hot shit but everyone avoids because they are insufferable.
Those marketing people don’t need to spend a week on their deck any longer. They can work the customer relationship now.
Knowing how to iterate with an LLM to give the customer exactly what they need is the valuable skill now.
I personally think we're still a ways from the latter...
Young people are cheap and they love AI!
That world was 30 years ago. In 2025 world average total fertility rate is 2.2, which is a shade above replacement rate (2.1). And 2.2 is a 10% drop since 2017 alone (when it was 2.46).
Because life expectancy is higher, the population will continue to increase. But not "rapidly".
Many of the largest countries are experiencing similar declines, with fewer and fewer countries maintaining large birth rates.
This is cause for government intervention.
Until AI can do literally everything we can, that class of work will continue to exist, and it'll continue to be handed to the least experienced workers as a way for them to learn, get oriented, and earn access to more interesting problems and/or higher pay while experienced folks rest on their laurels or push the state of the art.
Currently part of the problem is the taboo using AI coding in undergrad CS programs. And I don't know the answer. But someone will find the right way to teach new/better ways of working with and without generative AI. It may just become second nature to everyone.
Some people dont want to hear that, but...
i just want devs who actually read my pr comments instead of feeding them straight into an llm and resubmitting the pr
But if they're not hired...?
That's not true anymore in the smart phone / tablet era.
5-10 years ago my wife had a gig working with college kids and back then they were already unable to forward e-mails and didn't really understand the concept of "files" on a computer. They just sent screenshots and sometimes just lost (like, almost literally) some document they had been working on because they couldn't figure out how to open it back up. I can't imagine it has improved.
From company interns. Internships won't go away, there will just be less of them. For example, some companies will turn down interns because they do not have the time to train them due to project load.
With AI, now employed developers can be picky on whether or not to take on interns.
No, I was being sarcastic.
Why?
It's already doing a lot of the loadbearing work in those mid-level roles too now, it's just a bit awkward for management to admit it. One common current mode of work is people using AI to accomplish their work tasks very quickly, and then loafing a bit more with the extra time. So leaders refrain from hiring, pocket the savings, and keep a tight lid on compensation for those who remain.
At some point they'll probably try to squeeze the workforce for some additional productivity, and cut those who don't deliver it. Note that the "ease" of using AI for work tasks will be a rationale for why additional compensation is not warranted for those who remain.
Im tired of reading all these claims with no primary evidence to support it.
The US is going through a lot of upheaval, which whether you think is positive or negative, is unique, and a confounding factor for any such research.
Other than that, I guess developing software in some capacity while doing a non-strictly software job - say, in accounting, marketing, healtcare, etc. This might not be a relevant number of people if 'vibe coding' takes hold and the fundamentals are not learned/ignored by these accountants, marketers, healthcare workers, etc.
If that is the case, we'd have a lot of 'informed beginners' with 10+ years of experience tangentially related to software.
Edit: As a result of the above, we might see an un-ironic return to the 'learn to code' mantra in the following years. Perhaps now qualified 'learn to -actually- code'? I'd wager a dollar on that discourse popping up in ~5 years time if the trend of not hiring junior devs continues.
I'm half-joking, but I wouldn't be surprised to see all sorts of counterpoint marketing come into play. Maybe throw in a weird traditional bent to it?
> (Pretentious, douche company): Free-range programming, the way programming was meant to be done; with the human touch!
All-in-all, I already feel severely grossed out any time a business I interact with introduces any kind of LLM chatbot shtick and I have to move away from their services; I could genuinely see people deriving a greater disdain for the fad than there already is.
its like how the generic "we take anyone" online security degree has poisoned that market -- nothing but hoards of entry level goobers, but no real heavy hitters on the mid-to-high end. put another way, the market is tight but there are still reasonable options for seniors.
then again we live under capitalism
Take the software development sector as example: if we replace junior devs by AI coding agents and put senior devs to review the agent's work, how will we produce more seniors (with wide experience in the sector) if the juniors are not coding anymore?
If you are complaining about finding a job today, guess what, you’ll be complaining in five years too. If you’re 25 complaining about seniors, you’ll be 28 complaining about college graduates. It’s a BEAST out there. The job description literally requires you to be ready for entire paradigm shifts.
With all that said, one would really have to think about why they would want to invest into a career with a cadence like that? Shrugs. That’s the job description, be careful what you get involved with in life.
If it’s about the money, then I’d just get a nursing degree and never worry about shit until the end of my life, keep all of it invested for 40 fucking years without a flicker of a thought about job instability or performance. So, the “about the money” people are literally going down the least optimal path imho.
Unless you really want to spend a lifetime (and I mean that, people spend a LIFETIME doing this) sprucing up hobby GitHub projects to demo to employers or sell the same shit as everyone else trying on Product Hunt, unless you LOVE doing this, please, for the love of god, fucking reevaluate your future.
That’s my best Tyler Durdan “get your ass back to Veterinarian school” impression.
Hard for me to believe that AI in its current state is hollowing out junior shop assistant and salesperson roles. Either those jobs were already vulnerable to "dumb" touchscreen kiosks or they require emotional intelligence and embodied presence that LLMs lack.
Is this a case of "correlation does not imply causation?"
Entry-level jobs get "hollowed out" in a stagnant economy regardless of "AI".
AI = not hiring because no new work but spin as a "AI" . Markets are hungry of any utterance of the the word AI from the CEO.
so ridiculous. but we've collectively decided to ignore BS as long as we can scam each other and pray you are not the last one holding the bag.
You have to somehow have the discipline to avoid getting caught up in the noise until the hype starts to fade away.
they rather pay people to sit in a room pressing a button every hour than have them loitering around on UBI
either that or in the pod
Another way to look at it is that hiring is fine, and that the vain entitled generation we all suspected was going to emerge feels that a job should absolutely be available to them, and immediately.
Another way to look at it is that journalism has been dead for quite a while, and writing about the same fear-based topics like “omg hiring apocalypse” is what makes these people predictable money (along with other topics).
Another way to look at it is that we raised a generation of narcissistic parents and children that have been going “omg grades”, “omg good college”, “omg internship”, “omg job” for so long that that these lamentations feel normalized. A healthy dose of stfu was never given to them. Neurotic motherfuckers.