The ‘white-collar bloodbath’ is all part of the AI hype machine

by lwo32kon 5/30/2025, 1:38 PMwith 1,240 comments

by simonsarrison 5/31/2025, 3:20 AM

I think the real white collar bloodbath is that the end of ZIRP was the end of infinite software job postings, and the start of layoffs. I think its easy to now point to AI, but it seems like a canard for the huge thing that already happened.

just look at this:

https://fred.stlouisfed.org/graph/?g=1JmOr

In terms of magnitude the effect of this is just enormous and still being felt, and never recovered to pre-2020 levels. It may never. (Pre-pandemic job postings indexed to 100, its at 61 for software)

Maybe AI is having an effect on IT jobs though, look at the unique inflection near the start of 2025: https://fred.stlouisfed.org/graph/?g=1JmOv

For another point of comparison, construction and nursing job postings are higher than they were pre-pandemic (about 120 and 116 respectively, where pre-pandemic was indexed to 100. Banking jobs still hover around 100.)

I feel like this is almost going to become lost history because the AI hype is so self-insistent. People a decade from now will think Elon slashed Twitter's employee count by 90% because of some AI initiative, and not because he simply thought he could run a lot leaner. We're on year 3-4 of a lot of other companies wondering the same thing. Maybe AI will play into that eventually. But so far companies have needed no such crutch for reducing headcount.

by idkwhattocallmeon 5/30/2025, 2:43 PM

I worked at two different $10B+ market cap companies during ZIRP. I recall in most meetings over half of the knowledge workers attending were superfluous. I mean, we hired someone on my team to attend cross functional meetings because our calendars were literally too full to attend. Why could we do that? Because the company was growing and hiring someone to attend meetings wasn't going to hurt the skyrocketing stock. Plus hiring someone gave my VP more headcount and therefore more clout. The market only valued company growth, not efficiency. But the market always capitulates to value (over time). When that happens all those overlay hires will get axed. Both companies have since laid off 10K+. AI was the scapegoat. But really, a lot of the knowledge worker jobs it "replaces" weren't providing real value anyway.

by tdeckon 5/31/2025, 2:17 AM

Maybe someone can help me wrap my head around this in a different way, because here's how I see it.

If these tools are really making people so productive, shouldn't it be painfully obvious in companies' output? For example, if these AI coding tools were an amazing productivity boost in the end, we'd expect to see software companies shipping features and fixes faster than ever before. There would be a huge burst in innovative products and improvements to existing products. And we'd expect that to be in a way that would be obvious to customers and users, not just in the form of some blog post or earnings call.

For cost center work, this would lead to layoffs right away, sure. But companies that make and sell software should be capitalizing on this, and only laying people off when they get to the point of "we just don't know what to do with all this extra productivity, we're all out of ideas!". I haven't seen one single company in this situation. So that makes me think that these decisions are hype-driven short term thinking.

by sevensoron 5/30/2025, 2:40 PM

What AI is going to wipe out is white collar jobs where people sleepwalk through the working day and carelessly half ass every task. In 2025, we can get LLMs to do that for us. Unfortunately, the kind of executive who thinks AI is a legitimate replacement for actual work does not recognize the difference. I expect to see the more credulous CEOs dynamiting their companies as a result. Whether the rest of us can survive this remains to be seen. The CEOs will be fine, of course.

by CKMoon 5/30/2025, 8:56 PM

There's definitely a big problem with entry-level jobs being replaced by AI. Why hire an intern or a recent college-grad when they lack both the expertise and experience to do what an AI could probably do?

Sure, the AI might require handholding and prompting too, but the AI is either cheaper or actually "smarter" than the young person. In many cases, it's both. I work with some people who I believe have the capacity and potential to one day be competent, but the time and resource investment to make that happen is too much. I often find myself choosing to just use an AI for work I would have delegated to them, because I need it fast and I need it now. If I handed it off to them I would not get it fast, and I would need to also go through it with them in several back-and-forth feedback-review loops to get it to a state that's usable.

Given they are human, this would push back delivery times by 2-3 business days. Or... I can prompt and handhold an AI to get it done in 3 hours.

Not that I'm saying AI is a god-send, but new grads and entry-level roles are kind of screwed.

by snowwrestleron 5/31/2025, 1:34 AM

Historically, people have been pretty good at predicting the effects of new technologies on existing jobs. But quite bad at predicting the new jobs / careers / industries that are eventually created with those technologies.

This is why free market economies create more wealth over time than centrally planned economies: the free market allows more people to try seemingly crazy ideas, and is faster to recognize good ideas and reallocate resources toward them.

In the absence of reliable prediction, quick reaction is what wins.

Anyway, even if AI does end up “destroying” tons of existing white collar jobs, that does not necessarily imply mass unemployment. But it’s such a common inference that it has its own pejorative: Luddite.

And the flip side of Ludddism is what we see from AI boosters now: invoking a massive impact on current jobs as a shorthand to create the impression of massive capability. It’s a form of marketing, as the CNN piece says.

by michaeldoronon 5/30/2025, 3:07 PM

Every time an analyst gives the current state of AI-based tools as evidence supporting AI disruption being just a hype, I think of skeptics who dismissed the exponential growth of covid19 cases due to their initial low numbers.

Putting that aside, how is this article called an analysis and not an opinion piece? The only analysis done here is asking a labor economist what conditions would allow this claim to hold, and giving an alternative, already circulated theory that AI companies CEOs are creating a false hype. The author even uses everyday language like "Yeaaahhh. So, this is kind of Anthropic’s whole ~thing.~ ".

Is this really the level of analysis CNN has to offer on this topic?

They could have sketched the growth in foundation model capabilities vs. finite resources such as data, compute and hardware. They could have wrote about the current VC market and the need for companies to show results and not promises. They could have even wrote about the giant biotech industry, and its struggle with incorporating novel exciting drug discovery tools with slow moving FDA approvals. None of this was done here.

by darth_avocadoon 5/30/2025, 2:37 PM

I don’t understand how any business leader can be excited about humans being replaced by AI. If no one has a job, who’s going to buy your stuff? When the unemployment in the country goes up, consumer spending slows down and recession kicks in. How could you be excited for that?

by CSMastermindon 5/30/2025, 11:55 PM

Huge amounts of white collar jobs have been automated since the advent of computers. If you look at the work performed by office workers in the 1960s and compared it to what people today do it'd be almost unrecognizable.

They spent huge amounts of time on things that software either does automatically or makes 1,000x faster. But by and large that actually created more white collar jobs because those capabilities meant more was getting done which meant new tasks needed to be performed.

by qginon 5/31/2025, 1:46 AM

I often see people say “AI can’t do ALL of my job, so that means my job is safe.

But what this means at scale, over time, is that if AI can do 80% of your job, AI will do 80% of your job. The remaining 20% human-work part will be consolidated and become the full time job of 20% of the original headcount while the remaining 80% of the people get fired.

AI does not need to do 100% of any job (as that job is defined today ) to still result in large scale labor reconfigurations. Jobs will be redefined and generally shrunk down to what still legitimately needs human work to get it done.

As an employee, any efficiency gains you get from AI belong to the company, not you.

by deadbabeon 5/30/2025, 9:51 PM

Something I’ve come to realize in the software industry is: if you have more smart engineers than the competition, you win.

If you don’t snatch up the smartest engineers before your competition does: you lose.

Therefore at a certain level of company, hiring is entirely dictated by what the competition is doing. If everyone is suddenly hiring, you better start doing it too. If no one is, you can relax, but you could also pull ahead if you decide to hire rapidly, but this will tip off competitors and they too will begin hiring.

Whether or not you have any use for those engineers is irrelevant. So AI will have little impact on hiring trends in this market. The downturn we’ve seen in the past few years is mostly driven by the interest rate environment, not because AI is suddenly replacing engineers. An engineer using AI gives more advantage than removing an engineer, and hiring an engineer who will use AI is more advantageous than not hiring one at all.

AI is just the new excuse for firing or not hiring people, previously it was RTO but that hype cycle has been squeezed for all it can be.

by spcebaron 5/30/2025, 3:06 PM

Something is nagging me about the AI-human replacement conversation that I would love insight from people who know more about startup money than me. It seems like the AI revolution hit as interest rates went insane, and at the same time the AI that could write code was becoming available, the free VC money dried up, or at least changed. I feel like that's not usually a part of the conversation and I'm wondering if we would be having the same conversation if money for startups was thrown around (and more jobs were being created for SWEs) the way it was when interest rates were zero. I know next to nothing about this and would love to hear informed opinions.

by bachmeieron 5/30/2025, 3:14 PM

> AI is starting to get better than humans at almost all intellectual tasks

"Starting" is doing a hell of lot of work in that sentence. I'm starting to become a billionaire and Nobel Prize winner.

Anyway, I agree with Mark Cuban's statement in the article. The most likely scenario is that we become more productive as AI complements humans. Yesterday I made this comment on another HN story:

"Copilot told me it's there to do the "tedious and repetitive" parts so I can focus my energy on the "interesting" parts. That's great. They do the things every programmer hates having to do. I'm more productive in the best possible way.

But ask it to do too much and it'll return error-ridden garbage filled with hallucinations, or just never finish the task. The economic case for further gains has diminished greatly while the cost of those gains rises."

by monero-xmron 5/30/2025, 2:35 PM

https://en.m.wikipedia.org/wiki/List_of_predictions_for_auto...

It wasn’t just Elon. The hype train on self driving cars was extreme only a few years ago, pre-LLM. Self driving cars exist sort of, in a few cities. Quibble all you want but it appears to me that “uber driver” is still a popular widespread job, let alone truck driver, bus driver, and “car owner” itself.

I really wish the AI ceos would actually make my life useful. For example, why am I still doing the dishes, laundry, cleaning my house, paying for landscaping, painters, and on and on? In terms of white collar work I’m paying my fucking lawyers more than ever. Why don’t they solve an actual problem

by gololon 5/30/2025, 2:40 PM

> To be clear, Amodei didn’t cite any research or evidence for that 50% estimate.

I truly belive these types of paper don't deserve to be valued so much.

by hansmayeron 5/31/2025, 11:39 AM

It's so good to see the non-expert types are finally starting to see the whole hype for what it really is -> the long tail of last 20 years of incremental ML development, and not some revolutionary tech. We did not need to have this much hype around web 1.0 which was immediately adopted due to being obviously, well, revolutionary.

by fnyon 5/30/2025, 2:52 PM

I think everyone is missing the bigger picture.

This is not a matter of whether AI will replace humans whole sale. There are two more predominant effects:

1. You’ll need fewer humans to do the same task. In other forms of automation, this has led to a decrease in employment. 2. The supply of capable humans increases dramatically. 3. Expertise is no longer a perfect moat.

I’ve seen 2. My sister nearly flunked a coding class in college, but now she’s writing small apps for her IT company.

And for all of you who poo poo that as unsustainable. I became proficient in Rust in a week, and I picked up Svelte in a day. I’ve written a few shaders too! The code I’ve written is pristine. All those conversations about “should I learn X to be employed” are totally moot. Yes APL would be harder, but it’s definitely doable. This is an example of 3.

Overall, this will surely cause wage growth to slow and maybe decrease. In turn, job opportunities will dry up and unemployment might ensue.

For those who still don’t believe, air traffic controllers are a great thought experiment—they’re paid quite nicely. What happens if you build tools so that you can train and employ 30% of the population instead of just 10%?

by chris_armstrongon 5/30/2025, 11:01 PM

The wildest claims are those of increased labor productivity and economic growth: if they were true, our energy consumption would be increasing wildly beyond our current capacity to add more (dwarfing the increase from AI itself).

Productivity doesn’t increase on its own; economists struggle to separate it from improved processes or more efficient machinery (the “multi factor productivity fudge”). Increased efficiency in production means both more efficient energy use AND being able to use a lot more of it for the same input of labour.

by elktownon 5/30/2025, 6:11 PM

Tech has a big problem of selective critical thinking due to a perpetual gold rush causing people to adopt a stockbroker mentality of not missing out on the next big thing - be it the next subfield like AI, the next cool tech that you can be an early adopter on etc. But yeah, nothing new under the sun; it's corruption.

by infinitebiton 5/31/2025, 6:00 AM

So glad to see a MSM outlet take the words of an AI ceo with even a single grain of salt. I’ve been really disappointed with the way so many publications have just been breathlessly repeating what is essentially a sales pitch.

(ftr i’m not even taking a side re: is AI going to take all the jobs. regardless of what happens the fact remains that the reporting has been absolute sh*t on this. i guess “the singularity is here” gets more clicks than “sales person makes sales pitch”)

by Lu2025on 5/31/2025, 7:08 PM

don't think that the white collar layoffs of the last 3-4 years are due to AI. The tech layoffs of 2022 are explained in part by the impact of 2017 tax reform. Before 2022, the research and development expenses could be written off taxes as a tax credit in the same year. It's a dollar to dollar reduction of tax liability. Tech companies classify a lot of their work as R&D. So those overpaid Facebook coders are essentially public charges! Somebody up thread said that programmers are disproportionately highly compensated in the US. They are, because it's not companies who pay for them, it's taxpayers, indirectly. Starting 2022, the deal became a bit less sweet. R&D expenses had to be amortized over 5 years. What happened next is a collusion in response to Great Resignation. Several large tech companies conspired to have layoffs at the same time as a salary compression move. The AI statements are mostly a scare tactic to put pressure on employees. For some industries and applications AI is revolutionary, but for coding it's good at autocomplete and not much else.

by keyboredon 5/30/2025, 2:39 PM

> If the CEO of a soda company declared that soda-making technology is getting so good it’s going to ruin the global economy, you’d be forgiven for thinking that person is either lying or fully detached from reality.

Exactly. These people are growth-seekers first, domain experts second.

Yet I saw progressive[1] outlets reacting to this as a neutral reporting. So it apparently takes a “legacy media” outlet to wake people out of their AI stupor.

[1] American news outlets that lean social-democratic

by K0balton 5/31/2025, 12:28 PM

The “bloodbath” will be slow but is quite likely to be significant.

AI / GP robotic labor will not penetrate the market so much in existing companies, which will have huge inertial buffers, but more in new companies that arise in specific segments where the technology proves most useful.

The layoffs will come not as companies replace workers with AI, but as AI companies displace non-AI companies in the market, followed by panicked restructuring and layoffs in those companies as they try to react, probably mostly unsuccessfully.

Existing companies don’t have the luxury of buying market share with investor money, they have to make a profit. A tech darling AI startup powered by unicorn farts and inference can burn through billions of SoftBank money buying market share.

by econon 5/31/2025, 4:01 AM

There use to be a cookie factory here that had up to 12 people sitting there all day doing nothing. If the machines broke down it really took all of them to clean up. This pattern will be rediscovered.

by jona777thanon 5/31/2025, 1:22 PM

There will likely be more jobs because of AI. With more “knowledge”, comes more responsibility. Spam folders only exist because of automated emails. That classification process is more work. We may find there are more needs to meet as AI advances, not less.

The fallacy is in the statement “AI will replace jobs.” This shirks responsibility, which immediately diminishes credibility. If jobs are replaced or removed, that’s a choice we as humans have made, for better or worse.

by josefritzishereon 5/30/2025, 3:05 PM

I don't think we've seen a technology more over-hyped in the history of industrialized society. Cars, which did fully replace horses, was not even hyped this hard.

by joshdavhamon 5/31/2025, 12:24 AM

This type of hype is pretty perplexing to me.

Supposing that you are trying to increase AI adoption among white-collar workers, why try to scare the shit out them in the process? Or is he moreso trying to sell to the C-suite?

by AnimalMuppeton 5/30/2025, 3:01 PM

At least temporarily, it can be somewhat self-fulfilling, though. Companies believe it, think they'd better shed white-collar jobs to stay competitive. If enough companies believe that, white-collar jobs go down, even if AI is useless.

Of course, in the medium term, those companies may find out that they needed those people, and have to hire, and then have to re-train the new people, and suffer all the disruption that causes, and the companies that didn't do that will be ahead of the game. (Or, they find out that they really didn't need all those people, even if AI is useless, and the companies that didn't get rid of them are stuck with a higher expense structure. We'll see.)

by veuneson 5/31/2025, 12:53 PM

If the product can't speak for itself, scare people into believing it will soon

by HenryBemison 5/31/2025, 8:50 PM

> To be clear, Amodei didn’t cite any research or evidence for that 50%

This reminds me the "Walter White" meme "I am the documentation". When the CEO of a company that makes LLM says something like that, "I perk up and listen" (to quote the article).

When a doctor says "water in my village is bad quality, it gives diarrhea to 30% of the villagers", I don't need a fancy study from some university. The doctor "is the documentation". So if the Anthropic/ChatGPT/LLaMa/etc. (mixing companies and products, it's ok though) say that "so-and-so", they see the integrations, enhancements, compliments, companies ordering _more_ subscriptions, etc.

In my current company (high volume, low profit margin) they told us "go all in on AI". They see that (e.g. with Notion-like-tools) if you enable the "AI", that thing can save _a lot_ of time on "Confluence-like" tasks. So, paying $20-$30-$40 per person, per month, and that thing improving the productivity/output of an FTE by 20%-30% is a massive win.

So yes, we keep the ones we got (because mass firings, ministry of 'labour', unions, bad marketing, etc.). Headcount will organically be reduced (retirements, getting a new job, etc.) combined with minimizing new hires, and boom! savings!!

by WaltPurvison 5/30/2025, 10:44 PM

I plugged those two quotes from Amodei into ChatGPT along with this prompt: "Pretend you are highly skeptical about the potential of AI, both in general and in its potential for replacing human workers the way Amodei predicts. Write a quick 800-word takedown of his predictions."

I won't paste in the result here, since everyone here is capable of running this experiment themselves, but trust me when I say ChatGPT produced (in mere seconds, of course) an article every bit as substantive and well-written as the cited article. FWIW.

by Animatson 5/30/2025, 9:03 PM

The real bloodbath will come when coordination between multiple AIs, in a company sense, starts working. Computers have much better I/O than humans. Once a corporate organization can be automated, it will be too fast for humans to participate. There will be no place for slow people.

"Move fast and break things" - Zuckerberg

"A good plan violently executed now is better than a perfect plan executed next week." - George S. Patton

by 1vuio0pswjnm7on 5/30/2025, 5:01 PM

"If the CEO of a soda company declared that soda-making technology is getting so good it's going to ruin the global economy, you'd be forgiven for thinking that person is either lying or fully detached from reality.

Yet when tech CEOs do the same thing, people tend to perk up."

Silicon Valley and Redmond make desperate attempts to argue for their own continued relevance.

For Silicon Valley VC, software running on computers cannot be just a tool. It has to cause "disruption". It has to be "eating the world". It has to be a source of "intelligence" that can replace people.

If software and computers are just boring appliances, like yesterday's typewriters, calculators, radios, TVs, etc., then Silicon Valley VC may need to find a new line of work. Expect the endless media hype to continue.

No doubt soda technology is very interesting. But people working at soda companies are not as self-absorbed, detached from reality and overfunded as people working for so-called "tech" companies.

by bayareapsychoon 6/1/2025, 5:51 AM

My last company (F50, ass engineering culture, pretends to be a tech company) went and fired all of the juniors at a certain level because "AI"

The funny part is, most of those juniors were hired in 2022-2024, and they were better hires because of the harsher market. There were a bunch of "senior engineers" who were borderline useless and joined some time between 2018-2021

I just think it's kind of funny to fire the useful people and keep the more expensive ones around who try to do more "managerial" work and have more family obligations. Smart companies do the opposite

by cadamsdotcomon 5/30/2025, 6:09 PM

CEOs’ jobs involve hyping their companies. It’s up to us whether we believe.

I’d love a journalist using Claude to debunk Dario: “but don’t believe me, I’m just a journalist - we asked Dario’s own product if he’s lying through his teeth, and here’s what it said:”

by ghm2180on 5/31/2025, 5:04 AM

I wonder when the investors and investors in the early printing press or steam engine or excel spreadsheet was invented did they think of the ways — soul crushing homework(books), rapid and cruel colonization(steam engines and trains), innovative project management(excel) — there tech would be used?

The demand for these products was not where it was intended at the time probably. Perhaps the answer to its biggest effect lies in how it will free up human potential and time.

If AI can do that — and that is a big if — then how and what would you do with that time? Well ofc, more activity, different ways to spend time, implying new kinds of jobs.

by trhwayon 5/30/2025, 10:54 PM

Read on about PLTR in recent days - all these government layoffs (including by DOGE well connected to PLTR) with the money redirected toward the Grand Unification Project using PLTR Foundry (with AI) platform.

by phendrenad2on 5/30/2025, 3:22 PM

These are the moments that make millionaires. A majority of people believe that AI is going to thoroughly disrupt society. They've been primed to worry about an "AI apocalypse" by Hollywood for their entire lives. The prevailing counter-narrative is that AI is going to flop. HARD. You can't get more diametrically opposed than that. If you can correctly guess (or logically determine) which is correct, and bet all of your money on it, you can launch yourself into a whole other echelon of life.

I've been a heavy user of AI ever since ChatGPT was released for free. I've been tracking its progress relative to the work done by humans at large. I've concluded that it's improvements over the last few years are not across-the-board changes, but benefit specific areas more than others. And unfortunately for AI hype believers, it happens to be areas such as art, which provide a big flashy "look at this!" demonstration of AI's power to people. But... try letting AI come up with a nuanced character for a novel, or design an amplifier circuit, or pick stocks, or do your taxes.

I'm a bit worried about YCombinator. I like Hacker News. I'm a bit worried that YC has so much riding on AI startups. After machine learning, crypto, the post-Covid 19 healthcare bubble, fintech, NFTs, can they take another blow when the music stops?

by johnwheeleron 5/30/2025, 2:40 PM

I previously worked at a company called Recharge Payments, directly supporting the CTO, Mike—a genuinely great person, and someone learning to program. Mike would assign me small tasks, essentially making me his personal AI assistant. Now, I approach everything I do from his perspective. It’s clear that over time, he’ll increasingly rely on AI, asking employees less frequently. Eventually, it’ll become so efficient to turn to AI that he’ll rarely need to ask employees anything at all.

by ck2on 5/30/2025, 9:16 PM

LLM is going to be used for oppression by every government, not just dictatorships but USA of course

Think of it as an IQ test of how new technology is used

Let me give you an easier example of such a test

Let's say they suddenly develop nearly-free unlimited power, ie. fusion next year

Do you think the world will become more peaceful or much more war?

If you think peaceful, you fail, of course more war, it's all about oppression

It's always about the few controlling the many

The "freedom" you think you feel on a daily basis is an illusion quickly faded

by ArtTimeInvestoron 5/30/2025, 3:17 PM

Imagine you had a crystal ball that lets you look 10 years into the future, and you asked it about whether we underestimate or overestimate how many jobs AI will replace in the future.

It flickers for a moment, then it either says

"In 2025, mankind vastly underestimated the amount of jobs AI can do in 2035"

or

"In 2025, mankind vastly overestimated the amount of jobs AI can do in 2035"

How would you use that information to invest in the stock market?

by topherPedersenon 5/31/2025, 4:48 AM

I could be wrong, but I think us software developers are going to become even more powerful, in demand, and valuable.

by globalnodeon 5/31/2025, 12:59 AM

i really liked this article, it puts into perspective how great claims require great proof, and so far all we've heard are great claims. i love ml tech but i just dont trust it to replace a human completely. sure it can augment roles but thats not the vision we're being sold.

by randomname4325on 5/31/2025, 3:47 AM

Only way to know for sure you're safe from replacement is if your job is a necessary part of something generating revenue and your not easily replaceable. Otherwise you should assume the company won't hesitate to replace you. It's just business.

by ggmon 5/30/2025, 10:01 PM

Without well paid middle classes, who is buying all the fancy goods and services?

Money is just rationing. If you devalue the economy implicitly you accept that, and the consequences for society at large.

Lenin's dictum: A capitalist will sell you the rope you hang him with Comes to mind

by dottjton 6/1/2025, 2:52 AM

I think a huge tradeoff that people haven't mentioned is that in using AI to replace workers, you're introducing a dependency on AI that you previously didn't have. This poses a terrible long-term risk for companies.

by indigoabstracton 6/1/2025, 11:27 AM

So if I understand correctly, it's basically down between:

1. cure cancer

2. fix the economy

3. keep everybody happily employed.

And he's saying we can only pick two, or pick one. Except for the last one, that's not really an option.

by leeroiheon 5/30/2025, 8:37 PM

I used to be a big proponent of AI tools and llms, even built products around them. But to be honest, with all of the big AI ceos promising that they're going to "replace all white collar jobs" I can't see that they want what's best for the country or the american people. It's legitimately despicable and ghoulish that they just expect everyone to "adapt" to the downstream affects of their knowledge-machine lock-in.

by throwaway48476on 5/31/2025, 10:57 PM

The white collar bloodbath is the jobs that could have been automated pre AI but weren't due to organizational inertia, corporate freedom building and an unwillingness to invest.

by DrillShopperon 5/30/2025, 2:34 PM

I look forward to the day where executive overpromises and engineering underdeliveries bring about another AI winter so the useful techniques can continue without the stench of the "AI" association and so the grifters go bankrupt.

by rjurneyon 5/30/2025, 3:29 PM

Workers in denial are like lemmings, headed for the cliff... not putting myself above that. A moderate view indicates great disruption before new jobs replace the current round being lost.

by smeegeron 5/30/2025, 11:26 PM

if being redundant would lead to mass layoffs then half of white collar workers would have been laid off decades ago. and white collar people will fiddle with rules and regulations to make their ever more bloated redundancy even more brazen with the addition of AI… and then later when AI has the ability to replace blue collar workers it will do so immediately and swiftly while the white collar people get all the money. its happened a thousand times before and will happen again.

by stephc_int13on 5/30/2025, 11:30 PM

The main culprit behind the hype of the AI revolution is a lack of understanding of its true nature and capabilities. We should know better, Eliza demonstrated decades ago how easily we can be fooled by language, this is different and more useful but we rely so much on language fluency and knowledge retrieval as a proxy for intelligence that we are fooled again.

I am not saying this is a nothing burger, the tech can be applied to many domains and improve productivity, but it does not think, not even a little, and scaling won’t make that magically happen.

Anyone paying attention should understand this fact by now.

There is no intelligence explosion in sight, what we’ll see during the next few years is a gradual and limited increase in automation, not a paradigm change, but the continuation of a process that started with the industrial revolution.

by givemeethekeyson 5/31/2025, 4:14 PM

There is an AI bloodbath that is adding to supply of labor in all low hanging fields that aren’t yet being decimated by AI.

by bawanaon 5/30/2025, 8:44 PM

When are we going to get AI CEOs as a service?

by givemeethekeyson 5/31/2025, 4:11 PM

Many people are unable to find jobs because they are too old.

Even older people prefer to hire younger people.

by notyouraiboton 5/31/2025, 7:58 AM

The hype around AI replacing software engineers is truly delusional. Yes they are very good at solving known problems, writing for loops and boilerplate code but introduce a little bit of complexity and creativity and it all fails. There have been countless tasks that I have given to AI, to which it simply concluded its not possible and suggested me to use several external libraries to get it done, after a little bit of manual digging, I was able to achieve that same task without any libraries and I'm not even a seasoned engineer.

by osigurdsonon 5/31/2025, 1:16 AM

The real value is going to be in areas that neither machines nor humans could do previously.

by nova22033on 5/31/2025, 8:40 PM

Does anyone have any experience with using AI tools on a massive legacy code base?

by whynotminoton 5/30/2025, 2:40 PM

There’s a hype machine for sure.

But the last few paragraphs of the piece kind of give away the game — the author is an AI skeptic judging only the current products rather than taking in the scope of how far they’ve come in such a short time frame. I don’t have much use for this short sighted analysis. It’s just not very intelligent and shows a stubborn lack of imagination.

It reminds me of that quote “it is difficult to get a man to understand something, when his salary depends on his not understanding it.”

People like this have banked their futures on AI not working out.

by bawanaon 5/30/2025, 8:44 PM

When are going to get AI CEOs as a service?

by franczeskoon 5/31/2025, 6:06 PM

AI bubble burst will come first.

by infinitebiton 5/31/2025, 5:46 AM

I am SO thankful to see a news outlet take what tech CEOs say with a grain of salt re: AI. I feel like so many have just been breathlessly repeating anything they say without even an acknowledgement that there might be, you know, some incentive for them to stretch the truth.

(ftr i’m not even taking a side re: will AI take all the jobs. even if they do, the reporting on this subject by MSM has been abysmal)

by gcanyonon 5/30/2025, 11:50 PM

...everyone here saying "someday AI will <fill in the blank> but not today" while failing to acknowledge that for a lot of things "someday" is 2026, and for an even larger number of things it's 2027, and we can't even predict whether or not in 2028 AI will handle nearly all things...

by atleastoptimalon 5/31/2025, 9:23 AM

losing jobs is the biggest predictable hazard of AI but far from the biggest

however there seems to be a big disconnect on this site and others

If you believe AGI is possible and that AI can be smarter than humans in all tasks, naturally you can imagine many outcomes far more substantial than job loss.

However many people don’t believe AGI is possible, thus will never consider those possibilities

I fear many will deny the probability that AGI could be achieved in the near future, thus leaving themselves and others unprepared for the consequences. There are so many potential bad outcomes that could be avoided merely if more smart people realized the possibility of AGI and ASI, and would thus rationally devote their cognitive abilities to ensuring that the potential emergence of smarter than human intelligences goes well.

by brokegrammeron 5/31/2025, 6:12 AM

We don't need AI to wipe out entry-level office jobs. David Graeber wrote about this in Bullshit Jobs. But now that we have AI, it's a good excuse to wipe out those jobs for good, just like Elon did after he acquired Twitter. After that, we can blame AI for the deed.

by Warh00lon 6/1/2025, 2:24 PM

pierceday.metalabel.com/aphone

by paulluukon 5/30/2025, 2:48 PM

Around the time when bitcoin started to get serious public attention, late 2017, I remember feeling super hyped about it and yet everyone told me that money spent on bitcoin was wasted money. I really believed that bitcoin, or at least cryptocurrency as a whole, would fundamentally change how banking and currencies would work. Now, almost 10 years later, I would say that it did not live up to my believe that it would "fundamentally" change currencies and banking. It made some minor changes, sure, but if it weren't for the value of bitcoin, it would still be a nerdy topic about as well known as perlin noise. Although I did make quite a lot of money from it, though I sold out way too soon.

As a research engineer in the field of AI, I am again getting this feeling. People keep doubting that AI will have any kind of impact, and I'm absolutely certain that it will. A few years ago people said "AI art is terrible" and "LLMs are just autocomplete" or the famous "AI is just if-else". By now it should be pretty obvious to everyone in the tech community that AI, and LLMs in particular, are extremely useful and already have a huge impact on tech.

Is it going to fulfill all the promises made by billionaire tech CEOs? No, of course not, at least not on the time scale that they're projecting. But they are incredibly useful tools that can enhance efficiency of almost any job that involves setting behind a computer. Even just something like copilot autocomplete or talking with an LLM about a refactor you're planning, is often incredibly useful. And the amount of "intelligence" that you can get from a model that can actually run on your laptop is also getting much better very quickly.

The way I see it, either the AI hype will end up like cryptocurrency: forever a part of our world, but never quite lived up to it's promises, but I made a lot of money in the meantime. Or the AI hype will live up to it's promises, but likely over a much longer period of time, and we'll have to test whether we can live with that. Personally I'm all for a fully automated luxury communism model for government, but I don't see that happening in the "better dead than red" US. It might become reality in Europe though, who knows.

by theawakenedon 5/31/2025, 10:04 AM

I've said this before and I'll say it again: The idea that 'AI' will EVER take over any programmers job is ridiculous. These idiots think they are going to create AGI, it's never going to happen, not with this race of people. There is far too much ignorance in humanity. AI will never be able to be any better than it's source, humanity. It's a soon-to-be realization for these billionaire talking heads. Nothing can rise higher than it's source. Even if they cover every square foot of land with data centers, it'll never work like they expect it to. The AI bubble will burst so hard the entire world will quake. I give it 5 years max.

by jatoraon 5/31/2025, 8:56 AM

While I agree that the current 'bloodbath' narrative is all hype, I'm honestly confused by a lot of the sentiment i see on here towards AI. Namely the dismissal of continual improvement and the rampant whistling past the graveyard attitude of what is coming.

It is confusing because many of the dismissals come from programmers, who are unequivocally the prime beneficiaries of genAI capability as it stands.

I work as a marketing engineer at a ~1B company and the amount of gains I have been able to provide as an individual are absolutely multiplied by genAI.

One theory I have is that maybe it is a failing of prompt ability that is causing the doubt. Prompting, fundamentally, is querying vector space for a result - and there is a skill to it. There is a gross lack of tooling to assist in this which I attribute to a lack of awareness of this fact. The vast majority of genAI users dont have any sort of prompt library or methodology to speak of beyond a set of usual habits that work well for them.

Regardless, the common notion that AI has only marginally improved since GPT-4 is criminally naive. The notion that we have hit a wall has merit, of course, but you cannot ignore the fact that we just got accurate 1M context in a SOTA model with gemini 2.5pro. For free. Mere months ago. This is a leap. If you have not experienced that as a leap then you are using LLM's incorrectly.

You cannot sleep on context. Context (and proper utilization of it) is literally what shores up 90% of the deficiencies I see complained about.

AI forgets libraries and syntax? Load in the current syntax. Deep research it. AI keeps making mistakes? Inform it of those mistakes and keep those stored in your project for use in every prompt.

I consistently make 200k+ token queries of code and context and receive highly accurate results.

I build 10-20k loc tools in hours for fun. Are they production ready? No. Do they accomplish highly complex tasks for niche use cases? Yes.

The empowerment of the single developer who is good at manipulating AI AND an experienced dev/engineer is absolutely incredible.

Deep research alone has netted my company tens of millions in pipeline, and I just pretend it's me. Because that's the other part that maybe many aren't realizing - its right under your nose - constantly.

The efficiency gains in marketing are hilariously large. There are countless ways to avoid 'AI slop', and it involves, again, leveraging context and good research, and a good eye to steer things.

I post this mostly because I'm sad for all of the developers who have not experienced this. I see it as a failure of effort (based on some variant of emotional bias or arrogance), not a lack of skill or intellect. The writing on the wall is so crystal clear.

by rule2025on 5/31/2025, 7:44 AM

The real "white-collar massacre" is not caused by AI, but you have no irreplaceable, or the value created by hiring you is not higher than using AI. Businesses will not hesitate to use AI, you can't say that companies are ruthless, but that's the pursuit of efficiency. Just as horse-drawn carriages were replaced by cars and coachmen lost their jobs, you can't say it's a problem with cars.

History is always strikingly similar, the AI revolution is the fifth industrial revolution, and it is wise to embrace AI and collaborate with AI as soon as possible.