Anthropic's CEO Dario has annoyed me to no end with his "AI will take all the jobs in 6 months" doomer speeches on every podcast he graces his presence with.
Focusing on Dario, his exact quote IIRC was "50% of all white collar jobs in 5 years" which is still a ways off, but to check his track record, his prediction on coding was only off by a month or so. If you revisit what he actually said, he didn't really say AI will replace 90% of all coders, as people widely report, he said it will be able to write 90% of all code.
And dhese days it's pretty accurate. 90% of all code, the "dark matter" of coding, is stuff like boilerplate and internal LoB CRUD apps and typical data-wrangling algorithms that Claude and Codex can one-shot all day long.
Actually replacing all those jobs however will take time. Not just to figure out adoption (e.g. AI coding workflows are very different from normal coding workflows and we're just figuring those out now), but to get the requisite compute. All AI capacity is already heavily constrained, and replacing that many jobs will require compute that won't exist for years and he, as someone scrounging for compute capacity, knows that very well.
But that just puts an upper limit on how long we have to figure out what to do with all those white collar professionals. We need to be thinking about it now.
He's not right though. He's trying to scare the market into his pocket. It's well established that AI just turns devs into AI babysitters that are 10% more productive and produce 200% the bugs, and in the long-term don't understand what they built.
> It's well established that AI just turns devs into AI babysitters that are 10% more productive and produce 200% the bugs, and in the long-term don't understand what they built.
It's not well established at all. In fact, there is increasing evidence to the contrary if you look outside the HN echo chamber.
The nuanced take is that AI in coding is an amplifier of your engineering culture: teams with strong software discipline (code reviews, tests, docs, CI/CD, etc.) enjoy more velocity and fewer outages, teams with weak discipline suffer more outages. There are at least two large-scale industry reports showing this trend -- DORA 2025 and the latest DX report -- not to mention the infinite anecdotes on this very forum.
> He's trying to scare the market into his pocket.
People say this, but I don't get it. Is portraying yourself as a destroyer of the economy considered good marketing? Maybe there was a case to be made for convincing the government to impose regulations on the industry, but as we're seeing and they're experiencing first hand, the problem is the government.
If these tools were so great they wouldn't be struggling so hard to sell them. Great sign that the company has to mandate a "productivity" tool that the workers hate.
Hence why all these LLM companies love government contracts, they can't sell to consumers so they'll just steal from tax payers instead.
Ah yes the mythical "valuations" based on unicorn dust and pixie horns (note that they don't define what a month actually is, my hunch is they take their best week then multiply it by x52).
Valuations?! The "R" in "ARR" stands for "Revenue." Valuations are something entirely different and much higher.
And if you suggest that these and other AI companies are lying about revenues or fudging the numbers, it is corroborated from THREE other angles: the investors in these startups, the payment processor for these startups, and the people allocating budgets for the products from these startups! This thread (and its parents with links) is relevant: https://news.ycombinator.com/item?id=46773252
> Focusing on Dario, his exact quote IIRC was "50% of all white collar jobs in 5 years" which is still a ways off, but to check his track record, his prediction on coding was only off by a month or so. If you revisit what he actually said, he didn't really say AI will replace 90% of all coders, as people widely report, he said it will be able to write 90% of all code.
Ugh, people here seem to think that all software is react webapps. There are so many technologies and languages this stuff is not very good at. Web apps are basically low hanging fruit. Dario hasn't predicted anything, and he does not have anyone's interests other than his own in mind when he makes his doomer statements.
The problem is, the low hanging fruit, the stuff it's good at, is 90% of all software. Maybe more.
And it's getting better at the other 10% too. Two years ago ChatGPT struggled to help me with race conditions in a C++ LD_PRELOAD library. It was a side project so I dropped it. Last week Codex churned away for 10 minutes and gave me a working version with tests.
I think that typescript is a language uniquely suited to LLMs though:
- It's garbage collected, so variable lifetimes don't need to be traced
- It's structurally typed, so LLMs can get away with duplicating types as long as the shape fits.
- The type system has an escape hatch (any or unknown)
- It produces nice stack traces
- The industry has more or less settled styling issues (ie, most typescript looks pretty uniform stylistically).
- There is an insane amount of open source code to train on
- Even "compiled" code is somewhat easy(er) to deobfuscate and read (because you're compiling JS to JS)
Contrast that with C/C++:
- Memory management is important, and tricky
- Segfaults give you hardly anything to work with
- There are like a thousand different coding styles
- Nobody can agree on the proper subset of the language to use (ie, exceptions allowed or not allowed, macros, etc.)
- Security issues are very much magnified (and they're already a huge problem in vibecoded typescript)
- The use cases are a lot more diverse. IE, if you're using typescript you're probably either writing a web page or a server (maybe a command line app). (I'm lumping electron in here, because it's still a web page and a server). C is used for operating systems, games, large hero apps, anything CPU or memory constrained, etc.
I'm not sure I agree that typescript is "90% of all software". I think it's 90% of what people on hacker news use. I think devs in different domains always overestimate the importance of their specific domain and underestimate the importance of other domains.
I wouldn't say TypeScript is 90% of all software exactly, but tons of apps on all kinds of technologies like Python / Django, Ruby on Rails, PHP, Wordpress, "enterprise" Java and the like, primarily doing CRUD and data plumbing especially for niche applications and internal LoB sites that we never see on the open Internet.
I agree C++ is harder, and I still occassionally find a missing free(), but Codex did crack my problem... including fixing a segfault! I had a bunch of strategically placed printfs gated behind an environment variable, it found those, added its own, set the environment variable, and examined the outputs to debug the issue.
I cannot emphasize how mindblowing this is, because years back I had spent an hour+ doing the same thing unsuccessfully before being pulled away.
> 90% of all code, the "dark matter" of coding, is stuff like boilerplate and internal LoB CRUD apps and typical data-wrangling algorithms that Claude and Codex can one-shot all day long.
If you mean "us" on this forum, I would believe that. I would bet the number of engineers working on stuff "outside the distribution" is overrepresented here.
If you mean "us" as in all software engineers, not at all. The challenge we're facing is exactly that, reskilling the 90% of engineers who have been working on CRUD apps to the 10% that is outside the distribution.
> 90% of engineers who have been working on CRUD apps
I am a 30-year "veteran" in the industry and in my opinion this cannot be further from the truth but it is often quotes (even before AI). CRUD apps have been a solved problem for quite some time now and while there are still companies who may allow someone to "coast" doing CRUD stuff they are hard to find these days. There is almost always more to it than building dumb stuff. I have also seen (more and more each year) these types of jobs being off-shored to teams for pennies on a dollar.
What I have experienced a lot is teams where there are what I call "innovators" and "closers." "Innovators" do the hard work, figure shit out, architect, design... and then once that is done you give it to "closers" to crank things out. With LLMs now the part of "closers" could be "replaced" but in my experience there is always some part, whether it is 5% or 10% that is difficult to "automate" so-to-speak
I agree, I'd say we're talking about the same thing, just in different terms. When I said CRUD apps, it was a crude stand-in for what you call the "closing" work. Over-simplifying, but it's unglamorous, not too complicated, somewhat mechanical, mostly a translation into working code from high-level designs that come down from the "innovators."
But I am concerned precisely because AI is usurping that closing work, which accounts for the bulk of the team. Realistically the innovators will be the only people required. But the innovators are able to do the hard stuff by learning through a lot of hands-on experience and painful lessons, which they typically get by spending a lot of time in the trenches as closers.
And we're only talking about coding here, but this pattern repeats ALL over knowledge work: product, legal, consultancy, finance, accounting, adminstration...
So now the problem is two-fold: how do we get the closers to upskill to innovators a) without the hands-on experience b) faster than AI can replace them?
I don't understand why some of these AI companies check their egos at the door and hire public relations companies. Yes, I understand they are changing the world but customers do not open their wallets when they are scared. Very few people I know are as avant-guarde as I am with AI, but, most people look at these new technologies and simply feel fear. Why pay for something that will replace you?
It's to drive FOMO for investors. He needs tens of billions of capital and is trying to scare them into not looking at his balance sheet before investing. It's reckless, and is soaking up capital that could have gone towards more legitimate investments.
It certainly is. For people who have not heard the statements, here are some quotes. I bring them up, because I think it's worthwhile to remember the bold predictions that are made now and how they will pan out in the future.
Council on Foreign Relations, 11 months ago: "In 12 months, we may be in a world where AI is essentially writing all of the code."
Axios interview, 8 months ago: "[...] AI could soon eliminate 50% of entry-level office jobs."
The Adolescence of Technology (essay), 1 month ago: "If the exponential continues—which is not certain, but now has a decade-long track record supporting it—then it cannot possibly be more than a few years before AI is better than humans at essentially everything."
To be fair, it's hilarious how much verbiage was spent discussing AI 'getting out of the box', when the first thing everyone did with LLMs was immediately throw away the box and go "Here! Have the internet! Here! Have root access! Want a robot body? I'll get you a robot body."
"Y'know, like, the thing is, like, y'know, here's the thing..."
I totally feel for people with speech pathologies or anxiety that makes it harder for them to communicate verbally, but how is this guy the public face of the company and doing all these interviews by himself? With as much as is at stake, I find it baffling.
What I find so funny about heads of AI companies coming out saying things like this, is their own career pages suggest they don't actually feel that way.
He's annoyed me most with the way he speaks. I'm not sure if its a tick or what but the way he'll repeat a word 10x before starting a sentence is painful to listen to.
Yes, the CEO's of these AI companies are clearly not the people who should be selling AI products. They need to be hidden away and kept behind closed doors where they can do their best work. And they need advertising companies, PR firms and better marketing tactics to try and soothe the customers.