← All summaries

The Case Against the AI Job Apocalypse

Plain English with Derek Thompson · Derek Thompson — Alex Emas · May 12, 2026 · Original

Most important take away

Despite headline-grabbing layoffs and AI-CEO predictions, there is no strong economic evidence of an AI jobs apocalypse — most layoffs are happening at companies with weak stock performance, and a survey of 6,000 senior executives found 70% expect AI to add jobs or have no impact. Economist Alex Emas argues that the deep reasons (lump-of-labor fallacy, Jevons paradox, O-ring task interdependence, and a human desire for scarcity and connection) make a permanent collapse of labor unlikely even as AI grows dramatically more capable.

Summary

Key themes

  • The “AI jobs apocalypse” narrative is overstated. Companies citing AI for layoffs are mostly underperformers using AI as a convenient cover. Software engineering hiring, supposedly the most exposed sector, is recovering rather than collapsing.
  • The lump-of-labor fallacy: assuming today’s jobs are the only jobs that can exist. Since 1820, near-total automation of agriculture and most manufacturing did not destroy employment — it shifted dollars and humans toward newly affordable desires (healthcare, education, pet care, entertainment).
  • Jevons paradox: when something becomes cheaper or more efficient, total demand can explode rather than shrink. Coal demand rose with efficient steam engines; spreadsheet users grew 100x after Excel; software engineering may follow the same path.
  • O-ring jobs: most real jobs are tightly bundled tasks where one weak link wrecks the whole output. AI automating 4 of 5 tasks does not eliminate the job — it can actually raise the value of the remaining human task. Whether this leads to firings depends on the elasticity of consumer demand for the cheaper output.
  • Strong vs. weak bundles: radiologists, doctors, and teachers are “strong bundles” — the relational, judgment, and care-coordination tasks are inseparable from the automatable ones. Truck driving is a “weak bundle” — tasks can be split and automated independently.
  • Human privilege / scarcity: experiments show people pay sharply more for human-made goods and for goods that are scarce. AI abundance will increase the premium on scarce human work and relational services.
  • The dark counterpart: relational demand is rising while in-person human connection is declining. The economy may monetize loneliness via chatbots and paid relational services, but evidence (a 2026 Science study) suggests chatbot companionship increases loneliness rather than resolving it.

Actionable insights

  • Discount layoff press releases that name “AI” as the cause — check the company’s stock performance first.
  • For workers: invest in the relational, judgment, and integrative parts of your job (managing care teams, coaching, taste-making, client trust). These are the “O-ring” pieces that protect the bundle.
  • For software engineers: don’t assume your field is sunsetting. Jevons dynamics suggest demand for cheap, abundant software is still expanding.
  • For educators and students: design for nimbleness. Emas redesigns his curriculum nearly every year. Focus on skills that integrate human judgment with AI tooling rather than tasks AI does alone.
  • For businesses: where customers value the human in the loop (Starbucks reversed full automation for this reason), keep it. Scarcity and human authorship can command large premiums.
  • Be epistemically humble: Emas believes AGI/ASI is coming and holds wide confidence bands, but the historical base rate strongly favors continued high employment with changed job content.

Chapter Summaries

  1. Framing the apocalypse case. Derek opens by noting layoffs at Coinbase, Block, and Salesforce and AI-leader predictions of mass disemployment. He argues many of these companies have lost a third of their equity value and are using AI as cover; a 6,000-executive survey shows 70% expect AI to add jobs or have no impact.
  2. The strongest version of the apocalypse argument. Emas traces the worry to Ricardo’s 1820 “On Machinery” chapter and Samuelson’s 1989 “Ricardo Was Right.” The modern version (Philip Tremel) argues AI will produce endless cheap variety, capturing all consumer dollars and driving labor’s share toward zero.
  3. Starbucks and the relational economy. Starbucks over-automated, then reversed. People will pay more for a human in the loop. Emas defines a “relational” task as one where customers prefer (and pay for) human delivery even when output is identical.
  4. Lump of labor fallacy. If you told Ricardo nearly every 1820 job would be automated by 2026, he’d predict 90% unemployment. Instead, employment is near all-time highs. Cheaper agriculture freed dollars to chase new desires, creating whole new industries.
  5. Jevons paradox. Efficient steam engines increased coal demand; cheaper software may grow software engineering employment. Current data shows software hiring recovering post-COVID-overhang despite agentic coding tools — a “narrative violation.”
  6. O-ring jobs. Named for the Challenger disaster, the model says tasks within a job are deeply interdependent. AI automating most tasks can make the remaining human task more valuable. Whether jobs disappear depends on elasticity of demand for the cheaper product (legal software example).
  7. Spreadsheets vs. horses. Two distinct stories: Excel grew spreadsheet employment 100x (short-run Jevons); agriculture shrank to a small GDP share but dollars and people moved to new sectors like pet care (long-run lump-of-labor). Both arguments support skepticism of an apocalypse.
  8. Strong vs. weak bundles. Radiologists need the automatable reading task to do the relational care-coordination task — strong bundle. Truck driving splits cleanly into independently automatable tasks — weak bundle. The strong-bundle jobs are most resistant to displacement.
  9. Scarcity and human privilege. Experiments (with Crystal Madras and Grayland Mandel) show willingness to pay nearly doubles when a product is artificially scarce, and human-made goods command a premium that doesn’t apply to AI-made goods. Human work is intrinsically scarce.
  10. Future of work and education. Job titles may persist (teacher, doctor, financial planner) but day-to-day work will shift toward the relational core, with AI handling the rest. New relational job categories will emerge that we cannot yet name. Educators must continually redesign curricula.
  11. The loneliness tension. Relational spending is rising even as everyday human connection declines. The economy may monetize loneliness; chatbot companionship appears to worsen loneliness in early studies. Our “stone-age brain” still demands real human connection.
  12. Conclusion. Four pillars against the apocalypse: lump of labor, Jevons paradox, O-rings, human privilege. Emas believes AGI and ASI are coming but argues 200,000 years of upheaval have left employment at all-time highs — that base rate should not be discounted.