AI Podcasts
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

AI Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AIPodcastsAI Could Let a Few People Control Everything — Permanently (Article by Rose Hadshar)
AI Could Let a Few People Control Everything — Permanently (Article by Rose Hadshar)
AI

80,000 Hours Podcast

AI Could Let a Few People Control Everything — Permanently (Article by Rose Hadshar)

80,000 Hours Podcast
•December 12, 2025•1h
0
80,000 Hours Podcast•Dec 12, 2025

Key Takeaways

  • •AI could automate most human labor by 2047.
  • •Small groups may control vast AI workforces, concentrating power.
  • •AI-enabled power grabs could undermine democratic institutions.
  • •Technical mitigations like alignment audits offer tractable solutions.
  • •Few experts currently address AI-driven power concentration risk.

Pulse Analysis

The article by Rose Hatchar warns that advanced artificial intelligence could amplify existing wealth gaps into a new era of power concentration. While global inequality has modestly declined, AI promises to automate the majority of human tasks, potentially by 2047, stripping billions of workers of economic relevance. In a plausible 2029‑2035 scenario, a single AI firm gains a decisive capability edge, merges with rivals, and, through a government‑backed oversight council, directs millions of AI agents that run the economy, military, and policy decisions. This concentration would leave most citizens without meaningful influence over the future.

Four dynamics make AI‑enabled power concentration especially urgent. First, automation shifts productive capability from large human workforces to compact AI clusters, giving small entities outsized influence. Second, the resulting economic surplus empowers those groups to sway political institutions, eroding checks and balances designed for a labor‑based society. Third, combined wealth and AI control create feedback loops that reinforce dominance, while epistemic interference—AI‑curated information streams—weakens public understanding and coordination. Fourth, the field suffers from a talent shortage; only a few dozen specialists actively study these risks, leaving mitigation strategies under‑developed. Together, these forces could produce an unprecedented concentration of political and economic authority.

The report identifies actionable levers. Technical safeguards such as alignment audits, internal security protocols, and transparent model governance can limit actors' ability to weaponize AI. Policy measures—including antitrust enforcement on AI compute, public‑sector AI labs, and democratic oversight councils—could disperse capability concentration. Broadening research funding toward AI governance and encouraging open‑source frontier models may counterbalance private monopolies. While the problem remains under‑explored, the convergence of economic interests among current power holders and growing public awareness creates a window for coordinated intervention. Stakeholders positioned to influence AI development are urged to prioritize these mitigations before self‑reinforcing dynamics become irreversible.

Episode Description

Power is already concentrated today: over 800 million people live on less than $3 a day, the three richest men in the world are worth over $1 trillion, and almost six billion people live in countries without free and fair elections.

This is a problem in its own right. There is still substantial distribution of power though: global income inequality is falling, over two billion people live in electoral democracies, no country earns more than a quarter of GDP, and no company earns as much as 1%.

But in the future, advanced AI could enable much more extreme power concentration than we’ve seen so far.

Many believe that within the next decade the leading AI projects will be able to run millions of superintelligent AI systems thinking many times faster than humans. These systems could displace human workers, leading to much less economic and political power for the vast majority of people; and unless we take action to prevent it, they may end up being controlled by a tiny number of people, with no effective oversight. Once these systems are deployed across the economy, government, and the military, whatever goals they’re built to have will become the primary force shaping the future. If those goals are chosen by the few, then a small number of people could end up with the power to make all of the important decisions about the future.

This article by Rose Hadshar explores this emerging challenge in detail. You can see all the images and footnotes in the original article on the 80,000 Hours website.

Chapters:

Introduction (00:00)

Summary (02:15)

Section 1: Why might AI-enabled power concentration be a pressing problem? (07:02)

Section 2: What are the top arguments against working on this problem? (45:02)

Section 3: What can you do to help? (56:36)

Narrated by: Dominic Armstrong

Audio engineering: Dominic Armstrong and Milo McGuire

Music: CORBIT

Show Notes

0

Comments

Want to join the conversation?

Loading comments...