Data Centers: Can't Live With Em, Can't Live Without Em

Summary of Data Centers: Can't Live With Em, Can't Live Without Em

by iHeartPodcasts

48mJanuary 15, 2026

Overview of Stuff You Should Know — "Data Centers: Can't Live With Em, Can't Live Without Em"

Hosts Josh and Chuck (with Jerry present) explore the history, technology, economics and environmental impacts of data centers — from early mainframes and WWII machines (Colossus, ENIAC) to modern hyperscale cloud facilities and the AI-driven boom that’s reshaping capacity, investment and local communities. The episode balances fascination with practical concern: data centers are essential infrastructure, but their rapid growth raises questions about energy, water, regulation and whether an AI investment bubble is forming.

Key topics covered

  • Short history and evolution: Colossus, ENIAC, mainframes (IBM System/360), PCs, internet era, cloud computing.
  • Modern data centers: hyperscale facilities, how the cloud changed access and business models.
  • AI impact: why AI workloads need GPUs, not just CPUs, and how that drives demand.
  • Economics: massive corporate investments, potential bubble concerns, financing risks.
  • Environmental & local impacts: electricity and huge water use, local utility strain, subsidies and community effects.
  • UK-specific developments and examples from the U.S. (Data Center Alley in Northern Virginia, sites in Georgia, Phoenix, Inner Mongolia).

Timeline / history (concise)

  • WWII-era machines: Colossus (Bletchley Park) — early programmable electronic computer used for codebreaking; ENIAC — first general-purpose electronic computer.
  • 1950s mainframes: businesses could have centralized computing; Lyons (L-E-O) in the U.K. used a mainframe for payroll and (notably) some MoD work in 1951.
  • 1964: IBM System/360 — important mainframe family.
  • 1980s–90s: PC and Macintosh decentralized computing to desks.
  • Internet era (late 1990s–2000s): scale-up of data needs, e-commerce, and later cloud computing (early 2000s) which enabled leasing distributed infrastructure (AWS, etc.).
  • COVID era: accelerated remote work and increased demand for data center capacity.
  • 2022 onward: AI boom after ChatGPT — explosive demand for GPU compute and hyperscale facilities.

How modern data centers and the cloud work (brief)

  • Cloud = distributed storage and compute hosted by third parties, not magic “ether.”
  • Hyperscale data centers host thousands of servers (often defined as >5,000 servers) and consolidate huge compute resources for many customers.
  • For general compute and storage, CPUs still dominate; AI training and inference need massive parallel processing via GPUs (e.g., NVIDIA H100).

AI-specific pressure points

  • AI training links tens or hundreds of thousands of GPUs into large clusters — essentially one massive supercomputer.
  • Example references from the episode:
    • ChatGPT/early large-scale models drove a big jump in demand after 2022.
    • Reports of companies like X.AI building very large GPU clusters (e.g., references to a 200,000-GPU Colossus-class system).
    • GPUs (notably NVIDIA’s H100 series) are in extreme demand; NVIDIA’s stock saw huge gains as a result.
  • Result: skyrocketing demand for hardware, more hyperscale buildouts, higher capital investment.

Economics and investment

  • Massive corporate commitments cited in the episode:
    • Microsoft: cited $88 billion investment (2025 figure referenced by hosts).
    • Amazon: pledged around $150 billion over 15 years.
    • Google & Meta: hosts referenced combined spending of roughly $750 billion on equipment over two years.
    • Morgan Stanley projection: approximately $3 trillion in spending on data centers 2025–2030 (hosts attribute to Morgan Stanley).
  • Concerns:
    • Unclear early monetization of many AI projects — some call it an “AI bubble.”
    • IMF and other analysts have warned about overheating and speculative risk.
    • Much financing may flow via private credit (less regulated, shadowy), which increases systemic financial risk if projects don’t deliver returns.

Environmental & local community impacts

  • Global share: data centers use roughly 1–1.5% of worldwide electricity today (hosts’ figure).
  • Local concentrations can be dramatic:
    • Ireland: data centers consume ~20% of national electricity in some coverage cited.
    • Northern Virginia (“Data Center Alley”): a huge local concentration — hosts say those centers use an amount of electricity equivalent to about 60% of Virginia households; local electricity prices reportedly rose (hosts cite a 267% increase since 2020 in areas around the alley).
    • Meta’s “Hyperion” project (example in episode) projected to consume ~5 gigawatts — compared to about half the peak load of New York City (hosts’ claim).
  • Water usage:
    • Cooling is a major issue: evaporative cooling (wet pads, large-scale systems) uses significant volumes of potable water to remove heat. Evaporative approaches can reduce electricity but use more water.
    • Examples cited: Phoenix data centers consuming millions of gallons per day (hosts referenced “7 million gallons per day” for Meta and Microsoft in Phoenix in aggregate), the U.K. uses ~10 billion liters of drinking water for data centers annually (hosts’ figure).
    • Small towns hosting data centers can experience water shortages and higher utility rates.
  • Jobs vs. capital: very large investments may create relatively few local permanent jobs (example: a multi-billion-pound data center creating a few hundred jobs).

UK-specific notes

  • UK ranks as one of the top countries for data center capacity (hosts cite it as third; Germany and the U.S. noted).
  • Microsoft announced a large investment (~$30 billion) in UK data centers; ~100 AI data centers planned in the UK (hosts’ references).
  • Some repurposing projects (e.g., converting existing industrial sites in Wales into data centers) have been proposed as “smarter” siting.
  • Concerns in the UK echo global ones: local resource strain, subsidy strategies, and questions about economic payback.

Notable quotes / host insights

  • “A succubus of electricity and water usage” — hosts use colorful language to highlight how data centers can be insatiable consumers of utilities.
  • Mainframes: still used today for very secure, high-throughput workloads (banks, healthcare, government).
  • Cloud = your data “somewhere else” but still physically stored and processed on servers in data centers.
  • AI has transformed demand dynamics: GPUs + massive clusters = new scale and new environmental/financial pressures.

Key stats and numbers (as discussed on the episode)

  • 150 zettabytes of data consumed worldwide in 2024 (1 zettabyte ≈ 1 trillion gigabytes).
  • Global data consumption growth: ~2 zettabytes in 2010 → 150 ZB in 2024 (hosts use this to emphasize exponential growth).
  • Hyperscale definition used: facilities hosting >5,000 servers.
  • Google Oregon data center >1.3 million sq ft; China Telecom center in Inner Mongolia ~10.7 million sq ft / 250 acres (hosts’ examples).
  • GPUs: some large AI training runs cited using ~20,000 GPUs; X.AI/others aiming for much larger clusters (hosts cite 200,000 GPUs).
  • Data center electricity share: ~1–1.5% of global electricity now; Barclays projection that U.S. data center demand could be ~13% of U.S. electricity by 2030 (hosts cite Barclays).
  • Local examples: Data Center Alley examples, electricity price rises (~267% since 2020) in affected areas (hosts’ figure).
  • UK water use for data centers: ~10 billion liters/year (hosts’ figure).
  • Example jobs vs. investment: £10 billion data center → ~400 jobs (hosts cite this to show local job impact is relatively small).

Problems, trade-offs and underlying tensions

  • Essential infrastructure vs. environmental and social cost: data centers power modern life and AI development but can strain local utilities and water supplies.
  • Subsidies and tax/land incentives: local governments often give generous incentives; much of the economic value may flow back to multinational headquarters.
  • Regulatory gaps: rapid growth outpaces local planning and regulatory oversight, particularly for water and electricity demands.
  • Financial risk: heavy private investment and private credit exposure could create systemic risks if promised AI returns do not materialize.
  • Inequitable local impacts: small communities may shoulder resource burdens without proportional economic benefit.

Main takeaways / actions to watch for

  • Data centers are growing fast — driven now especially by AI — and their footprint (energy, water, land) is substantial.
  • The cloud is not immaterial: everything stored “in the cloud” lives in physical data centers that require planning and resources.
  • Watch for policy changes: regulation of siting, water use, electricity supply, and subsidy transparency will be crucial to manage impacts.
  • Investors and the public should monitor whether AI spending is producing sustainable economic returns — and how financing risks (private credit) are managed.
  • Local communities should scrutinize promised benefits vs. long-term costs (utilities, taxes, water supply).

Recommended further reading / sources mentioned by hosts

  • Historical pieces on Colossus, ENIAC, IBM System/360 and the LEO computer (Lyons/LEO) for context on the evolution from single-site mainframes to distributed cloud infrastructure.
  • Reporting from financial and policy outlets (Financial Times, Barclays, Morgan Stanley, IMF) about AI investment forecasts and systemic risk.
  • Technical coverage of GPUs and AI clusters (search NVIDIA H100, GPU cluster training requirements).
  • Josh Clark’s limited series mentioned by hosts (The End of the World with Josh Clark) — includes discussion of AI.

Final note from the episode

Hosts emphasize that data centers are unavoidable infrastructure powering modern life — but the rapid scaling tied to AI presents new environmental, financial and social challenges that deserve public awareness and policy attention. They invite listeners to consider both the marvel and the costs behind the services we use every day.