As Companies Build Data Centers For AI, Communities Push Back

Summary of As Companies Build Data Centers For AI, Communities Push Back

by Science Friday and WNYC Studios

11mDecember 5, 2025

Overview of Science Friday

This Science Friday episode (WNYC Studios) explores the rapid build-out of data centers driven by AI demand, and the resulting strains on electricity, water, local communities, and energy policy. Host Ira Flatow talks with Casey Crownhart, senior climate reporter at MIT Technology Review, about the scale of investment and resource use, cooling technologies, community pushback, and the gaps between corporate climate pledges and rising infrastructure needs.

Key takeaways

  • Data centers already account for a meaningful share of global electricity (about 1.5% in 2024) and that demand is projected to roughly double by 2030.
  • Global AI and data-center investment is immense: roughly $580 billion in 2025 — more than what was spent on developing the global oil supply that year.
  • Per-query energy for large models is small (Google: ~0.24 Wh; OpenAI: ~0.34 Wh), but billions of queries aggregate into large loads.
  • Two-thirds of new data center developments since 2022 are in water‑stressed regions (e.g., Arizona, Nevada, Texas), creating acute local impacts on water and electricity systems.
  • Community resistance is already delaying/canceling projects: one tracker found about $93 billion in data-center projects delayed or canceled between March and June (year not specified in transcript).

Numbers and evidence

  • Global electricity share: ~1.5% of world electricity in 2024; projected to double by 2030.
  • Per-query estimates: Google Gemini ~0.24 Wh/query; OpenAI ~0.34 Wh/query (companies are beginning to publish per-query figures, but not total energy footprints).
  • Water: a majority of AI‑related water consumption may be “indirect” (used at power plants powering data centers); some estimates put that indirect share at >60%.
  • Local impacts can be large: single data centers can use more water than many counties’ residential supply.

How data centers use water and electricity

Electricity

  • AI model training and serving require large, continuous compute; aggregate demand is rapidly rising and can strain local grids.
  • Long lead times for new clean generation (renewables, nuclear) clash with faster timelines for data‑center construction and AI growth.

Water & cooling

  • Many data centers use evaporative cooling, which consumes significant volumes of high‑purity (sometimes drinking‑quality) water to avoid clogging and bacterial growth.
  • Water is also consumed during chip manufacturing (high-purity water needs), which can be a nontrivial share of total AI-related water use.
  • Cooling alternatives include direct liquid cooling and immersion cooling; these can significantly reduce water use but may be more expensive or increase electricity use (one cited trade-off: up to ~10% more energy in some cases).

Politics, community pushback, and corporate responses

  • Rising utility rates tied to new data-center loads have become political issues in recent elections (e.g., governors’ races in New Jersey and Virginia).
  • Virginia has the largest concentration of data centers globally, making it a focal point for debates over local rates and infrastructure.
  • Companies have made climate/renewable pledges, but rapid and large increases in demand complicate reaching those goals; some firms are pursuing long‑term options like power purchase agreements or nuclear offtake (example cited: Microsoft and Three Mile Island).
  • Some companies have reduced transparency about total energy footprints; per-query figures are a start but don't show total consumption.

Recommended actions and policy implications

  • Treat AI infrastructure as a systems problem, not solely a personal‑use issue: planning should consider grid capacity, water availability, and regional energy mixes.
  • Increase transparency and reporting from companies on total energy and water use (not just per‑query).
  • Encourage and mandate more water‑efficient cooling technologies where appropriate, with assessments of trade-offs (cost, electricity).
  • Align local permitting and grid planning with longer-term clean energy development timelines; avoid accelerating load growth faster than clean supply can come online.
  • Engage communities early; local pushback is already delaying/canceling projects and will grow without clearer protections for rates and resources.

Notable quotes

  • “Data centers accounted for about 1.5% of the world’s electricity consumption in 2024. And that’s set to double by 2030.” — Casey Crownhart
  • “This is overall a systems conversation that we need to be having, rather than… personal choices and personal use.” — Casey Crownhart
  • “This year, we spent more on data centers than the oil supply.” — Casey Crownhart (on 2025 investment comparisons)

Bottom line

AI-driven data-center expansion is moving fast, consuming growing shares of electricity and local water supplies, and straining political and community tolerance. Small per-query energy figures mask a large aggregate problem. Addressing it requires coordinated policy, greater corporate transparency, technology trade‑offs (water vs. energy), and community-centered planning.