We Didn’t Ask for This Internet

Summary of We Didn’t Ask for This Internet

by New York Times Opinion

1h 27mFebruary 6, 2026

Overview of We Didn’t Ask for This Internet (New York Times Opinion)

This episode brings together Tim Wu (Columbia Law professor, former White House tech advisor) and Cory Doctorow (writer, EFF activist) to diagnose why major internet platforms have deteriorated and to propose realistic policy fixes. Wu frames the problem as "extraction"—platforms leveraging monopoly power to take value beyond what they provide—while Doctorow calls it "inshittification," a predictable multi-stage decay in which platforms first lock users in, then degrade user experience to monetize business customers and shareholders. The conversation connects theory to concrete examples (Facebook/Meta, Amazon, Instagram, app ecosystem rules), explores labor and surveillance harms, critiques regulatory responses (e.g., GDPR cookie pop-ups), and ends with specific policy proposals.

Who’s in the conversation

  • Tim Wu — Columbia Law, author of The Age of Extraction; former special assistant to President Biden for technology and competition policy. Focus: extraction, antitrust, platform power.
  • Cory Doctorow — EFF-affiliated activist, science‑fiction author, author of In Shittification. Focus: platform degradation patterns, software freedom, privacy/interoperability.

Key concepts and definitions

  • Extraction (Tim Wu)

    • Economic concept: firms with market/monopoly power capture wealth far beyond the value/cost of what they provide.
    • In tech: platforms monetize users’ time, attention, and data (often without direct payment) and extract value from business customers and supply chains (e.g., sellers on Amazon).
  • Inshittification (Cory Doctorow)

    • A multi-stage pattern: platforms are initially user-friendly to attract and lock in users; then they degrade the experience for users while monetizing and locking in business customers; finally they extract most of the remaining value for executives and shareholders, leaving a “pile of shit.”
    • Drivers: loss of market discipline (less competition, regulatory capture), weakened worker power, and legal barriers to interoperability and reverse engineering.

Main examples and case studies

  • Facebook / Meta

    • Early promise: non-surveillance social network to lure MySpace users.
    • Stages: user acquisition → advertiser-centred surveillance → algorithmic feeds (prioritizing engagement over user intent) → pivot to metaverse.
    • Consequences: downranking off-site links, ad fraud, publishers monetizing only inside the platform, and feeds showing mostly what platforms think will keep attention.
  • Instagram / TikTok competition

    • Algorithmic recommendation feeds (influenced by TikTok) increased time-on-platform and engagement, but at the cost of user agency. Revealed preference arguments (people stay, so they must prefer it) are undermined by power asymmetries and lack of real alternatives (e.g., blocking competing apps via platform rules).
  • Amazon marketplace

    • Early marketplace had lower fees and enabled small sellers; over time Amazon increased fees, promoted sponsored listings (huge ad revenue—tens of billions), and employed rules that effectively force sellers to rely on Amazon services (fulfillment, Prime), increasing extraction.
  • App ecosystem & anti-circumvention

    • Section 1201 of the DMCA criminalizes many forms of reverse engineering / app modification. This shuts down competitive alternatives (e.g., apps that strip algorithmic suggestions) and prevents competition that would defend users’ interests.
  • Labor, surveillance, and algorithmic discrimination

    • Bossware and surveillance tools (office productivity analytics, driver cameras) degrade worker dignity.
    • Algorithmic wage discrimination: platforms (e.g., nurse staffing apps) buying credit data and offering lower wages to more indebted workers; cycle of increasing precarity.
    • Examples of camera/monitoring abuses (Amazon drivers, degraded working conditions).
  • Regulatory failures & GDPR

    • GDPR produced lots of cookie consent UI clutter rather than comprehensive privacy protection, partly because enforcement is fragmented (many big tech companies base European operations in Ireland where enforcement has been weak).
    • The cookie-popup problem is seen as a symptom of regulatory design and enforcement deficiencies.

Core problems the guests identify

  • Power asymmetries: users and workers lack countervailing power versus platforms (weakened competition, captured regulators, weak unions, legal barriers to interoperability).
  • Incentive architecture: platforms optimize engagement and extract rents, not user welfare or public goods.
  • Surveillance economy: unregulated data brokering enables price discrimination, predatory labor practices, and manipulation.
  • Erosion of trust: search, recommendations, and discovery are increasingly monetized and less trustworthy.
  • Policy paralysis: instead of making bold, value-based choices, regulators often default to disclosure regimes that shift burden to users.

Proposed solutions (high-level)

Cory Doctorow (top priorities he’d enact)

  • Repeal or narrow anti‑circumvention law (Section 1201 DMCA) so owners can legally modify devices/software they own.
  • Strong federal privacy law with a private right of action to enable enforcement by impacted individuals and civil society groups.
  • Interoperability mandates for social media — let users leave a network yet continue to receive and send messages (analogous to leaving a phone carrier).

Tim Wu (top priorities he’d enact)

  • Ban the worst and most toxic business models (e.g., child-targeted manipulation, extreme price discrimination / exploitative models).
  • Treat certain essential platform components like utilities: non‑discrimination duties (no self‑preferencing) and tighter rules for parts of the stack that are foundational to commerce.
  • Antitrust / competition pressure: keep dominant platforms “insecure” to encourage innovation rather than allow dominance through acquisitions and exclusion.

Other policy ideas discussed

  • Restore competition through antitrust enforcement and reversing behaviors that let incumbents buy emerging rivals (acquihires, serial buys).
  • Strengthen worker protections and ban invasive workplace surveillance (bossware).
  • Enforce meaningful privacy protections that make users “unoptimizable” (limit surveillance so platforms can’t compute perfectly targeted manipulation).
  • Design regulation to be enforceable and not simply offload decisions to consumers (avoid the “click-or-consent” default failings).

Notable insights and quotes (paraphrased)

  • “Platforms go bad in a pattern: lock users in, make things worse for them to make things better for business customers, then extract from businesses.” — Cory Doctorow (on inshittification stages)
  • “Extraction is when firms with market power take wealth far in excess of the value of the good provided.” — Tim Wu
  • Revealed preference (people using platforms) is inadequate evidence of consent when users lack realistic alternatives or are subject to manipulation.
  • Competition can be healthy or toxic — competition to maximize attention via manipulation is not socially desirable.

Practical takeaways / actions for listeners

  • Push for stronger privacy legislation and enforcement (federal privacy law, meaningful enforcement capacity).
  • Support interoperability and open alternatives (encourage apps/services that allow portability of social graphs and data).
  • Consider platform choices deliberately: favor services that limit surveillance, and support organizations and policies that protect workers and users.
  • Advocate for antitrust enforcement and transparency in platform practices (sponsored listings, ad targeting, recommender systems).

Further reading (books recommended by guests)

  • Tim Wu:

    • Small Is Beautiful — E. F. Schumacher
    • Manipulation: What It Is, How to Stop It — Cass Sunstein
    • The Rise and Fall of the Great Powers — Paul Kennedy
  • Cory Doctorow:

    • Careless People — Sarah Wynne (on corporate power and geopolitics)
    • Little Bosses Everywhere — Bridget Read (on pyramid schemes and social rot)
    • Jewels, Penny, and the Rooster — Daniel Pinkwater (a recommended children’s book; lighter pick)

Bottom line

Wu and Doctorow offer complementary diagnoses: platforms extract economic and attention value (Wu) and follow a predictable path of degradation once discipline is removed (Doctorow). Both argue the solution mix must include stronger privacy law, restored competition or utility-like rules for essential platform components, interoperability, workplace protections, and a regulatory willingness to ban the most toxic business models rather than hide behind disclosure windows. The debate reframes familiar tech complaints into policy choices about what kind of society and economy we want to live in.