The Surprising Similarity Between the US and Chinese Internets

Summary of The Surprising Similarity Between the US and Chinese Internets

by Bloomberg

51mFebruary 3, 2026

Overview of Odd Lots — "The Surprising Similarity Between the US and Chinese Internets"

This Bloomberg Odd Lots episode (hosts Joe Weisenthal and Tracy Alloway) features Yiling Liu, author of The Wall Dancers: Searching for Freedom and Connection on the Chinese Internet. The conversation uses individual stories and on-the-ground reporting to explain how the Chinese internet actually works — the dynamics of censorship, creativity, fandom and nationalism — and draws surprising parallels between how social media evolved in China and in the U.S. Major recurring ideas: the “dance” between state and society (the book’s central metaphor), purposeful vagueness in rules, and the shared tribalizing, centralized effects of modern platforms across political systems.

Key themes and framing

  • "Dancing in shackles" (The Wall Dancers): a metaphor for how Chinese netizens navigate constraints — a dynamic push-and-pull between creativity/connection and state control.
  • Rejects binary views: neither “China = technological juggernaut” nor “China = mindless, totalitarian censorship” fully captures reality. Liu emphasizes complexity and agency at the individual level.
  • Convergence across systems: despite different governance (state vs. market), both Chinese and American internets show similar social dynamics (tribalism, mobs, fandom tactics, algorithmic centralization).
  • Central claim: the problem is centralization of technology — whether concentrated in governments or a handful of tech oligarchs, centralization produces similar pathologies.

How censorship actually works in China

  • Vagueness by design: rules are often purposely ambiguous (no clear "red lines"), which encourages preemptive self-censorship and gives authorities discretionary power.
  • Evolving content priorities:
    • Early era: focus on preventing collective action (the "three T's": Tiananmen, Tibet, Taiwan).
    • Mid-2010s onward: expanded to ideological enforcement (e.g., removal of "unhealthy marital values," "sissy boys"), and suppression of content that contradicts party narratives.
    • New targets: excessive flaunting of wealth, content judged to harm "positive energy."
  • Process and actors:
    • Top-level directives (e.g., Cyberspace Administration) can order platforms to downplay or remove items.
    • Platforms maintain large in-house censor teams; self-censorship is common.
    • Automated systems flag content first; human moderators enforce/delete.
    • Keyword databases are valuable proprietary assets companies use to survive.
  • Labor intensity: censorship still requires vast numbers of human moderators; e.g., Weibo grew from ~150 censors at launch to thousands later.

Communities, tactics and language

  • Netizens are resourceful: code-words, puns and homonyms (e.g., “grass mud horse,” Winnie the Pooh) are used to evade filters; Chinese writing's homophony amplifies this creativity.
  • Fandom and fandom tactics weaponized: “Little Pinks” (nationalist fan-like groups), patriotic trolls, flooding (pumping content to bury undesired posts) — tactics resemble Western fandoms or political influencer groups.
  • Paid/organic blur: distinctions between paid propaganda, patriotic hobbyists (Wumao / 50 Cent Army), and genuine grassroots voices are often unclear.
  • Cleavages: the main social split Liu highlights is builders/tech insiders (optimistic, agency) vs. ordinary users (pessimistic, feeling like NPCs), as well as wealth/inequality tensions.

Timeline & pivotal moments

  • 2008: Beijing Olympics + global financial crisis — pivotal for rising nationalism and distrust of Western models; catalyzed an inward turn among some Chinese netizens and officials.
  • 2009 Urumqi unrest: moment that reshaped domestic platforms; Weibo emerged as a survivor largely because it handled censorship better than rivals.
  • 2010s–2020s: growth of Little Pinks and organized online patriotism; platforms professionalize moderation and keyword systems.
  • Recent: TikTok ban scare in the U.S. produced a short-lived mass migration to Chinese platforms (e.g., Xiaohongshu / RedNote), illustrating cross-border social media flows and user adaptation.

Parallels with the U.S. internet

  • Tribalization, mob dynamics and fandom tactics appear on both sides — social media incentivizes group formation, identity-mobilization and harassment.
  • Centralization matters more than the type of central actor: a single platform owner (e.g., Elon Musk’s decisions at X) can exert as much influence over discourse as the Chinese state can through regulation and censorship.
  • Algorithms and opaque moderation regimes produce similar problems: information bubbles, amplification of ragebait, and difficulty auditing platform behaviors.
  • The actor mix may differ (state vs. corporations), but user behavior and incentives are often the same.

AI, algorithms and governance

  • AI already aids censorship: automated flagging reduces labor but human moderators remain crucial.
  • China has taken steps toward algorithm governance (e.g., algorithm registry requiring firms to submit AI tools to authorities) — a centralized regulatory model that some see as effective, others as Orwellian.
  • Emergent question globally: how to govern powerful, opaque recommendation systems and generative AI tools before harms compound.

Notable quotes & metaphors

  • "Dancing in shackles" — the lived metaphor for creativity under constraint.
  • "Vagueness is purposeful" — rules are intentionally fuzzy to encourage self-censorship and permit discretionary enforcement.
  • "We allowed too much of technology to be centralized" — Liu’s succinct diagnosis connecting U.S./China similarities.
  • Dolores Umbridge analogy — vague decrees that provoke over-compliance and anxiety among implementers.

Concrete takeaways

  • Censorship in China is not only top-down deletion; it’s a mixed system of directives, platform self-policing, algorithmic filtering, human moderation, and grassroots flooding.
  • Many social behaviors seen in the U.S. (mob behavior, fandom coordination, disinformation tactics) are mirrored in China — the platform architecture and incentives drive similarities across regimes.
  • Centralization of tech (whether corporate or state) is the structural problem that enables control, manipulation, and the tribalizing effects of social media.
  • AI will make content management easier and faster — but also create new governance challenges. China's algorithm registry is a notable, distinctive policy approach worth studying.

Recommended next steps / resources

  • Read: Yiling Liu, The Wall Dancers: Searching for Freedom and Connection on the Chinese Internet.
  • Follow: Yiling Liu (@yilingliu95) and nonprofit China Digital Times for translations and decoding of Chinese netizen slang and censorship tactics.
  • Experiment: If you want first-hand exposure, you can open a Weibo account (requires Chinese language skills for best experience) and observe content dynamics and code-words in context.
  • For scholars/policymakers: study algorithm registries and explore more auditable governance mechanisms for recommendation systems in your jurisdiction.

Final note

The episode reframes common assumptions: neither state censorship nor market-driven platform power alone explains online harms — incentives baked into centralized systems plus human social behavior produce remarkable cross-border similarities. Yiling Liu’s people-centered reporting makes those mechanisms tangible and shows how everyday creativity, fandom tactics, and governance choices shape what billions see and say online.