One Thing: Is This Social Media's Big Tobacco Moment?

Summary of One Thing: Is This Social Media's Big Tobacco Moment?

by CNN Podcasts

25mMarch 29, 2026

Overview of One Thing: Is This Social Media's Big Tobacco Moment?

This episode of CNN’s One Thing (host David Rind) examines a watershed series of lawsuits accusing major social media companies—most notably Meta (Instagram/Facebook) and Google/YouTube—of designing platforms that harm young people’s mental health. Through the story of a young plaintiff (Caroline) and a recent Los Angeles jury verdict, the episode explores the legal theory targeting product design (rather than user content), the evidence revealed at trial, possible industry and policy consequences, and practical steps for users and parents.

Key points and main takeaways

  • Recent trials are testing a new legal strategy: plaintiffs argue the platforms’ product design (algorithms, infinite scroll, autoplay, beauty filters, etc.) negligently promoted harmful content and addiction, which may sidestep Section 230 protections.
  • In a landmark Los Angeles case, a jury found Meta and YouTube liable for harms to a young plaintiff (Kaylee) and ordered several million dollars in damages; the companies plan to appeal.
  • The litigation is large in scale—consolidated cases include thousands of individual and family suits, school-district claims, and state attorneys-general actions—so this verdict could influence many future trials.
  • Internal documents and whistleblower testimony revealed during trials (e.g., warnings about safety risks, expert cautions on beauty filters) have been central to convincing jurors that companies knew of and failed to mitigate harms.
  • Tech companies say they already invest heavily in safety tools (age/parental controls, warnings, content removal) and dispute the verdicts; they argue platforms are different from publishers and will appeal.
  • Policy outcomes remain uncertain: Congress has bipartisan interest but faces political and technical hurdles; some states are advancing their own child-protection laws; other countries (e.g., Australia) have taken stricter approaches like age restrictions.

Notable evidence and insights from the episode

  • Personal story: Caroline (began social media ~age 10–11) describes developing anorexia after being algorithmically fed extreme weight-loss content during COVID; she later sued Instagram/TikTok.
  • Whistleblower testimony: Former employees (e.g., Arturo Behar) testified they warned leadership about safety issues including predator contact on Instagram.
  • Internal expert findings: Meta solicited 18 independent experts who warned allowing beauty filters for teens was dangerous—yet filters remained accessible to minors.
  • Product design features flagged as harmful: endless algorithmic feeds, autoplay, beauty filters, and other engagement-driven mechanics that amplify content and time-on-platform.
  • Legal pivot: Plaintiffs argue harm stems from how content is surfaced and engineered, not solely from the user-posted content—this reframes liability beyond Section 230 defenses.

Legal and policy implications

  • Litigation strategy: Successful verdicts using product-design liability could open the door for many more damages awards or court-ordered platform changes.
  • Section 230: These cases aim to avoid Section 230 by holding platforms responsible for design and operation choices rather than user speech; jurors so far have accepted negligence claims tied to design.
  • Remedies: Future court orders could require substantive platform changes (age verification, limit features for minors, change recommendation algorithms, restrict filters/autoplay).
  • Regulatory landscape: Federal legislation is politically challenging despite some bipartisan support; in the meantime, state-level laws and foreign models (Australia’s age restrictions) are shaping a patchwork approach.
  • Free-speech concerns: Civil libertarian groups may argue that design choices are a form of platform expression and implicate speech protections—this is likely to be litigated or raised on appeal.

Practical advice and recommended actions (for users and parents)

  • For parents:
    • Monitor and discuss kids’ social media use; consider age-appropriate limits and supervise accounts.
    • Use built-in parental controls, set device/app time limits, and review privacy settings.
    • Consider age verification, content filters, or limiting access to certain features (e.g., filters, reels/For You pages).
  • For teens and adult users:
    • Curate feeds aggressively: block accounts, keywords, and hashtags that trigger harmful content; use “not interested” and reporting tools.
    • Use screen-time limits and take breaks; enable time-reminder features.
    • Seek supportive, recovery-oriented communities and avoid engaging with triggering content.
  • If you or someone you know is struggling:
    • National Alliance for Eating Disorders hotline: 866-662-1235
    • National Suicide & Crisis Lifeline: 988

Quotes and soundbites worth remembering

  • “This proves that these companies are not invincible.” — Claire Duffy (CNN tech reporter)
  • “It’s not the user’s fault… These companies have purposefully designed this to make it addictive and to promote harmful content.” — plaintiff’s perspective (paraphrased)
  • Companies’ response: they disagree with verdicts, emphasize safety investments, and plan to appeal.

What to watch next

  • Appeals from Meta and Google/YouTube in the Los Angeles case; how appellate courts handle the product-design liability theory.
  • Upcoming bellwether trials in the consolidated litigation (hundreds/thousands of related cases).
  • State and federal legislative moves—particularly any laws requiring age verification, design constraints for minors, or national standards that override a state-by-state patchwork.
  • Reporting and document disclosures from ongoing litigation (internal research, whistleblower testimony) that could influence public opinion and juries.

Related content

  • CNN’s podcast Terms of Service (host Claire Duffy) for deeper reporting on tech policy, safety features, and the documents and testimony that emerged from these trials.