Overview of "Are we cooked? How social media shapes your language" (TED Tech — Adam Aleksic)
This TEDx talk (featured on TED Tech and summarized on NPR Shortwave) has linguist Adam Aleksic explaining how social media algorithms actively shape modern slang, accelerate language change, and influence identity formation. Using concrete examples (unalive, riz, skibbity, the "Rizzler" song, cottagecore, and -pilled terminology), Aleksic shows how platform moderation, algorithmic reward systems, and commercial incentives create, amplify, and sometimes distort words and subcultures — with both creative and harmful consequences. The talk ends with a call for awareness rather than alarm.
Key points / main takeaways
- Social media algorithms are now active drivers of language change, not just passive broadcasting channels.
- Moderation policies can produce euphemisms: "unalive" emerged because direct terms like "kill" get suppressed on TikTok.
- Algorithms reward repetition and engagement, so catchy songs, audios, or tags (e.g., the "Rizzler" meme/song) can quickly make niche slang mainstream.
- Platforms encourage hyper-specific labels (e.g., cottagecore, goblincore) because they let algorithms deliver targeted, monetizable content and create identity niches.
- Viral spread flattens or erases etymology: words originating in Black or queer communities can be appropriated, exaggerated, and divorced from their social meaning.
- The circulation of extremist-adjacent vocabulary (e.g., "pilled", "sigma") can normalize or make those ideas more accessible, even when used ironically.
- Overall verdict: language will keep evolving — not necessarily catastrophic — but users should be aware of algorithmic conditioning and the commercial/political stakes behind words.
Examples and case studies
"Unalive"
- Function: Euphemism used by young people to talk about death or self-harm in a less direct way.
- Origin driver: Platform moderation and content removal led users to invent a term that avoids automatic suppression.
The "Rizzler" song (and viral audios)
- How it works: A catchy audio filled with slang (riz, skibbity, gyat) goes viral; algorithms amplify it via engagement, leading to mass adoption.
- Result: Words can go from obscurity to mainstream in months; some even become dictionary-recognized.
"-core aesthetics (cottagecore, goblincore, etc.)"
- Mechanism: Algorithms treat subcultures as metadata; creators produce content that fits these tags because it performs well.
- Commercial angle: Platforms and businesses capitalize on these identities by selling targeted products; users may adopt labels partly because the algorithm continuously affirms them.
"Pilled" / incel language spread
- Issue: Terms borrowed from extremist subcultures (black-pilled, sigma male) get memefied and recombined (e.g., "burrito-pilled"), increasing exposure to harmful ideologies indirectly.
- Danger: Even ironic use can make harmful concepts more accessible or normalized.
Appropriation of Black & queer slang
- Pattern: Many viral slang terms originate in marginalized communities as forms of expression and identity.
- Consequence: Online repackaging often strips words of context and meaning, leading to dilution, caricature, or disrespectful uses.
Implications & concerns
- Cultural erasure and appropriation: Viral spread can remove historical and social context from slang, marginalizing origin communities.
- Commercialization of identity: Algorithms incentivize identity labels that are easy to target for ads and product sales.
- Normalization of extremist tropes: Meme-ification can trivialize or disseminate harmful ideologies under the guise of humor.
- Rapid, opaque evolution: Language change happens faster and with less traceable lineage, making etymology and accountability harder to keep track of.
Recommendations / action items
- Be aware: Think about where words come from before using or spreading them.
- Check etymology and context: Learn whether slang originated in marginalized communities or harmful groups.
- Avoid amplifying harmful terms: Don’t casually reuse extremist-origin vocabulary even ironically.
- Support origin communities: Give credit, listen to voices from communities that created the language.
- Demand platform transparency: Encourage social media companies to be clearer about how their algorithms promote or suppress language and communities.
- Educators & parents: Recognize algorithm-driven slang as part of cultural literacy — teach context without moral panic.
Notable quotes from the talk
- "We are entering an entirely new era of language change driven by social media algorithms."
- "You only got the word unalive because you can't say kill on TikTok."
- "Subcultures are the new demographics" (paraphrase of platform marketing that pushes niche identities).
- "We should be aware when the words we're using may have been engineered to sell us things."
- "If a word gets bent, we'll just come up with another word."
Bottom line
Language is evolving rapidly under the influence of social media algorithms. That evolution produces creative, colorful slang and new forms of identity — but it also accelerates appropriation, commercial exploitation, and the spread of harmful ideas. The recommended stance is informed awareness: enjoy new linguistic creativity, but recognize and interrogate where words come from, who benefits from their spread, and what harms they might carry.
Where to find the talk / further listening
- Original talk: TEDx (featured on the TED Tech podcast)
- Episode featured on NPR Shortwave; search "TED Tech Adam Aleksic" or check TED Tech wherever you get podcasts.
