Dr. Kaitlyn Regehr: Smartphone Nation
Why It Matters
Because attention‑driven algorithms amplify harmful content to vulnerable youths, proactive regulation is essential to protect mental health and democratic discourse.
Key Takeaways
- •Digital platforms lack consumer protections unlike food, medication, cars
- •Attention economy prioritizes sensational content, amplifying hate and disinformation
- •Algorithmic feeds expose youths to escalating misogyny within days
- •Neurodiverse and vulnerable users receive disproportionately harmful recommendations
- •Proactive regulation, not reactive moderation, needed for safe digital ecosystems
Summary
Dr. Kaitlyn Regehr’s talk “Smartphone Nation” warns that unlike food or medicine, digital services lack consumer‑level safeguards, turning users’ attention into a commodity sold to advertisers.
Her team conducted an algorithmic study on TikTok, creating archetypes from interviews with teenagers, then tracking the content fed to them. Within five days, male archetypes saw a four‑fold rise in misogynistic videos, while radical ideologies became increasingly normalized; neurodiverse or already vulnerable youths received even more of this harmful material.
Regehr emphasizes that “we are the product, not the consumer,” and challenges the notion of free‑speech‑based objections by framing the issue as a right to dissemination—platforms should not algorithmically push suicide content to minors. She also notes the absence of formal communication channels between tech firms and governments for child safeguarding.
The findings suggest that reactive moderation is insufficient; policymakers should adopt proactive, broadcast‑style content rankings that prevent harmful algorithms from surfacing dangerous material. Implementing such safeguards could curb polarization, radicalization, and the mental‑health toll on young users.
Comments
Want to join the conversation?
Loading comments...