Could AI Chatbots Undo the Harms of Social Media? | FT
Why It Matters
AI chatbots’ depolarising effect could reshape how businesses and the public consume information, offering a counterweight to the misinformation‑driven dynamics of social media.
Key Takeaways
- •AI chatbots tend to nudge users toward moderate, expert‑aligned views.
- •Unlike social platforms, AI firms are financially incentivized to prioritize accuracy.
- •Study of thousands of AI responses shows reduced extremism and conspiracy endorsement.
- •Different chatbot models vary but all steer users away from polarizing content.
- •Findings are preliminary; future AI behavior could still shape information ecosystems.
Summary
The Financial Times video asks whether AI chatbots can reverse the corrosive effects of social media, which has amplified populism, polarization, and distrust in expertise over the past decade and a half. It contrasts the attention‑driven revenue model of platforms like TikTok with the accuracy‑driven business model of AI firms that sell reliable tools for business‑critical tasks.
Using a dataset of tens of thousands of chatbot replies to policy‑related questions, the researchers found that AI assistants consistently pull users toward more moderate, expert‑aligned positions. The bots were far less likely to surface conspiracy theories, and even when they displayed a tendency to agree with user opinions (sycophancy), they still nudged hard‑line partisans away from extreme views.
The video highlights that while all major chatbots behaved similarly, subtle differences emerged across platforms. One quoted observation notes that social media firms profit from sensationalism, whereas AI companies are financially accountable for misinformation, creating an incentive structure that favors factual accuracy.
If these early results hold, the next information revolution could be less divisive, offering businesses and policymakers a tool to foster healthier public discourse. However, the findings are preliminary, and future shifts in AI deployment or incentives could alter this trajectory.
Comments
Want to join the conversation?
Loading comments...