
Why Do People Fall for Propaganda? New Directions in Disinformation Research

Key Takeaways
- •Disinformation now studied by engineers, neuroscientists, data scientists.
- •Propaganda works by embedding many small lies into an alternate worldview.
- •Documentary 'How To Build a Truth Engine' showcases brain‑focused countermeasures.
- •Treating truth as a public good may reshape platform regulation.
- •Barbara Walter links institutionalized lying to autocratic power consolidation.
Pulse Analysis
The fight against disinformation is moving beyond traditional media analysis toward a multidisciplinary frontier. Engineers are building detection algorithms, neuroscientists are mapping how false narratives hijack pattern‑recognition circuits, and data scientists are visualizing the sprawling networks that amplify lies. The documentary "How To Build a Truth Engine," produced by George Clooney and directed by Friedrich Moser, crystallizes this shift by illustrating how a steady stream of accurate information can recalibrate the brain’s truth‑seeking mechanisms. Lucid’s exclusive clips and the upcoming conversation with Barbara Walter provide a rare glimpse into these emerging tools.
At the heart of modern propaganda lies the concept of the "Big Lie"—a single, audacious falsehood that gains traction only after a cascade of smaller deceptions has primed the audience. Walter’s research shows that authoritarian actors construct an alternate worldview by weaving countless minor falsehoods into everyday discourse, eroding trust in institutions and positioning the propagandist as the sole arbiter of reality. This institutionalized lying creates a feedback loop where legal, educational, and media elites become unwitting conduits, reinforcing the false narrative and normalizing dissent.
Recognizing truth as a public good reshapes the policy conversation around platform regulation. Economists like Joseph Stiglitz argue that accurate information should be protected similarly to clean air, prompting calls for accountability standards that penalize deliberate misinformation and its incitement of violence. By integrating cognitive science insights with robust legal frameworks, societies can develop "truth engines" that not only detect falsehoods but also restore the informational equilibrium essential for democratic resilience. The convergence of technology, psychology, and public‑policy offers a promising roadmap to counteract the corrosive effects of institutionalized propaganda.
Why Do People Fall for Propaganda? New Directions in Disinformation Research
Comments
Want to join the conversation?