Scrolling for Answers: How Reliable Is Mental Health and Neurodivergence-Related Information on Social Media?

Scrolling for Answers: How Reliable Is Mental Health and Neurodivergence-Related Information on Social Media?

The National Elf Service (Mental Elf)
The National Elf Service (Mental Elf)Apr 22, 2026

Why It Matters

Misinformation on platforms like TikTok can shape young users’ self‑diagnoses and treatment decisions, posing risks to public health. Understanding platform‑specific gaps helps clinicians, researchers, and policymakers target interventions and improve digital health literacy.

Key Takeaways

  • TikTok hosts highest mental‑health misinformation at 35% of posts.
  • YouTube shows lower misinformation rates (22%) and higher content reliability.
  • Professional creators generally produce more accurate information than non‑experts.
  • Neurodivergence topics face higher misinformation prevalence than other mental‑health conditions.
  • Platform algorithms and moderation policies drive variability in information quality.

Pulse Analysis

The rise of TikTok, Instagram, YouTube and Facebook has turned social media into a de‑facto health‑information marketplace, especially for adolescents and young adults seeking quick answers to mental‑health questions. While these platforms provide unprecedented access to peer support and lived‑experience narratives, they also lower the barrier for unverified claims to spread at scale. Prior research estimates that up to 80 % of general health content on social networks is misinformation, a figure that prompted scholars to investigate whether mental‑health and neurodivergence topics follow the same pattern. The systematic review by Carter et al. (2026) offers the first comprehensive snapshot of this landscape.

Analyzing 5,057 posts from 27 studies, the review uncovered stark platform disparities. TikTok exhibited the highest misinformation prevalence at 35 %, reflecting its algorithm‑driven, short‑form video format that rewards virality over verification. YouTube, with a more searchable architecture, posted a lower rate of 22 % and generally higher content quality, particularly when videos were authored by clinicians or licensed therapists. Neurodivergence subjects such as ADHD and autism suffered the greatest misinformation burden, while anxiety and depression content on YouTube Kids showed virtually none. Professional creators consistently outperformed lay users, though some patient‑generated videos matched expert standards.

The findings carry clear implications for clinicians, researchers and regulators. Health providers should proactively discuss the credibility of online information during appointments, guiding patients toward vetted sources and encouraging critical appraisal skills. Researchers can build on this baseline by conducting longitudinal studies that track how misinformation influences help‑seeking behavior and treatment outcomes. Policymakers, meanwhile, face pressure to define transparent moderation criteria and to hold platforms accountable for algorithmic amplification of false claims. Ultimately, a coordinated effort across the digital ecosystem is needed to safeguard mental‑health literacy in an increasingly connected world.

Scrolling for answers: how reliable is mental health and neurodivergence-related information on social media?

Comments

Want to join the conversation?

Loading comments...