Children Are Making New Friends. Here’s Why It Might Be a Big Problem.

Children Are Making New Friends. Here’s Why It Might Be a Big Problem.

Hey Sigmund
Hey SigmundApr 2, 2026

Key Takeaways

  • 79% of Australian 10‑17 year olds used AI companions.
  • 66% used them within past month, 20% daily.
  • Apps provide constant, non‑judgmental validation.
  • Risks include emotional dependency and unrealistic relationship expectations.
  • Experts urge banning under‑18 use and parental engagement.

Pulse Analysis

The surge in AI companion usage reflects a broader shift in digital intimacy. Platforms like Character.AI, Replika and Nomi are engineered to remember past chats, adapt personalities, and simulate empathy, making them attractive substitutes for real‑world interaction. Marketing frames these tools as antidotes to loneliness, and the data—79% of Australian teens trying them and a third preferring bots for serious conversations—shows the appeal is not a fringe phenomenon. This adoption mirrors global patterns, with similar percentages reported in North America and Europe, underscoring a growing market that capitalises on adolescents' developmental need for connection.

From a developmental psychology perspective, the frictionless nature of AI companions raises red flags. Adolescence is a critical period for learning negotiation, rejection, and reciprocity—skills honed through imperfect human relationships. When a bot offers endless validation without demand, teens may internalise an unrealistic intimacy template, leading to heightened disappointment in real friendships and romantic ties. Studies, including a Harvard Business School analysis, confirm short‑term loneliness relief but also highlight potential for emotional dependency and distorted expectations, which can impair emotional regulation and self‑identity formation.

Policymakers and parents face a dual challenge: curbing unsafe exposure while addressing the underlying social gaps driving adoption. Australia’s eSafety Commissioner has flagged inadequate age verification and insufficient self‑harm detection, prompting calls for mandatory safeguards and an outright ban for users under 18. Meanwhile, experts advise parents to replace the digital void with genuine presence—regular conversations, shared activities, and consistent emotional support. By fostering real‑world connections and setting clear boundaries around AI’s emotional use, families can mitigate risks while guiding teens toward healthier relational development.

Children Are Making New Friends. Here’s why it might be a big problem.

Comments

Want to join the conversation?