
Massachusetts High Court: Claim Against Meta for Alleged Addiction of Children Can Go Forward Notwithstanding § 230
Key Takeaways
- •Massachusetts court rejects §230 shield for Instagram's design features
- •Claims target infinite scroll, autoplay, and variable reward mechanisms
- •Deceptive claims allege false safety statements to parents and regulators
- •Ruling may spur similar lawsuits in other states
- •Public nuisance theory expands liability beyond content publishing
Pulse Analysis
Section 230 of the Communications Decency Act has long been a legal bulwark for online platforms, insulating them from liability for user‑generated content. Yet the doctrine is not absolute; it does not protect companies when the alleged wrongdoing stems from the platform’s own conduct or speech. The Massachusetts Supreme Judicial Court’s opinion underscores this nuance, emphasizing that immunity applies only when a claim treats the service provider as a publisher of third‑party information. By focusing on Instagram’s engineered features—endless scroll, autoplay videos, and intermittent variable rewards—the court found the alleged harms arise from Meta’s design choices, not from the content users share.
The court’s analysis also highlights deceptive‑practice claims rooted in Meta’s public statements that Instagram is safe for children, despite internal evidence to the contrary. By treating those statements as the company’s own speech, the decision sidesteps Section 230’s content‑publisher shield. This approach signals that regulators and litigants can target the promotional narratives and design incentives that drive excessive use among minors. Companies may now need to reassess how they market safety features and age‑gating tools, ensuring that public claims align with internal research and product realities.
Beyond Massachusetts, the ruling could reverberate nationwide as states grapple with the societal costs of social‑media addiction. Legal scholars predict a wave of similar lawsuits leveraging unfair‑business‑practice and public‑nuisance theories, potentially prompting Congress to revisit Section 230’s boundaries. For tech firms, the decision serves as a warning: design decisions and corporate messaging are increasingly scrutinized under consumer‑protection law, and failure to adapt could invite costly litigation and heightened regulatory oversight.
Massachusetts High Court: Claim Against Meta for Alleged Addiction of Children Can Go Forward Notwithstanding § 230
Comments
Want to join the conversation?