
A petition for Supreme Court review challenges the Ninth Circuit’s ruling that Section 230 shields Twitter from civil penalties despite its knowing possession and distribution of child sexual abuse material (CSAM). The petition argues that the Good Samaritan immunity in 47 U.S.C. §230(c) should not apply when a platform has actual knowledge of illegal content and deliberately chooses not to act. If the Court narrows Section 230’s scope, platforms could face liability for knowingly hosting CSAM. The case offers a focused vehicle to address broader Section 230 controversies.
Section 230 was enacted to encourage internet intermediaries to police harmful material without fear of constant litigation. The law’s Good Samaritan clause, codified at 47 U.S.C. §230(c), protects platforms that act in "good faith" to remove objectionable content. Over the past decade, courts have expanded that protection, often shielding companies from liability even when they knowingly host illegal material. This doctrinal drift has sparked a policy debate about whether the original intent of Section 230 is being eclipsed by a de‑facto immunity shield.
In the recent Ninth Circuit case, the court concluded that Twitter’s decision to retain CSAM after confirming the victims’ minor status fell within Section 230’s immunity umbrella. The petition filed by the National Center on Sexual Exploitation and the Antonin Scalia Law School argues that actual knowledge of criminal content defeats the Good Samaritan defense. By highlighting Twitter’s refusal to act until a Department of Homeland Security official intervened, the petition underscores a stark mismatch between statutory language and judicial interpretation, raising the prospect that platforms could profit from illicit material without legal consequence.
A Supreme Court ruling on this issue could reshape the liability landscape for all digital intermediaries. Narrowing Section 230 would compel platforms to adopt more rigorous detection and removal protocols for CSAM, potentially reducing the prevalence of child exploitation online. At the same time, it would signal to courts that immunity does not extend to deliberate, knowledge‑based wrongdoing, preserving the law’s original balance between free expression and accountability. Stakeholders—from tech firms to advocacy groups—are watching closely, as the decision may set a precedent for future disputes over platform responsibility and civil penalties.
Comments
Want to join the conversation?