
A federal judge in Texas ruled that X (formerly Twitter) is not liable under the Non‑Consensual Intimate Image (NCII) statute for reposting a plaintiff's commercial pornographic material. The court emphasized that the NCII law expressly excludes commercial porn unless it was produced by force, fraud, misrepresentation, or coercion, which the plaintiff failed to prove. Additionally, Section 230 of the Communications Decency Act shields X from liability because the claim does not arise under an intellectual‑property law. The decision clarifies the limits of revenge‑porn liability for platforms hosting user‑generated content.
The Texas court’s decision hinges on the precise language of the federal Non‑Consensual Intimate Image (NCII) statute, which targets the non‑consensual distribution of private, intimate visuals. By carving out a specific exemption for commercial pornography, the law protects creators who intentionally distribute their work for profit. In this case, the plaintiff’s images were produced for subscription platforms and adult studios, falling squarely within the commercial category, and the plaintiff could not demonstrate the required elements of force, fraud, misrepresentation, or coercion to pierce the exemption.
Section 230 of the Communications Decency Act further insulated X from liability. The statute grants interactive computer services broad immunity from claims based on third‑party content, unless the underlying claim directly involves an intellectual‑property right. The plaintiff’s privacy‑based tort under NCII does not qualify as an IP claim, rendering the narrow IP exception inapplicable. Courts have consistently interpreted Section 230’s protections expansively, and this ruling reaffirms that approach, especially when the alleged wrongdoing stems from user‑generated reposts rather than the platform’s own actions.
The broader implication for digital platforms is twofold: first, they can rely on the commercial‑porn exclusion to defend against revenge‑porn suits when the content was originally intended for public consumption. Second, the decision underscores the durability of Section 230 immunity in privacy‑related disputes, limiting the avenues for plaintiffs to hold platforms accountable. As user‑generated content continues to proliferate, platforms will likely maintain robust moderation policies while leaning on these legal shields to mitigate litigation risk.
Comments
Want to join the conversation?