Legal Blogs and Articles
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

Legal Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Tuesday recap

NewsDealsSocialBlogsVideosPodcasts
HomeIndustryLegalBlogsX and XXX (but No XX): No Revenge Porn Liability for X Based on X User's Alleged Illegal Posting of Commercial Porn Depicting Plaintiff
X and XXX (but No XX): No Revenge Porn Liability for X Based on X User's Alleged Illegal Posting of Commercial Porn Depicting Plaintiff
Legal

X and XXX (but No XX): No Revenge Porn Liability for X Based on X User's Alleged Illegal Posting of Commercial Porn Depicting Plaintiff

•February 25, 2026
The Volokh Conspiracy
The Volokh Conspiracy•Feb 25, 2026
0

Key Takeaways

  • •NCII law excludes commercial porn without coercion
  • •Court upholds X's Section 230 shield
  • •Fraud allegation fails to trigger NCII exception
  • •Intellectual property exception irrelevant to privacy claim

Summary

A federal judge in Texas ruled that X (formerly Twitter) is not liable under the Non‑Consensual Intimate Image (NCII) statute for reposting a plaintiff's commercial pornographic material. The court emphasized that the NCII law expressly excludes commercial porn unless it was produced by force, fraud, misrepresentation, or coercion, which the plaintiff failed to prove. Additionally, Section 230 of the Communications Decency Act shields X from liability because the claim does not arise under an intellectual‑property law. The decision clarifies the limits of revenge‑porn liability for platforms hosting user‑generated content.

Pulse Analysis

The Texas court’s decision hinges on the precise language of the federal Non‑Consensual Intimate Image (NCII) statute, which targets the non‑consensual distribution of private, intimate visuals. By carving out a specific exemption for commercial pornography, the law protects creators who intentionally distribute their work for profit. In this case, the plaintiff’s images were produced for subscription platforms and adult studios, falling squarely within the commercial category, and the plaintiff could not demonstrate the required elements of force, fraud, misrepresentation, or coercion to pierce the exemption.

Section 230 of the Communications Decency Act further insulated X from liability. The statute grants interactive computer services broad immunity from claims based on third‑party content, unless the underlying claim directly involves an intellectual‑property right. The plaintiff’s privacy‑based tort under NCII does not qualify as an IP claim, rendering the narrow IP exception inapplicable. Courts have consistently interpreted Section 230’s protections expansively, and this ruling reaffirms that approach, especially when the alleged wrongdoing stems from user‑generated reposts rather than the platform’s own actions.

The broader implication for digital platforms is twofold: first, they can rely on the commercial‑porn exclusion to defend against revenge‑porn suits when the content was originally intended for public consumption. Second, the decision underscores the durability of Section 230 immunity in privacy‑related disputes, limiting the avenues for plaintiffs to hold platforms accountable. As user‑generated content continues to proliferate, platforms will likely maintain robust moderation policies while leaning on these legal shields to mitigate litigation risk.

X and XXX (but No XX): No Revenge Porn Liability for X Based on X User's Alleged Illegal Posting of Commercial Porn Depicting Plaintiff

Read Original Article

Comments

Want to join the conversation?