The rules dramatically tighten India’s content‑moderation regime, compelling global platforms to overhaul detection systems and potentially shaping international standards for synthetic‑media governance.
India has introduced a sweeping set of regulations targeting synthetic‑media, commonly known as deep fakes, that impose unprecedented takedown deadlines on online platforms.
Under the law, non‑consensual nudity generated by AI must be removed within two hours, while any content ordered taken down by a court or law‑enforcement agency must disappear within three hours. The previous national standard of 36 hours is thus slashed, and platforms are required to staff 24‑hour rapid‑response teams to meet the new timelines.
Officials highlighted the speed of the mandates, noting that “two hours down from 24” and “three hours is crazy” reflect a zero‑tolerance stance. The rules apply to all major social‑media and content‑hosting services operating in India, regardless of size.
The legislation forces tech firms to invest heavily in AI‑detection tools and compliance infrastructure, raising operating costs while setting a potential benchmark for other jurisdictions grappling with deep‑fake proliferation.
Comments
Want to join the conversation?
Loading comments...