Grok AI: What Do Limits on Tool Mean for X, Its Users, and Ofcom?
AI

Grok AI: What Do Limits on Tool Mean for X, Its Users, and Ofcom?

The Guardian AI
The Guardian AIJan 15, 2026

Why It Matters

The restrictions aim to curb non‑consensual image manipulation, reducing legal risk for X and signaling compliance with the UK Online Safety Act. They also lower the probability of a full platform ban while keeping regulators watching.

Grok AI: what do limits on tool mean for X, its users, and Ofcom?

Elon Musk’s X has announced it will stop the Grok AI tool from allowing users to manipulate images of people to show them in revealing clothing such as bikinis

The furore over Grok, which is integrated with the X platform, has sparked a public and political backlash as well as a formal investigation by Ofcom, the UK’s communications watchdog.


What has X announced?

The social media platform said on Wednesday it had implemented “technical measures” to stop the @Grok account on X from allowing the editing of images of real people so that they appear to be in revealing clothing such as bikinis. Before this, users had been able to ask the @Grok account on X to manipulate images, with the result being published on the platform.

X said this restriction would apply to all users, including paid subscribers to X. There are about 300 million monthly users of X and up to 2.6 million subscribers.

The platform added that the ability to create and edit any images at all via the @Grok account would be limited to subscribers. This means individuals who attempt to break the law or X’s policies can be more easily traced.

X is also introducing further limits for specific countries that fold in the Grok button inside X and the Grok app, which is also owned by X’s parent company, xAI. It is limiting the ability of users in certain jurisdictions to generate images of real people in bikinis, underwear and similar clothing via @Grok and the Grok button inside the app if such behaviour is illegal in those countries. This is a process known as geoblocking – and is expected to apply to the Grok app as well.

Distributing intimate images of people without their consent, known colloquially as “revenge porn”, is illegal in the UK, so geoblocking will be applied in this jurisdiction at least.


What is new in X’s statement?

Prior to Wednesday, the @Grok account on X said it had switched off its image‑creation function for non‑subscribers – the vast majority of its user base. However, this was criticised by the UK government because it indicated the digital undressing of women and children could still be carried out by paying subscribers. The Grok app also appeared to be unaffected.

This latest announcement is the first statement by X on the matter since 4 January. It goes into greater detail than the @Grok message, making clear that there will be sweeping restrictions across the @Grok account and the Grok button inside the X app, with the separate Grok app also expected to be included.


How did the UK government react?

A Downing Street source described the move as a “vindication” for the UK prime minister. Keir Starmer has called the flood of stripped images “disgusting” and “shameful”. The UK tech secretary, Liz Kendall, said she welcomed the move but still expected the facts behind what happened to be “fully and robustly established” by an ongoing investigation into X’s behaviour by Ofcom, the UK’s media watchdog.

Last week, the government said it would support Ofcom if it decided to use the full suite of its powers under the country’s online‑safety laws, including a UK‑wide ban of the platform.


Does X’s announcement make a UK ban less likely?

Yes. A ban was always the nuclear option under the Online Safety Act (OSA) and is supposed to be reserved for serious, ongoing breaches of the law. X appears to have addressed this with its announcement.

“If the technical measures that X has taken work, then banning the platform is reduced as a possibility,” says Lorna Woods, a professor of internet law at the University of Essex.


Does X still face punishment?

Ofcom released a statement on Thursday saying it was still investigating X, which means it can still be punished.

“This is a welcome development,” said the regulator. “However, our formal investigation remains ongoing. We are working round the clock to progress this and get answers into what went wrong and what’s being done to fix it.”

X remains under investigation for the circumstances around the intimate‑image torrent, which started in December and accelerated after Christmas.

Ofcom is focusing on whether X has breached the act in the following ways:

  • failing to assess the risk of people seeing illegal content on the platform;

  • not taking appropriate steps to prevent users from viewing illegal content such as intimate‑image abuse and child sexual‑abuse material;

  • not taking down illegal material quickly;

  • not protecting users from breaches of privacy law;

  • failing to assess the risks X may pose to children; and

  • not using effective age‑checking for viewing of pornography.

If it is found to have breached the act under those circumstances, X still faces the prospect of a fine of up to 10 % of global turnover or being forced to take specific steps to comply with the OSA.


What is the likely outcome?

This is Ofcom’s highest‑profile investigation yet, so it might fine X to set a precedent if it is found to have been in breach of the act. Alternatively, it could decide that X is now in compliance with the act and move on, as it did with Snapchat on Thursday.

Ofcom announced that it had raised concerns that the platform had failed to carry out an adequate risk assessment related to illegal content appearing on the site. It said Snapchat had addressed its concerns as part of the enforcement process, made changes, and it had decided that no further action would be taken.

Nonetheless, if X is found to have breached the act, Ofcom might feel the need to issue a fine as a precedent.

Comments

Want to join the conversation?

Loading comments...