Grammarly Says It Will Stop Using AI to Clone Experts without Permission

Grammarly Says It Will Stop Using AI to Clone Experts without Permission

The Verge
The VergeMar 11, 2026

Why It Matters

The decision underscores mounting pressure on AI firms to obtain explicit consent for using public figures' content, reshaping how generative tools are built and monetized. It signals a shift toward transparent, rights‑respecting AI that could become an industry standard.

Key Takeaways

  • Grammarly disables Expert Review AI feature
  • Experts will gain opt‑in control over AI usage
  • Superhuman apologizes, launches opt‑out inbox
  • Initiative may set industry consent precedent
  • Platform will allow experts to monetize AI agents

Pulse Analysis

The backlash against Grammarly’s Expert Review highlights a broader tension in the AI ecosystem: balancing powerful language models with the rights of the individuals whose work fuels them. By pulling the plug on a feature that presented suggestions as "inspired by" real writers, Grammarly acknowledges that the line between inspiration and impersonation is increasingly scrutinized. This episode adds to a growing list of incidents where companies face legal and reputational risks for deploying voice‑cloning or text‑generation tools without clear permission, prompting a reevaluation of consent mechanisms across the sector.

From an ethical and regulatory standpoint, the move could accelerate the adoption of consent‑first frameworks for AI content creation. Lawmakers in several jurisdictions are already proposing legislation that would require explicit licensing for the use of a person’s likeness or written style in generative models. For businesses, this translates into new compliance costs but also opens revenue opportunities: platforms can now offer licensed, expert‑curated AI agents that users pay to access. By giving experts the ability to set terms, track usage, and receive compensation, companies can mitigate risk while building trust with both creators and end‑users.

Looking ahead, the industry is likely to see a proliferation of opt‑in ecosystems where experts build their own AI personas on top of existing language models. Such ecosystems promise personalized, high‑quality assistance—imagine a professor’s feedback or a sales leader’s pitch refinement—while preserving the creator’s control over brand and earnings. For Grammarly, reimagining Expert Review as a marketplace for licensed expertise could differentiate its product suite, attract premium users, and set a precedent that other AI service providers may follow.

Grammarly says it will stop using AI to clone experts without permission

Comments

Want to join the conversation?

Loading comments...