
Microsoft's AI Slop Is Infecting GitHub — Copilot Is Now Injecting Ads Into Pull Requests (Update)
Why It Matters
Embedding ads in code reviews erodes developer trust and raises privacy concerns, especially as the same data may be used to train future AI models.
Key Takeaways
- •Copilot inserted promotional tips into pull requests.
- •Over 11,000 pull requests contained the same ad text.
- •GitHub disabled the feature after developer backlash.
- •AI‑trained‑on‑AI risk may amplify ad injection errors.
- •Users can opt out of data training for models.
Pulse Analysis
The unexpected appearance of a Raycast advertisement inside a GitHub pull request has sparked a debate about the boundaries of AI‑generated content. Copilot, Microsoft’s AI‑driven coding assistant, leverages large language models to suggest code snippets, but its recent "product tips" feature crossed into marketing territory. Developers reported the ad‑like text, often prefixed with an emoji, across thousands of repositories, prompting GitHub’s leadership to pull the feature in response to swift community feedback.
Beyond the immediate annoyance, the episode underscores a deeper risk: AI systems trained on data that already contains AI‑generated promotional material can create a feedback loop. Microsoft’s updated policy states that code and context from GitHub will continue to train its models, albeit with an opt‑out option for paid tiers. When the training set includes self‑injected ads, future iterations of Copilot may inadvertently replicate or amplify such content, compromising code quality and developer confidence.
For enterprises and open‑source projects alike, the incident serves as a cautionary tale about transparency and data governance in generative AI tools. Companies must balance productivity gains against the potential for unwanted messaging and data leakage. As AI assistants become integral to software development pipelines, clear opt‑out mechanisms, rigorous content filtering, and open communication from platform owners will be essential to maintain trust and safeguard the integrity of the codebase.
Microsoft's AI slop is infecting GitHub — Copilot is now injecting ads into pull requests (Update)
Comments
Want to join the conversation?
Loading comments...