Bay Area Radio Station Clowned for AI-Generated ‘Fourth Period’ in Sharks Coverage

Bay Area Radio Station Clowned for AI-Generated ‘Fourth Period’ in Sharks Coverage

Awful Announcing
Awful AnnouncingApr 3, 2026

Key Takeaways

  • AI caption claimed nonexistent fourth hockey period
  • Sharks never trailed by more than one goal
  • KNBR host exposed AI usage on social media
  • Station removed Facebook post, kept TikTok video
  • Incident sparks debate on AI oversight in broadcasting

Summary

Bay Area sports station 95.7 The Game posted an AI‑generated caption claiming the San Jose Sharks rallied in a non‑existent fourth period, a factual error that quickly drew criticism. KNBR host Adam Copeland highlighted the mistake on social media and revealed the station relies on AI for its captions. The offending Facebook post was removed, but the clip remains on the station’s TikTok with a revised caption. The episode underscores the pitfalls of unchecked AI content in live sports coverage.

Pulse Analysis

The incident at 95.7 The Game illustrates a growing tension between speed and accuracy in sports broadcasting. By using an AI tool to auto‑generate social‑media captions, the station inadvertently produced a headline that referenced a "fourth period"—a term that simply does not exist in hockey. Fans and fellow broadcasters quickly called out the mistake, and the station’s rapid removal of the Facebook post highlighted the vulnerability of AI‑first workflows when factual verification is bypassed.

Beyond the immediate embarrassment, the episode raises broader concerns for media outlets that lean heavily on generative AI. While AI can streamline content creation, it lacks the contextual awareness that seasoned sports journalists bring, such as understanding sport‑specific terminology and game dynamics. In a market where credibility is a key differentiator, a single AI‑induced error can damage a station’s reputation and erode listener loyalty, especially in tightly knit fan communities like the Bay Area’s hockey audience.

Industry experts suggest a hybrid model: AI handles routine tasks like transcription and data aggregation, while human editors retain final authority over narrative elements. Implementing robust fact‑checking protocols and training staff to spot AI‑generated anomalies can mitigate risks. As AI tools become more sophisticated, broadcasters that invest in editorial oversight will likely maintain higher trust levels, positioning themselves as reliable sources in an increasingly automated media landscape.

Bay Area radio station clowned for AI-generated ‘fourth period’ in Sharks coverage

Comments

Want to join the conversation?