Industry Views Sarugami

When AI Fools the Host: Mistake, Missed Opportunity, or Legal Minefield?

By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer

imgCharlie Kirk’s tragic assassination shook the talk radio world. Emotions were raw, and broadcasters across the spectrum tried to capture that moment for their audiences. Charles Heller of KVOI in Tucson shared in these pages yesterday (9/16) how he, in that haze of grief, played what he thought were tribute songs by Ed Sheeran and Adele. Only later did he realize they were AI-generated.

Heller deserves credit for admitting his mistake. Many would have quietly moved on, but he turned the incident into a public reflection on accuracy and the challenges of this new AI age. That honesty does not weaken him – it underscores his credibility. Audiences trust the host who owns a mistake more than the one who hides it. In this business, candor is currency.

Still, the programmer in me sees an on-air opportunity. Imagine a segment called “AI or Authentic?” – play generated songs alongside real ones and invite the audience to decide. It could be informative and fun: interactive, funny, and a perfect spotlight on the very problem that fooled him. I’m sure there are folks out there who have already done this.

Here’s where the lawyer in me speaks up. Falling for a convincing fake is a mistake, not malice. For public figures like Adele or Sheeran, defamation requires proof that a host knew something was false or acted recklessly. A one-off error doesn’t reach that bar.

But liability doesn’t end there. Misattribution can raise right-of-publicity concerns. Saying Adele recorded a song she didn’t isn’t defamatory – but it can still be an unauthorized use of her persona. Intent doesn’t always matter. The safer route is clear labeling: “This may be AI.”

For those of us behind the glass, the lesson is simple: mistakes happen. But doubling down without context? That’s how little errors become legal problems. The law is forgiving of a slip in judgment. It is less forgiving if the same content is repackaged as fact without transparency.

Heller’s story isn’t embarrassing – it’s instructive. In the AI era, every broadcaster faces the same challenge: how to verify what feels authentic. The answer isn’t to shy away from the technology. It’s to make sure you control the punchline – not the algorithm.

Matthew B. Harrison is a media and intellectual property attorney who advises radio hosts, content creators, and creative entrepreneurs. He has written extensively on fair use, AI law, and the future of digital rights. Reach him at Matthew@HarrisonMediaLaw.com or read more at staging.talkers.com/.