By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer
In a ruling that should catch the attention of every talk host and media creator dabbling in AI, a Georgia court has dismissed “Armed American Radio” syndicated host Mark Walters’ defamation lawsuit against OpenAI. The case revolved around a disturbing but increasingly common glitch: a chatbot “hallucinating” canonically false but believable information.
The Happenings: A journalist asked ChatGPT to summarize a real court case. Instead, the AI invented a fictional lawsuit accusing Walters of embezzling from the Second Amendment Foundation — a group with which he’s never been employed. The journalist spotted the error and never published inaccurate information. But the damage, at least emotionally and reputationally, was done. That untruth was out there, and Walters sued for defamation.
Last week, the court kicked the case. The court determined Walters was a public figure, and as such, Walters had to prove “actual malice” — that OpenAI knowingly or recklessly published falsehoods. He couldn’t but now it may be impossible.
The judge emphasized the basis that there was an assumption false information was never shared publicly. It stayed within a private conversation between the journalist and ChatGPT. No dissemination, no defamation.
But while OpenAI may have escaped liability, the ruling raises serious questions for the rest in the content creation space.
What This Means for Talk Hosts
Let’s be honest: AI tools like ChatGPT are already part of the media ecosystem. Hosts use them to summarize articles, brainstorm show topics, generate ad copy, and even suggest guest questions. They’re efficient — and also dangerous.
This case shows just how easily AI can generate falsehoods with confidence and detail. If a host were to read something like that hallucinated lawsuit on air, without verifying it, the legal risk would shift. It wouldn’t be the AI company on the hook — it would be the broadcaster who repeated it.
Key Lessons
- AI is not a source.
It’s a starting point. Just like a tip from a caller or a line on social media, AI-generated content must be verified before use.
- Public figures are more exposed.
The legal system gives less protection to people in the public eye — like talk hosts — and requires a higher burden of proof in defamation claims. That cuts both ways.
- Disclosure helps.
OpenAI’s disclaimers about potential inaccuracies helped them in court. On air, disclosing when you use AI can offer similar protection — and builds trust with your audience.
- Editorial judgment still rules.
No matter how fast or slick AI gets, it doesn’t replace a producer’s instincts or a host’s responsibility.
Bottom line: the lawsuit may be over, but the conversation is just beginning. The more we rely on machines to shape our words, the more we need to sharpen our filters. Because when AI gets it wrong, the real fallout hits the human behind the mic.
And for talk hosts, that means the stakes are personal. Your credibility, your syndication, your audience trust — none of it can be outsourced to an algorithm. AI might be a tool in the kit, but editorial judgment is still the sharpest weapon in your arsenal. Use it. Or risk learning the hard way what Mark Walters just did. Walters has yet to comment on what steps – if any – he and his lawyers will take next.
TALKERS publisher Michael Harrison issued the following comment regarding the Georgia ruling: “In the age of internet ‘influencers’ and media personalities with various degrees of clout operating within the same space, the definition of ‘public figure’ is far less clear than in earlier times. The media and courts must revisit this striking change. Also, in an era of self-serving political weaponization, this ruling opens the door to ‘big tech’ having enormous, unbridled power in influencing the circumstances of news events and reputations to meet its own goals and agendas.”
Matthew B. Harrison is a media attorney and executive producer specializing in broadcast law, intellectual property, and First Amendment issues. He serves as VP/Associate Publisher of TALKERS magazine and is a senior partner at Harrison Media Law. He also leads creative development at Goodphone Communications.
Share this with your network