Industry Views

When the Algorithm Misses the Mark: What the Walters v. OpenAI Case Means for Talk Hosts

By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer

imgIn a ruling that should catch the attention of every talk host and media creator dabbling in AI, a Georgia court has dismissed “Armed American Radio” syndicated host Mark Walters’ defamation lawsuit against OpenAI. The case revolved around a disturbing but increasingly common glitch: a chatbot “hallucinating” canonically false but believable information.

The Happenings: A journalist asked ChatGPT to summarize a real court case. Instead, the AI invented a fictional lawsuit accusing Walters of embezzling from the Second Amendment Foundation — a group with which he’s never been employed. The journalist spotted the error and never published inaccurate information. But the damage, at least emotionally and reputationally, was done. That untruth was out there, and Walters sued for defamation.

Last week, the court kicked the case. The court determined Walters was a public figure, and as such, Walters had to prove “actual malice” — that OpenAI knowingly or recklessly published falsehoods. He couldn’t but now it may be impossible.

The judge emphasized the basis that there was an assumption false information was never shared publicly. It stayed within a private conversation between the journalist and ChatGPT. No dissemination, no defamation.

But while OpenAI may have escaped liability, the ruling raises serious questions for the rest in the content creation space.

What This Means for Talk Hosts

Let’s be honest: AI tools like ChatGPT are already part of the media ecosystem. Hosts use them to summarize articles, brainstorm show topics, generate ad copy, and even suggest guest questions. They’re efficient — and also dangerous.

This case shows just how easily AI can generate falsehoods with confidence and detail. If a host were to read something like that hallucinated lawsuit on air, without verifying it, the legal risk would shift. It wouldn’t be the AI company on the hook — it would be the broadcaster who repeated it.

Key Lessons

  1. AI is not a source.
    It’s a starting point. Just like a tip from a caller or a line on social media, AI-generated content must be verified before use.
  2. Public figures are more exposed.
    The legal system gives less protection to people in the public eye — like talk hosts — and requires a higher burden of proof in defamation claims. That cuts both ways.
  3. Disclosure helps.
    OpenAI’s disclaimers about potential inaccuracies helped them in court. On air, disclosing when you use AI can offer similar protection — and builds trust with your audience.
  4. Editorial judgment still rules.
    No matter how fast or slick AI gets, it doesn’t replace a producer’s instincts or a host’s responsibility.

Bottom line: the lawsuit may be over, but the conversation is just beginning. The more we rely on machines to shape our words, the more we need to sharpen our filters. Because when AI gets it wrong, the real fallout hits the human behind the mic.

And for talk hosts, that means the stakes are personal. Your credibility, your syndication, your audience trust — none of it can be outsourced to an algorithm. AI might be a tool in the kit, but editorial judgment is still the sharpest weapon in your arsenal. Use it. Or risk learning the hard way what Mark Walters just did. Walters has yet to comment on what steps – if any – he and his lawyers will take next.

TALKERS publisher Michael Harrison issued the following comment regarding the Georgia ruling: “In the age of internet ‘influencers’ and media personalities with various degrees of clout operating within the same space, the definition of ‘public figure’ is far less clear than in earlier times. The media and courts must revisit this striking change. Also, in an era of self-serving political weaponization, this ruling opens the door to ‘big tech’ having enormous, unbridled power in influencing the circumstances of news events and reputations to meet its own goals and agendas.”

Matthew B. Harrison is a media attorney and executive producer specializing in broadcast law, intellectual property, and First Amendment issues. He serves as VP/Associate Publisher of TALKERS magazine and is a senior partner at Harrison Media Law. He also leads creative development at Goodphone Communications.

Industry News

OpenAI Loses Motion to Dismiss in Talk Host Defamation Case

Artificial Intelligence firm OpenAI was denied its Motion to Dismiss the defamation suit filed against it by talk show host Mark Walters, who hosts radio programs produced by his CCW Broadcast Media company. Walters claims the use of OpenAI’s ChatGPT by journalist Fred Riehl that created contentim stating the Walters was accused of embezzling funds from the Second Amendment Foundation defamed him. No such accusation ever actually took place. In its Motion to Dismiss, Open AI argued several points, including that Georgia is not the proper jurisdiction, but it summarized its argument that Walters’ claims didn’t meet the burden of defamation when it said, “Even more fundamentally, Riehl’s use of ChatGPT did not cause a ‘publication’ of the outputs. OpenAI’s Terms of Use make clear that ChatGPT is a tool that assists the user in the writing or creation of draft content and that the user owns the content they generate with ChatGPT. Riehl agreed to abide by these Terms of Use, including the requirement that users ‘verify’ and ‘take ultimate responsibility for the content being published.’ As a matter of law, this creation of draft content for the user’s internal benefit is not ‘publication.’”

Industry News

Yesterday’s (11/21) Top News/Talk Media Stories

The Israel-Hamas war and the negotiations for the release of the hostages; protests and anti-Semitism; Elon Musk sues Media Matters over content-related advertiser boycott of X; former President Donald Trump’s legal battles; the 2024 presidential race; JFK assassination anniversary; the Thanksgiving holiday weekend; and the firing and re-hiring of Sam Altman at OpenAI were some of the most-talked-about stories in news/talk media yesterday, according to ongoing research from TALKERS magazine.

Industry News

Yesterday’s (11/20) Top News/Talk Media Stories

The negotiations with Hamas over release of the hostages; Elon Musk sues Media Matters over its report on X content that’s caused advertisers to leave the social media platform; OpenAI staff threatens mass exit in wake of Sam Altman ouster; President Joe Biden turns 81; a federal appeals court rules only the U.S. AG can enforce section 2 of the Voting Rights Act; the Thanksgiving holiday and the forecast that could affect travel; former President Donald Trump’s legal battles; the 2024 presidential race; and the Supreme Court rejects Derek Chauvin’s appeal of his conviction in the death of George Floyd were some of the most-talked-about stories in news/talk media yesterday, according to ongoing research from TALKERS magazine.

Industry News

OpenAI Seeks Dismissal of Defamation Suit

Artificial Intelligence firm OpenAI has filed a Motion to Dismiss the defamation suit filed against it by talk show host Mark Walters, who hosts radio programs produced by his CCW Broadcast Media company. TALKERS reported the suit by Walters back on June 9 in which Walters claims the use of OpenAI’s ChatGPT by journalist Fred Riehl that created content stating the Walters was accused of embezzling funds from the Secondim Amendment Foundation defamed him. No such accusation ever actually took place. In its Motion to Dismiss, Open AI argues several points, including that Georgia is not the proper jurisdiction, but it summarizes its argument that Walters’ claims don’t meet the burden of defamation when it says, “Even more fundamentally, Riehl’s use of ChatGPT did not cause a ‘publication’ of the outputs. OpenAI’s Terms of Use make clear that ChatGPT is a tool that assists the user in the writing or creation of draft content and that the user owns the content they generate with ChatGPT. Riehl agreed to abide by these Terms of Use, including the requirement that users ‘verify’ and ‘take ultimate responsibility for the content being published.’ As a matter of law, this creation of draft content for the user’s internal benefit is not ‘publication.’”

Industry News

Radio Host Mark Walters Suing OpenAI for Defamation

Talk host Mark Walters, who produces and hosts Second Amendment-themed radio programs via his CCW Broadcast Media company, is suing OpenAI in a Georgia Superior Court claiming that OpenAI’s ChatGPT created a false case alleging that Walters embezzled funds from theim Second Amendment Foundation. The complaint states that journalist Fred Riehl was researching the case of The Second Amendment Foundation v. Robert Ferguson and asked ChatGPT to provide a summary of that complaint and received one that stated the suit’s plaintiff is Second Amendment Foundation founder Alan Gottlieb who accuses Walters as treasurer and chief financial officer of embezzling funds. Walters says, and Gottlieb confirms, that he didn’t serve in either position and didn’t steal anything. In the AI world, false text from services like ChatGPT are called “hallucinations.” As with any defamation case, Walters will have to prove he’s suffered damages, but this case will be interesting to watch as it appears to be the first such legal case involving the work of AI. Read the New York Post’s story here.