Industry Views

Creators, Commentators, or Publishers: Liability Remains the Same

By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer

imgThe rise of independent, talk show-style political commentary on YouTube has created a new class of media actors who do not see themselves as broadcasters, journalists, or publishers. They see themselves as creators. That distinction is real in terms of identity, tone, and platform. It is not real where it matters most: liability.

The difference exists in how the work is produced and presented. It disappears the moment the content is published.

In practice, these creators are engaging in acts that courts have long recognized as publication. They are selecting topics, framing narratives, editing clips, and distributing content to large audiences. Those decisions are not neutral. They are editorial.

The absence of FCC regulation in this space has created a persistent misunderstanding. Traditional broadcasters operate under a regulatory framework that includes licensing and content restrictions. Independent creators do not. But the lack of FCC oversight does not reduce exposure. It removes one layer of regulation while leaving the core legal risk fully intact.

Defamation law applies equally to both groups. A false statement of fact about a real person that causes reputational harm can give rise to liability whether it is spoken on a licensed radio station or uploaded to a monetized YouTube channel. The standards may differ depending on whether the subject is a public or private figure, but the underlying obligation remains the same: accuracy matters.

There is no YouTube exception. There is no creator carveout. The law does not care how the content was distributed, what the platform calls you, or how you see yourself. It cares who made the statement, who chose to publish it, and whether it was false.

The structure of YouTube content introduces additional risk. Many creators rely on rapid production cycles and clip-based commentary. This increases the likelihood of error, particularly when context is compressed or omitted. Editing choices that seem minor from a production standpoint can materially change meaning, which is precisely the type of conduct that courts examine in defamation and false light claims.

Monetization further complicates the analysis. Revenue from ads, memberships, or sponsorships strengthens the argument that content is commercial in nature. That does not eliminate First Amendment protections, but it can influence how a court evaluates intent and reasonableness.

There is also a tendency to assume that platform norms provide a form of protection. If a piece of content is allowed to remain online, or even promoted by an algorithm, it can feel implicitly validated. That assumption is misplaced. Platform enforcement decisions are not legal determinations. They are business judgments.

The most important point is simple and often overlooked. Liability does not turn on intent. It turns on what was said, whether it was false, and whether reasonable steps were taken to verify it.

The platform may change how content looks. It may change how fast it spreads. It may change who gets to participate.

It does not change the consequences of getting it wrong.

Time passes. Technology and fancy packaging change. Exposure and liability do not. 

Matthew B. Harrison is a media and intellectual property attorney who advises talk show hosts, content creators, and creative entrepreneurs. He has written extensively on fair use, AI law, and the future of digital rights. Reach him at Matthew@HarrisonMediaLaw.com or read more at TALKERS.com.

Industry Views

Mark Walters v. OpenAI: A Landmark Case for Spoken Word Media

By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer

imgWhen Georgia-based nationally syndicated radio personality, and Second Amendment advocate Mark Walters (longtime host of “Armed American Radio”) learned that ChatGPT had falsely claimed he was involved in a criminal embezzlement scheme, he did what few in the media world have dared to do. Walters stood up when others were silent, and took on an incredibly powerful tech company, one of the biggest in the world, in a court of law.

Taking the Fight to Big Tech

Walters, by filing suit against OpenAI, the creator of ChatGPT, become the first person in the United States to test the boundaries of defamation law in the age of generative artificial intelligence.

His case was not simply about clearing his name. It was about drawing a line. Can artificial intelligence generate and distribute false and damaging information about a real person without any legal accountability?

While the court ultimately ruled in OpenAI’s favor on specific legal procedure concerns, the impact of this case is far from finished. Walters’ lawsuit broke new ground in several important ways:

— It was the first known defamation lawsuit filed against an AI developer based on content generated by an AI system.
— It brought into the open critical questions about responsibility, accuracy, and liability when AI systems are used to produce statements that sound human but carry no editorial oversight.
— It continued to add fuel to the conversation of the effectiveness of “use at your own risk” disclaimers when there is real world reputational damage hanging in the balance.

Implications for the Radio and Podcasting Community

For those spoken-word creators, regardless of platform on terrestrial, satellite, or the open internet, this case is a wake-up call, your canary in a coal mine. Many shows rely on AI tools for research, summaries, voice generation, or even show scripts. But what happens when those tools get it wrong? (Other than being embarrassed, and in some cases fined or terminated) And worse, what happens when those errors affect real people?

The legal system, as has been often written about, is still playing catch-up. Although the court ruled that the fabricated ChatGPT statement lacked the necessary elements of defamation under Georgia law, including provable harm and demonstrable fault, the decision highlighted how unprepared current frameworks are for this fast-moving, voice-driven digital landscape.

Where the Industry Goes from Here

Walters’ experience points to the urgent need for new protection and clearer guidelines:

— Creators deserve assurance that the tools they use are built with accountability in mind. This would extend to copyright infringement and to defamation.
— Developers must be more transparent about how their systems operate and the risks they create. This would identify bias and attempt to counteract it.
— Policymakers need to bring clarity to who bears responsibility when software, not a person, becomes the speaker.

A Case That Signals a Larger Reckoning

Mark Walters may not have won this round in court, but his decision to take on a tech giant helped illuminate how quickly generative AI can create legal, ethical, and reputational risks for anyone with a public presence. For those of us working in media, especially in formats built on trust, voice, and credibility, his case should not be ignored.

“This wasn’t about money. This was about the truth,” Walters tells TALKERS. “If we don’t draw a line now, there may not be one left to draw.”

To listen to a longform interview with Mark Walters conducted by TALKERS publisher Michael Harrison, please click here

Media attorney, Matthew B. Harrison is VP/Associate Publisher at TALKERS; Senior Partner at Harrison Media Law; and Executive Producer at Goodphone Communications. He is available for private consultation and media industry contract representation. He can be reached by phone at 724-484-3529 or email at matthew@harrisonmedialaw.com. He teaches “Legal Issues in Digital Media” and serves as a regular contributor to industry discussions on fair use, AI, and free expression.