Creators, Commentators, or Publishers: Liability Remains the Same
By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer
The rise of independent, talk show-style political commentary on YouTube has created a new class of media actors who do not see themselves as broadcasters, journalists, or publishers. They see themselves as creators. That distinction is real in terms of identity, tone, and platform. It is not real where it matters most: liability.
The difference exists in how the work is produced and presented. It disappears the moment the content is published.
In practice, these creators are engaging in acts that courts have long recognized as publication. They are selecting topics, framing narratives, editing clips, and distributing content to large audiences. Those decisions are not neutral. They are editorial.
The absence of FCC regulation in this space has created a persistent misunderstanding. Traditional broadcasters operate under a regulatory framework that includes licensing and content restrictions. Independent creators do not. But the lack of FCC oversight does not reduce exposure. It removes one layer of regulation while leaving the core legal risk fully intact.
Defamation law applies equally to both groups. A false statement of fact about a real person that causes reputational harm can give rise to liability whether it is spoken on a licensed radio station or uploaded to a monetized YouTube channel. The standards may differ depending on whether the subject is a public or private figure, but the underlying obligation remains the same: accuracy matters.
There is no YouTube exception. There is no creator carveout. The law does not care how the content was distributed, what the platform calls you, or how you see yourself. It cares who made the statement, who chose to publish it, and whether it was false.
The structure of YouTube content introduces additional risk. Many creators rely on rapid production cycles and clip-based commentary. This increases the likelihood of error, particularly when context is compressed or omitted. Editing choices that seem minor from a production standpoint can materially change meaning, which is precisely the type of conduct that courts examine in defamation and false light claims.
Monetization further complicates the analysis. Revenue from ads, memberships, or sponsorships strengthens the argument that content is commercial in nature. That does not eliminate First Amendment protections, but it can influence how a court evaluates intent and reasonableness.
There is also a tendency to assume that platform norms provide a form of protection. If a piece of content is allowed to remain online, or even promoted by an algorithm, it can feel implicitly validated. That assumption is misplaced. Platform enforcement decisions are not legal determinations. They are business judgments.
The most important point is simple and often overlooked. Liability does not turn on intent. It turns on what was said, whether it was false, and whether reasonable steps were taken to verify it.
The platform may change how content looks. It may change how fast it spreads. It may change who gets to participate.
It does not change the consequences of getting it wrong.
Time passes. Technology and fancy packaging change. Exposure and liability do not.
Matthew B. Harrison is a media and intellectual property attorney who advises talk show hosts, content creators, and creative entrepreneurs. He has written extensively on fair use, AI law, and the future of digital rights. Reach him at Matthew@HarrisonMediaLaw.com or read more at TALKERS.com.
