Neutraliars: The Platforms That Edit Like Publishers but Hide Behind Neutrality
By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer
In the golden age of broadcasting, the rules were clear. If you edited the message, you owned the consequences. That was the tradeoff for editorial control. But today’s digital platforms – YouTube, X, TikTok, Instagram – have rewritten that deal. Broadcasters and those who operate within the FCC regulatory framework are paying the price.
These companies claim to be neutral conduits for our content. But behind the curtain, they make choices that mirror the editorial judgment of any news director: flagging clips, muting interviews, throttling reach, and shadow banning accounts. All while insisting they bear no responsibility for the content they carry.
They want the control of publishers without the accountability. I call them neutraliars.
A “neutraliar” is a platform that claims neutrality while quietly shaping public discourse. It edits without transparency, enforces vague rules inconsistently, and hides bias behind shifting community standards.
Broadcasters understand the weight of editorial power. Reputation, liability, and trust come with every decision. But platforms operate under a different set of rules. They remove content for “context violations,” downgrade interviews for being “borderline,” and rarely offer explanations. No appeals. No accountability.
This isn’t just technical policy – it’s a legal strategy. Under Section 230 of the Communications Decency Act, platforms enjoy broad immunity from liability related to user content. What was originally intended to allow moderation of obscene or unlawful material has become a catch-all defense for everything short of outright defamation or criminal conduct.
These companies act like editors when it suits them, curating and prioritizing content. But when challenged, they retreat behind the label of “neutral platform.” Courts, regulators, and lawmakers have mostly let it slide.
But broadcasters shouldn’t.
Neutraliars are distorting the public square. Not through overt censorship, but through asymmetry. Traditional broadcasters play by clear rules – standards of fairness, disclosure, and attribution. Meanwhile, tech platforms make unseen decisions that influence whether a segment is heard, seen, or quietly buried.
So, what’s the practical takeaway?
Don’t confuse distribution with trust.
Just because a platform carries your content doesn’t mean it supports your voice. Every upload is subject to algorithms, undisclosed enforcement criteria, and decisions made by people you’ll never meet. The clip you expected to go viral. Silenced. The balanced debate you aired. Removed for tone. The satire? Flagged for potential harm.
The smarter approach is to diversify your presence. Own your archive. Use direct communication tools – e-mail lists, podcast feeds, and websites you control. Syndicate broadly but never rely solely on one platform. Monitor takedowns and unexplained drops in engagement. These signals matter.
Platforms will continue to call themselves neutral as long as it protects their business model. But we know better. If a company edits content like a publisher and silences creators like a censor, it should be treated like both.
And when you get the inevitable takedown notice wrapped in vague policy language and polished PR spin, keep one word in mind.
Neutraliars.
Matthew B. Harrison is a media and intellectual property attorney who advises radio hosts, content creators, and creative entrepreneurs. He has written extensively on fair use, AI law, and the future of digital rights. Reach him at HarrisonMediaLaw.com or read more at TALKERS.com.