Is it censorship?

Imagine a time in America when politics was deeply divided. Congressional investigations and accusations were flying in Washington. There were name-calling and finger-pointing. Content creators had their programming pulled by the main service providers. Programming that was deemed unacceptable was removed sometimes on the day it was scheduled to be released.

We’re not referring to the early 2020s but the early 1950s. The issue was the “red scare” led by Senator Joseph McCarthy. The content creators were budding television broadcasters. The service providers were the television networks. The very notion that an actor, writer or director was sympathetic to Communism, could get a show canceled and people fired. The driving force was advertisers who didn’t want their products associated with anything “un-American.” It was a time when anything could be censored.

The parallels with our own times are interesting. The major difference today is the ability anyone has to upload or “broadcast” their content from anywhere. You can even use your phone to go live or upload on the go. This is both amazing and a bit frightening. Just because one can, doesn’t mean one should broadcast content. For example, there have been cases of violent crimes being live-streamed. If this content is removed is it censorship? Where do we draw the line?

Who gets to draw the line?

The Federal Communications Commission has the job of regulating what is allowed to be broadcast. Part of their stated mission is “for the purpose of promoting safety of life and property through the use of wire and radio communications.” Yes, that does include the internet. However, YouTube alone has more than a billion users worldwide and more than 300 hours of content loaded every minute. Is it even possible to regulate?

Just about every service provider identifies and removes content that it deems harmful or dangerous. Some of this work happens through AI bots that scan content for keywords, music or images. They also use self-reporting measures from users. No, it’s not a perfect system and content will be removed erroneously. When it happens, the appeal process for a creator may seem like an impossible task. But, the removal of content is not a signal of McCarthy-era censorship.

How to navigate the tug of war

How do we understand videos removed from a platform? Remind yourself of the many reasons videos are removed and don’t take it personally. Content is frequently removed for copyright violation. Even a simple music bed from a royalty-free website could set off the alarm. On some occasions, videos are removed because of software conflicts. You may send out content to multiple platforms via multi-casting, for example, and one platform removes it due to a programming error. Perhaps there is content that has offended some of your fan bases and they reported you.

You might also consider self-hosting. It’s actually easier than you might think. It can be as simple as loading your video to a web site via FTP, then linking to that web site. It’s true that you are no longer in the big mainstream fishpond of service providers. It will be harder for people to find your work on search engines. With some ingenuity, you can find an audience and connect them to your video.

Content being removed from social media providers, most likely, is not an act of censorship in the legal sense, but there are similarities to those early days of television. What reunited the country and stopped the accusations was good journalism. It was a few content creators who weren’t afraid to take a risk to expose the truth. Maybe that’s exactly what we need to reunite us again.


Matthew York
Matthew York
Matt York is Videomaker's Publisher/Editor.

Related Content