It’s not about Alex Jones.
This week, Facebook, YouTube, Apple and Spotify removed posts, videos and podcasts from Jones and his platform, Infowars. Many, across the political spectrum, breathed sighs of relief. But these simultaneous removals raise questions about content regulation, censorship and who chooses what we can see, and shine a harsh light on the challenges tech companies face in applying their own content guidelines.
Online platforms have been grappling for years with how to manage speech they–and many of their users–dislike. The New York Times has a good primer on how various platforms have approached regulating users’ content.
As private companies, these firms are free to take down whatever they want. They also serve as the largest global forums for the exchange of ideas–and presumably want to stay that way. But their guidelines surrounding offensive content remain vague, subjective and confusing. What happens when the gatekeepers dislike speech you agree with? Who decides what is offensive enough to ban? Who draws the line? Do you really trust them to do it well?
As Nadine Strossen writes, “Entrusting these powerful private-sector companies to decide what we can see, hear and discuss online is, simply put, a very bad idea.”
Where do we go from here?
Before we enthusiastically outsource control of the online public sphere to Big Tech, we need to consider the implications. There are other ways to confront the toxic conspiracy theories Infowars peddles in. We stand with our allies working in internet freedom, journalism and government transparency to defend free expression. As we deal with toxic online speech, we must remain true to our commitment to First Amendment principles.
Our democracy demands that we protect free expression.