The Scene:

Facebook, Youtube, and Apple announced on Monday that they will be removing content associated with InfoWars and its founder Alex Jones from their respective platforms. Jones is a longtime darling of the alt-right who frequently spreads misinformation with an apparent goal of undermining the Democratic Party and Washington establishment. In covering this ongoing story, we explore student opinion regarding Facebook’s ongoing internal battle between protecting free speech and combating fake news.

The Takes:

Maryland Diamondback: Jack Lewis thinks “Facebook has, for too long, used the phrase "open platform" to ignore its responsibly to monitor hate and lies.”

•    “The scope of Facebook's influence, coupled with its failure to act, demands smarter regulation.”

•    “Today, Facebook does not belong to the people — it belongs to those who know how to best exploit its flaws.”

•    “We cannot wait for Facebook and other social media companies to self-correct. It's not just elections that matter: Facebook is failing to monitor hate speech and discriminatory advertising.”

Harvard Crimson: Gabriel Karger wonders: “what does Facebook think free speech is for?”

•    “The fact that the company’s moderation guidelines were developed ad hoc and without user input over the span of several years is worrying and hard to defend.”

•    “When we stop pretending that online ‘platforms’ are amoral structures, we also see the urgent need to scrutinize their foundations.”

•    “As it stands, the question of who ought to define and regulate hate speech is a moot one. With the exception of some European authorities, Facebook and other companies are already answering it for us, whether or not we accept their verdicts.”

Maryland Diamondback: Liyanga de Silva believes “Facebook must crack down on hate speech beyond U.S. borders.”

•    Facebook “needs to develop stricter guidelines that account for the contexts of other countries and protect vulnerable communities from violence.”

•    For example, Sri Lanka recently blocked nationwide Facebook access because “Facebook had become a platform for spreading hate against Muslims and calling for violence against them.”

•    “Free speech arguments aren't quite valid in the debate over Facebook's role in political violence in Sri Lanka and Myanmar. Hate speech in those countries is not only expressing bigoted opinions but also explicitly inciting violence and encouraging genocide.”

•    “As the platform has grown and reached into most parts of the world, it has lost sight of national and local contexts.”

The Bottom Line:

While students seem to agree that Facebook must change its policies, they disagree on the ideal extent and direction that new policies should take. Free speech advocates believe that Facebook should allow all but the most extreme forms of hate speech and speech inciting violence to go unregulated. Others believe Facebook should censor more figures like Jones who contradict the site’s stated mission “to give people the power to build community and bring the world closer together.”

Hosted on Roast.io