If you establish a social media site, and allow the world at large to join and post, you’re running a risk. Some people will post pictures of kittens, old family photos, or corny but uplifting messages. Others, however, may want to post other things — things that are disturbing. So you establish a content policy — but where do you draw the line? That’s an issue that Facebook is wrestling with these days.
Facebook has an extensive set of “community standards” that address topics like “nudity and pornography,” “violence and threats,” and “hate speech.” One topic is “graphic content.” As Facebook puts it, people use the site to share their experiences and thoughts about issues, some of which “involve graphic content that is of public interest or concern, such as human rights abuses or acts of terrorism.” Facebook distinguishes between sharing such content for purposes of condemnation and sharing “to celebrate or glorify violence.” Facebook asks users to share content “in a responsible manner” and warn the audience about any graphic video. If Facebookers report that certain content violates the community standards, Facebook decides whether to remove it.
The most recent controversy involves a video showing a woman being decapitated. The British Prime Minister, David Cameron, and others criticized Facebook for not removing the video and for apparently loosening its standards on hyper-violent postings. Facebook reacted to the criticism, removed the video after determining it improperly and irresponsibly glorifies violence, and issued a “fact check” statement to explain its new approach and its decision.
It’s the right decision, of course — but it shouldn’t have been a hard decision to make in the first place. There is a big difference between disturbing images of starving children that sharpen an appeal for contributions to a hunger relief charity and a video of a planned execution by beheading. Line-drawing can be tough, but I would certainly draw the line so that videos showing real people actually being killed, tortured, or horribly injured are excluded, whether their accompanying text purportedly “condemns” such action or not.