For years, Facebook has faced harsh criticism for not doing enough to moderate hate speech, promote terrorism, or broadcast violence on its site. Now it hopes to clear up any confusion about its post-removing policies, guidelines for which were just released

Tuesday's announcement comes a little more than a month after The New York Times reported that London-based political firm Cambridge Analytica gathered users' Facebook data and claimed it could influence the behavior of American voters. Facebook's decision to release its full content guidelines could be seen as an attempt to be more transparent about its operations while it's under a microscope by multiple governments and privacy groups. 

"The guidelines will help people understand where we draw the line on nuanced issues," wrote Monika Bickert, the vice president of global product management, in a blog post announcing Facebook's decision. "Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines--and the decisions we make--over time."

Facebook organized its community standards into six categories, including violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property and content-related requests. Here's what you need to know:

  • Facebook does not allow individuals or organizations involved in terrorist activity, organized hate, human trafficking, criminal activity and mass or serial murder to have a presence on the site. What's more, it will remove content that supports or praises these groups or people.
  • Facebook prohibits anyone from selling or trading non-medical drugs, pharmaceutical drugs and marijuana. It also bans any exchange of firearms between individuals on Facebook.
  • To tackle bullying, Facebook vows to remove content that purposefully targets private individuals with the intent of shaming them. Facebook recognizes that bullying can be especially harmful to minors and will provide heightened protection for those individuals. For example, it will remove content that contains negative character claims or physical descriptions of a minor.
  • Facebook's harassment policies include targeting victims or survivors of violent tragedies with claims they are lying about being a victim. It will also remove any calls for self-injury or suicide.
  • Facebook will remove or restrict content that engages in copyright or trademark infringement after receiving a report from a rights holder or authorized representative.
  • Facebook acknowledged there is a "fine line between false news and satire or opinion," and therefore it will not remove false news from the site but it will reduce its distribution by placing it lower in the News Feed.

In addition to clearly stating what content is prohibited from the site, Facebook will, for the first time, allow users to appeal a decision to remove content. Previously, people could only appeal the removal of pages, groups or accounts. What's more, the social network pledged to increase its team of content reviewers from 7,500 individuals to 20,000 by the end of 2018, according to Recode

Published on: Apr 24, 2018