Facebook Shares Secret Community Standards For Censoring Users' Posts

by admin April 25, 2018 at 5:38 am

Facebook publicly shared the previously secret community standards that it uses to police its users’ posts. The rulebook shows moderators how to deal with terrorism, hate speech, racism, self-harm, and other content that are against the rules.  ( Chip Somodevilla | Getty Images )

In an unprecedented move, Facebook released its private community standards that it has been using to moderate users’ content on the platform, answering calls for more transparency.

The rules were previously a guarded secret by the company to avoid people from gaming the system.

Community Guidelines

Facebook has faced criticism in the past for hiding guidelines. In the past, Facebook removed photos of artwork that it has deemed to be in violation of its community guidelines.

This isn’t the first time that the public has been able to peek into Facebook’s rules as The Guardian was able to publish leaks of the rules in 2017.

Facebook’s Community Standards consists of a 27-page guide for its human content moderators. It is a hyper-specific manual that instructs the moderators how to handle a variety of topics that may be posted on Facebook. Topics in the guide include terrorist propaganda, spam, hate speech, violence, fake news, intellectual property theft, and others.

Facebook vice president of global policy management Monika Bickert says that the guidelines were released to give people more clarity. Bickert also pointed out that another reason for releasing the guidelines is to receive feedback on the current rules. This allows Facebook to make changes if something isn’t working correctly.

Bickert added that Facebook is still concerned that bad actors may develop workarounds to the rules in order to distribute their content. Facebook increased its number of content moderators to 7,500.

Facebook has been found to be enforcing its own rules unevenly. In a 2017 investigation by ProPublica, Facebook was found to be protecting white people over black children. The company also admitted to making mistakes on 22 of 49 posts that its content moderators handled.

Facebook’s Specificity

The released document is an updated version of its guidelines. In the past Facebook had received criticism for the way that it handles nudity in the platform. Regarding breasts, Facebook has a very specific way of moderating posts involving breasts.

In its guidelines, Facebook says that it has clarified its stances of nudity over time. It acknowledges that nudity could be used as a form of protest, to raise awareness, or for educational and medical purposes. Some of the ways that Facebook allows nipples to appear in posts include protests, breastfeeding, and post-mastectomy scarring. One of the more specific rules regarding nipples is that it doesn’t allow uncovered female nipples of children older than toddler age. 

Meanwhile, it allows nude figures from paintings, sculptures, and other forms of art. 

See Now: 27 Most Inspirational And Motivational Quotes By Influential Leaders In Tech

© 2018 Tech Times, All rights reserved. Do not reproduce without permission.

Source link

more news from the blog

Add Comment