According to the BBC how Facebook censors what users see has been revealed by internal documents.

The BBC findings are based off the Guardian newspaper. It stated that manuals have been revealed which guides moderators through the criteria used to gauge if posts were too violent, sexual, racist, hateful, or supported terrorism.

RELATED: Facebook To Add An Additional 3,000 Employees To Review Facebook Live Videos Of Suicides And Crime

There are more than one hundred manuals that moderators at Facebook have to use to determine what could or could not be posted. Moderators for Facebook were interviewed by the Guardian and they state that policies used by Facebook to judge content were “inconsistent” and “peculiar.”

Moderators complained that when it came to sexual topics that were the most “confusing” in terms of deciding if it could stay or had to be removed. The pressure it takes to decide if a post should stay or go on Facebook left moderators “overwhelmed” and they only have seconds to decide. This is essentially the biggest issue.

RELATED: Facebook Responds to The Massacre In Cleveland, Promises to ‘Do Better’ With Graphic Content Censoring

Deciding what content people can and cannot see on a platform as big as Facebook is a task that has a huge effect on influence, and what is and isn’t acceptable in the world essentially. The Open Rights Group, which is a digital rights group said that the report shows how much influence Facebook could wield over its two billion users.

“Facebook’s decisions about what is and isn’t acceptable have huge implications for free speech,” said an ORG statement. “These leaks show that making these decisions is complex and fraught with difficulty…Facebook will probably never get it right but at the very least there should be more transparency about their processes.”

Facebook has stated that there are plans hire 3000 more moderators to help spread the workload, and in a statement, Monica Bickert, Facebook’s head of global policy management had this to say:

“We work hard to make Facebook as safe as possible, while enabling free speech…This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously,” she added.

As social media giants such as Facebook keep growing this issue is only going to get heavier. Trying to balance out is freedom of speech versus what is not isn’t a small task. It’s hard enough just doing it in America, to moderate on a global scale is a different kind of beast.

Never miss a story. Follow us on Twitter and Instagram, @BlameEbro and like our Facebook page, here

Source: BBC

About The Author Tahchee

Born and raised in The Bronx, New York. I have a passion for a lot of things, and writing is just one of those many forms. I’m here for positive vibes, and productive conversation and building. I enjoy Jack Daniels and long walks on the beach. Get to know me on twitter: @sirblackgaryoak and IG: @itstahchee Any issues with anything I write on this site, please BlameEbro.

comments (0)

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>