According to the BBC how Facebook censors what users see has been revealed by internal documents.
The BBC findings are based off the Guardian newspaper. It stated that manuals have been revealed which guides moderators through the criteria used to gauge if posts were too violent, sexual, racist, hateful, or supported terrorism.
There are more than one hundred manuals that moderators at Facebook have to use to determine what could or could not be posted. Moderators for Facebook were interviewed by the Guardian and they state that policies used by Facebook to judge content were “inconsistent” and “peculiar.”
Moderators complained that when it came to sexual topics that were the most “confusing” in terms of deciding if it could stay or had to be removed. The pressure it takes to decide if a post should stay or go on Facebook left moderators “overwhelmed” and they only have seconds to decide. This is essentially the biggest issue.
Deciding what content people can and cannot see on a platform as big as Facebook is a task that has a huge effect on influence, and what is and isn’t acceptable in the world essentially. The Open Rights Group, which is a digital rights group said that the report shows how much influence Facebook could wield over its two billion users.
“Facebook’s decisions about what is and isn’t acceptable have huge implications for free speech,” said an ORG statement. “These leaks show that making these decisions is complex and fraught with difficulty…Facebook will probably never get it right but at the very least there should be more transparency about their processes.”
Facebook has stated that there are plans hire 3000 more moderators to help spread the workload, and in a statement, Monica Bickert, Facebook’s head of global policy management had this to say:
“We work hard to make Facebook as safe as possible, while enabling free speech…This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously,” she added.
As social media giants such as Facebook keep growing this issue is only going to get heavier. Trying to balance out is freedom of speech versus what is not isn’t a small task. It’s hard enough just doing it in America, to moderate on a global scale is a different kind of beast.