Secret until now, the internal Facebook Community Standards practices for moderators has now been made public for the sake of clarification…
Today, Facebook revealed how it moderates content on its platform. The Community Standards page breaks down into six different sections: Violence and Criminal Behavior, Safety, Objectionable Content, Integrity and Authenticity, Respecting Intellectual Property, and Content-Related Requests.
Facebook Community Standards Practices Revealed
Each subsection outlines clearly what is and isn’t acceptable, complete with lists of rules. But, it’s important to note, this doesn’t include any changes to already established rules. Nor does it introduce any new changes. It’s simply an explanation of how the social network handles violations. “…for the first time, we are publishing the internal implementation guidelines that our content reviewers use to make decisions about what’s allowed on Facebook,” Monika Bickert, Facebook’s VP of Global Policy Management, states.
Facebook is under fire over several recent scandals. Third-party data mining allegations, leaked internal memos, and Android call and text data scraping are all included. The company is also under scrutiny for its lack of transparency and its inconsistent treatment of some political content. Now, users have direct access to the rules, which might help when appealing take down decisions.
On that note, Facebook is also creating an appeals process. It will allow users who believe their content removal was erroneous to challenge the decision. But, the company should likewise provide reasons why its doesn’t take down content, as well, for better context.
In May, Facebook will host forums in “Germany, France, the UK, India, Singapore, the US and other countries” to solicit user feedback.