Facebook has published a detailed post to its blog, announcing its increased efforts to combat what it calls “problematic content”…
“Remove, reduce, inform.” That’s Facebook’s latest three-step plan to help rid the platform of misleading information. The company has written a piece entitled “Remove, Reduce, Inform: New Steps to Manage Problematic Content,” informing the public of its intentions.
Facebook Releases Three-Part Plan to Help Rein-In the Spread of Problematic Content
Facebook will not only apply these principles to individuals, but also, groups. In fact, the social network states groups which “repeatedly share misinformation” will have less reach in the News Feed.
Another change seemingly takes a page from how Google ranks sites in its organic search results. Facebook states it will reduce the presence of low-quality publications by measuring whether or not those sites are genuinely trusted sources. In other words, Google gives more credit to sites with more inbound links from reputable sources than sites with substantially fewer or none.
Facebook will also rely more on fact-checking from The Trust Project, which uses “Trust Indicators.”
Guy Rosen, VP of Integrity, and Tessa Lyons, Head of News Feed Integrity explain:
“Since 2016, we have used a strategy called “remove, reduce, and inform” to manage problematic content across the Facebook family of apps. This involves removing content that violates our policies, reducing the spread of problematic content that does not violate our policies and informing people with additional information so they can choose what to click, read or share. This strategy applies not only during critical times like elections, but year-round.”