Dive Brief:
- Facebook disclosed in a blog post today that it has initiated a program to work with third-party fact-checking organizations to help identify fake news stories.
- Potentially fake posts will be identified by the Facebook community, which will generate a report that is sent to the fact-checking organization to evaluate. If confirmed as a fake, the story will receive a “disputed” label. Facebook will include a link to an article explaining why the label was applied.
- Disputed stories may appear lower in the news feed and, when users try to share them, display a popup further warning users that they may be fake. Once a story has been flagged, it cannot be made into an ad or promoted.
Dive Insight:
With Facebook facing a backlash due to its platform’s role in the dissemination of fake news – a problem that has become a big focus for many following allegations that such content may have helped sway the results of the recent presidential election – the company pivoted from its previous hands-off approach several weeks ago with promises of being more proactive. The new role for third-party fact-checkers and the “disputed” flag are one of Facebook’s first steps to make good on its promise to address the issue.
Still, the company insists it will proceed carefully so as not position itself as an arbiter of truth and to insure users have a voice. With this in mind, Facebook is focusing its efforts on clear hoaxes spread by spammers for their own gain, which VP of News Feed Adam Mosseri referred to in the blog post as “the worst of the worst.”
One potential pitfall of the strategy is if the disputed flag starts appearing so frequently that it looses meaning for users.
The prescribed steps, which include several ways to make it easier to report a hoax, are currently being tested and rolled out, with Facebook planning to iterate and extend them over time.
The use of fact-checkers and a flagging system for letting users know when a post may be fake is in line with Facebook’s desire to help people decide for themselves what to trust and share by providing more context. It will still be possible to share disputed stories, but users will see a warning that a story has been disputed as they share.
Facebook’s research shows that one possible sign a story has mislead users is when reading an article makes people significantly less likely to share it. As a result, the company will test incorporating this signal into ranking. It will also try to reduce the financial incentives for spammers by eliminating the ability to spoof domains and analyzing publisher sites to ensure policies are being followed.