Dive Brief:
- CEO Mark Zuckerberg released a post on Saturday outlining steps the social media giant is taking to combat the problem of fake news on its site.
- Zuckerberg pointed to steps that Facebook is currently taking to battle fake news, including stronger detection, easy reporting by users, adding warnings to flagged articles, raising the bar for related stories added as links, disrupting the fake news spam economies and listening to the news industry.
- The post also highlighted the philosophical as well as technical challenges that Facebook faces as it tries to balance letting users freely express themselves and the need to address misinformation especially in light of the platform's significant influence.
Dive Insight:
Facebook has been hounded by a fake news issue that was brought into highlight by recent presidential the election, with claims rampant that misinformation on the site helped sway users to vote for Trump, something Zuckerberg previously denied.
Fake news is becoming a big problem for Facebook in light of research that found 44% of U.S. adults rely on Facebook as their primary source of news. Zuckerberg has been clear in stating Facebook is a technology company and not a media entity, but given the level its users rely on the platform for basic news, that distinction is tough for Zuckerberg to credibly make.
Numbers such as 44% relying on Facebook for news underscore the platform's unique role in users' lives. Saturday's announcement suggests the company may be bowing to public pressure and taking a closer look at the responsibilities and issues inherent in this role.
Fake news on Facebook is a problem for the social media platform because it reduces the credibility of the entire brand. In the post, Zuckerberg writes that Facebook takes misinformation seriously, but also that in the past it has relied on the community to police what is fake news or legitimate.
“The problems here are complex, both technically and philosophically," Zuckerberg wrote. "We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”