Skip to content

Mark Zuckerberg outlines Facebook’s ideas to battle fake news

Author
PUBLISHED: | UPDATED:

A week after trying to reassure the public that it was “extremely unlikely hoaxes changed the outcome of this election,” Facebook founder Mark Zuckerberg outlined several ways the company might try to stop the spread of fake news on the platform in the future.

“We’ve been working on this problem for a long time and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done,” Zuckerberg wrote in a Friday night post on his own Facebook page. He then named seven approaches the company was considering to address the issue, including warning labels on false stories, easier user reporting methods and the integration of third-party verification.

“The problems here are complex, both technically and philosophically,” he cautioned, repeating the company’s long-standing aversion to becoming the “arbiters of truth” — instead preferring to rely on third parties and users to make those distinctions.

“We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content,” he said.

While none of the listed ideas are particularly specific, Zuckerberg’s post does provide more details on the company’s thinking about the problem of fake news.

Facebook’s concern with fake news predates the 2016 elections. Hoaxes have long plagued the site’s algorithms, which incentivize the creation of content that its users would like to share, true or not.

But fake news – and specifically, Facebook’s role in spreading it – became a story of wide interest just after the elections, when critics accused the platform of influencing voters by allowing political hoaxes to regularly go viral – particularly those favorable to the now President-elect Donald Trump. Zuckerberg has strongly denied that this was true, saying last week that the idea that Facebook influenced the elections in this way is “pretty crazy,” and that fake news “surely had no impact” on the outcome.

Zuckerberg did not contradict this denial on Friday, but his post reflects Facebook’s growing acknowledgment that it’s going to have to do a lot more about the plague of hoaxes and fake stories on the platform. On Monday, Facebook announced it was going to crack down on fake news sites that use its ad services to profit off hoaxes.

One of the ideas Zuckerberg presented on Friday indicates that the company wants to go further in “disrupting fake news economics,” and is considering more policies like the one it just announced, along with stronger “ad farm detection.”

Another promises stronger detection of misleading content. “This means better technical systems to detect what people will flag as false before they do it themselves,” Zuckerberg wrote.

News Feed can already make some guesses about whether a post is authentic or not based on the user behavior around it. On Friday, Zuckerberg specified that Facebook currently watches for things like “people sharing links to myth-busting sites such as Snopes” to determine whether a post might be misleading or false. Zuckerberg didn’t go into specifics about what more Facebook might be looking to do on this front.

Facebook also indicated that it’s trying to find ways to rely more on users and third parties to help flag and classify fake stories. Zuckerberg listed “stronger reporting” methods for users, and listening more to “third party verification” services like fact checking sites. Zuckerberg also said Facebook was considering how to use third-party and user reports of fake news as a source for displaying warnings on fake or misleading content.

The site would also improve the quality of articles that appear in “related articles” under news stories that are posted to Facebook. And, Zuckerberg said, Facebook would “continue to work with journalists and others in the news industry” on the issue.

While Facebook has attracted the majority of scrutiny this week, the platform is hardly the only company struggling to address the spread of fake news on the Internet. On Monday, the top Google hit for the search “final election count” was a site falsely reporting that Trump had won the popular vote. Like Facebook, Google has also taken steps this week to try and stop fake news writers from using their ad services to make money.