Skip to content

Fake news is a real problem. These college students came up with a fix.

Author
PUBLISHED: | UPDATED:

When Nabanita De scrolled through her Facebook feed recently, she felt afraid. There were so many posts with competing information and accusations about Donald Trump and Hillary Clinton that she didn’t know how to begin deciphering the fearmongering from the reality.

The social media site has faced criticism since the presidential election for its role in disseminating fake and misleading stories that are indistinguishable from real news. Because Facebook’s algorithm is designed to determine what its individual users want to see, people often see only that which validates their existing beliefs regardless of whether the information is true.

So when De, an international second-year master’s student at the University of Massachusetts at Amherst, attended a hackathon at Princeton University this week where the goal was to develop a technology project in 36 hours, she suggested to her three teammates that they build an algorithm to authenticate what is real and what is fake on Facebook.

And they were able to do it.

The four students — De, with Purdue University freshman Anant Goel and sophomores Mark Craft and Qinglin Chen from the University of Illinois at Urbana-Champaign — built a Chrome browser extension that tags links in Facebook feeds as verified or not verified by taking into account factors such as the source’s credibility and cross-checking the content with other news stories. Where a post appears to be false, the plug-in will provide a summary of more credible information on the topic online. A small blue box in the upper right-hand corner of the screen says either “verified” or “not verified.”

They’ve called it FiB.

Since the students developed it in only 1 1/2 days (and have classes and schoolwork to worry about), they’ve released it as an “open-source project,” asking anyone with development experience to help improve it. The plugin is available for download to the public, but the demand was so great that their limited operation couldn’t handle it.

Ideally, Goel said, Facebook would team up with a third-party developer such as FiB so that the company could control all news feed data but let the developers verify it so Facebook couldn’t be accused of “hidden agendas or biases.”

The sponsors of the hackathon included Facebook and other major technology companies, but neither Facebook nor Google has contacted the students about FiB. Both companies have said this week that they will take steps to address the spread of fake news.

This year’s presidential election has shown how the lines have blurred between fact and lies, with people profiting from the spread of fake news. More than 100 news sites that made up pro-Trump content have been traced to Macedonia, according to a BuzzFeed News investigation. The Washington Post interviewed Paul Horner, a prolific fake-news creator, who said: “I think Trump is in the White House because of me. His followers don’t fact-check anything — they’ll post everything, believe anything.”

Melissa Zimdars, a communications professor at Merrimack College in North Andover, Massachusetts, said she’s seen a similar problem with students who cite questionable sources. So she created a list of fake, misleading or satirical sites as a reference, not as a direct response to the postelection fake news debate but to encourage students to be more media-literate by checking what they read against other sources.

The list, which she has continued to add to since she made it public earlier this week, has gone viral. She included tips for analyzing news sources:

“Watch out if known/reputable news sites are not also reporting on the story. Sometimes lack of coverage is the result of corporate media bias and other factors, but there should typically be more than one source reporting on a topic or event.

“If the story makes you REALLY ANGRY it’s probably a good idea to keep reading about the topic via other sources to make sure the story you read wasn’t purposefully trying to make you angry (with potentially misleading or false information) in order to generate shares and ad revenue.”

Zimdars said people have grown so distrustful of institutional media that they turn to alternative sources. A recent Pew Research Center survey found that only 18 percent of people have a lot of trust in national news organizations; nearly 75 percent said news organizations are biased.

It doesn’t help, Zimdars said, that news media, to be profitable, rely on “click-bait” headlines that are sometimes indistinguishable from the fake stories.

Paul Mihailidis, who teaches media literacy at Emerson College in Boston had a different take.

“I don’t think a lot of people didn’t know; I think they didn’t care. They saw it as a way to advocate,” Mihailidis said. “The more they could spread rumors, or could advocate for their value system or candidate, that took precedent over them not knowing. A large portion of them didn’t stop to critique the information. One of the things that has happened is people are scrolling though [Facebook] and the notion of deep reading is being replaced by deep monitoring. They see a catchy headline, and the default is to share.”

And the way people consume news, by a flick of the thumb on a smartphone, means they are less likely to cross-check what they’re reading against other sources.

That’s where the plugin tool presents a simple solution.

“A few days back, I read an article telling people they can drill a jack in the iPhone7 and have an earphone plug, and people started doing it and ruining their phones,” De said. “We know we can search on Google and research it, but if you have five minutes and you’re just scrolling through Facebook, you don’t have time to go verify it.”