Fake news on Facebook is a real problem. These college students came up with a fix in 36 hours.
Anant Goel, Nabanita De, Qinglin Chen and Mark Craft at Princetons hackathon (Anant Goel)

Anant Goel, Nabanita De, Qinglin Chen and Mark Craft at Princetons hackathon (Anant Goel)
When Nabanita De scrolled through her Facebook feed recently, she felt afraid. There were so many posts with competing information and accusations about Donald Trump and Hillary Clinton that she didnt know how to begin deciphering the fearmongering from the reality.
The social media site has faced criticism since the presidential election for its role in disseminating fake and misleading stories that are indistinguishable from real news. Because Facebooks algorithm is designed to determine what its individual users want to see, people often see only that which validates their existing beliefs regardless of whether the information being shared is true.
So when De, an international second-year masters student at the University of Massachusetts at Amherst, attended a hackathon at Princeton University this week with a simple prompt to develop a technology project in 36 hours, she suggested to her three teammates that they try to build an algorithm that authenticates what is real and what is fake on Facebook.
And they were able to do it.
De, with Anant Goel, a freshman at Purdue University, and Mark Craft and Qinglin Chen, sophomores at the University of Illinois at Urbana-Champaign, built a Chrome browser extension that tags links in Facebook feeds as verified or not verified by taking into account factors such as the sources credibility and cross-checking the content with other news stories. Where a post appears to be false, the plug-in will provide a summary of more credible information on the topic online.
Theyve called it FiB.
Since the students developed it in only a day and a half (and have classes and schoolwork to worry about), theyve released it as an open-source project, asking anyone with development experience to help them improve it. The plugin is available for download to the public, but the demand was so great that their limited operation couldnt handle it.
So while FiB isnt currently up and running, when it works, this is what it looks like:

When a link cannot be verified, it looks like this:

Goel said that ideally, Facebook would team up with a third-party developer such as FiB so that the company could control all news feed data but then let the developers verify it so Facebook couldnt be accused of hidden agendas or biases.
The sponsors of the hackathon included Facebook and other major technology companies. FiB was awarded Best Moonshot by Google, but neither Facebook nor Google, which has its own problems with promoting fake news, have reached out about helping them.
This presidential election year has shown how the lines have blurred between fact and lies, with people profiting off the spread of fake news. There are more than 100 news sites that made up pro-Trump content traced to Macedonia, according to a BuzzFeed News investigation. The Washington Post interviewed Paul Horner, a prolific fake-news creator, who said, I think Trump is in the White House because of me. His followers dont fact-check anything theyll post everything, believe anything.
Melissa Zimdars, a communications professor at Merrimack College in Massachusetts, said shes seen a similar problem with her students who cite from sources that are not credible. So she created a list of fake, misleading or satirical sites as a reference for her students. She created it not as a direct response to the postelection fake news debate but simply to encourage her students to become more media literate by checking what they read against other sources.
Zimdars said media literacy has become a challenge because people have grown so distrustful of institutional media that they turn to alternative sources. A recent Pew Research Center survey found that only 18 percent of people have a lot of trust in national news organizations; nearly 75 percent said news organizations are biased.
It doesnt help, she said, that news media, to be profitable, rely on click-bait headlines that are sometimes indistinguishable from the fake stories.
Another problem, said Paul Mihailidis, who teaches media literacy at Emerson College in Boston, is that many people sharing links on Facebook dont care whether its true.
I dont think a lot of people didnt know; I think they didnt care. They saw it as a way to advocate, he said. The more they could spread rumors, or could advocate for their value system or candidate, that took precedent over them not knowing. A large portion of them didnt stop to critique the information. One of the things that has happened is people are scrolling though [Facebook] and the notion of deep reading is being replaced by deep monitoring. They see a catchy headline, and the default is to share.
Thats where the plugin tool presents a simple solution.
A few days back, I read an article telling people they can drill a jack in the iPhone7 and have an earphone plug, and people started doing it and ruining their phones, De said. We know we can search on Google and research it, but if you have five minutes and youre just scrolling through Facebook, you dont have time to go verify it.