• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

College students create Facebook anti-fake news plugin at hackathon

Status
Not open for further replies.

Regiruler

Member
People who accomplish things in such a short time make me feel inconsequential.

But then I remember hackathons completely dry you like a sponge and I go back to browsing the internet.
 

tokkun

Member
Digging through the code and it looks like it just uses https://www.mywot.com/ to verify domains. Also does some image analysis, twitter detection, etc. The big things seems to be checking that website though.

OK. What's the actual value, though? Facebook has said that fake news is < 1% of all news stories. So if the false positive rate is 1%, it is worse than doing nothing.
 

Cyan

Banned
Why do people get news on FB in the first place?

smh

I think "get news" is the wrong term. It sort of implies people thinking to themselves "I wonder what is going on in the world" and then deciding to check Facebook to find out instead of turning on CNN or reading a paper or whatever. What's really happening is people are bored, go on Facebook (or other social media sites), see some story that's gone viral because it has properties like a clickbait title or content that makes people want to anger-share, and think they've learned something true. Or maybe they don't think it's absolutely true per se, but it's still wormed its way into their head without full vetting, and probably in a day or two they'll remember what it said but not where it came from.

"Getting news" from Facebook isn't the problem. Viral sharing of stuff created purely to farm clicks with complete disregard for things like truth or relevance or usefulness is the problem.

As a more general response to the story, I think it's great that these people are trying to help with this problem. Absent some kind of filter or method of separating real stuff from garbage, the garbage is always going to float to the top because garbage can be hand-crafted to grab our attention in a way that reality can't always be.

At the same time, it's important for people to take responsibility for their own sharing. Never assume someone sharing something has vetted it, especially if it's startling or scary or makes you upset. Don't share without checking it yourself. And if you do share something, and then it turns out to be misleading or false and someone points that out to you, instead of flailing for excuses for why you got it wrong ("it was so convincing!" "how was I supposed to know?" "reality is so weird now that it could've been true!"), accept that you were wrong and consider what you could've done to have discovered this on your own rather than from someone else post-sharing. And then start doing that going forward.

This is a problem Facebook and others can and should be working on, but it's also a problem we as individuals can and should be working on. Do your part!
 

Shig

Strap on your hooker ...
Probably shouldn't have put their names and faces out there so front-and-center. Alt-right vigilantes are gonna have a field day.
 

Cyan

Banned
OK. What's the actual value, though? Facebook has said that fake news is < 1% of all news stories. So if the false positive rate is 1%, it is worse than doing nothing.

This is wildly at odds with my own personal experience. Possibly my Facebook friends are unusually gullible or are secretly alt-right or whatever. Or possibly Facebook is talking about something different than I'm talking about. Do they include stories that are based on a kernel of fact but that take multiple leaps from there to make things sound as outrageous as possible? Stories that are actually mostly reasonable but have clickbait headlines which people then share without bothering to read the story?

Are they talking about the percentage of all distinct news story links shared on the site, or the percentage of all shares, or the percentage of all stories clicked on, or...
 
And Facebook keeps saying there's no way for them to fix the issue.

They're right though. There no way Facebook can fix conservative mistrust of facts. All their fixes basically nuke conservative "news" sources and would just send conservatives packing to get their "news" fix elsewhere.

I still think Facebook should do the right thing and pull the plug on the nonsense, but it's going to sting when they do.
 

JesseZao

Member
This is like making an extension that verifies scientific studies for anti-vaxxers; they just double down on ignorance in the face of reality.
 

Dan

No longer boycotting the Wolfenstein franchise
I think "get news" is the wrong term. It sort of implies people thinking to themselves "I wonder what is going on in the world" and then deciding to check Facebook to find out instead of turning on CNN or reading a paper or whatever. What's really happening is people are bored, go on Facebook (or other social media sites), see some story that's gone viral because it has properties like a clickbait title or content that makes people want to anger-share, and think they've learned something true. Or maybe they don't think it's absolutely true per se, but it's still wormed its way into their head without full vetting, and probably in a day or two they'll remember what it said but not where it came from.

"Getting news" from Facebook isn't the problem. Viral sharing of stuff created purely to farm clicks with complete disregard for things like truth or relevance or usefulness is the problem.

As a more general response to the story, I think it's great that these people are trying to help with this problem. Absent some kind of filter or method of separating real stuff from garbage, the garbage is always going to float to the top because garbage can be hand-crafted to grab our attention in a way that reality can't always be.

At the same time, it's important for people to take responsibility for their own sharing. Never assume someone sharing something has vetted it, especially if it's startling or scary or makes you upset. Don't share without checking it yourself. And if you do share something, and then it turns out to be misleading or false and someone points that out to you, instead of flailing for excuses for why you got it wrong ("it was so convincing!" "how was I supposed to know?" "reality is so weird now that it could've been true!"), accept that you were wrong and consider what you could've done to have discovered this on your own rather than from someone else post-sharing. And then start doing that going forward.

This is a problem Facebook and others can and should be working on, but it's also a problem we as individuals can and should be working on. Do your part!
Great post. Be responsible, people. It starts with you.

OK. What's the actual value, though? Facebook has said that fake news is < 1% of all news stories. So if the false positive rate is 1%, it is worse than doing nothing.
This is wildly at odds with my own personal experience. Possibly my Facebook friends are unusually gullible or are secretly alt-right or whatever. Or possibly Facebook is talking about something different than I'm talking about. Do they include stories that are based on a kernel of fact but that take multiple leaps from there to make things sound as outrageous as possible? Stories that are actually mostly reasonable but have clickbait headlines which people then share without bothering to read the story?

Are they talking about the percentage of all distinct news story links shared on the site, or the percentage of all shares, or the percentage of all stories clicked on, or...
The 1% figure is what Zuckerberg pulled out of his ass in his immediate defense of Facebook, claiming fake news isn't a problem and anyone claiming it might be just doesn't have empathy for Trump voters.
 

marrec

Banned
Alright, this is a nice effort and all but this here was dumb:

There were so many posts with competing information and accusations about Donald Trump and Hillary Clinton that she didn’t know how to begin deciphering the fearmongering from the reality.

Like, you can begin deciphering fake news stories from real news stories by verifying them for yourself and checking sources.

So we're so fucked as a society that instead of relying on your own critical thinking skills you have to have a plugin tell you whether a news story is fake or not?
 

Mark L

Member
Whenever people recommend curation I ask them to imagine that the curator believes the opposite of what they believe in.
 

Mark L

Member
The opposite of fake news isn't opinion, it's reality.


Well, let's say that we had internet curation handled by a government agency. So now Trump would be doing the curating. Good idea? After all, it's mission statement would be that it was for removing falsehoods, so there would be no problem, right?
 

marrec

Banned
Well, let's say that we had internet curation handled by a government agency. So now Trump would be doing the curating. Good idea? After all, it's mission statement would be that it was for removing falsehoods, so there would be no problem, right?

You are jumping pretty far from "Facebook's Mark Zuckerberg protecting the integrity of Facebook as a news source" to "Donald Trump Controls All Information!!!"

Sometimes a slippery slope fallacy is a necessary rhetorical tool, this is not one of those times.
 

Mark L

Member
You are jumping pretty far from "Facebook's Mark Zuckerberg protecting the integrity of Facebook as a news source" to "Donald Trump Controls All Information!!!"

I am in the curious position of arguing with GAF that a private corporation should act as gatekeeper for what constitutes legitimate news. You know what? I respectfully bow out of the discussion.
 

marrec

Banned
I am in the curious position of arguing with GAF that a private corporation should act as gatekeeper for what constitutes legitimate news. You know what? I respectfully bow out of the discussion.

Listen I'm not saying that a private corp acting as gatekeeper for what constitutes legitimate news is a good thing.

But that's also not what people are asking Facebook to do.

You're conflating what people share among their feed with what facebook promotes to you for ad revenue. Facebook absolutely should be curating what they present to you as legitimate news in exchange for money. It's the responsible thing to do.

However, they should not be curating what people share among their friends, that's the authoritarian thing to do (though, as a private organization, they have that right).

Do you see the difference?
 
Putting a "not verified" tag, while probably more accurate for the algorithm, probably isn't going to sway people from not believing it.

If the tag said "Bullshit" it might, though.
 

Cyan

Banned
I am in the curious position of arguing with GAF that a private corporation should act as gatekeeper for what constitutes legitimate news. You know what? I respectfully bow out of the discussion.

Can you explain in more detail what you think people are actually arguing for? I think you may have misread or misunderstood the story in the OP.
 
Status
Not open for further replies.
Top Bottom