Like other Facebook users, I saw the ad the top of my feed this morning: âAmber,â it said, âItâs possible to spot false news.â (What, wait? Really?) âAs we work to limit the spread, check out a few ways to identify whether a story is genuine.â Click it, and you're led to a page with handy tips like âBe skeptical of headlines,â âWatch for unusual formatting,â and âIs the story a joke?â Oh, Facebookâalways looking out for our best interests!
Or perhaps itâs because the company drew heavy criticism for their role in helping disseminate false information during the 2016 election. Surely lighting a flame under their feet is a recent bill passed last week in Germany that can fine social media companies up to $53 million if they donât remove hate speech and fake news from their sites immediately.
And just today, British newspaper the Times accused Facebook of ârefusing to remove potentially illegal terrorist and child pornographyâ (yikes!) after a reporter went undercover on the site and found offensive material being shared, and even promoted, through Facebook algorithms.
Blame it on those pesky algorithmsâtheyâve got a mind of their own! Zuckerberg did in a recent interview with Fast Company:
Go back a few years, for example, and we were getting a lot of complaints about click bait. No one wants click bait. But our algorithms at that time were not specifically trained to be able to detect what click bait was.
Facebookâs call-and-response system, Zuckerberg said, is a âconstant work in progress,â and he almost started waxing philosophical about the whyâs and howâs of (how to eventually manage) the human impulse to tell stories (and spread lies).
Itâs not like they are problems that exist because thereâs some kind of underlying, nefarious motivation. I mean, certainly giving people a voice leads to more diversity of opinions, which if you donât manage that can lead to more fragmentation, but I think this is kind of the right order of operations. You know, you give people a voice and then you figure out what the implications of that are, and then you work on those things.
In December, Facebook partnered with fact-checking sites Snopes and PolitiFact to launch focus groups that let users mark news stories as fake. And just today, the company orchestrated a massive crackdown in France, deleting over 30,000 fake accounts.
Those fake news tips circulating right now are part of their latest attempts to cure the spread of viral misinformation. Hereâs a good point-by-point breakdown of those tips, which explains why Facebookâs fake news advice doesnât necessarily help. For instanceâwhen trying to investigate the source of a news story, confirmation bias is a thing that exists:
Sure, this might eliminate the âDenver Guardiansâ and other baldly false news sources of the world, but on the other hand: Enough people trust Breitbart for it to be a distressingly powerful news organization. And if you check a shady news outfitâs About section ⌠âWell, it says here that this organization is âdedicated to unraveling the criminal conspiracy that is the Clinton Foundation.â Sounds legit!â