Facebook’s CEO, Mark Zuckerberg, took to Facebook today to tell us he is finally going to do something about Facebook’s fake news problem, writing that,
“I want you to know that we have always taken this seriously, we understand how important the issue is for our community and we are committed to getting this right.”
According to Zuckerberg,
“The bottom line is: we take misinformation seriously. Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We’ve been working on this problem for a long time and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done.”
Keep in mind, Zuckerberg hasn’t until now taken this as seriously as many of us would have liked, initially saying in response to the hubbub,
“Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea. Voters make decisions based on their lived experience.”
If anything is crazy, however, it was Zuckerberg’s rejection of reality. He now says Facebook has relied on “our community to help us understand what is fake and what is not,” but clearly that was not working.
There is mounting evidence about the scope of the crisis of fake news on Facebook. Teenagers in Macedonia ran content farms creating fake news that generated over hundreds of thousands of shares on Facebook during the presidential election. Fake news stories repeatedly trended and were shared across Facebook. A BuzzFeed analysis even found that fake news outperformed real news during the final three months of the election.
People “sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation” is a ridiculous solution when conservatives especially reject reality’s liberal bias, and therefore sites like Snopes that tell them things uncongenial to their preconceptions. We’ve all seen this.
Zuckerberg’s change of heart seems grudging at best. He still claims, “the percentage of misinformation is relatively small,” but admits “we have much more work ahead on our roadmap.”
Accordingly, he is actually telling us what steps he will take to tackle the fake news plague.
“Normally we wouldn’t share specifics about our work in progress, but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have underway”:
– Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.
– Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.
– Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.
– Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.
– Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.
– Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We’re looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.
– Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.
The problem with some of these is that anyone can report anything they don’t like as “fake” even if it is 100% factual. Perfectly reasonable critiques of Donald Trump, under Zuckerberg’s new system, could be flagged as false by organized groups, for example, those Macedonian kids.
To be honest, Zuckerberg’s initiative sounds a lot like the empty prattle coming from Republicans in the House in that it says a lot without promising to do a lot. Google actually did something about its fake news problem by withholding its online advertising service from sites peddling fake news.
He can tell us “we have always taken this seriously” and that he understands “how important the issue is for our community” but Facebook’s news feed is designed to show us stories it thinks we want to read.
As Vox’s Matthew Yglesias puts it,
“A news diet overwhelmingly driven by shareability and algorithmic targeting is going to be profoundly misleading whether or not it contains fake news.”
So when Zuckerberg tells us “Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information,” he has to know “meaningful” and “accurate” are not necessarily the same thing.
And that’s the problem. Facebook is designed to give you exactly what you want, rather than what you need, to hear. You know, like Fox News – and Donald Trump’s cabinet.