Facebook’s latest changes could make the misinformation problem worse

When Facebook announced the latest changes to its News Feed algorithm, most of the attention focused on how the new ranking system might affect the amount of traffic coming from the social network—which isn’t that surprising, since many media companies depend on that traffic for a large part of their digital revenue.

An even more important question, however, is whether these changes will actually help to solve any of the major problems Facebook claims it is trying to solve, such as the proliferation of “fake news” and misinformation on its platform. The answer remains to be seen, but there are good reasons to believe that the latest algorithm tweaks could make the problem worse instead of better.

Here’s why: Facebook co-founder and CEO Mark Zuckerberg and Adam Mosseri—the man in charge of the news feed—said in separate blog posts that the new algorithmic approach is designed to get away from passive consumption (which they agree might not be good for users, a somewhat surprising admission) and to focus more on personal posts that generate discussion and engagement.

Zuckerberg said the changes would promote content that is likely to “encourage meaningful interactions between people,” while Mosseri said it would highlight posts that “inspire back-and-forth discussion in the comments” and content “that you might want to share and react to.”

Media outlets are justifiably afraid this emphasis could hurt the reach of their news posts, because the assumption is that they won’t appear in a user’s feed unless someone shares them and they get lots of comments. What Facebook seems to be devaluing is the kind of passive consumption that makes up a lot of the news-related activity on Facebook–reading without commenting, watching videos with captions, etc.

The company says it plans to promote “reputable” publishers into feeds even if they don’t get a lot of engagement, but it’s not clear how it will define that term, or how much promotion such posts will get.

If the system is designed to encourage conversation and spark reactions, the risk is that the sites which could get the biggest boost from these changes are the least credible ones–publishers who specialize in either completely fake stories, or stories that have a grain of truth but are wildly exaggerated. Why? Because misinformation is almost always more interesting than the truth.

After all, fake news stories don’t have to stick to the actual facts, and they don’t have to go out of their way to be balanced or objective. The real news is complicated and often boring. As a former Facebook product manager put it during the 2016 election: “Sadly, News Feed optimizes for engagement [and] as we’ve learned in this election, bullshit is highly engaging.”

In a recent report from First Draft News that looked at how disinformation works, Claire Wardle and Hossein Derakhshan pointed out that for many social-network users, sharing is performative—in other words, they don’t share fake news posts because they believe that they are think they are factually accurate, but because doing so fits the worldview of a specific group they would like to belong to.

Given all of these factors, which posts are likely to get the magic combination of comments and engagement that Facebook says it will now optimize for? The worst of the worst.

We already have some evidence that this is the case, because Facebook has been doing a “split feed” experiment in several countries for the past several months, in which users get a feed made up primarily of content from their friends and family, and news posts appear in a separate feed.

Slovakian journalist Filip Struharik, who has been studying the impact of the change on the media in his country, said that his research shows mainstream news sites have seen engagement on Facebook (reactions, comments and shares) decline much more than lower-quality sites. Serious news sites saw engagement by almost 40 percent, while sites that are known for misinformation saw a drop of less than 30 percent.

A survey done by BuzzFeed in 2016, meanwhile, showed that the top fake election news stories generated more total engagement than all the top election stories from 19 major news outlets combined.

In other words, unless Facebook takes extraordinary steps to insert reputable news into people’s news feeds regardless of what the algorithm says, sensationalized or outright fake news stories are likely to spark far more engagement and discussion than serious news stories, and that is going to make it even more difficult to stop the misinformation problem from accelerating.

Leave a Reply

Your email address will not be published. Required fields are marked *