Here’s Why Fake News Is More Popular Than Real News

The rise of “fake” news, and the role that platforms like Facebook played in that rise, has become one of the defining issues of the 2016 election. Did hoaxes and misinformation help Donald Trump win? And if so, what — if anything — should Facebook do about it? Who is to blame?

What makes this issue so difficult is that hard answers to those questions are difficult to come by, if not impossible. And finding solutions is not likely to get any easier.

Let’s take the first of those questions: Did fake news help Trump win? We simply don’t know. Some believe that false stories about Hillary Clinton murdering people or similar hoaxes definitely swayed the electorate, while others believe all these stories did was confirm biases that voters already had.

Those who argue that fake news did play a role point to evidence that these kinds of stories were hugely popular on Facebook, and were spread far more widely than “real” news stories.

Note: This was originally published at Fortune, where I was a senior writer from 2015 to 2017

A recent BuzzFeed investigation by hoax expert Craig Silverman, for example, showed that the 20 most shared fake stories were far more popular with users than the top 20 real news stories — and in some cases were more popular by an order of magnitude.

The top viral fake, entitled “Pope Francis Shocks the World, Endorses Donald Trump for President,” came from a site called Ending the Fed and was shared or interacted with almost a million times. The top real news story from the Washington Post, which was about Donald Trump’s history of corruption, fell well short of that mark.

As both New York magazine and journalism researcher Mark Bunting have pointed out — and Silverman has admitted — the BuzzFeed sample is a relatively small one, and it doesn’t mean that all fake news was more popular than all real news. It also doesn’t measure the true reach of these stories, because Facebook doesn’t make it easy to do that (which is a big part of the problem when it comes to determining the size of the phenomenon).

Nevertheless, fake news is clearly spreading far and wide on Facebook, and the company’s response to it has been relatively weak. In effect, Facebook says it is concerned, but that it doesn’t believe it’s a big problem, and doesn’t think it affected the election.

As a counterpoint to that, the Washington Post recently spoke to someone who makes a business out of creating fake news. Paul Horner told the newspaper that he created dozens of fakes and saw them shared widely by Trump fans, including the presidential candidate’s campaign manager.

My sites were picked up by Trump supporters all the time. I think Trump is in the White House because of me. His followers don’t fact-check anything — they’ll post everything, believe anything. His campaign manager posted my story about a protester getting paid $3,500 as fact.

Since Horner essentially admitted that he lies for a living, it’s difficult to know how much credence to give his report on the impact of fake news. Did it really help get Trump elected?

Even if it’s true that fake news stories just confirmed what people already thought — a phenomenon that sociologists call “confirmation bias” — it’s still possible that having those beliefs confirmed every day on Facebook made people less likely to change their votes, or less likely to listen to competing arguments about a different candidate.

But when it comes to Facebook actually doing something about that, the issue gets even more complicated. For one thing, as journalist Jessica Lessing and others have pointed out, determining what is fake and what isn’t would give Facebook even more power and control over news content than it already has.

Facebook is taking on Craigslist with this new feature. Watch: [fortune-brightcove videoid=5153016678001]

The biggest problem when it comes to solving this, however, comes down to human nature. The fact that a news story is shown to be untrue — either by being labeled a fake, or being fact-checked by mainstream news organizations — doesn’t seem to affect whether people choose to share or believe it.

Media theorist Clay Shirky said recently that “people trade untrue stories that encapsulate things they believe about the world all the time,” and there is a lot of truth to that. People don’t share news stories on Facebook because they are true, or factual. They share them because they feel true, or because sharing them is a way of signaling membership in a specific cultural group.

As the author of a study on social echo chambers put it in a recent interview with the New York Times about fake news and Facebook, this “creates an ecosystem in which the truth value of the information doesn’t matter. All that matters is whether the information fits in your narrative.”

Even the act of fact-checking a fake news story has been shown to actually reinforce the belief system of those who want it to be true. Mike Masnick of the technology commentary site Techdirt has said that when he fact-checks a news story, he often gets dozens of responses from people claiming that Snopes.com — one of the Internet’s longest-surviving and most active fact-checking sites — is a liberal front.

This aspect of human nature is something that Facebook and other social networks have “weaponized,” as Josh Benton of the Nieman Lab put it recently. It is so easy to share something, and the social benefits are more or less the same whether it is true or not.

So what can we do about this kind of “post-truth” environment? It’s clear that Facebook could take some kind of action to identify hoaxes and block fake sites — although it is not in the company’s economic interests to do so, as Shirky noted. But even that won’t be enough.

It’s possible that the only way to stop people sharing fake news is to try and understand *why* they are sharing it, and spend some time getting at the root of those problems. Unfortunately, that’s a much more time-consuming and difficult task than fact-checking a news story. And it’s something that the fragmented media industry of today isn’t particularly adept at doing.

Could Facebook help on that score? Could it find some way of introducing people to alternate viewpoints or promoting understanding rather than a cultural battleground? Only Mark Zuckerberg knows the answer.

Leave a Reply

Your email address will not be published. Required fields are marked *