Facebook and the news: trends, filter bubbles and algorithmic bias

Controversy continued Thursday over the question of Facebook’s influence on the news that more than a billion people see every day. In the latest developments, the company’s editorial guidelines around its Trending Topics feature were leaked to The Guardian, and the social-networking giant quickly published its own version, along with another internal document that describes how it decides what news to include and what not to.

The controversy got started earlier this week, with a piece by Gizmodo that looked at how several journalists who worked on the Trending Topics feature were treated by the social network, and what they were expected to do.

The original story mostly focused on how these editorial contractors believed they were simply training Facebook’s news-filtering algorithms, and didn’t feel that the social network cared about journalism much, except as raw material for its engagement engines. Then a second piece appeared that featured comments from an anonymous editor about how staff routinely kept certain sites and topics out of the Trending feed.

Note: This was originally published at Fortune, where I was a senior writer from 2015 to 2017

This sparked criticism of what appeared to be right-wing bias (since many of the sites that were not included were conservative-leaning). Facebook got some nasty comments, not just from the head of the Republican National Congress but also from the head of the Senate Commerce Committee, who asked CEO Mark Zuckerberg to make his staff available for questions about how editorial decisions were being made at the social network.

As a number of observers have pointed out (including Fortune), the question isn’t so much whether Facebook filters out certain kinds of news—something that newspapers and other media entities do every day without much scrutiny. The real point is that Facebook is orders of magnitude larger and more influential than any traditional media entity, and yet the ways in which it chooses the news its billion users see is still fundamentally opaque.

Whenever questions about its status as a media entity come up, Facebook typically argues that it isn’t a media entity at all, it’s just a social service that uses algorithms to show people content that they might enjoy or want to see. Some of that happens to be news, but the Facebook argument is that it’s all done by algorithm, so there’s no real editorial activity.

What that ignores, of course, is that algorithms are programmed by human beings, and in the process of doing so a million decisions are made that are journalistic decisions, including how to rank different news sources, what kinds of news to exclude, etc.

The reality is that Facebook routinely removes or censors content in the main news feed, whether it’s breast-feeding photos or pictures of the war in Syria, and it decides to down-rank or hide other kinds of content on a daily basis. Those are fundamentally editorial decisions that have an impact on the way that a billion users think about the world.

Although Facebook points out that Trending Topics is completely separate from the main news-feed, what the recent controversy has done is highlight how much human beings are a part of everything Facebook does. And that is raising questions the social network hasn’t had to face before about how it makes the decisions it does.

On Thursday, the British newspaper The Guardian published what it said were leaked internal documents with editorial guidelines on how to handle Trending Topics, including rules around when to remove certain terms and when to include them.

Within hours of the Guardian story appearing, Facebook published a lengthy post on its site describing the purpose behind the Trending Topics section, and its views about how items in that section were curated, first by an algorithm and then by human editors. The move appeared to be an attempt to get out in front of some of the criticism of potential bias, and Facebook also published a 28-page internal document on how Trending Topics functions.

In an interview with The Verge about Instant Articles (a feature that takes content from news partners and makes it mobile-friendly by customizing it for the Facebook platform), news-feed product manager Will Cathcart talked about Facebook’s approach to curating news, and maintained that all the social network wants to do is “give users what they want.” The definition of that, he said, is left up to the algorithm.

In effect, Cathcart said that with more than a billion users, Facebook can’t possibly make across-the-board decisions about what is newsworthy or what is crucial information for users to know and what isn’t. So everything is personalized, via the algorithm, in order to give users the impression that they are “informed,” as he described it.

Cathcart didn’t talk about any of the potential down-sides of this approach, such as the “filter bubble” effect that can keep users from seeing potentially important topics because they don’t fit the platform’s pre-conceived notions of what that user is already interested in. Nor did he talk about whether Facebook bears any kind of editorial or journalistic responsibility because of its size and market power.

One upside of the Trending Topics controversy is that Facebook has become a little more open and transparent about how the feature works, and what principles guide those choices. But in many ways, the trending section is a sideshow. All of the same kinds of questions apply to how the main news feed works, and so far there hasn’t been much openness about that at all, nor any real admission that the company has any ethical or moral responsibility related to how it shapes the world-view of its billion-plus users.

Leave a Reply

Your email address will not be published. Required fields are marked *