Facebook recently announced that it has changed the way it handles a key section of its website. Instead of being curated and edited mostly by human beings, the social network said its “Trending Topics” feature will now be almost entirely produced and structured by computer algorithms.
This change was driven in part by a controversy that flared up earlier this year, in which human editors who worked on the feature said that they were encouraged to exclude certain conservative news sites and prevent them from trending, although Facebook denied this.
In its blog post about the new approach, the social network says it found no evidence of “systemic bias” in the way trending topics were selected, but nevertheless hopes that the change will make the feature “a way for people to access a breadth of ideas and commentary about a variety of topics.”
Presumably, Facebook is hoping that handing the feature over to an algorithm will make it easier to defend against these kinds of accusations, because computer code is seen as being more objective and/or rational than human beings, and thus not susceptible to bias.
Note: This was originally published at Fortune, where I was a senior writer from 2015 to 2017
The code that operates Facebook’s news feed and trending algorithms, however, isn’t some kind of omniscient or ruthlessly objective engine, as technology analysts continually point out. It’s designed and programmed by human beings, and in most cases incorporates the biases of those human programmers.
As it turns out, Facebook isn’t actually taking all of the human beings out of the Trending Topics process. The company noted in its post that human editors will still be used to weed out certain topics that don’t refer to actual news events. “For example, the topic #lunch is talked about during lunchtime every day around the world, but will not be a trending topic,” it said.
Another example presented itself on Monday, when a fake news story about Fox News host Megyn Kelly appeared in the trending-topics section, and was called out by a number of journalists and other users.
The real point, however, is that simply moving from using human editors to using algorithms isn’t going to change the reality of whether Facebook’s news feed and trending-topics algorithms are biased. If either human beings or computer software are choosing which items qualify as interesting or news-worthy, then that decision automatically excludes certain other things.
Maybe the algorithm and the human editors will exclude topics like #lunch, but they may also exclude other things, and most users will never know. That creates a potential risk for a social network that seems to want to become a hub for journalistic content.
In the aftermath of the shooting of a black man in Ferguson, Mo. in 2014, the trending-topic feature showed nothing about the event to most users, but instead showed innocuous posts about the “Ice Bucket” challenge. Was that because most users weren’t sharing posts about Ferguson? Facebook would undoubtedly say yes, but the simple fact is that we don’t know.
These problems can become recursive as well. Even if the trending-topics feature does faithfully represent what people are actually sharing or interacting with the most, if Facebook’s news-feed algorithm hides or even excludes certain types of posts—which it routinely does—then they will never trend.
The bottom line is that Facebook’s programmers, who in a very real sense are also editors, are choosing what we see and when, and that has very real implications not just for journalism but for society as a whole.