Here’s why getting rid of human editors won’t solve Facebook’s bias problem

Facebook recently announced that it has changed the way it handles a key section of its website. Instead of being curated and edited mostly by human beings, the social network said its “Trending Topics” feature will now be almost entirely produced and structured by computer algorithms.

This change was driven in part by a controversy that flared up earlier this year, in which human editors who worked on the feature said that they were encouraged to exclude certain conservative news sites and prevent them from trending, although Facebook denied this.

In its blog post about the new approach, the social network says it found no evidence of “systemic bias” in the way trending topics were selected, but nevertheless hopes that the change will make the feature “a way for people to access a breadth of ideas and commentary about a variety of topics.”

Presumably, Facebook is hoping that handing the feature over to an algorithm will make it easier to defend against these kinds of accusations, because computer code is seen as being more objective and/or rational than human beings, and thus not susceptible to bias.

Note: This was originally published at Fortune, where I was a senior writer from 2015 to 2017

Continue reading “Here’s why getting rid of human editors won’t solve Facebook’s bias problem”