Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer
Last week, Facebook released a report detailing some of the most popular content shared on the site in the second quarter of this year. The report is a first for the social network, and part of what the company has said is an attempt to be more transparent about its operations: Guy Rosen, Facebook’s vice president of integrity, described the content review as part of “a long journey” to be “by far the most transparent platform on the internet.” If that is the case, however, the story behind the creation of the report shows the company still has a long way to go to reach that goal.
To take just one example, Facebook’s new content report appears to be, at least in part, a co-ordinated response to critical reporting from Kevin Roose, a Times‘ technology columnist, who has been tracking the posts that get the most engagement on Facebook for some time, using the company’s own CrowdTangle tool, and has consistently found that right-wing pages get the most interaction from users.
This isn’t something Facebook likes to hear, apparently, so the content report tries to do two things to contradict that impression: the first is it tries to argue that engagement, or the number of times someone clicks on a link — which Roose uses as the metric for his Top 10 lists — isn’t the most important way of looking at content, and so it focuses instead on “reach,” or how many people saw a certain post. The second thing it tries to do is show that even the most popular content only amounts to only a tiny fraction of what gets seen on the platform (less than 0.1 percent, according to the report). As Robyn Caplan, a researcher with Data & Society has pointed out, this seems to be an attempt to show that disinformation on the platform isn’t a big deal because so few people see it.
Continue reading “Facebook “transparency report” turns out to be anything but”