One big problem with Facebook as a platform for news: It deletes things

As Facebook rolls out its “Instant Articles” initiative, in which news entities such as the New York Times and The Guardian are publishing directly to the social network, instead of just posting links to their own sites, media organizations and industry watchers are wrestling with the idea of Facebook as a platform for news. There’s the influence of the news-feed algorithm, for one thing, which is poorly understood—primarily because the company doesn’t really talk about how it works. But there’s also the fact that Facebook routinely deletes content, and it doesn’t talk much about that either.

In what appears to be one recent example, photojournalist Jim MacMillan happened to be walking through downtown Philadelphia shortly after a woman was run over by a Duck Boat (an amphibious vehicle that takes tourists around the harbor). Reverting to his journalistic training, he took a picture of the scene and posted it to his accounts on Instagram and Facebook, along with the caption “Police hang a tarp after a person was caught under ‪#‎RideTheDucks‬ boat at 11th and Arch just now. Looks very serious.”

“We do things like this to eliminate the possibility that loved ones will learn of the death from anyone but official sources and to spare viewers the traumatic effects of graphic imagery whenever possible,” he wrote. “In other words, I was operating conservatively within standard practices of photojournalism. That was my best effort to be sensitive to the victim while responsible to the public’s right to know.”

Note: This was originally published at Fortune, where I was a senior writer from 2015 to 2017

When MacMillan went back to look at his earlier post, however, he found that it had been removed from both Instagram and Facebook, without any notice or alert. As he put it in his subsequent update: “Shouldn’t I have been offered the opportunity to respond? Clearly, I have made the mistake of placing any measure of trust in a corporate platform with little concern for truth or history. But now every post means less to me, knowing firsthand that only those which please Facebook will survive.”

Melody Kramer, a former digital strategist at National Public Radio, said on Twitter that “things like this should worry everyone who cares about news.”

Duckboat photo

In a comment on his own post, MacMillan said he had since been in contact with a couple of Facebook staffers about the incident, and both said they could find no record of the image or post having been taken down, and in fact couldn’t find any evidence that anything of that nature had ever been uploaded. I asked for an official comment and was told that Facebook staff “reviewed both the Facebook and Instagram accounts and found no record of any action on our part to remove this content.”

Luckily, MacMillan said that he always uses a third-party service called If This Then That to post images and other content to Instagram and Facebook, so he had a permanent copy of the image and the caption.

Whether Facebook ever confirms that it deleted MacMillan’s image or not, it’s a well-known fact that the giant social network deletes and/or censors content all the time, for a variety of reasons. In some cases it has to remove content because of a court order, but in other cases it will remove even relatively inoffensive images and posts—such as pictures of women breast-feeding their children, for example—because it breaches the site’s community standards, or because someone complains about the content.

Investigative reporter and blogger Eliot Higgins has talked about how Facebook’s haste to censor what might be disturbing or controversial imagery can directly impact our understanding of a developing news story: the site removed several pages and posts from Syrian dissident groups and terrorist factions, but in doing so it effectively deleted a key source of information about chemical weapon attacks by the Syrian government. That meant less knowledge about a potential war crime and its aftermath.

As a corporate entity, Facebook has every right to delete or censor whatever it wants, of course, since the First Amendment only applies to the actions of the government in restraining speech. And it’s not surprising that a social network would want to maintain certain standards so that its users wouldn’t get offended by certain images or commentary. But what happens when that network want to become a platform for journalism?

Emily Bell of Columbia University’s Tow Center and NYU journalism professor Jeff Jarvis have both talked about how the media industry will have to confront this issue as platforms like Facebook become more and more important distribution channels for news. What responsibilities or duties does the platform have in such cases? Facebook maintains that it doesn’t choose what to show people—that its algorithm simply reflects the choices that users make—but this seems more like a way to dodge responsibility than an accurate reflection of what’s happening, especially when it is actively deleting news-worthy content.

Leave a Reply

Your email address will not be published. Required fields are marked *