It’s getting harder and harder to believe Facebook when it says it’s not a media company. The social network just said that even if an image or a story posted on the site breaches its community standards, it will leave the post up if it is deemed to be “newsworthy.”
But how will Facebook determine whether something is newsworthy and therefore deserves not to be deleted by the site’s censors? That remains unclear.
The site’s responsibilities as a media entity were also highlighted Friday by a report that some staffers wanted to delete posts by the Trump campaign because they believed they should qualify as hate speech, but were ultimately over-ruled by CEO Mark Zuckerberg.
According to the Wall Street Journal, the decision to allow Trump’s posts to remain resulted in complaints that the founder and CEO was bending the site’s rules for the Republican candidate. Some employees who work reviewing content on the site reportedly threatened to quit.
Note: This was originally published at Fortune, where I was a senior writer from 2015 to 2017
When it comes to the newsworthiness exception, Joel Kaplan and Justin Osofsky — vice presidents of Global Public Policy and of Global Operations & Media Partnerships, respectively — said in a blog post that the site came to its decision based on feedback from users and partners.
Although the post doesn’t mention it specifically, much of this feedback likely came as a result of a recent incident in which Facebook deleted posts containing an iconic Vietnam War image of 9-year-old Kim Phuc running down the road naked after her village was bombed.
Not only did Facebook delete the original image, after a Norwegian newspaper editor uploaded it as part of a series on war photography, but the site deleted the editor’s post about the deletion as well. It then blocked his account, and even deleted a post by the prime minister of Norway, who protested Facebook’s censorship of the image.
The social network apologized for the deletions, and said that staffers were compelled to remove the image because it is of a naked child, which breaches the site’s community standards.
Now, Kaplan and Osofsky say that Facebook will leave up certain images and posts”We’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards,” they write. “We will work with our community and partners to explore exactly how to do this.”
The problem, as Facebook sees it, is that some images and content that seem innocuous in one country or culture may be seen as offensive or even illegal in another country.
Twitter has also struggled with this problem, and what it came up with in 2014 was what it calls “country withheld content.” So in the case of tweets that contain pro-Nazi references, it will block anyone within Germany from seeing them, since that kind of content is against the law.
This may be the kind of thing that Kaplan and Osofsky are talking about when they say that the company is looking at “new tools and approaches to enforcement.”
The larger issue, however, is that Facebook is increasingly having to make these kinds of calls about what content is permitted and what isn’t, and in fact it has been making those editorial decisions for years — all while denying that it is a media company.
For example, it routinely removes breast-feeding photos or even articles and images having to do with breast cancer, as it apparently did this week with a breast-cancer video advertisement.
Activists and political dissidents are also familiar with having their posts and even their accounts disappear without warning. Investigative journalist Eliot Higgins has talked about how Facebook’s deletion of pages involving violence in Syria has prevented journalists like him from collecting important information about the war there.
In addition to those kinds of decisions, Facebook has also been criticized for hosting so many hoaxes and fake news stories, many of which are produced by a shadowy group of political sites.
Human editors used to remove such fakes from the Trending Topics section of the site, but Facebook got rid of its editors following a controversy over alleged bias in their decision making. Trending Topics is now run by algorithm, just like the main Facebook news feed.
Many believe that Facebook can no longer argue that the algorithm makes all of its decisions about what news to include or not include, and therefore it doesn’t have any responsibility or duty to talk about its news judgement or decision-making process. It is a media entity with 1.5 billion users, and it needs to start acting like one.