When you play a game, it’s handy to have a score, so you know how you did compared to all the other players. But what if the score is one that Facebook assigns you based on your estimated “trustworthiness,” and the criteria behind the score is kept secret from you? That appears to be the case, according to a report from the The Washington Post on Tuesday. A Facebook product manager in charge of fighting misinformation (there’s a job title for the ages) told the paper that the social network has developed the ranking system over the past year as it has tried to deal with “fake news” on the platform. Every user is given a score between zero and one, which indicates whether they are considered to be trustworthy when it comes to either posting content or flagging already posted content as fake.
It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” product manager Tessa Lyons told the Post. The trustworthiness score is designed in part to guard against this kind of gaming of the process. Facebook also took pains to point out that there is no single “reputation score” given to users, and that the trustworthiness ranking is not an absolute indicator of a person’s credibility. It is just one measurement among thousands of behavioral clues, Lyons said, which are used to determine whether a post is legitimate and/or whether a post was flagged improperly.
The speed with which Facebook tried to reassure users that they don’t have a single reputation score isn’t surprising, given all the attention on what is happening with social activity in China. There, the government is assigning all Chinese citizens a “social credit score” based on their behavior both online and offline, including what they share via networks like WeChat (which is a little like Facebook, Instagram, Snapchat, and PayPal combined into a single app). This social credit score can then be used to determine who gets access to certain services, including schools. No one is suggesting that anything quite so dystopian is going on at Facebook, but still the idea of being assigned a secret trustworthiness score by a network that controls the information diet of more than two billion people feels a tad uncomfortable.
Facebook’s personal trustworthiness score seems like a fairly obvious spinoff of its other attempts to crack down on fake news and misinformation, in the wake of the Internet Research Agency’s trolling of the network during the 2016 election, and the resulting Congressional hearings. The social network announced earlier this year that it is working on a trust score for publications, based in part on surveys of users to find out which media outlets they trust and don’t trust. Presumably that data will in turn influence Facebook’s rating of users who vote for specific publications, or routinely flag their content as fake.
The big question is what else a user’s trust rating will influence. Will it help determine whether their own content is favored by Facebook’s all-powerful NewsFeed algorithm? And if someone flags a lot of posts from media outlets as fake even when it isn’t, does that mean their own posts will be assumed to be fake as well? Will Facebook ever share this trustworthiness score with external partners such as banks and credit companies, or the federal government? As usual with so much that goes on at Facebook and the other major web platforms, we simply don’t know, and probably never will.
Here’s more on Facebook and its tangled relationship with both the media and politics:
- Fighting trolls: The social network says it is working hard to try and prevent Russian trolls or any other foreign agents from trying to meddle in the upcoming US midterm elections, according to a Recode interview with Facebook product head Samidh Chakrabarti. “I feel like we have a good handle and a good plan for many of the problem types that we’re seeing,” he said. “But whether we will get far enough, fast enough, is really the question.”
- Facebook and hatred: Is Facebook helping to foment hatred towards immigrants? New research indicates that it might be. A study looked at thousands of anti-immigrant attacks in Germany and correlated them with various factors, and found that anti-immigrant violence was more likely in towns with higher rates of Facebook usage. This held true regardless of whether the town was large or small, wealthy or poor. Facebook-owned WhatsApp has also been associated with violence.
- No more discrimination: Facebook says it is removing about 5,000 criteria from its ad-targeting options, in an attempt to cut down on potential discrimination in advertising, according to a report by BuzzFeed. Advertisers will no longer be able to automatically hide their ads from users who say they are interested in things like Passover, Evangelicalism, Native American culture, Islamic culture, and Buddhism. The US Department of Housing and Urban Development recently filed a complaint alleging that the social network was enabling discriminatory housing practices with its ad targeting options.
- Banking info: According to a report earlier this month, Facebook has contacted some large US banks, asking for information on the banking habits of its users, including credit-card transactions and account balances. That might seem a little troubling in the context of the social network’s trustworthiness score, not to mention the recent controversy with data leakage via Cambridge Analytica, but the company says it is not “actively seeking” banking information on its users.
Other notable stories:
- Maya Kosoff at Vanity Fair says that by admitting in a recent interview that Twitter has a left-leaning bias, CEO Jack Dorsey essentially poured gasoline on the right’s favorite conspiracy theory, which is that the company deliberately squelches conservative content due to bias. “Dorsey effectively handed conservatives more ammunition, perpetuating the cycle that forces him to continually tiptoe around them,” she says.
- Now that more than 400 newspapers have declared their solidarity with each other in fighting the spread of President Trump’s anti-media attitudes, what should the press do now? Melody Kramer and Betsy O’Donovan put together a list for the Poynter Institute of seven things the media should be focusing on now, including showing their work to prove its value, and sharing information with other media outlets.
- Ruairi Casey writes for CJR about a unique publication called Kanere, which is written and published by and for residents of a large refugee camp in Kenya known as Kakuma. The paper was founded by Qaabata Boru, a Nigerian journalism student who was jailed for an article he wrote in 2005 about that country’s elections, and later fled the country to avoid persecution both for his political views and his status as a member of the Oromo ethnic group.
- In its latest analysis of the state of the news media, the Pew Research Center found that the audience for almost every major sector of the US news media fell last year, with the only significant exception being radio. Both local and network TV news fell by 7 percent, cable TV’s audience dropped by 12 percent and digital news audiences declined by 5 percent. Circulation at US daily newspapers, meanwhile, fell by 11 percent.
- The Solutions Journalism Network said Tuesday that it has partnered with Google to launch a feature called “Tell Me Something Good,” which will be available via Google’s smart assistant device, the Google Home, as well as on any smartphone that has the Google Assistant app installed. Users will be able to say “Tell me something good,” and have a story read to them that comes from the Solutions Journalism Network.
- Washington Post national political correspondent Jenna Johnson found something interesting being sold at a booth stationed outside Donald Trump’s rally in Charleston, West Virginia: Books that consist solely of the president’s tweets from the first year of his presidency, bound in blue leather, being sold for $35 each. “There’s going to be one volume for each year,” a woman selling the books told Johnson. “It’s a lot of tweeting.”