Note: This was originally published as the daily newsletter at the Columbia Journalism Review, where I am the chief digital writer
In August, the major social-media platforms released statements about how they intended to handle misinformation in advance of the November 8 midterms, and for the most part Meta (the parent company of Facebook), Twitter, Google, and TikTok said it would be business as usual—in other words, that they weren’t planning to change much. As the midterms draw closer, however, a coalition of about 60 civil rights organizations say business as usual is not enough, and that the social platforms have not done nearly enough to stop continued misinformation about “the Big Lie”—that is, the unfounded claim that the 2020 election was somehow fraudulent. Jessica González, co-chief executive of the advocacy group Free Press, which is helping to lead the Change the Terms coalition, told the Washington Post: “There’s a question of: Are we going to have a democracy? And yet, I don’t think they are taking that question seriously. We can’t keep playing the same games over and over again, because the stakes are really high.”
González and other members of the coalition say they have spent months trying to convince the major platforms to do something to combat election-related disinformation, but their lobbying campaigns have had little or no impact. Naomi Nix reported for the Post last week that members of Change the Terms have sent multiple letters and emails, and raised their concerns through Zoom meetings with platform executives, but have seen little action as a result, apart from statements about how the companies plan to do their best to stop election misinformation. In April, the same 60 social-justice groups called on the platforms to “Fix the Feed” before the elections. Among their requests were that the companies change their algorithms in order to “stop promoting the most incendiary, hateful content”; that they “protect people equally,” regardless of what language they speak; and that they share details of their business models and moderation.
“The ‘big lie’ has become embedded in our political discourse, and it’s become a talking point for election-deniers to preemptively declare that the midterm elections are going to be stolen or filled with voter fraud,” Yosef Getachew, a media and democracy program director at the government watchdog Common Cause, told the Post in August. “What we’ve seen is that Facebook and Twitter aren’t really doing the best job, or any job, in terms of removing and combating disinformation that’s around the ‘big lie.’ ” According to an Associated Press report in August, Facebook “quietly curtailed” some of the internal safeguards designed to smother voting misinformation. “They’re not talking about it,” Katie Harbath, a former Facebook policy director who is now CEO of Anchor Change, a technology policy advisory firm, told the AP. “Best case scenario: They’re still doing a lot behind the scenes. Worst case scenario: They pull back, and we don’t know how that’s going to manifest itself for the midterms on the platforms.”
The Change the Terms coalition said it tried to keep up the pressure on the platforms throughout the past six months because the groups wanted to “avoid what is the pitfall that inevitably has happened every election cycle, of their stringing together their efforts late in the game and without the awareness that both hate and disinformation are constants on their platforms,” Nora Benavidez, director of digital justice at Free Press, told the Post. The coalition first called for the reduction of online hate after the deadly neo-Nazi march in Charlottesville in 2017, according to the group’s website. “Yet some technology companies and social-media platforms remain hotbeds of hate and disinformation,” it says. “This is not hyperbole: The insurrection on the U.S. Capitol on Jan. 6, 2021, was organized with the use of major social-media platforms.”
As Nix notes, the coalition’s pressure on the social-media platforms was fueled in part by revelations from Frances Haugen, the former member of Facebook’s integrity team turned whistleblower who leaked thousands of internal documents last year. Haugen testified before Congress that shortly after the 2020 election, the company had rolled back many of the election integrity measures that were designed to stamp out misinformation. An investigation by the Post and ProPublica last year showed that a number of Facebook groups became hotbeds of misinformation about the allegedly fraudulent election in the days and weeks leading up to the attack on the Capitol. Efforts to police such content, the investigation found, “were ineffective and started too late to quell the surge of angry, hateful misinformation coursing through Facebook groups—some of it explicitly calling for violent confrontation with government officials.”
Facebook “took its eye off the ball in the interim time between Election Day and January 6,” a former member of Facebook’s integrity team employee told the Post and ProPublica team. “There was a lot of violating content that did appear on the platform that wouldn’t otherwise have.” (Drew Pusateri, a spokesman for Meta, said in a statement to the Post and ProPublica at the time that “the notion that the January 6 insurrection would not have happened but for Facebook is absurd”). A recent report showed that misinformation about the election helped create an entire ecosystem of disinformation-peddling social accounts, accounts whose growth the platforms seem to have done little to stop. In May, the Post wrote about how Joe Kent, a Republican congressional candidate, had claimed “rampant voter fraud” in the 2020 election in an ad on Facebook. The ad was reportedly just one of several similar ads that went undetected by internal systems.
YouTube told the Post the company planned to enforce its policies, and had removed “a number of videos related to the midterms.” TikTok said it supports the Change the Terms coalition because “we share goals of protecting election integrity and combating misinformation.” Facebook declined to comment, and referred to an August news release listing the ways the company said it planned to promote accurate information about the midterms. Twitter said it would be “vigilantly enforcing” its content policies. Earlier this year, however, the company said it had stopped taking steps to limit misinformation about the 2020 election. Elizabeth Busby, a spokesperson, told CNN at the time that the company hadn’t been enforcing its integrity policy related to the election since March 2021. Busby said the policy was designed to be used “during the duration” of an election, and since the 2020 election was over, it was no longer necessary.
Here’s more on the platforms:
- Whiffing it: In Protocol’s Policy newsletter, Ben Brody writes that the election misinformation problem is about more than just the US. “Take Brazil,” he says. “President Jair Bolsonaro appears to be poised to lose his reelection bid, which he kicked off by preemptively questioning the integrity of the country’s vote.” Facebook has already missed a lot of misinformation in Brazil, critics say. In addition, Brody notes, there are potentially contentious elections elsewhere, including in nations “with civic turmoil or tenuous freedom, such as Turkey, Pakistan, and Myanmar.” Elections in India are also expected in 2024. “If we want to fix this, we need to acknowledge the problem is bigger than Big Tech whiffing it on content moderation, especially in the U.S.,” Brody said.
- The time of Nick: Nick Clegg, president of global affairs at Meta, said he will be the one to decide whether to reinstate former president Donald Trump’s account in January of next year, according to Politico. Trump was banned from Facebook for two years in the wake of the attack on the Capitol on January 6 of 2021. At an event in Washington put on by Semafor, the news startup from former media reporter Ben Smith, Clegg said whether to extend Trump’s suspension is “a decision I oversee and I drive,” although he said he would consult with Mark Zuckerberg, Meta’s CEO. “We’ll talk to the experts, we’ll talk to third parties, we will try to assess what we think the implications will be,” Clegg said.
- Predator and prey: More than 70 lawsuits have been filed this year against Meta, Snap, TikTok, and Google claiming that adolescents and young adults have suffered anxiety, depression, eating disorders, and sleeplessness as a result of their addiction to social media, Bloomberg reports. In at least seven cases, the plaintiffs are the parents of children who’ve died by suicide. Bloomberg said the cases were likely spurred in part by testimony from Facebook whistleblower Haugen, who said the company knowingly preyed on vulnerable young people to boost profits, and shared an internal study that found some adolescent girls using Instagram suffered from body-image issues.
Other notable stories:
- The public-stock offering of Truth Social, the media platform Donald Trump started after he was banned from Twitter and Facebook in January of 2021, could be in trouble, CNBC reported yesterday. Digital World Acquisitions, the SPAC (special purpose acquisition company) that wanted to take Truth Social public, has lost $138 million of private financing, the company stated in a recent regulatory filing. Investors said they pulled their funds from Digital World because of legal problems facing the company and Trump, as well as the app’s lackluster performance. Trump has just over four million followers on Truth Social, and at one time had over eighty million followers on Twitter.
- British broadcasters say they have been told by Buckingham Palace that they can only save 60 minutes worth of TV footage from Queen Elizabeth’s funeral, and that the royal family has a veto over any clips included in that total, the Guardian reported. “Once the process is complete, the vast majority of other footage from ceremonial events will then be taken out of circulation,” the paper wrote. “Any news outlets wishing to use unapproved pieces of footage would have to apply to the royal family on a case-by-case basis.”
- South Korea’s President Yoon Suk Yeol has accused the country’s media of damaging the country’s alliance with the US, after a TV network aired a recording of him apparently swearing about US lawmakers, after a session at last week’s United Nations General Assembly in New York. A South Korean broadcaster caught Yoon on tape talking to his aides after a chat with Biden, and the South Korean president appeared to use a profanity in referring to the US Congress. “Wouldn’t it be too darn embarrassing for Biden if those fuckers at legislature don’t approve?” Yoon allegedly said. Yoon’s press secretary said the president was referring to the South Korean parliament.
- Journalists in Guatemala say public officials have used a law designed to stop violence against women to prevent journalists from reporting on certain stories, according to the Los Angeles Times. The paper cited eight recent instances when journalists have received restraining orders for causing “psychological violence” to the subjects of the stories they were reporting on, subjects who in some cases were the female relatives of public officials accused of corruption. The Guatemalan law was passed in 2008 to reduce the high rates of gender-based violence in that country.
- As protests over the death of Mahsa Amini continue to spread across Iran, the government has responded by shutting down mobile internet services and disrupting social media sites like WhatsApp and Instagram, Wired reports. Social media platforms have been filled with videos of female protesters burning their headscarfs or waving them in the air, in defiance of the government’s ban on women showing their hair (Amini was arrested for allegedly wearing her headscarf improperly, and died in police custody).
- Police in Pakistan said Sunday that they arrested Ayaz Amir, a well-known columnist and TV personality, for his alleged involvement in the death of his daughter-in-law. Amir reportedly appeared in court in Islamabad on Sunday for interrogation. He is accused of helping his son, Shahnawaz, who police say attacked and killed his wife Sara Inam, 37, at their home. Police say Shahnawaz then attempted to hide his wife’s body in a bathtub.
- Some journalists in Chicago fear that their access to police radio frequencies may be limited, after the Chicago Police Force announced they will switch to digitally encrypted radio channels at the year’s end, the Chicago Tribune reported. City officials said the move will prevent outsiders from interjecting with rogue chatter, and that Broadcastify, a live audio platform, will stream the polices’ radio communications online for free, at a thirty-minute delay. However, dispatchers can censor information from those broadcasts, which has led to transparency concerns in the local journalism community.