Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer
This week, Meta—the parent company of Facebook—was widely criticized for handing over private messages between a mother and her 17-year-old daughter in Nebraska, in which they discussed ending the girl’s pregnancy. According to the Lincoln Journal Star, police in Nebraska got an anonymous tip in April that the girl had suffered a miscarriage and then buried the remains. They charged the girl and her mother in early June with a felony—for disposing of the body—and two misdemeanors, for concealing evidence and making a false report. After those charges were filed, the police officer investigating the case got a court order that forced Facebook to produce the private message history between the mother and her daughter, and found evidence that they ended the pregnancy by using abortifacient pills. The mother was charged with two additional felonies: one for performing or attempting an abortion on a pregnancy at more than 20 weeks, which is illegal in Nebraska, and one for performing an abortion without a medical license.
Although the ending of the pregnancy and the court order both took place before Roe v. Wade was overturned, many argue the incident is still a sign of what might happen now that abortion has become illegal or is likely to become illegal in a number of states. The story “shows in shocking detail how abortion could and will be prosecuted in the United States, and how tech companies will be enlisted by law enforcement to help prosecute their cases,” Vice wrote, in a story detailing the text messages. But Andy Stone, head of communications at Meta, said on Twitter that “nothing in the valid warrants we received from local law enforcement in early June, prior to the Supreme Court decision, mentioned abortion. The warrants concerned charges related to a criminal investigation and court documents indicate that police at the time were investigating the case of a stillborn baby who was burned and buried, not a decision to have an abortion.” The warrants also originally included non-disclosure orders, Stone said, “which prevented us from sharing any information about them” (the orders have been lifted).
Casey Newton, who writes a technology-focused newsletter called Platformer, said the consensus he saw emerging on Twitter and elsewhere following the incident was that Facebook was wrong for turning the private messages between the girl and her daughter over to police. “But of course Facebook complied with law enforcement’s request,” he wrote. “All the company would have known at the time is that police were investigating a stillborn fetus, and on what basis could the company credibly reject that request?” Both Google and Facebook receive tens of thousands of requests every year from government bodies and law enforcement, and expecting them to resist all of them seems naieve. Even if Facebook could somehow determine whether a specific court order was worthy or not, Newton writes, “there are costs to continuously flouting the government [and] you can bet that somewhere a Republican attorney general is salivating over a court battle that would put Facebook, abortion, and his name in the headlines.”
The larger question is what Facebook—or any of the other major social platforms—will do if there is a similar court order related to an actual abortion. As Corynne McSherry, legal director at the Electronic Frontier Foundation, told CNBC in June: “If you create huge databases of information, what you’re also creating is sort of a honeypot for law enforcement.” Stone, the Meta spokesperson, didn’t elaborate on what the company might do about future requests in his Twitter thread, and most of the major platforms have been similarly close-mouthed about the issue. After Roe v. Wade was overturned, CNN asked all of the social networks how they plan to handle such requests for personal data, and either received no response, a “no comment,” or a simple restatement of company policy from many, including Amazon, Apple, Google, Meta, and Twitter.
More restrictive abortion laws are part of the problem, but the bigger issue is the vast amounts of personal data that Facebook and other services collect about their users. As Newton wrote in an earlier version of his newsletter, abortion is “tech’s next big reputational risk. If Google and its peers aren’t going to stop cooperating with law enforcement, they need to start collecting less data.” Albert Fox Cahn, a lawyer who is also executive director of the Surveillance Technology Oversight Project, told Bobby Allyn, a reporter for NPR, that “Google is increasingly the cornerstone of American policing.” Since 2017, Google has complied with between 81 and 83 percent of the more than 50,000 law enforcement requests it gets every year. Meta said in its recent transparency report that it got close to 60,000 requests for information in the US last year, and provided some information in more than 87 percent of them.
This practice is as extensive as it is in part because the US has no national privacy law that covers all personal data posted to social networks such as Facebook and Twitter. As the New York Times explained in a feature published last year, privacy laws are a hodgepodge of different rules covering different sectors. “Historically, in the US we have a bunch of disparate federal [and state] laws,” said Amie Stepanovich, executive director at the Silicon Flatirons Center at Colorado Law. “[These] either look at specific types of data, like credit data or health information,” Stepanovich said, “or look at specific populations like children, and regulate within those realms.” A number of legislators have proposed that the US adopt a digital data code much like the European Union’s General Data Protection Regulation or GDPR, but nothing has made it into law yet. And until it does, the social platforms will be free to share whatever personal data they wish.
Here’s more on Facebook:
My body, my data: One of the proposed laws before Congress that attempt to take on the sharing of private information is the My Body, My Data Act, Natasha Singer and Brian X. Chen note in the New York Times. “Introduced in June by Representative Sara Jacobs, a California Democrat, the bill would prohibit companies and nonprofits from collecting, keeping, using or sharing a person’s reproductive or sexual health details without the person’s written consent,” Singer and Chen write. “Another bill, the Fourth Amendment Is Not for Sale Act, would prevent law enforcement and intelligence agencies from buying a person’s location records and other personal details from data brokers.”
My phone, my location: After Roe v Wade was overturned, Google announced that it would voluntarily begin removing personal location data that might show someone visited sites such as counseling centers, domestic violence shelters, and abortion clinics, but other platforms, such as Facebook, have made no such commitments. And as Newton points out, “telecom companies like AT&T, Verizon, and T-Mobile collect and even sell sensitive user data, including the locations of the cellular towers that your smartphone pings as you move about the world. And they all (shamefully!) declined to comment about how their data collection will intersect with abortion prosecutions.”
My data, the FTC’s rules: On Thursday, the Federal Trade Commission started a process known as “rulemaking” around privacy and data sharing—the first step toward actual regulations on what companies regulated by the FTC can and can’t share. These measures are needed “to stop corporations and other commercial actors from abusing people’s personal data,” the FTC said in a news release. In June, the agency announced that it was considering the rulemaking process in order to “safeguard privacy and create protections against algorithmic decision-making that results in unlawful discrimination.”
My chat, my secrets: Meta has talked in the past about offering users the ability to encrypt Facebook Messenger content, so that neither the company nor law enforcement could see what is written there, even with a warrant (messages sent via WhatsApp, which is also owned by Meta, are already encrypted by default). On Thursday, Meta said that it will begin testing end-to-end encryption as the default option for some users of Facebook Messenger on Android and iOS. A Facebook spokesperson told The Guardian the test is limited to a couple of hundred users for now, and that the decision to start rolling it out was “not a response to any law enforcement requests.”
Other notable stories:
ValueAct Capital Management, known as an active investor that likes to force change at the companies it invests in, has accumulated a stake in New York Times Co., Bloomberg reported Thursday. “San Francisco-based ValueAct said in a letter to investors Thursday that it now owns a 7% stake in the Times,” the news service said. The fund said it believed “the current valuation doesn’t reflect the company’s long-term growth prospects in almost any potential economic environment and that management has several opportunities to offset the macroeconomic headwinds that face the industry.”
Twitter posted an update on what it plans to do in order to curb misinformation during the US mid-term elections. The company said that its civic integrity policy “covers the most common types of harmful misleading information about elections and civic events, such as: claims about how to participate in a civic process like how to vote, misleading content intended to intimidate or dissuade people from participating in the election, and misleading claims intended to undermine public confidence in an election – including false information about the outcome of the election.”
The Economist ran a story about the epidemic of obesity among Arab women and used a photo of Enas Taleb, a veteran actress and talk-show host, to illustrate the problem. Now Taleb says she is suing the British newspaper. “I am demanding compensation for the emotional, mental and social damage this incident has caused me,” she told New Lines magazine. My legal team and I are arranging the next steps.”
Pakistan’s interior minister, Rana Sanaullah, said he is filing sedition charges against Shahbaz Gill, an aide to Pakistan’s former Prime Minister Imran Khan, and ARY TV, a local media company, Reuters reported. Sanaullah said comments made by Gill and aired on ARY TV could incite mutiny among the country’s military. Police officials told Reuters that both Gill and Ammad Yousaf, head of news at ARY TV, have been arrested, and Pakistan’s media regulator said in a statement that it had ordered ARY News to be taken off air for airing “false, hateful and seditious” content.
Ben Thompson, a technology and media analyst who writes a newsletter called Stratechery, interviewed Meredith Kopit Levien, the CEO of the New York Times, about the newspaper’s strategy in buying Wordle and The Athletic, and about its new goal of 15 million subscribers by 2027. Asked what fears or concerns about the paper’s growth keep her up at night, Kopit Levien said “I’m 51 years old, I run a public company, and I’m a mom. Everything keeps me up at night.”
“I started playing word games as a way to stop reading the news first thing in the morning,” Lyz Lenz writes, in a piece for the Nieman Journalism Lab. “Death counts, infection rates, mass shootings, disasters on our overheating planet, and what could I do about it all? I’ve protested, voted, and written. And yet, it doesn’t seem to make a material difference. Or that’s how it feels most days — tossing words into the unchanging void of history.”
A new Pew Research Center survey of American teenagers ages 13 to 17 finds TikTok has rocketed in popularity since its North American debut several years ago and now is a top social media platform for teens. Some 67 percent of teens say they ever use TikTok, with 16 percent saying they use it almost constantly. The share of teens who say they use Facebook, a dominant platform among teens in the Center’s 2014-15 survey, has plummeted from 71 percent then to 32 percent today.