Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer
Several weeks ago, the Wall Street Journal published a series of six investigative news stories about Facebook, alleging a pattern of questionable behavior on the part of both the social network and its photo-sharing service, Instagram. One alleged that changes to the Facebook news feed algorithm, which were purportedly designed to improve the news-reading experience, actually had the opposite effect, and “turned it into an angrier place.” Another said that the company knew about the negative effects its Instagram service was having on the mental health of young girls, because researchers working at Facebook had repeatedly mentioned it during briefings with senior executives, but Facebok took little or no action. Other Journal stories revealed a little-known feature that allowed celebrities to avoid responsibility for breaching Facebook’s rules, and claimed that the company knew its services were being used by drug cartels and human trafficking networks, but routinely failed to do anything to stop it (Facebook responded that the stories are inaccurate and that it cares deeply about the effect its products have on users, including young girls).
The Journal reports were all based on what the paper called “an extensive array of internal company communications” given to it by a whistleblower, a former Facebook staffer who copied the documents before they quit working for the company because they disagreed with its behavior. On Sunday, the whistleblower revealed herself on 60 Minutes to be Frances Haugen, a former product manager at Facebook who has also worked for Google, Pinterest, Yelp, and a number of other technology companies. On Tuesday, Haugen testified before the Senate Commerce subcommittee on consumer protection, product safety, and data security about the potential dangers of Instagram for young users (Haugen also posted her testimony to her personal website). In both her 60 Minutes interview and her congressional testimony, Haugen made the same central point: that Facebook knew about the dangers of the recommendation algorithms that power it and Instagram, but chose to do nothing. It knew about these dangers, Haugen said, because the company’s own researchers had mentioned them repeatedly in multiple research papers.
“The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people,” Haugen said during her testimony to the senate committee. The bottom line, she said, is that Congress must take action, comparing Facebook to other industries that also wound up being regulated by the government in order to protect consumers from harm, such as tobacco companies and car makers. One of the big challenges with Facebook, she argued, is that legislators don’t have any idea how the company’s products work, because it is so reluctant to either share data from its own internal research, or provide data to outside scientists. “This inability to see into Facebook’s actual systems and confirm they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway,” Haugen told the senate committee.
Many critics of Facebook, including law professor and former Congressional candidate Zephyr Teachout (who spoke with CJR as part of a discussion series on our Galley platform), have argued that antitrust action is the only solution to the problems it creates, and that it needs to be broken up, forced to sell subsidiaries like Instagram and WhatsApp. Surprisingly, Haugen said she disagrees with this approach. “I’m actually against the breaking-up of Facebook,” she said in Tuesday’s hearing. “If you split Facebook and Instagram apart, it’s likely that most advertising dollars would go to Instagram, and Facebook will continue to be this Frankenstein that is endangering lives around the world,” but without the necessary funds to pay for the content moderation and other work required. Haugen said she believes that “regulatory oversight and finding collaborative solutions with Congress is going to be key, because these systems are going to continue to exist and be dangerous.”
Some industry watchers say Haugen’s proposed solution is among the most favorable potential outcomes for Facebook, since it would mean no expensive breakup, and the development of regulatory oversight would be susceptible to lobbying, meaning the company might be able to shape regulations to its own benefit. Meanwhile, the company continued to demean its former employee, by suggesting she didn’t have enough authority to be credible, even though the bulk of her whistleblowing came from the company’s own research. Lena Pietsch, Facebook’s director of policy communications, dismissed Haugen as “a former product manager who worked for the company for less than two years, had no direct reports [and] never attended a decision-point meeting with C-level executives.” Mark Zuckerberg, the Facebook CEO, who has stayed out of the limelight over the past few weeks, said he and the rest of the company care deeply about safety, well-being and mental health and “it’s difficult to see coverage that misrepresents our work and our motives.”
Here’s more on Facebook:
In the wake of the Journal‘s reporting and Haugen’s congressional testimony, some critics are advocating that if regulators can’t find a way to hold Facebook responsible for the actual content it hosts—because of the legal protections contained in Section 230 of the Communications Decency Act—they might be able to hold it responsible for the way its algorithms recommend or promote certain kinds of content. Researcher Daphne Keller, however, argues that while this might sound like a great idea, it’s likely to be considerably harder than it sounds because of the First Amendment, and the way that courts have ruled on similar attempts to govern algorithmic behavior.
Nathaniel Persily, a professor of law and director of the Stanford Cyber Policy Center, is asking Congress to pass a law that would grant researchers access to information from Facebook about how its services impact society. Persily writes in an op-ed for the Washington Post that he resigned last year as co-chair of Social Science One, a partnership between researchers and Facebook, because of what he said was “years of frustration” over broken promises to share more data. “When Facebook did finally give researchers access to data, it ended up having significant errors—a problem that was discovered only after researchers had spent hundreds of hours analyzing it, and in some cases publishing their findings.”
Facebook and all of its subsidiary services, including Instagram and WhatsApp, went offline for most of the day on Monday. Some conspiracy theorists found it suspicious that a massive outage happened just as the company was under fire from Congress, but internet routing company CloudFlare explained that a routine update of information contained in an internet standard system called the “bridge gateway protocol” was responsible. The result “was as if someone had pulled the cables from their data centers all at once and disconnected them from the Internet” (the company had its own less detailed explanation).
Other notable stories:
Meredith, which owns a stable of magazines including People and Better Homes & Gardens, is being acquired by Barry Diller’s IAC holding company, and will be merged with IAC’s Dotdash digital content group, formerly known as About.com, in a deal that is valued at $2.7 billion. Meredith acquired Time Inc. for $1.85 billion in 2018, and later sold Time magazine to Salesforce CEO Marc Benioff, as well as Sports Illustrated, Fortune and Money. Dotdash owns more than a dozen branded websites that post content related to health, finance, and lifestyle, including Investopedia, Serious Eats, Treehugger, and Brides.
A Reuters special report details how AT&T funded the creation and rise of the One America Network. Founder and chief executive Robert Herring Sr. has testified that the inspiration to launch OAN in 2013 came from AT&T executives. “They told us they wanted a conservative network,” he said during a 2019 deposition seen. “They only had one, which was Fox News, and they had seven others on the other [leftwing] side. When they said that, I jumped to it and built one.” In 2019, ninety percent of OAN’s revenue came from a contract with AT&T-owned television platforms, according to testimony from an OAN accountant.
Brandon Silverman, the founder of CrowdTangle, a Facebook-owned tool for social-media analytics, is leaving the company, according to a report from The Verge. For the past couple of months, Silverman has been embroiled in a controversy at Facebook over how transparent the company should be about content metrics, and what kinds of content perform the best. The controversy was sparked in part by reports from New York Times writer Kevin Roose that used CrowdTangle data to show how right-wing content drives a lot of engagement on the social network. According to a number of reports, Facebook recently disbanded the CrowdTangle team, which had been together since the service was acquired.
Wired magazine looks at how the International Consortium of Investigative Journalists coordinated reporting on the Pandora Papers document leak, which included almost three terabytes of data. “The Pandora paper revelations came from an unfathomably big tranche of documents: 2.94 terabytes of data in all, 11.9 million records and documents dating back to the 1970s,” the magazine reports. “But how do you handle a massive leak of such size securely, when documents come in all sizes and formats, some dating back five decades?”
The British Broadcasting Corp., ITV, Channel 4 and ViacomCBS are building a shared service that would better promote their streaming brands, according to a report from Bloomberg. The broadcasters are developing a common platform in order to defend themselves against US tech giants and a planned overhaul of British TV laws, the Bloomberg report states. “The work is being loosely organized through the company Digital UK, owned by the BBC, ITV and Channel 4. The idea is to stay relevant and present a united front in negotiations with the new gatekeepers of streaming TV: Silicon Valley operating systems like Alphabet Inc.’s Android and smart TV manufacturers such as Samsung Electronics.”
The New York Times announced that it has hired Blake Hounshell, formerly Politico‘s managing editor for Washington and politics, “to help us as we re-engineer and build a new team for On Politics—already one of the biggest and best newsletters of its kind. It will soon become part of the Times newsletter portfolio available only to paying subscribers.” As managing editor for Washington and politics at Politico, Hounshell oversaw coverage of Congress, the White House, the judiciary, and national security.
Bustle Digital, which revived Gawker this year after a couple of false starts, has rolled out a revamped version of Mic, another media asset that Bustle bought after it filed for bankruptcy. “We are a place you can read a review of Lil Nas X’s new album and also a column about the existential feelings around climate change,” Shanté Cosme, Mic’s editor in chief, said in an interview with the New York Times. The makeover was led by Cosme and Joshua Topolsky, a chief content officer at Bustle Digital and the former founder of Outline, another New York-based media startup that was acquired by Bustle after it failed.
Charles McPhedran writes for CJR about a media war taking place in Belarus, featuring two exiles who have created the world’s largest Telegram channel, with over a million subscribers, and two popular YouTube channels. “Nexta’s hyperactive mixture of pointed, sometimes vulgar, videos, reader-generated exclusives, and calls to protest helped launch a street movement that posed one of the most serious threats to the grasp on power exerted by Lukashenko, Eastern Europe’s longest-lasting and perhaps fiercest dictator.”