Note: This was originally written for the daily newsletter at the Columbia Journalism Review, where I am the chief digital writer
Over the past decade, Google and Facebook have built globe-spanning digital platforms that impact almost every facet of our digital lives, and increasingly our physical lives as well, and often in harmful ways. Apart from their use of “surveillance capitalism” on a massive scale, or their distribution of disinformation during the 2016 election, the algorithms Google uses at YouTube have been implicated in the radicalization of alt-right fanatics like QAnon, and Facebook’s private groups and WhatsApp messaging service have been cited by the United Nations as helping to perpetuate a genocide against the Rohingya people in Myanmar. And yet traditional antitrust legislation, or at least the way it’s been interpreted for the past couple of decades, makes it difficult to regulate these two giant platforms — as does Section 230 of the Communications Decency Act, which absolves them of liability for anything that is posted by their users, and gives them wide latitude to moderate content as they wish.
Is there another path we could take that might allow us to harness the benefits of these huge services, while also blunting their negative effects? Dipayan Ghosh thinks there is. He’s the director of the Digital Platforms and Democracy Project at Harvard’s Shorenstein Center, a former policy adviser to the Obama administration, and a former adviser at Facebook. He’s also the co-author of a recent paper with Joshua Simons, a fellow at the Edmond J. Safra Center for Ethics at Harvard and a former adviser to the UK Labour Party, as well as a former policy adviser at Facebook. Their paper is titled “Utilities for Democracy: Why and How the Algorithmic Infrastructure of Facebook and Google Must Be Regulated.” CJR used its Galley discussion platform to speak with both men about their proposals recently, and their belief that the algorithms used by both companies have become part of the infrastructure of our public sphere, and therefore Facebook and Google should be regulated as public utilities.
“These companies control the social infrastructure we all use for communication and organization, political expression, and collective decision-making,” said Simons. “Their control over this infrastructure concentrates not only economic power, but social and political power too.” In effect, he and Ghosh argue, the kinds of virtual monopolies or oligopoly that Google and Facebook have created isn’t that different from the massive “trusts” of previous generations, which controlled railways or oil production in much the same way Google controls search and Facebook controls social interactions. Innovation is a good thing, Simons says, but “it creates new concentrations of power — railroads, oil trusts, telecommunications companies — and those concentrations of power matter for democracy in different kinds of ways.” The strength of the public utility concept that was developed in the Progressive era, he says, was that it offered a way to think about how and why different kinds of corporations might pose a threat to democracy.
The first question we should ask is, Ghosh argues, is whether Facebook is powerful enough to be considered a monopoly. “I think it is,” he says. “In fact, in several important markets including social media and web-based text messaging, Facebook is a dominant monopoly,” with more than 50 percent of the relevant market, and in some cases as much as 90 percent. The next question, Ghosh says, is whether the company has used this market power to cause broad social harm. The answer to this is also yes, he says. “I think we can make the case that Facebook has indeed caused harm in the three traditional areas where competition regulators look — namely, in market innovation; quality of service; and consumer pricing (i.e., the amount of data-and-attention monetized by the firm).” In each of these areas, says Ghosh, you could argue that Facebook has caused real harm not just to consumers but to society as a whole.
If both of those statements are true, Ghosh argues, then the only proper course of action is to regulate them in a variety of ways that reflect the different functions they serve in our society and our economy, and to ”treat them like the utilities they are.” There is a case to be made, he says, that the two companies may actually be what are called “natural monopolies,” in the sense that the market barriers that come from the network effects they rely on can be insurmountable for smaller companies. And both have then reinforced those monopolies by acquiring firms like Instagram and Doubleclick, which make the barriers higher. “This is not innovation any longer,” Ghosh says. “It is a pair of behemoths getting ever fatter at the expense of everyone else.”
Here’s more on Google, Facebook and democracy:
Catch 22: In a recent discussion on Galley, author and freedom-of-information activist Cory Doctorow, whose latest book is called “How to Destroy Surveillance Capitalism” said the problem with much of the technology regulation that is currently taking place, including laws against hate speech and other phenomena in a number of European countries, is that these regulations require massive amounts of moderation and oversight—and the cost of those solutions means that only huge platforms with dominant market positions can participate. “It’s not that I’m opposed to regulating Big Tech—quite the contrary!” he says. “It’s just that I think that regulations that have high compliance costs primarily work to benefit monopolies, who can afford the costs, and who can treat those costs as a moat that prevents new firms from entering the market.”
Collective goods: Olivier Sylvain, a professor of law at Fordham University and director of the McGannon Center for Information Research, said during a recent Galley discussion that much of the danger in online networks is unseen by users directly and therefore regulation is needed. “Regulators and legislators are better positioned to intervene when consumers cannot easily see the deep or long-term harms and costs,” he said. Jennifer King, director of privacy at Stanford Law School’s Center for Internet and Society said that privacy is a collective good. “I often analogize this to pollution and recycling; we are all harmed by the net effects of the individual negative actions we take, whether it is throwing away another piece of plastic, or sharing or disclosing more personal information online,” she says. “Both problems require systemic solutions.”
Too little, too late: Facebook made some changes to its rules aimed at clamping down on disinformation recently, including a ban on political ads with misinformation in them. But as Steve Kovach pointed out, the changes don’t really do anything to stop anyone, including Donald Trump and his campaign, from posting misinformation on their personal or campaign pages, so long as the posts aren’t ads. The company added another new rule on Wednesday, saying it won’t allow any ads on the network that seek to delegitimize the outcome of an election. The new policy will prohibit any ads that call specific methods of voting inherently fraudulent or corrupt. The new rule comes after repeated false claims by Donald Trump that voting by mail leads to election fraud.
Other notable stories:
The New York Times is putting together a team to re-report the story behind Caliphate, its critically acclaimed podcast on the Islamic State, after one of its central characters was arrested for allegedly faking his background in terrorism. “We are going to look for the truth of his story and inevitably we are going to also ask the question about how we presented him,” executive editor Dean Baquet told staff on Wednesday, according to a report from The Daily Beast. Hosted by Pulitzer Prize-nominated reporter Rukimini Callimachi, the podcast focused at length on the story of “Abu Huzayfah,” a Canadian who said he traveled to Syria to join the Islamic State before eventually becoming disillusioned. Last week, Canadian police arrested 25-year-old Shehroze Chaudhry and charged him with concocting a terrorist hoax. The Washington Post looked at previous questions about Callimachi’s reporting on terrorism.
In an interview with the Times, presidential debate moderator Chris Wallace called the event “a terrible missed opportunity.” Wallace conceded he was initially reluctant to step in during the Trump-Biden match-up, but was eventually forced to try and stop the president from constantly interrupting Joe Biden. “I’ve never been through anything like this,” Wallace said. On Wednesday, the Commission on Presidential Debates issued a statement saying the debate “made it clear that additional structure should be added to the format of the remaining debates to ensure a more orderly discussion of the issues,” and said it would be announcing changes soon. It also thanked Wallace for the “professionalism and skill he brought to last night’s debate.”
The Proud Boys, a far-right extremist group, received a flood of new members in the hours after Donald Trump mentioned the movement during the presidential debate on Tuesday. One of the group’s most popular social-media channels showed over 600 new members in the subsequent 24 hours, according to BuzzFeed, while another one showed roughly 700. BuzzFeed chose not to name the networks on which the group organizes to avoid driving traffic to them, and added that some of the attention they go could have come from screenshots and tweets that reporters and commentators circulated during and after the debate, according to anti-extremism researcher Joan Donovan of Harvard’s Shorenstein Center.
Journalism must show Trump as he is rather than seeking to make things appear normal, writes CJR editor and publisher Kyle Pope. “For the past three and a half years, we have covered this man, and this administration, with a willful, Groundhog Day forgetfulness. Maybe this will be the day the briefing room isn’t a spigot of misinformation. Maybe for once we can quote his enablers without them lying. Maybe, over the sound of helicopters, the president will take responsibility. Yet, despite our urgent desire for normalcy, normal never happens. But we get up the next morning and do it all again. Let’s put an end to this awful cycle.”
The Online News Association today announced a new project called Vision25: Building Racial Equity in Newsrooms, a collaboration between the ONA, the Maynard Institute and OpenNews. In a news release, ONA executive director and chief executive Irving Washington said that he hoped the new project would be “a catalyst in a social change movement that seeks to build journalistic institutions where newsrooms are actively anti-racist and collaborative, and journalists of color feel like they truly belong.” The group says it plans to develop industry standards for practices that can eliminate institutional racism, as well as engaging in training and community building.
New York magazine writes about the sudden departure of former Hearst president Troy Young, who was accused by a number of female staff of making rude and/or harassing remarks. “Whatever Young’s personal and interpersonal shortcomings, he had enjoyed the protection of Hearst’s leaders for the past seven years for a simple reason: He was taking Hearst to what seemed to be its inevitable future, one where the company’s growth was reliant on cheap, viral content, not lavish, expensive-to-produce print stories or digital features,” says the New York story. “But his departure came at yet another moment of reckoning for the already diminished magazine industry, which, after being hammered for years by declining print ad sales, saw the bottom fall out on digital advertising during the first few months of the pandemic.”
Senator Elizabeth Warren and Rep. Alexandria Ocasio-Cortez have pulled out of next week’s New Yorker Festival, the star-studded annual event held by The New Yorker magazine, in solidarity with unionized editorial staff members, according to a report in the New York Times. Warren, who made an unsuccessful bid to be the Democratic presidential nominee, and Ocasio-Cortez, a popular young representative from New York, were scheduled to appear as keynote speakers on Monday night, the first night of the annual event, but said they would honor the union’s virtual picket line.
Axios has managed to avoid any staff reductions in the past year, unlike most other media companies, and is on track to bring in revenues of about $58 million this year, up more than 30 percent over last year, according to a Wall Street Journal report. The company’s success is largely due to its sponsored-newsletter business, people close to the publication said, which produces more than 50 percent of the revenue. Early next year, the company plans to establish two-person newsletter teams in several local markets, starting with Minneapolis, Denver, Tampa, and Des Moines.