How can we fight back against surveillance capitalism?

Note: This was originally written for the daily newsletter at the Columbia Journalism Review, where I am the chief digital writer

Over the past decade, we’ve seen the rise of a new kind of corporate power, one built on an almost unprecedented level of digital surveillance of users, fueled by the demands of the global advertising industry—a phenomenon that Harvard professor emerita Shoshana Zuboff calls “surveillance capitalism.” Google, Facebook, and Amazon have built businesses that are worth trillions, yet the functioning of their algorithms remains entirely opaque to both users and regulators. How should we deal with this problem? Is there a way to dismantle these massive platforms without losing the benefits they bring, or will breaking them up cause more problems than it solves? To answer these and other related questions, CJR spent this week conducting a “slow interview” on its Galley discussion platform with author and freedom-of-information activist Cory Doctorow, whose latest book is called How to Destroy Surveillance Capitalism.” In addition to writing science-fiction novels, Doctorow is also a special consultant to the Electronic Frontier Foundation, and holds an honorary doctorate in computer science from the Open University in the UK, where he is a visiting professor.

Both in his book and in our interview, Doctorow argues that Zuboff and others are right to be afraid of massive corporations like Google and Facebook, and their ability to track our every move online, or insert themselves into every conversation or transaction, because monopolies smother competition and innovation. But he says the idea that these digital behemoths can influence our thoughts or behavior through the power of their algorithms or ad targeting methods is mostly bunk. “Every person who’s claimed to have built a system of effective, long-term persuasion was either kidding themselves or the rest of us,” he says. The advertising industry loves to claim that it can trigger purchases and other behavior by using certain images or playing on human emotion, but most of that is flim-flam, says Doctorow, and the efforts of the digital platforms are likely not much more successful. Even Facebook’s notorious sociological experiment from 2010, in which it tried to encourage users to vote by offering an “I Voted” sticker, had an impact of less than 0.5 percent.

But while Facebook and Google may not hold some secret power to affect our decisions, that doesn’t mean they aren’t dangerous, Doctorow argues. The way they coerce us into doing their bidding, he says, is not through psychological tricks, but by using their monopoly powers to restrict our choices—by forcing us to use their app stores, dominating our search results, and commodifying our relationships and then holding them hostage within walled gardens. They also exercise their power via political influence, he argues—using their monopoly profits to lobby against technology regulation, including laws that would open up those walled gardens. “If we can freely choose which apps run on our devices, seize control over our locked-in social relations, and have a competing and diverse search industry,” Doctorow says“not only will Big Tech have less control over our lives, they will also face competitive pressure that will deprive them of the monopoly rents they use to pursue political projects.”

The problem with much of the technology regulation that is currently taking place, he says, including laws against hate speech and other phenomena in a number of European countries, is that these regulations require massive amounts of moderation and oversight—and the cost of those solutions means that only huge platforms with dominant market positions can participate. In effect, says Doctorow, the laws only serve to cement the control that these companies have over the marketplace. “It’s not that I’m opposed to regulating Big Tech—quite the contrary!” he says. “It’s just that I think that regulations that have high compliance costs primarily work to benefit monopolies, who can afford the costs, and who can treat those costs as a moat that prevents new firms from entering the market.” Better to focus on the structural problems that have led to the creation of these giant companies in the first place, Doctorow says, such as the failure of antitrust law to stop the acquisitions and mergers that have allowed Google and Facebook to extend their dominance into new markets.

So what should be done? According to Doctorow, acquisitions that tend to strengthen a company’s monopoly should not be allowed, and even purchases of smaller companies (like Facebook’s acquisition of Instagram) should be blocked if they have the effect of removing a potential future competitor or reducing customer choice. And regulators should also force the digital platforms to open up to competitors, to allow interoperability of services, he says, but more than anything else they need to be broken up, not just brought to heel. As Doctorow puts it in his book—which is available on Medium—”We can work to fix the internet by breaking up Big Tech and depriving them of monopoly profits, or we can work to fix Big Tech by making them spend their monopoly profits on governance. But we can’t do both.”

Here’s more on surveillance capitalism:

Engaging vs. enraging: In a recent interview with Axios, Facebook chief executive Mark Zuckerberg said that the social network’s recommendation algorithms focus on promoting what he called “the most meaningful content.” But as New York Times writer Kevin Roose noted in a Twitter thread, there’s a fine line between meaningful engagement and the kind that relies on anger and other negative emotions. “This goes back to the “failure of empathy” point I’ve been trying to make,” said Roose. “Partisan anger is core to many people’s identities, and their sense of belonging in community. QAnon is *super* meaningful in its followers’ lives. You can’t promote one without promoting the other.”

Empowering trolls: Facebook’s algorithms have helped promote content from a secret network of online trolls hoping to stoke fears about election fraud and promote skepticism about COVID regulations, according to a report from the Washington Post. The “troll farm” has been traced back to a group of teenagers—some of them minors—who have been paid to post this kind of content by Turning Point USA, a conservative organization based in Arizona and run by Charlie Kirk. The organization released a statement that said the posts were “sincere political activism conducted by real people who passionately hold the beliefs they describe online, not an anonymous troll farm.”

Locking and spying: The way that the tech giants have built and maintain their monopolies may be different, but the outcome is the same, Doctorow argues in his book. “Some tech companies want to lock their users in but make their money by monopolizing access to the market for apps for their devices and gouging them on prices rather than by spying on them (like Apple). Some companies don’t care about locking in users because they’ve figured out how to spy on them no matter where they are and what they’re doing and can turn that surveillance into money (Google). Facebook alone among the Western tech giants has built a business based on locking in its users and spying on them all the time.”

Other notable stories:

Facebook and Instagram flagged posts from the Fox News show “Tucker Carlson Tonight” as false information on Wednesday, saying that they repeated information about Covid-19 “that multiple independent fact checkers say is false.” The show posted a video on the social media platforms on Tuesday night with the caption “Chinese whistle-blower to Tucker: This virus was made in a lab & I can prove it.” The posts feature a segment in which Mr. Carlson interviewed Li-Meng Yan, a Chinese virologist who claims that the virus “is not from nature.”

Daniel Ellsberg, one of the most famous whistleblowers in US history, came to the defense of WikiLeaks founder Julian Assange on Wednesday in his legal fight to avoid extradition to the United States from Britain, arguing that the two have “very comparable political opinions.” The 89-year-old, who helped turn public opinion against the Vietnam War with his leaking of the so-called Pentagon Papers in 1971, said that Assange “cannot get a fair trial for what he has done under these charges in the United States.”

As part of CJR’s Year of Fear series, in which we have partnered with the Delacorte Review to bring readers reports about the pre-election atmosphere in four towns across America, Sandra Sanchez takes a look at the state of things in McAllen, Texas, which has been hit particularly hard by COVID-19. “There is now one month until early voting begins on October 13 in Hidalgo County, and it almost feels like there’s no upcoming election whatsoever,” she writes. “McAllen and South Texas are struggling. And the campaigns appear to be as well. The roads that are usually overwhelmed with colossal political billboard-like signs are now almost completely devoid of the expensive advertisements.”

NPR’s daily podcast Consider This began as Coronavirus Daily, but changed its name and broadened its scope in June as protests against police brutality and racial discrimination competed with the virus for headlines. Now, the short-form afternoon podcast is changing again, according to a report from Nieman Journalism Lab. NPR recently announced that Consider This will become a localized news podcast, which the public-radio service had been planning to debut earlier this year before COVID-19 came along. Listeners to the program who are in or near 10 cities will hear local updates after the national news show.

The New York Times and Facebook have struck a multi-year partnership to co-develop augmented-reality filters and effects on Instagram that help users access and contextualize Times journalism, according to Axios. To get the partnership off the ground, the Times has built a dedicated “AR Lab” team in its research and development unit that has more than a dozen employees, says Axios. That team will develop augmented reality filters and effects using a Facebook platform for developers called “Spark AR Studio.”

Bloomberg Media is relaunching its QuickTake video service on November 9 with 100 dedicated staffers as a global, 24-hour streaming service in an effort to reach a digital audience that the company refers to as “modern, rising leaders,” according to the Hollywood Reporter. QuickTake originally launched in 2017 as TicToc by Bloomberg, partnering with Twitter to produce breaking news and live event coverage. Bloomberg Media chief executive Justin Smith says the relaunch project is “one of the most consequential, if not the most consequential, launch in the news industry this year.”

Minnesota Public Radio fired a DJ for 89.3 the Current late Tuesday, a day after the resignation of an MPR reporter who investigated sexual-harassment allegations against the on-air personality. “Eric Malmberg will no longer be a DJ on The Current,” said a statement from MPR President Duchesne Drew. “Our hosts have to be able to attract an audience that wants to listen to them and trusts them and over the last 36 hours those conditions have changed for Malmberg.” Reporter Marianne Combs said Monday that she had resigned because of MPR’s decision to hold off on a story alleging sexual misconduct by Malmberg.

The Membership Puzzle Project, a three-year effort to understand how media companies can benefit from a focus on turning readers into members rather than just subscribers, has published a report on what it has learned from those years of research. The project—funded by the Lenfest Institute for Journalism, the Google News Initiative, and the Facebook Journalism Project, among others—says the guide “presents a practical, tactical guide to launching a membership program” and offers best practices, case studies, and step-by-step advice from around the world that will help newsrooms.

Leave a Reply

%d bloggers like this: