What can we do about society’s ‘information disorder’?

Note: This was originally published as the daily newsletter at the Columbia Journalism Review, where I am the chief digital writer

In January, the Aspen Institute set up a Commission on Information Disorder, and announced a star-studded group of participants — including co-chair Katie Couric, former global news anchor for Yahoo, as well as Prince Harry, the Duke of Sussex — to look at solutions to the problem of rampant disinformation. Other not-so-famous members of the commission include Jameel Jaffer, executive director of the Knight First Amendment Institute; Yasmin Green, director of research at Google’s Jigsaw project (who took part in CJR’s symposium on disinformation in 2019); Alex Stamos, founder of the Stanford Internet Observatory; and Dr. Safiya Noble, co-founder of UCLA’s Center for Critical Internet Inquiry. The commission was funded by Craig Newmark, the founder of Craigslist (who is a member of CJR’s Board of Overseers). On Sunday, the group released its final report, with 15 recommended steps that it says could be taken by governments, technology companies, and others to help address the problem of disinformation.

In their introduction to the report, the commission’s three co-chairs—Couric, along with Chris Krebs, co-founder of Aspen Digital, and Rashad Robinson, president of Color of Change—say information disorder slows down our response time on issues such as climate change, and also “undermines democracy [and] creates a culture in which racist, ethnic, and gender attacks are seen as solutions, not problems.” They add that while in the past, there was a belief that in order to fight bad information, all we need is more good information, “in reality, merely elevating truthful content is not nearly enough to change our current course.” In some cases, if promoting more factual information involves debunking hoaxes and conspiracy theories, those practices can actually exacerbate the problem, as Data & Society researcher Whitney Phillips (now a professor of media studies at Syracuse University) pointed out in a 2019 report on “The Oxygen of Amplification.”

The Aspen report notes that “there is an incentive system in place that manufactures information disorder, and we will not address the problem if we do not take on that system.” Some of the major players in that incentive system, according to the group, are large tech platforms such as Facebook, which it says have “abused customers’ trust, obfuscated important data, and blocked research.” The commission mentions one example CJR has also highlighted: the fact that Facebook shut down a research project run by scientists from New York University by turning off their access to the social network. “Critical research on disinformation—whether it be the efficacy of digital ads or the various online content moderation policies—is undercut by a lack of access to data and processes,” the report states. Several of its recommendations are aimed at solving this problem, including one that asks the government to require platforms to “disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.”

Journalists and media organizations may be pleased to see the report recommends government support for local journalism, including the Local Journalism Sustainability Act, which proposes that federal tax credits be provided as a way to subsidize local news subscriptions. The commissioners also argue that the industry needs to “adjust journalistic norms to avoid false equivalencies between lies and empirical fact in the pursuit of ‘both sides’ and ‘objectivity,’ a topic CJR has also covered in depth both in the magazine and through our Galley discussion platform, something New York University journalism professor Jay Rosen has called “The View From Nowhere.” In addition, the report notes that cable news, podcasts, YouTube, and talk radio “all play a unique role in inflaming disinformation and too often fail to hold accountable those who spread false statements on-air,” and that there continues to be a tension in the media between “the drive to maximize profit and the imperative to serve the public good.”

Although the Aspen report notes that disinformation is “a complex problem that didn’t begin with the Communications Decency Act of 1996 nor with Facebook’s founding in 2004, and will not be solved with mere cosmetic tweaks to certain algorithms,” the group does take a crack at trying to fix Section 230. That’s a clause in the CDA that gives digital platforms immunity from liability for the content they carry—and, theoretically, for the decisions they make about what to highlight with their algorithms (CJR has had a number of discussions on Galley about the challenges of Section 230). The commission recommends that Section 230 be amended to “withdraw platform immunity for content that is promoted through paid advertising and post promotion,” and also to remove immunity protection from “product features [and] recommendation engines.” Daphne Keller of Stanford’s Center for Internet and Society, however—who is quoted in the report—has raised concerns about the clash between these kinds of attempts to regulate algorithms and the First Amendment.

But perhaps the most ambitious, and potentially controversial, of the report’s recommendations comes at the very end: a proposal that Congress create and fund an independent non-profit organization that would be mandated to “invest in systemic misinformation counter-measures,” by funneling money from something called a Public Restoration Fund into research, education, and institutions such as libraries, hospitals, schools, and local news outlets “with an emphasis on community-level protections against misinformation.” This effort could be funded, the commission says, by general taxes, voluntary investment from tech companies, taxes on social media ads, and FTC fines. Not only that, but the report also recommends that Congress look into ways to “compensate individuals and communities who have been harmed by mis- and/or disinformation.” What exactly that compensation plan might look like isn’t clear.

Here’s more on disinformation:

Misinfo bots: Although critics of disinformation on social platforms often focus on individuals or “troll farms” such as Russia’s Internet Research Agency, researchers say that groups of automated accounts known as “bots” drive much of the misinformation on Facebook when it comes to COVID-19, not actual human users, according to a study led by John Ayers, who specializes in public health surveillance at the University of California, San Diego. “If we want to correct the ‘infodemic,’ eliminating bots on social media is the necessary first step,” Ayers said. “Unlike controversial strategies to censor actual people, silencing automated propaganda is something everyone can and should support.”

Propaganda: Anya Schiffrin wrote for CJR about an early attempt to understand deliberate misinformation, and specifically propaganda: a project that began in 1937, when journalist-turned-educator Clyde Miller, a former reporter for the Cleveland Plain Dealer, founded the Institute For Propaganda Analysis amid the rise of fascism in Europe. “There are three ways to deal with propaganda,” Miller said in a public lecture in 1939. “First, to suppress it; second, to try to answer it by counterpropaganda; third, to analyze it.” Miller got a $10,000 grant from department store magnate Edward A. Filene, and set up the Institute at Columbia University’s Teachers College, with seven staff members.

Force multiplier: The Aspen report notes that “at best, the truth is the best version of what we know in that moment with the evidence available,” and that misinformation and disinformation “are not the root causes of society’s ills but, rather, expose society’s failures to overcome systemic problems, such as income inequality, racism, and corruption, which can be exploited to promote false information online.” The report also quotes Mike Masnick, the founder of Techdirt and the Copia Institute (who took part in our Section 230 discussion), who wrote: “saying that the disinformation is the problem—rather than a way in which the underlying problem shows itself—misses the point entirely.”

Other notable stories:

On Sunday, two female journalists in India—Samriddhi Sakunia and Swarna Jha—were detained by police while covering anti-Muslim violence in the Tripura region, according to a report from Indian news site Scroll.in. They were charged with offences under the Indian Penal Code related to “promoting disharmony, enmity or feelings of hatred between different groups on the grounds of religion,” committing an “intentional insult,” and engaging in a conspiracy to fabricate and conceal records. “This is not the first time that journalists have been intimidated for doing their jobs in this country,” Digipub, an association of 11 Indian digital news organisations, told the site. The two were released on bail after protests by media organizations and other journalists, the BBC reported.

Clio Chang writes in New York magazine about Felicia Sonmez, a former Washington Post journalist, and her fight against the newspaper, which suspended her after she tweeted about rape allegations against basketball star Kobe Bryant following his death in a helicopter crash. In a statement through her lawyer, Sonmez told Chang that editors at the newspaper “retraumatized and humiliated me by forcing me to relive my assault at work, over and over, whenever news broke and a colleague would ask why I wasn’t allowed to cover the story.” Sonmez’s lawsuit against the paper reignited the issue, Chang reports. Former Post executive editor Marty Baron “was held in very high regard in the newsroom,” Christopher Ingraham, a Post reporter who left the paper in June, told Chang, but “I think for a lot of folks, some of the shine came off Marty after what happened with Felicia.”

Members of three bargaining units at the New York Times—the Times Guild, Times Tech Guild, and Wirecutter Union—are planning a rally outside the Times building on Tuesday, November 16, according to a statement released Monday. The purpose of the rally is to “call out the company for union-busting, including negotiating in bad faith and multiple unfair labor practices,” the statement said. Last week, Wirecutter staff announced that workers were prepared to strike during Black Friday because of the company’s “failure to bargain in good faith and to agree to a fair contract” despite nearly two years of bargaining. According to the NewsGuild, management of the Times have refused to meet with the Wirecutter union because they say they need to prepare for the strike.

Substack, the subscription newsletter platform, says more than one million people have signed up to receive newsletters published through its service, according to a report from the Financial Times. The company said that is four times the number of subscribers its customers had last December. The paper reported that Substack, which is not profitable, would not provide its revenue figures (it takes 10 percent of the fees that newsletter authors collect from subscribers). Meanwhile, Delia Cai spoke to several former newsletter writers about burnout and their exit strategies from Substack, which has been criticized by some for replicating the flaws of traditional media.

Caleb Pershan writes for CJR about whether we still care about magazine covers. In the internet age, does a magazine cover still matter? “I think the realistic answer is no, it’s not as important as it once was,” Arsh Raziuddin, who has designed covers for The Atlantic and recently started as the creative director for Bon Appétit, told Pershan. Josef Reyes—the design director for Bloomberg Markets, who has worked for Wired and New York—agreed, saying: “The main goal with designing a magazine cover is that it sticks out of the newsstand, right? I think that’s kind of outdated now.”

The Federal Trade Commission has made it clear that it sees some common newspaper practices, including forcing subscribers to make a phone call in order to cancel their accounts, as “one of several ‘dark patterns that trick or trap consumers into subscriptions'” and therefore probably illegal, according to a report from the Nieman Journalism Lab. “The FTC vowed to ramp up enforcement on companies that fail to provide an ‘easy and simple’ cancellation process,” the Lab report said, including an option that’s at least as easy as the one to subscribe. California has required that news sites and other media businesses allow people to cancel their subscriptions online since 2018.

Paul Farhi, a media reporter with the Washington Post, posted a chart on Twitter comparing the web traffic of various news sites to last year’s numbers, based on data from traffic measurement company comScore. According to the chart, web traffic at the New York Times was down by more than 15 percent in October compared with the same month last year, and traffic at the Washington Post was down by more than 28 percent in the same period. Other big losers, according to the chart, included The Guardian (down more than 30 percent), Politico (down 48 percent), The Hill (down more than 38 percent), Vox (down over 44 percent), and ABC News (down 35 percent). Yahoo News and BuzzFeed went in the other direction, increasing their traffic by 38 percent and 33 percent respectively.

Almost 70 percent of US Twitter users say they get news from the service, according to a new Pew Research Center study that surveyed 2,548 Twitter users from May 17 to 31, 2021. But very few say the service is their most important source of news, the Pew research showed. Less than 10 percent of users who get news on the site said it was the most important source — close to 60 percent said it was important but not paramount. “One key area of news people rely on Twitter for is breaking news,” the research center reported. “Fully 70% of Twitter news consumers say they have used Twitter to follow live news events, up from 59% who said this in 2015.”

Leave a Reply

Your email address will not be published. Required fields are marked *