Note: This was originally published as the daily newsletter at the Columbia Journalism Review, where I am the chief digital writer
In January, the Aspen Institute set up a Commission on Information Disorder, and announced a star-studded group of participants — including co-chair Katie Couric, former global news anchor for Yahoo, as well as Prince Harry, the Duke of Sussex — to look at solutions to the problem of rampant disinformation. Other not-so-famous members of the commission include Jameel Jaffer, executive director of the Knight First Amendment Institute; Yasmin Green, director of research at Google’s Jigsaw project (who took part in CJR’s symposium on disinformation in 2019); Alex Stamos, founder of the Stanford Internet Observatory; and Dr. Safiya Noble, co-founder of UCLA’s Center for Critical Internet Inquiry. The commission was funded by Craig Newmark, the founder of Craigslist (who is a member of CJR’s Board of Overseers). On Sunday, the group released its final report, with 15 recommended steps that it says could be taken by governments, technology companies, and others to help address the problem of disinformation.
In their introduction to the report, the commission’s three co-chairs—Couric, along with Chris Krebs, co-founder of Aspen Digital, and Rashad Robinson, president of Color of Change—say information disorder slows down our response time on issues such as climate change, and also “undermines democracy [and] creates a culture in which racist, ethnic, and gender attacks are seen as solutions, not problems.” They add that while in the past, there was a belief that in order to fight bad information, all we need is more good information, “in reality, merely elevating truthful content is not nearly enough to change our current course.” In some cases, if promoting more factual information involves debunking hoaxes and conspiracy theories, those practices can actually exacerbate the problem, as Data & Society researcher Whitney Phillips (now a professor of media studies at Syracuse University) pointed out in a 2019 report on “The Oxygen of Amplification.”
The Aspen report notes that “there is an incentive system in place that manufactures information disorder, and we will not address the problem if we do not take on that system.” Some of the major players in that incentive system, according to the group, are large tech platforms such as Facebook, which it says have “abused customers’ trust, obfuscated important data, and blocked research.” The commission mentions one example CJR has also highlighted: the fact that Facebook shut down a research project run by scientists from New York University by turning off their access to the social network. “Critical research on disinformation—whether it be the efficacy of digital ads or the various online content moderation policies—is undercut by a lack of access to data and processes,” the report states. Several of its recommendations are aimed at solving this problem, including one that asks the government to require platforms to “disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.”Continue reading “What can we do about society’s ‘information disorder’?”