Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer
Several weeks ago, the Wall Street Journal published a series of six investigative news stories about Facebook, alleging a pattern of questionable behavior on the part of both the social network and its photo-sharing service, Instagram. One alleged that changes to the Facebook news feed algorithm, which were purportedly designed to improve the news-reading experience, actually had the opposite effect, and “turned it into an angrier place.” Another said that the company knew about the negative effects its Instagram service was having on the mental health of young girls, because researchers working at Facebook had repeatedly mentioned it during briefings with senior executives, but Facebok took little or no action. Other Journal stories revealed a little-known feature that allowed celebrities to avoid responsibility for breaching Facebook’s rules, and claimed that the company knew its services were being used by drug cartels and human trafficking networks, but routinely failed to do anything to stop it (Facebook responded that the stories are inaccurate and that it cares deeply about the effect its products have on users, including young girls).
The Journal reports were all based on what the paper called “an extensive array of internal company communications” given to it by a whistleblower, a former Facebook staffer who copied the documents before they quit working for the company because they disagreed with its behavior. On Sunday, the whistleblower revealed herself on 60 Minutes to be Frances Haugen, a former product manager at Facebook who has also worked for Google, Pinterest, Yelp, and a number of other technology companies. On Tuesday, Haugen testified before the Senate Commerce subcommittee on consumer protection, product safety, and data security about the potential dangers of Instagram for young users (Haugen also posted her testimony to her personal website). In both her 60 Minutes interview and her congressional testimony, Haugen made the same central point: that Facebook knew about the dangers of the recommendation algorithms that power it and Instagram, but chose to do nothing. It knew about these dangers, Haugen said, because the company’s own researchers had mentioned them repeatedly in multiple research papers.
“The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people,” Haugen said during her testimony to the senate committee. The bottom line, she said, is that Congress must take action, comparing Facebook to other industries that also wound up being regulated by the government in order to protect consumers from harm, such as tobacco companies and car makers. One of the big challenges with Facebook, she argued, is that legislators don’t have any idea how the company’s products work, because it is so reluctant to either share data from its own internal research, or provide data to outside scientists. “This inability to see into Facebook’s actual systems and confirm they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway,” Haugen told the senate committee.
Continue reading “Whistleblower turns up the heat on Facebook and Instagram”