Facebook’s secret newsfeed experiments affected voter turnout in the 2012 election

As many users know by now, Facebook routinely experiments with the structure of the newsfeed — that is, which updates its ranking algorithm highlights and which it down-votes or hides completely. Much of this experimentation is innocuous, but some of it has real-world consequences that extend beyond Facebook and into the disturbing realm of social manipulation: in the latest example, Mother Jones reports that the network tweaked the newsfeeds of almost 2 million users in 2012, and this experiment materially affected voter-turnout rates.

As Techpresident founder Micah Sifry describes in the Mother Jones piece, the manipulation occurred as Facebook was developing a feature it calls its “voter megaphone” — that is, a tool that allows users to post a prominent “I’m Voting” button on their profile in order to encourage others to vote.

Experimentation on or manipulation of users through tweaks to the newsfeed is a controversial topic for many, ever since Facebook admitted earlier this year that a team of researchers had modified the newsfeeds of users to change the emotional content of what they saw in order to determine whether good or bad news could trigger a kind of “emotional contagion,” and make them feel a certain way.

Some argued that this was just part of the tweaking that web-based services do all the time, and pointed out that the 700,000 users affected were a tiny portion of the social network’s user base — but others were disturbed by what they saw as emotional manipulation without proper disclosure. And many saw it as another example of the potential flaws in an algorithmically-filtered online environment.

Newsfeed changes boosted voting rates

The Mother Jones story describes how, in the months leading up to election day in 2012, Facebook made a change to the newsfeeds of 1.9 million users in order to see whether it could influence those users to become more interested in political activity: it did this by increasing the number of hard news items that appeared at the top of a user’s newsfeed. According to one of the Facebook data scientists involved, this change “measurably increased voter turnout.”

Facebook Inc Announces Graph Search

As described in a public talk given by Facebook data scientist Lada Adamic in 2012 (a video of which has since been removed from YouTube, according to Sifry), Facebook made the change to the feeds of almost 2 million users and then studied their behavior — and found a “statistically significant” increase in the amount of attention they paid to government-related news. The number who voted (or at least those who said they voted) went from 64 percent to 67 percent.

It is curious that Facebook officials apparently thought that testing such a major change in its users’ feeds in the weeks before the 2012 election — precisely when people might be paying more attention to political news and cues — was benign and not worth sharing with its users and the public… and, according to Buckley, the public will not receive full answers until some point in 2015, when academic reports fully describing what Facebook did in 2012 are expected to be published.

Experiment in 2010 also affected turnout

As has been reported before, Facebook also experimented with the newsfeed in the months leading up to the 2010 general elections. It put different versions of the “I’m Voting” button on the pages of about 60 million users — and put them in different places — and then studied the reactions and behavior of users. Two groups of about 600,000 users served as a control group: one saw the button but didn’t get any additional information, and others saw no button at all.

The results of the experiment — which Facebook users were not made aware of — were published in 2012 in Nature magazine, under the title “A 61 Million-Person Experiment in Social Influence and Political Mobilization.” The authors, including both outside researchers and Facebook data scientists, studied voter records to see whether the changes actually affected voter turnout, and concluded that it did.

Facebook-in-the-future

According to the study, voter turnout increased by at least 340,000 or about 0.14 percent of the total voting-age population in 2010. Compared to previous changes in turnout, the authors concluded that this was “substantial.” The evidence indicated that “more of the 0.6 percent growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook.”

As Harvard Law professor Jonathan Zittrain wrote in a piece for The New Republic after news of the 2010 experiment emerged, the research raised the possibility that Facebook could actually influence the outcome of certain elections — perhaps even without meaning to do so (Zittrain proposed that large web companies like Facebook be defined as “information fiduciaries,” and have specific duties with regards to the information they collect about their users).

What other behavior is Facebook influencing?

A spokesman for Facebook told Mother Jones the research in 2012 was just part of the ongoing experiments related to improving the quality of the newsfeed for users, and that the company has nothing to hide. He said the company took down the video presentation by its data scientist because it didn’t want to pre-empt the research paper she is writing about the experiment. But despite such explanations, the idea of being experimented on is something many users find disturbing.

As sociologist Zeynep Tufekci pointed out in a recent research paper on the social impact of “big data” involving social behavior, these types of experiments make people uneasy because they are “opaque, powerful and possibly non-consensual, in an environment of information asymmetry.” In other words, they make users feel like they are being experimented on by an unseen and impersonal entity, for purposes that they don’t really understand or in many cases are not even made aware of.

The fact that Facebook can manipulate newsfeed design in ways that can influence voter-turnout rates is fascinating, and perhaps even encouraging — but at the same time, the implications of that are disturbing: what other kinds of behavior could it influence, or actually be influencing even now, without our knowledge?

Leave a Reply

Your email address will not be published. Required fields are marked *