and what about this one?

Toying with our emotions is one thing, influencing the outcome of an election is another.

In an article with the date June 1, 2014 posted in The New Republic (about a month before the emotions experiment was published) Jonathan Zittrain wrote about a November 2010 Facebook experiment in which:

Facebook’s American users were subject to an ambitious experiment in civic-engineering: Could a social network get otherwise-indolent people to cast a ballot in that day’s congressional midterm elections?
Zittrain's source is probably a study published in Nature in September of 2012: A 61-million-person experiment in social influence and political mobilization. He makes it clear that Facebook only tried to get more citizens to vote, not to vote in a particular way. He also reports that the experiment was a success - as a result of the intervention apparently more people voted. But he explains that influencing not only the vote count, but how people vote, could in the future be a real possibility:
Now consider a hypothetical, hotly contested future election. Suppose that Mark Zuckerberg personally favors whichever candidate you don’t like. He arranges for a voting prompt to appear within the newsfeeds of tens of millions of active Facebook users—but unlike in the 2010 experiment, the group that will not receive the message is not chosen at random. Rather, Zuckerberg makes use of the fact that Facebook “likes” can predict political views and party affiliation, even beyond the many users who proudly advertise those affiliations directly. With that knowledge, our hypothetical Zuck chooses not to spice the feeds of users unsympathetic to his views. Such machinations then flip the outcome of our hypothetical election.
Interestingly, though the emotions experiment caused, at least for a few weeks, quite a furor, this more disturbing experiment that Facebook made no attempt to hide hardly made headlines.


Go to: What isn't an experiment?