14 June 2016

Aeon Essays: “How the internet flips elections and alters our thoughts”

A study by Robert M Bond, now a political science professor at Ohio State University, and others published in Nature in 2012 described an ethically questionable experiment in which, on election day in 2010, Facebook sent ‘go out and vote’ reminders to more than 60 million of its users. The reminders caused about 340,000 people to vote who otherwise would not have. Writing in the New Republic in 2014, Jonathan Zittrain, professor of international law at Harvard University, pointed out that, given the massive amount of information it has collected about its users, Facebook could easily send such messages only to people who support one particular party or candidate, and that doing so could easily flip a close election – with no one knowing that this has occurred. And because advertisements, like search rankings, are ephemeral, manipulating an election in this way would leave no paper trail.

Are there laws prohibiting Facebook from sending out ads selectively to certain users? Absolutely not; in fact, targeted advertising is how Facebook makes its money. Is Facebook currently manipulating elections in this way? No one knows, but in my view it would be foolish and possibly even improper for Facebook not to do so. Some candidates are better for a company than others, and Facebook’s executives have a fiduciary responsibility to the company’s stockholders to promote the company’s interests.

Robert Epstein

Given the massive amount of information on the Internet and how it’s delivered (mostly) filtered through proprietary algorithms, these results are hardly surprising. But the problem has started to resonate more in the media in the past months, starting with reports of Facebook possibly suppressing conservative news sources in the feed. Google search results were also populated with fake news prior to the launch of HBO’s Silicon Valley’s third season and the company was recently accused of manipulating results in favor of Hilary Clinton.

To demonstrate how reality may differ for different Facebook users, The Wall Street Journal created two feeds, one “blue” and the other “red.” If a source appears in the red feed, a majority of the articles shared from the source were classified as “very conservatively aligned” in a large 2015 Facebook study. For the blue feed, a majority of each source’s articles aligned “very liberal.” These aren't intended to resemble actual individual news feeds. Instead, they are rare side-by-side looks at real conversations from different perspectives.

Even if there’s no conscious manipulation at work in either case, there’s always a certain ‘filter-bubble’ effect at work: because these companies are building an advertising profile on users based on their online activities, personalized news feeds or search results are more likely to show people stories similar to what they (or their friends) read/liked in the past, reinforcing your beliefs instead of challenging them.

As always, don’t believe everything you see on TV – and now on your news feed.

Post a Comment