Thursday, July 31, 2014

Facebook Follies

My last post was on The Future of Facebook? so I might as well follow up with another post on Mark Zuckerberg's creation, what some might say is the Frankenstein monster of social networks. 





And speaking of electric shocks to the system, but in this case not so much it's alive! as he might be dead, there is an eerie resonance between Facebook's experimental manipulation of users' emotions in 2012 and Stanley Milgram's famous obedience to authority experiments that were conducted over half a century earlier. Click on the link to read the summary of the study over on Wikipedia.

About a decade after the experiments, Milgram published a book entitled Obedience to Authority, which went over the experiments, and contextualized them in light of the excuse used by many Nazis after World War Two that they were only following orders—Milgram significantly referenced Hannah Arendt's Eichmann in Jerusalem—and controversies concerning our own conduct of the Vietnam War, notably the My Lai Massacre. Milgram's book was required reading in Neil Postman's media ecology doctoral program.





I've included links to the Milgram book here, as well as Arendt's famous report on the Eichmann trial, where she coined the phrase, the banality of evil, for your convenience. And we certainly can make an equation of it:

Obedience to Authority=Banality of Evil

Anyway, back when I was a doctoral student, Postman also showed us the documentary about Milgram's experiments, which is a little hokey, maybe laughable for being so, but also quite disturbing. It occurred to me that I should check on YouTube to see if it was there, and low and behold:







For Milgram, the moral of the story was how willing so many of us are to obey authority even while disagreeing with what they were being told to do, that most of the subjects who went all the way with the electric shocks were not sadists, and in fact were quite disturbed by what was happening. When asked about it after the experiment, they said they wanted to stop, but the experimenter wouldn't let them, this despite the fact that no force was used, no coercion or persuasion, beyond the experimenter's insistence that the subject continue to give the victim electric shocks. Milgram referred to this as agentic shift, that the subjects ceded their independence as agents, abandoning their sense of responsibility for what was happening, and their freedom of choice. 

Put another way, the relationship, in this case an authority relationship, was more power and overwhelmed the content of the situation, that an innocent was being harmed. This is consistent with our understanding of relational communication as established by Gregory Bateson and Paul Watzlawick, and parallels McLuhan's famous dictum, the medium is the message.

The results would no doubt be different today than they were back in the day when respect for authority and a desire for conformity were quite powerful. It was the kind of culture we associate with the fifties, but it extended into the early sixties, at least. The point that Milgram missed, however, was that he himself had conducted a cruel set of experiments, inflicting psychological damage on some of his subjects, all in the name of a more abstract higher authority: Science. There are many forms of obedience, after all.

In response to these experiments, and others like them, but especially these, rules were put in place governing the use of human subjects. An experiment like this probably could not be conducted today, at least not at a university, where any study involving human subjects has to be reviewed and approved by an Institutional Review Board. So I really have to wonder how in the world my undergraduate alma mater, Cornell University, approved the participation of two of its faculty in an experiment where the emotional states of Facebook users were manipulated without their knowledge, without even their awareness that they were subjects in an experiment?

The study was carried out in 2012, and the results recently published in the Proceedings of the National Academy of Sciences of the United States of America. The article is authored by Adam D. I. Kramera of Facebook, co-authored by Cornell faculty Jamie E. Guilloryb and Jeffrey T. Hancock, and entitled, Experimental evidence of massive-scale emotional contagion through social networks. I'm not sure whether that link will work if you're not at a university that subscribes to the journal, so here is the abstract:


Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks [Fowler JH, Christakis NA (2008) BMJ 337:a2338], although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.


There also is a shorter paragraph summarizing the significance of the study:


We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.

Now, if you can access the article, you can see in the comments section a number of individuals expressing concerns about the ethics, and even the legality of the study, as well as some defense of it, stating that it was a minimal risk study that did not require informed consent. There are other criticisms as well, concerning the methodology and reasoning used to interpret the data, but let's hold that aside for now. Instead, let's go to an article published by the Christian Science Monitor on July 3rd, where I was one of the new media scholars asked to comment on the revelations regarding Facebook's questionable research.

The article, written by Mark Guarino, is entitled Facebook experiment on users: An ethical breach or business as usual? and it starts with the following blurb: "Many Internet companies collect user data. But privacy experts and Internet users question whether Facebook's 2012 experiment marked a breach of corporate ethics and public trust." And here's how it begins:


It's not yet clear if Facebook broke any laws when it manipulated the news feed content of nearly 700,000 users without their explicit consent to test whether social networks can produce "emotional contagion."

(It turns out, to a modest extent, they can.)

But the uproar after release of the results of this 2012 study is raising new questions on how pervasive such practices are–and the extent to which they mark a breach of corporate ethics.


While it is generally known that Internet companies such as Facebook, Google, Microsoft, Twitter, and Yahoo, claim the right to collect, store, access, and study data on their users, the Facebook experiment appears to be unique.

Unique is a bit of an understatement. Facebook users do willingly provide the social network with a great deal of personal information, while at the same time making that information accessible to others, to Facebook friends of course, and often at least some of it being made open to public view, as well as to any third party whose applications users may be willing to approve. That's understood. It is also no secret that Facebook makes money from advertising, and delivers advertising targeted to users, based on the personal information we all provide to them. And neither does anyone try to disguise the fact that Facebook can track when people click on a link, that advertisers and marketing professionals can know how many people clicked on their advertisement, and of that group, how many actually made a purchase. We may or may not approve of all or some of this, and some of us may not be aware of the extent to which this all works, but none of it is kept hidden from users. As they used to say on the X-Files, the truth is out there.






But this is a horse of another color, and by this I mean both Facebook and the experiment, as the article proceeds to make clear:


Not only is the company the largest social network in the world, the kind of information it accumulates is highly personal, including user preferences spanning politics, culture, sport, sexuality, as well as location, schooling, employment, medical, marriage, and dating history. The social network algorithms are designed to track user behavior in real time – what they click and when.

The Information Commissioner's Office in the United Kingdom announced the launch of an investigation to determine whether Facebook broke data protection laws governed by the European Union. The Federal Trade Commission in the US has not yet said whether it is launching a similar probe or not. On Thursday, the Electronic Privacy Information Center, a civil liberties advocacy group in Washington, filed a formal complaint with the FTC, urging action.

The experiment, conducted over a week in January 2012, targeted 689,003 users who were not notified that their news feed content was being manipulated to assess their moods in real time. The study determined that an increase in positive content led to users posting more positive status updates; an increase in negative content led to more negative posts.

So now that the facts of the matter have been established, it's time to raise the question of ethical conduct, or lack thereof:


What alarmed many Internet activists wasn't the use of metadata for a massive study, but rather the manipulation of data to produce a reaction among users, without their knowledge or consent, which they see as a violation of corporate ethics.


Just to interrupt again for a moment, why specifically activists? Doesn't this pretty much marginalize the concern? We could instead say that this alarmed citizens groups, that might sound a little better, but still. Doesn't this alarm Facebook users in general? Or as we used to refer to them, citizens? Just asking, mind you... Okay, back to the article now: 

“It’s one thing for a company to conduct experiments to test how well a product works, but Facebook experiments are testing loneliness and family connections, and all sorts of things that are not really directed toward providing their users a better experience,” says James Grimmelmann, a law professor and director of the Intellectual Property Program at the University of Maryland Institute for Advanced Computer Studies in College Park. “These are the kinds of things that never felt part of the bargain until it was called to their attention. It doesn’t match the ethical trade we felt we had with Facebook,” Professor Grimmelmann says. Many academics studying tech and online analytics worry about the ethics involving mass data collection. A September 2013 survey by Revolution Analytics, a commercial software provider in Palo Alto, Calif., found that 80 percent of data scientists believe in the need for an ethical framework governing how big data is collected.


So now it's academics and activists, that's a little better, but academics are not exactly part of the mainstream, or part of what Nixon used to call the Silent Majority. Oh well, let's hear what Facebook had to say in response to all this:


Facebook leaders expressed remorse, but they stopped short of apologizing for the experiment, which reports show reflect just a small portion of the studies that the company regularly conducts on its nearly 1 billion users. On Wednesday, Facebook COO Sheryl Sandberg told The Wall Street Journal the study was merely “poorly communicated.... And for that communication, we apologize. We never meant to upset you.”

In response to its critics, Facebook notes that policy agreements with users say that user data can be used for research. However, the term “research” was added in May 2012, four months after the study took place. Others say the complexities of the tests require stricter oversight, now that it is known the company has been conducting hundreds of similar experiments since 2007 without explicitly notifying the public.


Oh yeah, don't forget to read the fine print, right, and be sure to carefully review every update to the policy agreement that comes out. How about a different point of view, one that reflects a little bit of common sense?


“Burying a clause about research in the terms of use is not in any way informed consent," says Jenny Stromer-Galley, an associate professor who studies social media at the School of Information Studies at Syracuse University in New York.

"The issue is that people don’t read terms of use documents, and ethical principles mandate that people involved in basic research must be informed of their rights as a participant,” she adds.


Some say Facebook could have avoided the controversy simply if it had provided more transparency and allowed its users to opt out.

Transparency would be a start, but if they had been open and clear about the experiment, basically they would not have been able to carry it out. It would have been like Stanley Milgram telling his subjects, I only want to see if you'll do what this guy says, even though no one's forcing you, and by the way, the other guy isn't really getting any electric shocks. No doubt, had he done that, the Obedience to Authority experiments would have been just as effective and elucidating (please note I am being sarcastic here, obviously those experiments would have been useless and pointless).

Well now, we come to the end of the article, and guess who gets the last word?


Lance Strate, professor of communications and media studies at Fordham University in New York City, says that the revelations, which are among many such privacy violations for Facebook, suggest social networks have outlived their purpose because they no longer adhere to the Internet values of “openness, honesty, transparency, and free exchange.”

“With this move, Facebook has violated the essential rules of online culture, and now begins to appear as an outsider much like the mass media industries. It is almost impossible to recover from the breaking of such taboos, and the loss of faith on the part of users. Zuckerberg started out as one of us, but now we find that he is one of them,” Professor Strate says.



What do you think? Too melodramatic, too extreme, too much? I felt the situation called for a strong comment, a strong condemnation, and if you think that quote is harsh, here is the entirety of the comment I provided:


With the revelations concerning Facebook's latest social experiment, the social network is now dead. By dead, I mean that it has squandered its most precious resource, its credibility and the trust of its users, and is now an antisocial network. Facebook has previously run into repeated privacy concerns regarding its users, but most users have significantly reduced expectations for their privacy in an online environment. What individuals do not expect is to be manipulated, and in fact when attempts at psychological manipulation are unveiled, they often have a boomerang effect, resulting in individuals doing the opposite of what they are expected to do, thereby rejecting the manipulation, and the manipulators.

There is nothing new about conducting behavioral research on audiences on the part of mass media organizations and the advertising industry, but they usually involve subjects who are willing participants in answering surveys, being a part of focus groups, and consenting to be subjects in psychological experiments. Some of the research was necessary simply because mass media had no way to directly measure the size and characteristics of their audience, hence for example the famous Nielsen ratings. But social media made much of that sort of research unnecessary, as it was possible to track exactly how many people signed in to a particular site, clicked on a given advertisement, and subsequently purchased a particular product. It is well known that Facebook delivers ads tailored to different users based on the information they provide, willingly, in their profiles, and that adding various applications, on Facebook and elsewhere, allows other commercial interests to gain access to that same information, information otherwise freely available online anyway. The point is that all this gathering of data is done with the users' consent, but Facebook's manipulation of the users' news streams was carried out without anyone's permission, or awareness. This is where they crossed the line.

Whatever the corporate ethics may be, and some may say go so far as to say that the phrase "corporate ethics" is an oxymoron, there is an ethos that has long dominated the internet that emphasizes the values of openness, honesty, transparency, and free exchange. With this move, Facebook has violated the essential rules of online culture, and now begins to appear as an outsider much like the mass media industries. It is almost impossible to recover from the breaking of such taboos, and the loss of faith on the part of users. Zuckerberg started out as one of us, but now we find that he is one of them.

The result may not be the immediate demise of our leading social network, but it is the beginning of the end for Facebook as the dominant force in the online environment. Facebook may hang on as a kind of online registry for folks to find each other, but its other functions are being usurped by a variety of new services, and even that basic function of connecting with others is no longer a monopoly for Facebook. Younger users will be more inclined to leave Facebook altogether than older users, and it may be that many will continue to maintain profiles on the service, use it for messaging perhaps, but will stop checking their news streams.

In the short run, Google has much to gain from Facebook's loss, as users may turn to the Google+ service as a less objectionable alternative to Facebook. Indeed, Google+ has in effect been waiting in the wings, ready to usurp Facebook's place as the dominant social networking site, waiting in other words for Facebook to stumble and fall. And the decline of Facebook seems all but inevitable, based on past experience of the decline and fall of so many other companies that once dominated the new media landscape, from IBM to AOL to Microsoft to Yahoo to MySpace. Ultimately, though, the door is open for newer services to provide alternatives to Facebook based on entirely new models of connection, perhaps in a more complex mode involving many specialized sub-networks. By the end of the decade, some new service that is only getting started just now may well be the dominant force in online communications.


So, agree or disagree? Either way, you should know that this is not an isolated incident. Just recently, on July 29th, TechNewsWorld ran an article on a similar incident involving an online dating site: OkCupid's Confessed Hijinks Get Thumbs-Down. In it, my colleague and friend Paul Levinson had this to say:


"I think use of customers and users without their consent in experiments, for any reason, is unethical and outrageous," said Paul Levinson, professor of communications and media studies at Fordham University.

"People have a right to know if they're participating in an experiment—if their information is being deliberately controlled and manipulated to test a theory, if their activity is monitored beyond just general statistics of how often they log on, etc. The surest to get people angry and wanting to have nothing to do with social media is to make them unwitting guinea pigs," he told TechNewsWorld.





Paul and I often differ in our views on new media, and technology in general, but on this we are very much in agreement. To borrow a phrase from Jorge Luis Borges, the internet may be a garden of forking paths, but it is not a maze (although it has much to do with our amazing ourselves to death), and we are not lab rats trying to run through it as part of some social media experiment. A warning to all would-be web-experimenters out there, watch out!, because these lab rats bite. In a big way, like megabite and gigabite. Do not count on us being obedient to your online authority. Experiments have a funny way of backfiring on their experimenters. Just ask Dr. Frankenstein.

No comments: