Facebook Experiment

Jul 07, 2014 at 04:45 am by bryan


One of the guiding principles of social science research is the notion of “informed consent,” the idea that you have to give explicit permission before you can be inspected, injected or infected. And that you have the right to know the specific treatment and possible side effects before your mind or body are manipulated, selected or neglected.

HEAR DR. BURRIS' COMMENTARY:

ADVERTISEMENT

But somehow the folks at Facebook got the idea that the rules don’t apply to them. And that small sentences buried who-knows how far down in the user agreement actually counts as informed consent.

That’s what is at the heart of the latest Facebook controversy: the revelation that the service manipulated users’ moods and emotions by changing the words in newsfeeds sent to some 700-thousand users. And they did it without any informed consent, or without any oversight by an outside reviewer.
Now, some folks are claiming that web sites do this kind of thing all the time. And indeed they do. It’s called A-B testing, and it happens when a web host changes the content of a page to measure the reaction: which advertisement gets more “click through,” A or B?

But Facebook went a crucial step further: After viewers read a particular news story, Facebook tracked the emotional content of messages they sent to someone else. Facebook wasn’t just tracking whether you liked a particular ad or not, they were tracking how you interacted with your friends. In other words, it wasn’t just you were being manipulated, but you were an unwitting party to manipulating your friends as well.

This is not the first time Facebook has hidden behind the “opt out” smoke screen. It’s time for services to provide more “opt in” preferences to insure informed consent, and informed participation, no matter what the subject.

 

Sections: News