Facebook apparently wants to control your mood. Along with researchers from Cornell University and the University of California San Francisco, the world’s largest online social networking site tweaked users’ news feeds to find out how they felt about it, and what they would write in response.
Revelations of this secretive experiment on hundreds of thousands of users shows the increasing power of such sites, and of such large companies, to manipulate people in sophisticated ways.
For the experiment, Facebook changed the news feeds for nearly 700,000 people for one week in 2012 to see how much discomfort might result. They conducted the psychological experiment by changing the content of the news feeds to include a large number of happy and positive words, or, for contrast, sadder than usual language. They then studied what users posted to see how contagious emotions are between people who never meet in person, but only write online.
Such a study might produce interesting results if conducted in a professional manner. But standard academic and business practices require the express consent of research participants. Facebook claims it had that consent, since it asks users to sign an agreement when setting up their pages. Whether that all-inclusive OK at the start constitutes consent is doubtful. Surely it is not consent to be a research subject anytime, any way, for Facebook.
Perhaps few of all those who had their Facebook pages manipulated, tracked, studied and analyzed knew, or ever will know, that they were the subjects of a vast psychological experiment. No doubt, too, the majority do not really care too much. However, the revelations expose the degree to which large and profitable companies are interested in discovering how to manipulate the emotions, thoughts and lives of consumers.
Like many companies and governments, Facebook has a monopoly over a certain area of consumer activity. The implications of that power are a bit frightening. One wonders whether Facebook seeks contracts with advertisers, or with governments, to control access to information that may adversely affect them.
If powerful entities learn how to manipulate the emotions of users, they would no longer need the old ways of blocking or censoring news. They could simply feed whatever content they like to users and let the users pass on their feelings to others in their own words.
Facebook was not forthcoming about the purposes of the experiment. Among the billions of Facebook users worldwide, many had at least one response to the secretive experiment — outrage. Facebook at least learned one thing from its mood-manipulating experiment — such experiments produce a lot of negative emotions in response.
In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.