Facebook’s signature blue face is turning red as the social network faces inquiries from Europe’s privacy regulators following its data scientists’ experimental tweaking of the emotional content of posts in the news feeds of nearly 700,000 users.
When the news broke, Facebook initially stated that it was merely conducting user testing to see if emotions were contagious. In this case—they are, and the company’s actions have roused the ire of social media, prompted an-apology from COO Sheryl Sandberg, and perhaps violated local privacy laws.
The European agencies involved include Ireland’s Office of the Data Protection Commissioner and the Information Commissioner’s Office of Britain. While it’s not clear where the one out of every 2,500 Facebook users experimented with reside, 80 percent of its total 1.2 billion users are based outside North America.
“We’re aware of this issue, and will be speaking to Facebook, as well as liaising with the Irish data protection authority, to learn more about the circumstances,” a spokesman for the British regulator commented. Sandberg today apologized while on a trip to India, in comments published by the Wall Street Journal.
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Facebook’s #2 executive said of the one-week experiment in 2012. “And for that communication we apologize. We never meant to upset you.”
“Facebook cannot control emotions of users. Facebook will not control emotions of users,” Sandberg stated in a TV interview.
The controversial study involved tweaking the number of positive and negative posts a pool of users saw in their feeds to measure their responses in a bid to show how emotions affect social media.
From the clandestine psychological study, Facebook found that “emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
As criticism escalates, Richard Allan, Facebook’s director of policy in Europe shared the company’s statement that “The study was done with appropriate protections for people’s information, and we are happy to answer any questions regulators may have.”
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” wrote Adam Kramer, the data scientist responsible for the study, on his Facebook page. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”
In addition to the manipulation of the user experience in order to elicit and measure emotions, the situation “represents only a sliver of the kind of problematic privacy issues brought on by its competitors on a regular basis.” It also angered some who felt it was only being done to be served up to advertisers to help them tailor their messages, content selection and tone of voice for maximum impact with the site’s user base.
What’s as surprising to some about the research is the general public’s naïveté about what happens with their data and content shared on social networks, even behind privacy settings. “Facebook could be doing this sort of manipulation all the time, and the fact is they probably are,” observed Adi- Kamdar of the Electronic Frontier Foundation (EFF).
“We as users should use the publication of this study as a glimpse into the sort of power that Facebook has. Consumers should understand that Facebook is not a neutral platform. Facebook is an online tool that is run by a for-profit company that wants to tweak settings to provide a better product and also make more money. It’s become such an important part of our lives. We have the expectation that it is a public form and that nothing will be altered or changed in any way, and that isn’t totally true.”
Facebook isn’t the only brand attempting to measure customers’ emotions. British Airways recently conducted a “Happiness Blanket” test to gauge passengers’ emotions and improve the customer experience, but the test subjects opted in and gave permission to wearing the neuro-sensor blanket and headgear while being filmed for a promotional video.