Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Facebook is a school yard bully that's going down

Alex Burinskiy | July 9, 2014
Facebook has grown and evolved in recent years. In addition to connecting people online, it bombards users with unnecessary ads and useless sponsored stories. And it runs experiments on its users. Columnist Alex Burinskiy is not amused.

Being a Facebook user myself, shutting down my account has crossed my mind multiple times, and with every new breach of trust Facebook puts me through this becomes more likely. Facebook has consistently used its own Terms and Conditions to justify using your profile pictures for advertising content and having the full rights to any pictures you post. The company has always kept security settings buried deep in the user interface, a major issue that's left many users angry and pushed some to just pick up and leave.

That experiment

In January 2012, Facebook ran a sociological experiment as part of a joint venture between Cornell University and the company's own Core Data Science Team. The experiment involved 689,003 users and more than three million status updates. It was designed to determine whether the emotional content of those updates could influence Facebook friends' emotional states. What it did, instead, was to dynamite any sense of trust among users.

Cornell has since released a statement putting some distance between it and Facebook, directing any ethics questions to Facebook itself. "Professor [Jeffrey] Hancock and Dr. [Jamie] Guillory [now at the University of California — San Francisco] did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper."

Adam Kramer, the Facebook researcher who worked on the study posted his own statement — on Facebook, of course — to try and reassure users that the company's internal self-policing insured that the study was ethical.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.

"...I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," Kramer said in his post. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

Now, let's be real; studies like this can be useful, given that social media is a major part of our day-to-day lives. But manipulating people's emotions in any way can be extremely dangerous, and the possibility of hurting someone, no matter how little, is too high a price to pay.

Facebook's senior leadership — in particular, those involved with the study — apparently slept through their ethics courses in college. A company technically has the right to hide behinds its Terms and Conditions as much as it wants; but that doesn't make dubious actions right. In fact, it just reinforces the growing belief that Facebook acts according to its own rules, users be damned. And younger users — the very ones it needs to stay alive — are the ones who feel this the most.

 

Previous Page  1  2  3  Next Page 

Sign up for Computerworld eNewsletters.