Menu
Facebook's icky psychology experiment is actually business as usual

Facebook's icky psychology experiment is actually business as usual

This kind of creepy manipulation is actually standard procedure in all kinds of media.

Unless you've been living under a rock for the last couple weeks, you've no doubt heard about Facebook's creepy, secret, psychological experiment designed to see if negative newsfeed posts inspire more negativity -- and vice versa. I don't want to excuse Facebook's behavior, which has prompted a (sort-of) apology from Facebook COO Sheryl Sandberg, as well as an ongoing stream of condemnation and outrage from legitimate psychologists and Internet commentators. I too was weirded out by the revelations, feeling manipulated and that somehow my privacy had been unfairly invaded without my permission.

But the more I think about this kerfuffle, the more I realize that while Facebook's actions may have been unseemly and morally wrong, it's hardly unusual. In fact, far more egregious personal violations of this kind take place all the time, generating little or no comment. These kinds of "experiments" in modifying our emotions are not just common, they are the everyday stock in trade of a wide variety of businesses and media outlets.

Everybody manipulates emotions

The vast majority of the advertising business is built on manipulating emotions, from tear-jerking AT&T commercials to personal grooming ads built on the idea of making you embarrassed about some bodily function you wouldn't otherwise give a second thought. Apart from a few price-oriented ads for staples or feature-oriented ads for business products, almost all advertising is about affecting your moods and feelings to spur a purchase or other behavior.

When was the last time you saw a car ad, for example, that focused solely on getting you from here to there instead of making you feel like winner for sitting in the XYZ Model Dreadnought 3000?

It's not just ads, of course. Stores spend millions on everything from decor to music to influence the way you feel while shopping, hoping that you'll spend more if they get it right. Writers and moviemakers devote their careers to eliciting emotional reactions, and when they succeed we celebrate them for it.

So what's different with Facebook?

Sure, the Facebook experiment could have negative impacts on users, but so too do all those other examples of emotional manipulation. Based on public comments, several things seem to be bugging people, but they don't really add up to a significant difference from what's already going on.

Is secrecy the problem?

First, the Facebook experiment was secret. The folks at Facebook Labs who conducted the test didn't see the need to tell anyone what they were doing (and no one else at the company stepped in to suggest that maybe that wasn't such a good idea). The difference, I guess, is that it's OK to manipulate your emotions in situations where that's expected, but not to do it unannounced in what people assumed was an objective environment.

Of course, this was an experiment designed to figure out what might happen, not a plan to elicit a specific result, like selling cars or deodorant. But advertisers carry out all kinds of tests all the time. Amazon, for example, is the king of A/B testing, constantly tweaking everything from prices to page designs to see which works better. In fact, Facebook's manipulation may be less egregious because the company wasn't really trying to drive specific behavior.

Facebook's real issue?

So here's what I think is behind all the outrage. Despite all the evidence to the contrary, many people think of Facebook differently than they do advertisements, retailers, and traditional media. Because our Facebook newsfeed is populated by our friends, it's easy to forget that it's not a clear window into what our friends are saying to us, but merely a stream of content that Facebook is choosing to show us.

The Facebook user agreement (which no one, including me, bothers to read) apparently makes it clear that the company can mess around with the newsfeed or any other part of the service, at any time for any reason without any notice. If you didn't know that, you simply weren't paying attention. As many people have pointed out, when dealing with free services like Facebook (or Google, or Twitter, or many others), we users are not the customers...we are the product. Having that reality slapped in our faces is what is really bothering people here.

Ironically, I actually think the controversy is a good thing. I still use Facebook - it remains a great way to stay in touch with people I might otherwise not communicate with and a powerful channel to reach many people at once. But Facebook's misguided experiment makes it easier to remember what's really going on with social media services (Facebook, Twitter, etc.) and harder to be naive about their real priorities. Seeing a few extra negative (or positive) posts back in 2012 seems a small price to pay for that essential reminder.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags social mediaprivacyinternetFacebookbusiness intelligencesoftwareapplicationsdata miningInternet-based applications and services

More about Amazon Web ServicesFacebookGoogle

Show Comments
[]