BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The Outrage Over Facebook's 'Creepy' Experiment Is Out-Of-Bounds -- And This Study Proves It

This article is more than 9 years old.

Two years before Facebook began a study that critics are branding as unethical and manipulative, another researcher used the social media site to do a similar experiment. And his story pokes holes in many of the arguments that Facebook did something wrong.

In Facebook's experiment, researchers found that manipulating the emotional content of the posts in users' Facebook feeds had a mildly contagious effect: It caused people to use slightly more or less positive language when writing their own updates.

But since Facebook voluntarily published its findings in PNAS, the company's been under attack, with dozens of news outlets featuring stories critical of Facebook's "secret mood manipulation study."

Critics say that Facebook needed to get informed consent from users, as one would in a medical experiment, before messing with their Facebook pages in ways that might affect their mood. Others argue that the research should have been reviewed by an institutional review board, a panel of experts that governs safety at an academic site—especially because Facebook's data scientist partnered with university academics and published the research in a journal.

But here's the thing. Facebook isn't alone in running academic-minded experiments on its users. Previous studies have faced the same issues of informed consent without receiving even a modicum of public pressure.

Take Timothy Ryan, a political scientist who will be on the faculty at UNC-Chapel Hill this fall, and who authored a study called "What Makes Us Click? Demonstrating Incentives for Angry Discourse with Digital-Age Field Experiments."

Ryan's study was published in the Journal of Politics in October 2012. It was peer-reviewed. It was cheered by fellow academics.

"My study is four years old," Ryan says. "It's not a secret—I've talked to a number of different people about it. And I don’t think I've ever gotten pushback."

And unlike Facebook's much-debated study, Ryan's research did receive approval from an institutional review board—a fact that he double-checked this week. "All this discussion led me to go back and make sure I got IRB approval," says Ryan. "It was a relief."

Ryan's study relied on a clever mechanism: Could changing the language and images in Facebook's ads affect user behavior? The short answer—yes. By using his own series of tests that played on users' emotions, Ryan found that liberal voters were more likely to click on ads that actively angered them.

(Ryan shared some sample ads with me, and I've pasted them below.)

One important point: Ryan didn't get people’s informed consent before actively angering them...and his IRB was OK with that.

Informed consent would have been "unnecessary and impossible," Ryan told me. "There were something like 14 million people in my study."

Ryan notes that other academics have used Facebook to experiment with even more fraught issues, like driving people to vote.

And he also says that the study Facebook is getting flack for is well within the standards of his field.

The risk to users from Facebook was quite low – a mild change in the composition of their News Feeds – and within the realm of what they’d otherwise experience, Ryan argues. The benefit: “We better understand what can make people happy or sad."

So why did Facebook's study make so many people upset, whereas Ryan's study—which was designed to actually anger users—get overlooked?

Maybe because people feel betrayed. "It comes from people whose fears about Facebook's manipulation of what they see on the News Feed have been confirmed,” says Alex Howard, a well-known research analyst who's written about these issues.

"I was trying to manipulate emotions through advertisements," Ryan says, but Facebook "was trying to do it through the News Feed. [And] I think people have different expectations on ads versus News Feeds."

Follow Dan on Facebook.

From the archives:

Also on Forbes: