BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

10 Other Facebook Experiments On Users, Rated On A Highly-Scientific WTF Scale

This article is more than 9 years old.

A former Facebook data scientist who set out to defend the company's emotion-manipulation experiment on nearly 700,000 users decried the hubbub about it, saying that experiments are happening all the time at the company and that every Facebook users has been part of one at some point. Yes, we can all wear t-shirts saying, "I, too, am a Facebook lab rat." Most of the "experiments" are A/B testing that's standard on the Web and rather boring -- such as which color blue you're mostly like to click or how big an ad needs to be for you to notice it -- but some are interesting enough to warrant academic write-ups for shedding new light on human behavior.

Here's other research done by Facebook data scientists on users (possibly on you) that we know about because it's been published. I've given each of these studies a "WTF rating." In my opinion, the hubbub-inducing study in January 2012, which involved curating the emotional content of users' News Feeds to see if Facebook could manipulate their emotions, is the most intrusive, WTF-y of experiments for poking and prodding users emotionally to see what happens, so that's the high end of the scale. These other studies run by Facebook data scientists, sometimes in collaboration with academic researchers, are in reverse chronological order. Facebook data scientists who pop up often as study creators are Adam Kramer, who conducted the emotion study; Cameron Marlow, who founded Facebook's in-house sociology team but has since left the company; and Dean Eckles, an academic cited in Eli Pariser's Filter Bubble for his work on persuasion.

Study 1: Rumor Cascades

What Facebook wanted to find out: How easy is it for lies to spread?

When it happened: July and August 2013

How many users: ?

How they did it: Researchers looked at over 200,000 photo comments posted to the site with Snopes.com links. Snopes is a rumor-debunking site so would indicate the shared photo was an example of someone being duped and then erroneously passing it on to their friends, such as people claiming this guy was Trayvon Martin at 17 or that Obamacare would tax non-medical items like clothes and rifles. They then looked at how viral those photos went.

What Facebook found out: People like spreading rumors. Outrageous stuff travels farther and faster than debunking of that outrageous stuff. But researchers also noted that posts that "get Snoped" are 4.4 times as likely to get deleted.

WTF rating: Low. Interesting study, but it's a reminder that Facebook can easily figure out which users are the dumb, rumor-spreading ones, and that Facebook has the ability to keep track of the dumb thing you posted even after you delete it. Business case for this could be that Facebook is a news service and wants to stop the spread of false information.

***

Study 2: Calling All Facebook Friends: Exploring requests for help on Facebook

What Facebook wanted to find out: Who asks for something on Facebook?

When it happened: Two weeks in July and August 2012

How many users: 20,000 users

How they did it: Researchers looked through public status updates looking for requests like "What movie should I watch tonight?," "Is it okay to eat canned food that expired in 2005?" or "I need a ride to the airport."

What Facebook found out: Researchers were more interested in people asking for help than whether they got it. People who visit Facebook less often, but who have a lot of friends, are most likely to ask for help with stuff.

WTF score: Nil. These are public updates, no surprise that someone's collecting and studying them. There's a business case in saying that Facebook helps you solve your problems, if the study had actually proved that.

***

Study 3: Self-censorship on Facebook

What Facebook wanted to find out: How many people hold back from blasting the network with their thoughts on something?

When it happened: 17 days in July 2012

How many users: 3.9 million users

How they did it: They tracked every entry of more than 5 characters in a comment or compose box that didn't get posted within 10 minutes.

What Facebook found out: We're thinking things that we don't put down to digital paper. 71% of the users "self-censored," drafting comments that they never posted.

WTF score: Medium. One of the two authors of this study is our friend, Adam Kramer, the Facebook data scientist who ran the emotion manipulation experiment. The study tracked these entries regardless of a user's privacy setting though it only tracked that they entered something not what it was. Still, it led to quite a few headlines about Facebook tracking what you don't do on Facebook. A Facebook spokesperson told me at the time that the site doesn't usually do that. The business case for running this study is a little hard to parse, though you could argue it's necessary for Facebook to figure out how to get us all to overshare to keep its site going.

***

Study 4: Selection Effects in Online Sharing: Consequences for Peer Adoption

What Facebook wanted to find out: Does broadcasting that you plan to buy something make your friends jump on the same opportunity?

When it happened: Two month period in 2012

How many users: 1.2 million users

How they did it: Users who claimed "Facebook Offers" -- such as an offer for free lace panties from Victoria's Secret -- were put into two groups. One group had the offers they claimed auto-shared so that friends would see it in their News Feeds. People in the other group were graciously given a button to click if they wanted to broadcast the offer claim to their friends.

What Facebook found out: Friends are more likely to also claim the offer when you actively decide to share it with them. But when it comes to the sheer numbers game, more offers get claimed when everyone in your News Feed gets spammed every time.

WTF score: Medium-high. Auto-sharing is creepy and users had the active or "passive" sharing randomly assigned to them, changing their experience (and their spammed friends' experience) of the site. When given the option to share, only 23% chose to do so. There is a clear business case for Facebook finding out how to get offers claimed as it's key to revenue. Media reports at the time the paper came out claimed Facebook was killing the auto-sharing though Facebook's Help Page indicates you still have to opt out, offer by offer.

***

Study 5: Social Influence in Social Advertising: Evidence from Field Experiments

What Facebook wanted to find out: Do ads work better on you when your friends' names appear next to them, endorsing them?

When it happened: 2011

How many users: 29 million users

How they did it: They showed users ads with and without a "Kashmir Hill likes this" style endorsement from users' friends and then measured clicks.

What Facebook found out:The stronger your buddy bond with the person endorsing the ad, the more likely you are to click.

WTF score: Nil. This is exactly the kind of test I'd expect Facebook to run. Clear business case for making ads work better. Facebook has dealt with that whole legality around putting users' likeness in endorsements thing, so now it's smooth sailing though I certainly expect they're running experiments all the time to optimize advertising.

***

Study 6: Inferring Tie Strength from Online Directed Behavior

What Facebook wanted to find out: Which of your Facebook friends are true IRL friends?

When it happened: 2010/2011

How many users: 789 users

How they did it: The relatively tiny group of users were recruited with Facebook ads to take a survey asking who their closest friends were in real life.

What Facebook found out: The more you interact with someone on Facebook, the more likely you are to be IRL friends. And you're as likely to interact with them publicly, putting your closeness on display for all to see via Wall posts, as to send them private messages.

WTF score: Low. I initially wondered if those survey takers realized that by naming their closest friends they were signing up to have all of their Facebook activity scrutinized by researchers. The study says this was IRB approved and that written informed consent was obtained from all participants, so Facebook does know how to do that! The business lesson for Facebook (and other social networks): "It is not critical to have information about private communication behavior in order to characterize the likelihood that two users are closely connected in the real world."

***

Study 7: The Role of Social Networks in Information Diffusion

What Facebook wanted to find out: How does information spread on Facebook?

When it happened: Seven weeks in August/October 2010

How many users: 253 million users (At the time, this was half of all Facebook users)

How they did it: Researchers "randomly" assigned 75 million urls a "share" or "no-share" status. The links could have been news articles, or job offers, or an apartment for rent, or news of an upcoming concert. Any kind of links that Facebook users share. Except if it got a "no-share" status, it was "disappeared" meaning it wouldn't show up in News Feeds. ""Directed shares, such as a link that is included in a private Facebook message or explicitly posted on a friend’s wall, are not affected by the assignment procedure," wrote the researchers. Well, thanks for that! Researchers then compared the virality of links that were allowed to be seen with those that weren't. The researchers wanted to know whether the censored information would still "find a way," Jurassic Park style, to spread.

What Facebook found out: Unsurprisingly, you're more likely to spread information if you can see friends sharing it. Researchers found that distant friends are more likely to expose you to novel information than your close friends, as judged by your likelihood of sharing it after seeing it.

WTF score: Medium-High. Hope no important information got censored. I'm not sure about the business case for this one. One of the closing lines reads a little ominously in context: "The mass adoption of online social networking systems has the potential to dramatically alter an individual's exposure to new information," including making sure they don't have access to it at all.

***

Study 8: A 61-million-person experiment in social influence and political mobilization

What Facebook wanted to find out: Can it encourage people to vote?

When it happened: 2010 midterm elections in the U.S.

How many users: 61,279,316 users over the age of 18

How they did it: They offered test subjects an 'I Voted' button at the top of their News Feeds and information on how to find their polling place. Some users also saw the names of their friends who had clicked the button. The control group got no prompt to vote. Then the researchers checked public voting records to see which of the millions actually voted.

What Facebook found out: Peer pressure works. People were more likely to click the "I Voted" button if their friends' names appeared there. When researchers checked actual voting records, they found that people who got the "I Voted" message in their News Feed were 0.39% more likely to have actually voted, and were more likely to have voted if their friends' names appeared. Those are miniscule percentages but the researchers think their experiment resulted in 340,000 votes that wouldn't have otherwise happened.

WTF score: High. While getting people to perform their civic duty and vote is an admirable enterprise, this enters serious society control territory. As many critics have pointed out, Facebook could theoretically put the "I Voted" button only in the feeds of users that are in favor of immigration reform (something Zuck has been pushing for with his FWD.us lobby group) or only in the feeds of Republicans or Democrats to potentially swing elections. None of the users realized that they were part of this experiment or that Facebook would go looking for their names in voting records; they did come up with a privacy-preserving way to do that. There wasn't an obvious business case for this one; it was a pure can-we-really-do-this study.

The researchers in the paper said they certainly want to do more of these: "[T]he

growing availability of cheap and large-scale online social network data means that these experiments can be easily conducted in the field. If we want to truly understand—and improve—our society, well-being and the world around us, it will be important to use these methods to identify which real world behaviours are amenable to online interventions."

There's evidence that Facebook ran another experiment around the election in 2012, but we won't know what it was until (or if) there's a paper published about it.

***

Study 9: The Spread of Emotion Via Facebook

What Facebook wanted to find out: Does your emotional state affect your friends?

When it happened: Some three-day period prior to 2012 (when the paper was published)

How many users: 151 million users

How they did it: Run by Adam Kramer, this was the precursor to the "emotional contagion" study. In this study, he just looked at 1 million users' status updates, rating them as positive or negative based on terms used, and then looked at the postivity or negativity of the posts of those users' 150 million friends.

What Facebook found out: Your happy updates lead friends to suppress their negative posts, and your negative posts lead them to suppress their happy ones. If you say something upbeat on Facebook, one out of every 100 friends (who wouldn't have otherwise, according to the study) will do the same within 3 days.

WTF score: Low. Assessing the emotional tone of status updates on Facebook is pretty mundane. However, it did lead this researcher down the path of trying to see if it was possible for Facebook to actively manipulate the emotions of individual users based on which of their friends' posts they expose them to.

Kramer warned users not to try to manipulate friends themselves: "These results suggest that posts to Facebook have the ability to affect our friends’ subsequent posts... [H]owever, we would not advise Facebook users to go around expressing disingenuous positivity or to suppress the expression of negative emotions in order to keep one’s friends happy." Facebook though did go on to do something similar to see what would happen.

***

Study 10: Feed Me: Motivating Newcomer Contribution in Social Network Sites

What Facebook wanted to find out: How do we get newcomers to Facebook to stick around?

When it happened: 15 weeks starting in March 2008

How many users: 140,292 newcomers

How they did it: They invited 7 people in for face-to-face interviews about using Facebook. The rest simply had their clicks studied. They looked at the activity of a bunch of people who joined Facebook and stuck around for at least 3 months to see what they looked at on the site, who interacted with them and how, and what they contributed.

What Facebook found out: Facebook was surprised to discover that newcomers didn't find it especially compelling to be tagged in photos, but they did really like getting comments on photos they uploaded. They basically looked to their friends' activity to figure out what they were supposed to do on the site. It helps to show newcomers their friends adding, tagging and commenting on photos to addict them to the site. "It is vital for developers of social networking sites to encourage users to contribute content, as each individual’s experience is dependent on the contributions of that person’s particular set of connections," they write.

WTF score: Low. This is exactly the kind of research you'd expect from Facebook: how do we addict people to this site? It's a reminder to users though that on every site they are on, their every click may be getting measured and weighed, whether they're a site virgin or not.

***

There are lots of other studies that have come out of Facebook: Is fandom contagious? (Spoiler: Yes.) How many status updates are political? (Spoiler: Less than 1%) Can you predict if someone is lonely based on their Facebook activity? (Spoiler: Yes.) How close do you live to your Facebook friends and can your location be predicted based on their locations? (Kind of.)

As people have said before, a terrible outcome of this controversy would be that Facebook starts doing experiments secretly. I hope instead it leads to more transparency, corporate standards for running experiments, and the ability for users to opt in or out of being part of experiments, much like beta testers for new apps and hardware. Facebook could take a page from the Google Glass book and call those who are willing to be part of mass psychological experiments Facebook Explorers.