It’s possible that you were just involved in a massive psychological experiment — but don’t worry, nobody else knew about it either.
According to a study recently published in the Proceedings of the National Academy of Sciences (PNAS), data scientists for Facebook tweaked the News Feed algorithms for 689,003 users, manipulating the types of posts they saw on a daily basis. For one week in 2012, the algorithm filtered a disproportionately low number of either positive or negative posts on the users’ feeds.
The point was to see if emotions could be transferred virtually, just as they are transferred in face-to-face interactions. And sure enough, the scientists found that people who saw fewer positive posts on their News Feeds created fewer positive posts themselves. Conversely, the fewer negative posts people saw, the more positive they became online.
“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” wrote study authors Adam Kramer, Jamie Guillory, and Jeffrey Hancock. “We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”
But the creepy part of the whole thing? None of these Facebook users were aware their News Feeds were being manipulated. It also means that an algorithm was perfectly capable of messing with people’s emotions for a whole week.
However, the study is covered under the Facebook Data Use Policy, which users must agree to before signing up for the site’s service. The policy gives the company the right to access and use information people post to the social media site, and according to A.V. Club, the policy also lists a number of potential uses for this data, “including troubleshooting, data analysis, testing, research and service improvement.”
So if you felt depressed for one week back in 2012 without knowing why, that mystery may have just been solved.