Skip to main content

Facebook was always too busy selling ads to care about your personal data

(in)Secure is a weekly column that dives into the rapidly escalating topic of cybersecurity.

Last year, Facebook collected over nine billion dollars in ad revenue over just a single quarter. That’s a lot of ads. As a trade-off for using a free service, people on Facebook put up with the proliferation of these ads in their newsfeeds. But what if the trade-off involved more than that? What if it involved your personal data being sold off without your consent?

Let’s be clear. This isn’t an actual data breach. It’s merely a policy
no one at Facebook
cared about.

Facebook’s latest scandal involves a data analysis firm called Cambridge Analytica, which was supplied with the personal data of 50 million Facebook profiles without the consent of those people, which just happened to be used in the election of a certain presidential candidate. On its own, the scandal is more than a little troubling, and it provides a startling look into how little the world’s biggest social media platform is concerned about personal data.

Let’s be clear. This doesn’t involve an actual data breach. It’s merely a policy no one at Facebook cared about.

Under the guise of academic research

Using personal data for the sake of academic research has been a weak point in Facebook’s privacy policy for years now — and it’s the first vulnerability the collaborators involved with the Cambridge Analytica scandal exploited.

Despite the name, Cambridge Analytica has no official connection to academia. It’s a research organization founded with the specific purpose of impacting the electoral process, and was run by former Trump aide Steve Bannon, as well as and hedge fund billionaire Robert Mercer.

Cambridge Analytica Facebook breach
Bryan Bedder/Getty Images
Bryan Bedder/Getty Images

The facade of academic research was used as an entry point for an important figure in the crew — Aleksandr Kogan, a researcher who worked for both Cambridge University and (briefly at) St. Petersburg State University. According to a report by the New York Times, when doing work for Cambridge Analytica, Kogan told Facebook that he was collecting data for academic purposes rather than political.

The description for the app said, word for word, “This app is part of a research program in the Department of Psychology at the University of Cambridge.” Apparently, Facebook did nothing to verify that claim. To make things worse, Kogan stated he later changed the reason for his use for the data, and Facebook never bothered to inquire about it further.

Facebook has been giving the data of its users to academic researchers for years now — and not in secret.

Facebook has been giving the data of its users to academic researchers for years now — and not in secret. Facebook freely provided personal data from its users to Harvard University for an academic study back in 2007. Others since then include a partnership with Cornell University on influencing the mood of Facebook users, and yet another in 2017 which studied how AI could guess a person’s sexual orientation from only a photograph.

These studies were all met with public outrage, but Facebook emphasized that they weren’t the result of data breaches or significant holes in the company’s research protocols. It saw them as only “minor oversights.”

There’s little reason to believe a platform that views massive misuse of data without consent as “minor oversights” cares about your privacy. And that’s not where it ends.

Under the guise of a personality quiz

The other area where Facebook’s data policies are weak lie in something we all know too well: personality quizzes. They’re prominent on Facebook, and Kogan used the vulnerable pinch point to collect the data that Cambridge Analytica purchased from him.

Through Global Science Research (GSR), a separate company he created, Kogan developed a Facebook plugin called thisisyourdigitallife. It paid a group of 270,000 people to download the app and take the quiz. That might not sound like much, but the app was then allowed to collect data from each of those people’s friends as well. The result was data for 50 million profiles, now in the hands of Cambridge Analytica. That’s a lot of data.

Whistleblower Christopher Wylie posing for a portrait
Jake Naughton for The Washington Post via Getty Images
Christopher Wylie, one of the founders of Cambridge Analytica, blew the whistle on how the data firm harvested data from millions of Facebook users. Photo: Jake Naughton for The Washington Post via Getty Images

Never did Facebook inform its users that data was being used without their consent. That alone is already calling British law into question.

According to The Guardian, Facebook learned this trick was used to mine massive amounts of data in 2015, which was then used by the Ted Cruz presidential campaign. Facebook’s response was to send Cambridge Analytica an official letter, obtained by the Times, stating the following: “Because this data was obtained and used without permission, and because GSR was not authorized to share or sell it to you, it cannot be used legitimately in the future and must be deleted immediately.”

Never did Facebook inform its users of all
the data that was
being used without
their consent.

Over two years passed before Facebook would even follow up on its request. “If this data still exists, it would be a grave violation of Facebook’s policies and an unacceptable violation of trust and the commitments these groups made,” a blog post from Facebook stated. Eventually, it did get around to it, but it shows that Facebook’s problem isn’t that it lacks policies. It’s that they aren’t enforced.

Cambridge Analytica wasn’t the only organization bending Facebook’s privacy policies. A previous employee of Facebook spoke to The Guardian, saying that “My concerns were that all of the data that left Facebook servers to developers could not be monitored by Facebook, so we had no idea what developers were doing with the data.”

That’s from Sandy Parakilas, who was the platform operations manager in 2011 and 2012. “Once the data left Facebook servers there was not any control, and there was no insight into what was going on.”

Who could be bothered to care?

As reported by the Times, research director Jonathan Albright at Columbia University summarized the problem well: “Unethical people will always do bad things when we make it easy for them and there are few — if any — lasting repercussions.”

https://www.facebook.com/zuck/posts/10104712037900071

Facebook will make sure it takes care of this specific problem, sure. After remaining silent for multiple days after the release, Facebook CEO Mark Zuckerberg did finally make an official statement, in which he took a bit more responsibility for what happened: “We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.”

He also vowed to take others steps, such as auditing suspicious apps or limiting the amount of data developers can access from applications. These policies will all help prevent a very similar scenario from unfolding, but cybersecurity is all about prevention. It requires a proactive approach to stopping holes in the system.

Mark Zuckerberg: “I’m really sorry that this happened”

For a company that lives and dies on the trust people have in giving away personal information, you’d think it’d issues a little more seriously across the breadth of its platform. If it doesn’t make massive changes to the way things are done across all levels of privacy and security, #deleteFacebook could grow into far more than just a hashtag.

Editors' Recommendations

Luke Larsen
Senior Editor, Computing
Luke Larsen is the Senior editor of computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
Sen. Ron Wyden wants to protect your data from Big Tech, if Congress lets him
U.S. Senator Ron Wyden speaks in Washington, D.C.

Sen. Ron Wyden is fed up with Big Tech companies getting slapped on the wrist for violating user privacy. Unfortunately, he doesn't think his fellow congressmen feel the same way. Yet.

"My sense is we are one major privacy scandal away from finally getting the political support to move this legislation,” the Oregon Democrat said in an exclusive interview with Digital Trends.

Read more
Zuckerberg says it’s not Facebook’s job to worry about misleading political ads
Facebook CEO Mark Zuckerberg leads a conversation on free expression at Georgetown University on October 17, 2019 in Washington, DC. The event was hosted by the university’s McCourt School of Public Policy and its Institute of Politics and Public Service

Mark Zuckerberg said that Facebook won't be banning political ads, even if they contain false information, during a speech at Georgetown University on Thursday.

The Facebook CEO argued that political advertising is more transparent on Facebook than anywhere else, and added that political ads are an essential form of free speech. "I don't think it's right for a private company to censor politicians," he said.

Read more
I turned off Facebook’s ad trackers, and the ads only got more personalized
close up of someone deleting the Facebook app off their iPhone

After toggling off all seven of Facebook’s ad tracking and information settings, I resumed my News Feed scroll expecting to be regaled with ads for beef jerky and car insurance. Instead, I realized just how much Facebook knows about me and scrolled past ads that, instead of becoming less relevant, only felt more personalized.

Facebook knows that I’m pregnant. Facebook knows that I’m a photographer. Facebook even knows that I’m currently remodelling my bathroom.

Read more