At a Capitol Hill hearing Tuesday — no, not the one with the impeachment and such — Sen. Dick Durbin (D-Illinois) asked Jay Sullivan, Facebook’s product management director for privacy and integrity in Messenger, whether Facebook collected any data from its Messenger Kids app. It was the exact same question, Durbin said, that he had posed to Mark Zuckerberg last year, when he received an answer he deemed unsatisfactory.
“I have significant concerns that the data gathered by this app might be used or sold,” Durbin told Sullivan. “[Zuckerberg] responded, ‘in general, that data is not going to be shared with third parties.’ I said his use of that terms was ‘provocative and worrisome.’” Durbin then asked Sullivan the same question. “Is your answer that there is no information collected via Messenger Kids that is shared by Facebook to any third parties?”
“Yes,” Sullivan replied “We don’t sell or share data with third parties.”
Facebook’s entire business is modeled around selling users’ data to advertisers. In the past three years, Facebook has been caught up in a whirlwind of scandals, most of them involving the improper use, access, selling, or hacking of the personal data of Facebook’s billions of users by or to third parties. Since then, Facebook has been beating the privacy drum; most recently, it openly flouted requests from three major world governments — the U.S., the U.K., and Australia.
In October, the U.S. Department of Justice, the U.S. Department of Homeland Security, the Australian Minister for Home Affairs, and the U.K.’s Home Department wrote an open letter to Zuckerberg, requesting that Facebook “not proceed with its plan to implement end-to-end encryption across its messaging services … without including a means for lawful access to the content of communications to protect our citizens.” The governments were essentially asking Facebook for a special backdoor into its messaging apps that would allow authorities to read your private messages during the course of investigations.
Facebook’s response, sent on December 9 and first reported by the Washington Post, was in essence, “no.”
“Cybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere,” the letter, obtained by Digital Trends, reads. “The ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes … It is simply impossible to create such a backdoor for one purpose and not expect others to try and open it.”
Facebook’s letter cited experts from groups like Amnesty International and the Center for Democracy and Technology as backing Facebook’s stance: Encryption for everyone, or no one.
In a statement to Digital Trends, a Department of Justice Spokesperson said, “Facebook’s refusal to acknowledge the serious public safety impact of its business decisions, including on vulnerable children, has triggered a bipartisan backlash. The tech industry needs to start thinking seriously about solving this problem of its own creation.”
Hannah Quay-de la Vallee, senior technologist at the Center for Democracy and Technology, told Digital Trends that this encryption means that users won’t “have to trust Facebook anymore.”
“Anytime you say ‘Facebook’ and ‘trust’ in the same sentence, that feels hard,” she said. “They’ve done things that users were surprised by, which is not a great look. But the idea of end-to-end [encryption] is, you don’t need that trust. The tech guarantees that Facebook won’t be able to see certain kinds of data.”
Quay-de la Vallee also clarified that the type of data that Facebook lets advertisers use to target you is different from the type of data they’re now talking about encrypting. “Facebook has long said that they don’t monetize the content of private messages,” she said. “That’s the data they’re talking about encrypting anyway. All of the data that they monetize, your demographic data, and who you connect with, that is still available to law enforcement.”
“Encryption is a fundamental part of the right to privacy,” said Tanya O’Carroll, the head of Amnesty International’s tech arm. “You cannot exercise the right to privacy if you don’t have strong end-to-end.”
But O’Carroll agreed that encryption was only one small part of the whole security picture. “One of the really big problems of this business model is that it incentivizes turning everything into data,” O’Carroll said. “Regardless of what [Facebook does] now with end-to-end, we’d like to see governments address wider business regulations that just restrains the amount of data that’s collected in the first place.”
Facebook can do the right thing here, but its fundamental business model of essentially selling your data to advertisers (and others) shows that privacy still isn’t really a focus at the company.
“I don’t think we should think of this as a seal of approval on Facebook’s privacy record,” she continued. “It’s a strong step in the right direction, but we’re actually strongly calling for an overhaul of the business model itself.”
Update 12/11: Added Department of Justice statement.
- Zoom turns on passwords, waiting rooms by default to plug privacy holes
- SpaceX tells workers to ditch Zoom over ‘significant’ privacy concerns
- Zoom boss admits missteps but insists improvements have been made
- Apple rejects U.S. Attorney General request to unlock another phone