British lawmakers released a 250-page trove of internal Facebook documents on Wednesday, shedding new light on how the secretive company operates.
The cache of documents—which includes emails, slideshows, and other materials—are part of a lawsuit against Facebook by the app development company, Six4Three. The documents were sealed by a California judge, however during Six4Three CEO Ted Kramer’s trip to the UK last month, he was taken to British Parliament and compelled to hand over the documents.
After reviewing the materials, the Digital, Culture, Media and Sport Committee made them public today, and it turns out they contain a few new revelations—particularly about how the company guarded, or failed to guard, user data, as well as how it used that valuable data to get a leg up on competitors.
British Member of Parliament Damian Collins has been leading the charge on getting Facebook to answer for its actions before lawmakers, summoning Facebook CEO Mark Zuckerberg several times to testify, before securing the testimony of Facebook executive Richard Allan last week. In a Twitter thread linking to the documents Wednesday, Collins said they “raise important questions about how Facebook treats users data, their policies for working with app developers, and how they exercise their dominant position in the social media market.”
“We don’t feel we have had straight answers from Facebook on these important issues, which is why we are releasing the documents,” Collins added.
Facebook fought back against allegations the committee subsequently made in its summary of the company documents, saying that “the documents Six4Three gathered for their baseless case are only part of the story and are presented in a way that is very misleading without additional context.”
“Like any business, we had many of internal conversations about the various ways we could build a sustainable business model for our platform,” a Facebook spokeswoman told Mother Jones in an email. “But the facts are clear: we’ve never sold people’s data.”
Here are Mother Jones’s key takeaways from the just-released documents:
Facebook tried to hurt competitors by eliminating access to its data.
Zuckerberg and company at Facebook saw the now-defunct video-sharing app Vine as enough of a potential threat back in early 2013 that they decided to restrict access of Facebook data for the Twitter-owned company.
“Twitter launched Vine today which lets you shoot multiple short video segments to make one single, 6-second video,” Facebook vice president Justin Osofsky wrote to Zuckerberg and others the day Vine launched, according to the emails released by Parliament. “Unless anyone raises objections, we will shut down their friends API access today. We’ve prepared reactive PR.”
“Yup, go for it,” Zuckerberg responded.
UK Parliament charged that the move is “evidence of Facebook taking aggressive positions against apps, with the consequence that denying them access to data led to the failure of that business.”
Likely in response to the disclosure on Wednesday, Vine’s cofounder tweeted, “Competition sucks, right? No. It allows for products to improve, become available to more people, at lower costs. Strive to build new things that people want and influence other creators for the cycle to continue.”
A Facebook spokeswoman said “these kinds of restrictions are common across the tech industry,” but noted that Facebook had recently decided to end its policy of cutting off competitor access to data.
Zuckerberg likes to keep a personal eye on Facebook’s competitors.
Zuckerberg may have weighed in on Facebook’s actions against Vine because they may have fallen under the company’s policy of maintaining a “small list of strategic competitors that Mark personally reviewed”—a company precedent that was also revealed in Wednesday’s disclosure.
“Apps produced by the companies on this list are subject to a number of restrictions outlined below,” the internal policy from the documents reads, referring to a uniform list of rules proceeding the stipulation.
“Any usage beyond that specified is not permitted without Mark level sign-off.”
Facebook gave a special group of companies exemptions from its data restrictions.
Until 2015, the company had allowed developers to gain significant amounts of user data, including the data of their friends, even without their friends’ permission. This was how Cambridge Analytica was able to obtain data on 87 million Facebook users. But the company later restricted this broad flow of data.
According to the documents, though, not everyone had to play by the same rules; Facebook kept a “white list” of several companies that were exempt from the new restrictions. The list included Airbnb, Netflix, and Lyft, allowing them to still access data of a user’s friends.
Some Facebook employees anticipated public backlash on data collection.
Facebook leadership has seemingly been caught flat-footed on the public backlash to its data privacy practices, but the rest of the company was not completely oblivious to the impending blowback.
Back in February 2015, Facebook employee Mark Lebeau expressed concern about one such practice that would’ve let Facebook gain access to users’ call logs on their Android phones, outside of the Facebook app.
“This is a pretty high-risk thing to do from a PR perspective but it appears that the growth team will charge ahead and do it,” Lebeau cautioned in an email chain with several other employees.
“We think the risk of PR fallout here is high,” Labeau wrote. “Screenshot of the scary Android permissions screen becomes a meme (as has it has in the past) propagates around the web, it gets press attention, and enterprising journalists dig into what exactly the new update is requesting, then write stories about ‘Facebook uses new Android update to pry into your private life in ever more ferrying ways – reading your call logs, tracking you in business with beacons, etc.’”
In turn, according to the documents, Facebook appeared to pursue a different avenue that did not require as explicit a notification.
Parliament summed up the interactions as evidence of a deliberate attempt by Facebook to obfuscate the impact of its policies. “To mitigate any bad PR, Facebook planned to make it as hard as possible for users to know that this was one of the underlying features of the upgrade of their app,” the summary said.