Late Tuesday, Facebook suspended accounts, apps and pages of New York University researchers Laura Edelson and Damon McCoy and their colleagues over their refusal to stop using a browser extension to research political advertising and disinformation on Facebook.
In a statement on a corporate blog, Facebook Product Management Director Mike Clark said:
For months, we’ve attempted to work with New York University to provide three of their researchers the precise access they’ve asked for in a privacy protected way. Today, we disabled the accounts, apps, Pages and platform access associated with NYU’s Ad Observatory Project and its operators after our repeated attempts to bring their research into compliance with our Terms. NYU’s Ad Observatory project studied political ads using unauthorized means to access and collect data from Facebook, in violation of our Terms of Service. We took these actions to stop unauthorized scraping and protect people’s privacy in line with our privacy program under the FTC Order.
While Clark accused the NYU team of using its browser extension to collect data “about Facebook users who did not install it or consent to the collection,” Protocol reporter Issie Lapowsky pointed out that obfuscates the facts of the case:
It’s an accusation that evokes the worst of the Cambridge Analytica scraping scandal, but one that leaves out key details that Protocol revealed earlier this year in a story about Facebook’s dispute with the NYU researchers and the fraught relationship between platforms and researchers generally. The users who had data collected without their consent aren’t private users: they’re advertisers, whose ads are by definition already public, and whose information Facebook itself stores in an ad archive.
Indeed, Facebook is quick to make the connection to Cambridge Analytica, and to use that scandal as a cudgel against the NYU researchers.
Updated Facebook statement on NYU Ad Observatory. “Research is not an excuse to break privacy rules and scrape user data–no matter the intent. As most will remember, we paid a $5 billion fine to the FTC for a developer scraping data under the auspices of ‘research.’ (1/2)— Andy Stone (@andymstone) August 4, 2021
There were three academics connected to the Cambridge Analytica scandal of the 2016 election. One settled with the FTC for deceptive practices for misleading people who took their personality quiz (Aleksandr Kogan). His junior partner joined Facebook after co-developing this rogue quiz, but was dismissed from the company without explanation when Congress asked questions (Joseph Chancellor). The other academic connected to the scandal took Cambridge Analytica to court in the UK using European Data Protection laws because US voter data was processed there (me).
I know the Cambridge Analytica scandal better than almost anyone. That’s why I’m calling out the company for this uniquely shameful act of invoking it to cast aspersions on the NYU team. Their response helps illustrate why Facebook is a bully of a company that gleefully misrepresents the Cambridge Analytica scandal as it invokes it as a pretext to shut down this particular academic research project from NYU.
The Cambridge Analytica scandal centered on a deceptive Facebook application that surreptitiously pilfered data using Facebook’s Friends API from millions of people. This was the commercialization of academic work that was grandparented and authorized by Menlo Park for extended use of the API to continue to collect behavioral data, after access was limited for everyone else.
The NYU Ad Observer browser extension does not collect personal data even though it processes data from Facebook’s Ad Library and CrowdTangle analytics APIs. Granted no special status by Facebook, it has served as an independent non-commercial observation project that gathers insights into the experiences of volunteers who securely donate their data to the researchers.
The NYU tool is neither surreptitious nor deceptive because it does not seek to gather data from users abusively, unlike Kogan and Chancellor’s app that was originally blessed by Facebook as a legitimate research tool. On the contrary, the Ad Observer browser extension is a paragon of user integrity, privacy-protecting research techniques, and fills a significant gap in the limited ad library tools supplied by Facebook for civil society’s use. Engineers at Mozilla that reviewed the open source code concur.
Facebook claims they shut down the researchers accounts due to privacy problems with the Ad Observer. In our view, those claims simply do not hold water. We know this because we reviewed the code ourselves before recommending it. Read more here: https://t.co/75WKo0v5xK— Mozilla (@mozilla) August 4, 2021
Facebook might be pleased to see concerns related to Cambridge Analytica attached to its handling of the NYU Ad Observer case as a fig leaf for its original sin. But the Kogan/Chancellor personality quiz and the NYU Ad Observer browser extension do not resemble each other in any substantial way beyond the vague notion that Kogan and Chancellor were once academic researchers. They sold US Facebook data to a firm funded by a GOP donor that was an outgrowth of the British military industrial complex which ended up being a subject of the Mueller investigation. They were doing it deceptively, and in contravention of UK law, never mind Facebook’s own policies.
By contrast, NYU’s researchers have been contributing open source tools to gather voluntary opt-in research of vital importance under no false pretext. The only “privacy” this ban ostensibly protects is that of Facebook’s customers, advertisers. We can see how Facebook fears the parameters of targeting becoming revealed to public scrutiny. The only privacy Facebook really cares about is the secrets of its power. Facebook is not required by the FTC Consent Decree to restrict activities such as the NYU Ad Observer. The Ad Observer threatens Facebook because it challenges Facebook’s monopoly on the truth about itself. It has not thwarted another Cambridge Analytica through carefully considered internal safeguards here, despite Facebook’s spin.
Facebook’s legal argument is bogus. The 2019 FTC order has two relevant sections. Section II restricts how *Facebook* shares user information. It doesn’t preclude *users* from volunteering information about their experiences on the platform, including through a browser extension.— Jonathan Mayer (@jonathanmayer) August 4, 2021
Indeed, this is a case of selective enforcement. As former FTC chief technology officer and White House senior advisor Ashkan Soltani notes, “under a theory that the NYU browser extension has ‘the potential to access Covered Information’ one might ask why Facebook has not similarly enforced these policies” against other extensions. One could point to the Republican political data broker Bridgetree to find a commercial entity that boasts of scraping and selling access to Facebook user data to clients. Taking Facebook’s public relations statements at face value in the technical sense would probably entail the banning of password managers. Regardless of what critics say, Facebook will likely get away with this rank hypocrisy and absurd gaslighting because advertisers have become dependent on the company to access audiences and customers.
Except, NYU Ad Observatory doesn't scrape 'user data' but instead, permits users to voluntarily donate information about only the ads they received from advertisers— ashkan soltani (@ashk4n) August 5, 2021
Most of this information @Facebook itself makes available via its "Ad Library Tool" (although unreliably) /1 https://t.co/OoiZ5bsPer pic.twitter.com/38X9lgDTc4
What’s needed to rein Facebook is legislative action and regulation. On Wednesday, Senators Mark Warner (D-VA), Ron Wyden (D-OR) and Amy Klobuchar (D-MN) all slammed Facebook for its actions. Klobuchar said “it is vital that social media companies both protect user data and improve transparency,” and that she is “deeply troubled by the news that Facebook is cutting off researcher access to political advertising data, which has shown that the company continues to sell millions of dollars’ worth of political ads without proper disclosures.” Wyden said “after years of abusing users’ privacy, it’s rich for Facebook to use it as an excuse to crack down on researchers exposing its problems,” and said he asked the FTC to “to confirm that this excuse is as bogus as it sounds.” Warner praised the Ad Observer tool, which he said “repeatedly facilitated revelations of ads violating Facebook’s Terms of Service, ads for frauds and predatory financial schemes, and political ads that were improperly omitted from Facebook’s lackluster Ad Library.”
These statements are a good start. It is time for lawmakers to demand more from these companies by installing guardrails in place and demand the data rights necessary to understand the role our data and these dominant platforms play in our democracy as studied by accountable, ethical, independent, non-commercial scholars. Perhaps the curious timing of Facebook’s action illustrates why it won’t hesitate to obstruct research that gets too close to the truth about Facebook:
Although Facebook made its demand to Edelson and McCoy in October of last year, it did not move to shut down the researchers’ Facebook accounts until yesterday, hours after Edelson had informed the platform that she and McCoy were studying the spread of disinformation about January 6 on the social media platform.
The stakes are high. Legislation is the only next step. Let’s ask whether Facebook is compliant with European and new state data privacy laws that grant people rights to access their data. Properly enforced, our microtargeting data would be included in data download features essentially mandated by what’s already on the books. Instead of needing to install privacy protecting browser extensions, people could freely donate their downloaded data to trusted researchers bound by legal data rights and institutional responsibilities. Either way, don’t let Facebook use Cambridge Analytica to deflect accountability.
David Carroll is an associate professor of media design at Parsons School of Design at The New School. He is known as an advocate for data rights by legally challenging Cambridge Analytica in the UK in connection with the US presidential election of 2016, resulting in the only criminal conviction of the company by the Information Commissioner’s Office. This work is featured in the BAFTA and Emmy nominated Netflix original documentary The Great Hack (2019) and his writings on the effort have been published in WIRED, PAPER, Quartz, The Guardian, Motherboard, and The Boston Review. He was awarded prizes from The Philosophical Society and the Law Society at Trinity College Dublin in 2019. He received a BA from Bowdoin College and an MFA from Parsons.