Home

5 Directions for the FTC on Data & Surveillance: Consumer Protection Rules

Ellen P. Goodman / Jul 27, 2021

The Federal Trade Commission (FTC) has been tasked in President Joe Biden's Executive Order on Competition with tackling unfair and deceptive data and surveillance practices. More enforcement actions are needed, but so is a rulemaking and other ex ante guidance. For at least two decades, Congress has been “about to” pass comprehensive privacy legislation. As Jessica Rich, former Director of the Bureau of Consumer Protection put it, the FTC made privacy law recommendations “just a few years after the internet became an everyday medium, four years before Facebook was created, and seven years before the iPhone would be introduced.”

Why now? There are several pretty good comprehensive privacy bills out there, including ones introduced by Senator Sherrod Brown (D-OH) and Senator Maria Cantwell (D-WA) that would increase FTC authority (or create a new agency). Maybe Congress will act soon; probably it won’t. It’s past time for the FTC to start making rules under Section 5 (Mag Moss) or under other statutory authority (e.g., COPPA), issuing guidance on unfair data practices, conducting 6(b) investigations, and using whatever enforcement authority it still has, like a penalty offense authority for companies on notice that what they’re doing is illegal. This is all even more urgent:

After the Supreme Court’s 2021 Transunion decision, which makes it harder for individuals to vindicate their data and privacy rights.

After the 11th Circuit’s 2018 LabMD decision, which makes it harder for the FTC to hold companies responsible for data breaches without clearer standards.

After the Supreme Court’s 2021 AMG decision, which kneecaps the FTC’s ability to order financial restitution under 13(b).

There is now bipartisan support at the FTC for making rules about data governance. Republican Commissioner Christine Wilson in April told Congress that the FTC should act. Two months before, she had signaled as much at the University of Colorado Law School’s Silicon Flatirons event, to which Colorado Attorney General Phil Weiser responded, “I wish I could give you a virtual hug.”

While rulemaking will take a long time, the process could drive support for comprehensive data legislation, provide thought leadership and convening for stakeholders, help concentrate civil society efforts, develop the record supporting theories of harm and modes of redress, create moral hazard for wrongdoers, increase enforcement in the states and at the FTC, etc.

So what to do? The FTC doesn’t have the authority or the resources to regulate in all the ways that Congress could legislate to limit data collection and use, require transparency and accountability, and give consumers more control and relief — all the things peer countries have been doing for years and along the lines of what President Obama called for almost a decade ago with his administration’s consumer data privacy Bill of Rights. What the agency can and should do is begin to establish guidance for “tip of the spear” practices that are most harmful and will- if regulated- have the most impact on business practices, consumer welfare, and liberty.

Below are five categories of high impact data use and abuse that the FTC should regulate or, at the very least, guide.

1. Focus on most vulnerable consumers and communities

The FTC has already started to focus on the intersection of civil rights/human rights and data. In AppFolio, it enforced FCRA against biased tenant screening which can unfairly deny people housing they have every right to obtain. The ACLU and Upturn have petitioned the FTC to turn this concern into guidance, for example, making it clear that it is a violation of FCRA to use “unreliable or overly loose matching criteria, such as name only searches, to match applicants with public records” and “reporting sealed or expunged records.”

Health information is also shared in ways that can subordinate the vulnerable. In Flo Health, the FTC enforced against a company abusively sharing women’s ovulation data, which is one of many examples of health data unprotected by HIPAA being shared in unfair and harmful ways that not only compromise individual privacy but can feed predictive algorithms responsible for systemic harms.

Children are a vulnerable group being exploited for data in captive environments where they have little choice but to relinquish biometric and other sensitive data. Civil rights groups have called on the FTC to issue a rulemaking under its COPPA authority on discriminatory practices in ad tech and ed tech that affect children.

Hiring algorithms that make it especially hard for people of color and other diverse constituencies (unlike past hires) are the target of an ACLU and Upturn letter that urges a rulemaking on biased hiring algorithms. EPIC has sought the same. What is so useful about such a project is that FTC guidance on what counts as biased and unfair automated decision systems in hiring can either be applied or be instructive for other algorithmic operations. For example, if a hiring algorithm is unfair because it relies on facial recognition, such a finding might apply more broadly to all automated decision systems that use facial recognition.

The recent revelations about the spyware Pegasus being used for comprehensive surveillance of journalists and dissidents highlights the special vulnerability of the brave. The FTC has jurisdiction over only commercial entities. But even if the entity doing the spying is a government, businesses that develop the spyware and sell it for undisclosed and abusive purposes may be engaging in unfair practices to the extent they threaten vulnerable populations who are consumers of the infected devices. The consumer harm is there, nested in an attack on the trust and freedom necessary for self-governance.

This touches on a more general point that I’ll address in a later post. What President Biden’s Executive Order did so well was to connect democracy, economic liberty, equity, and the prevention of abusive data practices. In its guidance, rulemaking, and enforcement actions, the FTC should look for the nexus of these interests and go after the practices that erode intersecting fundamental interests alongside consumer harm.

2. Define the meaning and function of fair algorithmic audits

A principal intervention to prevent biased hiring and other algorithmic harms is to require algorithmic audits. Draft legislation calls for this. Civil society groups have called for it. But what should auditing look like? To what extent should it be standardized? What should be the role of independent auditors? How should audits differ across sectors and how public should audit results be? Mona Sloane of NYU has written very perceptively about these questions. These should all be addressed in a rulemaking or guidance, following the development of a record.

Relatedly, what should an algorithmic impact assessment look like? The proposed Algorithmic Accountability Act tasks the FTC with overseeing these. Commentators have put great faith in the use of this tool to avoid algorithmic harms, much as NEPA’s environmental impact statement was designed to stop bad projects. What should an assessment accomplish? How thorough must it be? At what point in the lifespan of automated decision systems that are dynamic and constantly updating? These are all questions for rulemaking and guidance.

3. Get to internet of things data abuses before they are intractable

The FTC settled a case with Taplock. the manufacturer of smart locks, for deceiving consumers about how secure the locks were. It went on to advise internet of things businesses that they should implement “security by design.” What is this? More guidance is needed to ensure that people’s smart home devices are not hacked or their data unfairly or deceptively shared. Because smart home data is in the home, data abuse abets domestic abuse. Home surveillance abets police surveillance. And one arm of a company can exploit the data gathered by another, resulting in economic and dignitary harms to consumers and harm to competition. Competition and consumer protection interests will intertwine when consumers are locked into a smart home device platform or ecosystem because Nest and Echo systems, for example, don’t interoperate. The FTC should address what security and interoperability obligations IOT companies should have, especially in the home, and how providers need to empower vulnerable consumers.

4. Re-theorize the harms for ad tech abuses and dark patterns

The FTC held a workshop on dark patterns that push consumers to give up data they don’t want to. Apple has now provided a natural experiment to see how much consumers actually object to having their data culled and shared by advertisers — apparently 96% of iPhone customers are taking advantage of Apple’s do not track option. So much for the privacy paradox!

The financial and non-financial substantial harms from current practices in ad tech and platform design abound. Just look at COVID misinformation, supported by advertising and pushed on consumers profiled as good marks, and its attendant physical and economic harms. At the same time, the leading ad tech companies are engaging in anticompetitive behavior, which the House Judiciary Committee detailed and which the FTC will learn more about from its 6(b) investigation of social media companies. It makes sense for the FTC, with its limited resources, to prioritize data practices that are both anticompetitive and unfair/deceptive. Ad tech data practices fit the bill.

Relatedly, the FTC should provide guidance or adopt rules on unfair/deceptive social media and website practices as they relate to Section 230 immunity. As Chris Hoofnagle has shown, the FTC has already used its Section 5 authority to carve out a small space from the Section 230 privilege for companies that design their websites to inflict consumer harm (discussing Abika.com, which helped people buy confidential records from third party private investigators). Proposals to reform Section 230 that would empower courts to assess liability in more cases, such as Danielle Citron and I have made, would work much better if the FTC provided guidance on where it thinks internet service providers are responsible for harming consumers.

5. Address the connections: B2G and B2B

In 2012, Julia Angwin came out with a story about how Google bypassed Apple protections against third party cookies to force them on consumers, resulting in an FTC enforcement action for deception. Then in 2013, just after the Snowden revelations, Ashkan Soltani broke the news that the NSA used Google cookies to find hacking targets. Of course there was nothing the FTC could do about that.

It is now of course unremarkable that government grabs the data that companies collect, causing substantial injury to consumers as well as to liberty interests. Indeed, this nexus between government surveillance and promiscuous commercial data collection is what seems to have warmed Commissioner Wilson to FTC data regulation. She wrote to Senator Ron Wyden (D-OR) in praise of the Fourth Amendment Is Not For Sale Act “which closes the legal loophole that allows the federal government to purchase commercial data on individuals without a warrant.” At the state and local levels, the connections between invasive data practices of tech companies (e.g., Clearview AI) and government surveillance are multiplying fast as smart city applications give public functions to private entities.

President Biden’s EO tasked the FTC with creating more space for small businesses by enforcing antitrust law and creating rules for fair competition. Consumer protection law and rulemaking can help here too. The FTC not long ago started something it called Operation Main Street to take down scammers targeting small businesses. There is a data corollary to this effort and guidance or rules on fair data practices between businesses with market power and their dependents could be important. The EU has undertaken this with its platform business regulation on promoting fairness and transparency for business users of online intermediation services. These rules require of platforms to give businesses some of the same rights that data protection gives to individuals, including transparency about how they are treated (e.g. ranking, data collection) and means of redress. Empowering smaller businesses like this is likely to give consumers more power in data protection battles.

***

It has been time for the FTC to make rules on data and surveillance for a long time. For many practices, it’s past time. For many more practices, there is still time for the FTC to exercise leadership and, it seems, the right team in place to do it.

Authors

Ellen P. Goodman
Ellen P. Goodman is a Professor at Rutgers Law School, Co-Director of the Rutgers Institute for Information Policy & Law (RIIPL), and a Senior Fellow at the Digital Innovation & Democracy Institute at the German Marshall Fund.

Topics