Willmary Escoto is U.S. Policy Analyst and Eric Null is U.S. Policy Manager at Access Now.
How many whistleblowers will it take to rein in Big Tech?
The Facebook Papers and whistleblower scandals shed light on just how badly online technology companies need oversight and regulation. While the Facebook (now Meta) scandals have helped reinforce that Big Tech cannot govern itself, we’ve known this for years. Countless tech firms engage in severe privacy intrusions and data-driven discrimination, particularly harming people of color, women, low-income people, immigrants, and other marginalized communities.
Even with the weeks of headline stories, Congressional hearings and renewed public pressure, passing comprehensive privacy legislation is a huge undertaking, and it will take a significant amount of time. While legislation may be the lodestar, the Federal Trade Commission (FTC) doesn’t have to wait for Congress. The agency is in a great position to pass privacy safeguards — it already has the authority to pass trade rules to protect people’s privacy and civil rights. Specifically, the agency should initiate a rulemaking proceeding, and quickly, to protect civil rights and set clear safeguards on collecting and using personal data.
We need privacy and civil rights protections at the federal level
The past several years of damning revelations and studies have shown that we need more privacy and civil rights protections in the U.S. The ultimate goal, of course, is to help better protect marginalized people from dangerous data practices, including algorithmic discrimination and civil rights violations online.
The ledger of data harms to marginalized people is extensive. For example, data can be used to target marginalized groups for high-risk loan and credit cards. Unchecked data collection has also been used to erode democracy through the use of deceptive voter suppression and misinformation targeting Black people and Latinos online. Online searches using Black-associated names were once shown to be more likely to display ads relating to arrest records and high-risk credit cards. Algorithms have also been linked to communities of color paying 30% more for auto insurance premiums than white communities with similar accident costs. Further, some universities use “risk algorithms” and race-based data as a proxy to indicate whether students will drop out of school, resulting in Black students being up to 4 times more likely to be labeled ‘high risk’ than white students.
Algorithms can also limit people’s choices because of how advertisers and platforms limit ad distribution. For example, in 2019, HUD sued Facebook for violating the Fair Housing Act for allowing advertisers to determine who receives an ad for housing based on race, gender, and other protected characteristics. A recent study also showed that, even when advertisers were not explicitly discriminatory, Facebook could skew delivery of the advertisement in a discriminatory way, which had particularly harmful effects when showing housing and employment ads. Even back in 2017, several companies, including Amazon, Verizon, and UPS used Facebook to exclude older workers from employment ads.
The FTC has the expertise and authority to pass privacy and civil rights rules
While we have long advocated that Congress pass strong data protection legislation, achieving this goal is difficult and could take a long time. And now, with online privacy rapidly deteriorating, there’s more urgency than ever for robust and clear privacy rules. As nine Senators said in a letter to FTC Chair Lina Khan, the Commission “has substantial institutional knowledge and expertise to contribute to the legislative process through its track record of enforcement and its existing privacy authorities,” and thus should engage in rulemaking.
The FTC has been the primary privacy enforcer for decades and its expertise in data is indisputable. It has undertaken extensive privacy-related enforcement, including a $5 billion fine against Facebook in 2019 for violating its previous consent decree. The agency also has the power to demand documents from U.S. companies, which it recently exercised to determine that internet service providers routinely collect extensive data on their customers that can be sold or shared with third parties.
Initiating such a rulemaking would push the privacy debate forward in an unprecedented way. It would empower the Commission to more thoroughly defend online privacy and promote civil rights by establishing clear rules against discriminatory and abusive data practices through a transparent and democratic process. The FTC could also lay out any requirements restricting data collection based on the types of personal data involved, how the data is used, and who is using and sharing the data. The rules could ultimately limit data collection, require companies to allow people to delete their data, and curb discriminatory data uses. The rules could also require much-needed transparency into algorithms. But given that rulemaking proceedings at the FTC can take several years, the agency should begin the process as soon as possible.
Big Tech has been allowed to run rampant because there is little oversight and essentially no rules of the road for data practices. The FTC can and should seize the moment and initiate a rulemaking proceeding to make tough but decisive decisions on data practices. It’s time to act.