Home

Donate

Facebook Settles Lawsuit Stemming from HUD Discrimination Complaint

Justin Hendrix / Jun 21, 2022

A blog post authored by Roy L. Austin Jr., Facebook's Vice President of Civil Rights and Deputy General Counsel, and a news release from the U.S. Department of Justice both say the company has settled a lawsuit stemming from an administrative complaint brought by the Department of Housing and Urban Development that alleged the company's ad targeting systems were being used to propagate discriminatory advertising in violation of the Fair Housing Act (FHA).

Filed in the U.S. District Court for the Southern District of New York, the lawsuit alleged "that Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status and national origin."

The settlement says "HUD investigated the administrative complaint in accordance with 42 U.S.C. § 3610(a)-(b), and the Secretary determined there was reasonable cause to issue a Charge of Discrimination under 42 U.S.C. § 3610(g)(2)," a law that permits aggrieved parties to pursue redress.

Roy Austin, Jr.'s post says the company will introduce changes to the way its ads system works:

Today’s announcement reflects more than a year of collaboration with HUD to develop a novel use of machine learning technology that will work to ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad. To protect against discrimination, advertisers running housing ads on our platforms already have a limited number of targeting options they can choose from while setting up their campaigns, including a restriction on using age, gender or ZIP code. Our new method builds on that foundation, and strives to make additional progress toward a more equitable distribution of ads through our ad delivery process. To implement this change while also taking into account people’s privacy, we will use the privacy-preserving approaches we’re pursuing to measure race and ethnicity at the aggregate level.

Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division called the agreement historic,

As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,. This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.

The original complaint, filed in 2019, noted that Facebook's systems enabled a range of discriminatory practices, including offering "advertisers hundreds of thousands of attributes from which to choose, for example to exclude 'women in the workforce,' 'moms of grade school kids,' 'foreigners,' 'Puerto Rico Islanders,' or people interested in 'parenting,' 'accessibility,' 'service animal,'
'Hijab Fashion,' or 'Hispanic Culture.'" Facebook also "offered advertisers the ability to limit the audience of an ad by selecting to include only those classified as, for example, 'Christian' or 'Childfree.'

The Justice Department says the settlement includes the following:

By Dec. 31, 2022, Meta must stop using an advertising tool for housing ads known as “Special Ad Audience” (previously called “Lookalike Audience”), which relies on an algorithm that, according to the United States, discriminates on the basis of race, sex and other FHA-protected characteristics in identifying which Facebook users will be eligible to receive an ad.



Meta has until December 2022 to develop a new system for housing ads to address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads. If the United States concludes that this new system sufficiently addresses the discriminatory disparities that Meta’s algorithms introduce, then Meta will fully implement the new system by Dec. 31, 2022.



If the United States concludes that Meta’s changes to its ad delivery system do not adequately address the discriminatory disparities, the settlement agreement will terminate and the United States will litigate its case against Meta in federal court.



The parties will select an independent, third-party reviewer to investigate and verify on an ongoing basis whether the new system is meeting the compliance standards agreed to by the parties. Under the agreement, Meta must provide the reviewer with any information necessary to verify compliance with those standards. The court will have ultimate authority to resolve disputes over the information that Meta must disclose.



Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics. Under the agreement, Meta must notify the United States if Meta intends to add any targeting options. The court will have authority to resolve any disputes between the parties about proposed new targeting options.



Meta must pay to the United States a civil penalty of $115,054, the maximum penalty available under the Fair Housing Act.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics