Home

Donate
Analysis

How States Are Taking on Algorithmic Pricing

Max Morgan / Mar 20, 2026

Max Morgan is a Tech and Public Policy Scholar at Georgetown University.

New York Governor Kathy Hochul. Source

From apartment rents to grocery carts, algorithmic pricing tools are quietly reshaping what Americans pay. In 2022, ProPublica's investigation of RealPage demonstrated how algorithmic recommendations enabled landlords to coordinate rent increases across markets, letting tenants foot the bill. More recently, in 2025, Consumer Reports documented the impact of Instacart's AI-enabled pricing software, Eversight, finding that the platform showed different prices to different users for identical products, inflating grocery bills by as much as 23 percent based on users' shopping patterns and browsing history.

The federal government took initial steps to scrutinize algorithmic pricing practices. The Department of Justice filed suit against RealPage, alleging that its rental pricing software facilitated coordination among competing landlords, effectively enabling algorithmic price-fixing in property rental markets. The DOJ and RealPage ultimately settled in 2026. The Federal Trade Commission (FTC) also launched a Section 6(b) study in 2024 to examine surveillance pricing practices across industries, and released preliminary insights in early 2025. However, the latest FTC leadership has signaled that continuing the study is not a priority, leaving federal regulation uncertain.

State legislatures have responded with a dramatic surge in activity: 24 states introduced 51 bills curbing algorithmic pricing practices in early 2025, compared to just 10 bills across all states in 2024. New York became the first state to enact legislation specifically targeting algorithmic price-fixing in rental housing. As states move forward where federal agencies have hesitated, both regulators and industry face challenges in enforcement and compliance.

Different problems may take different regulatory approaches

State legislative activity has targeted two distinct practices—algorithmic price-fixing and surveillance pricing—that raise fundamentally different concerns and present distinct detection challenges for regulators, researchers, and investigators.

Algorithmic price-fixing

Algorithmic price-fixing refers to "the use of software—often trained on nonpublic competitor data or used across multiple sellers in a market—to coordinate prices, effectively reducing market competition." This practice creates market-wide concerns, such as when a pricing platform aggregates data from competing landlords or retailers and uses it to decrease competition and raise prices across the market. Such price-fixing activities are generally prohibited under the Sherman Act.

ProPublica's 2022 investigation of RealPage demonstrated the consumer harms this practice can cause. Reporters analyzed marketing materials, landlord testimonials, and rental price data to show how algorithmic recommendations led to anticompetitive coordinated rent increases. The Council of Economic Advisors to the Biden administration conducted a similar investigation, quantifying that algorithmic coordination in rental housing increased costs to renters by $70 per month on average.

New York's S.7882, the first law of its kind enacted in the United States, targeted this practice by prohibiting coordinated housing rental pricing decisions through shared data and algorithmic recommendations. The law directly responds to the RealPage investigation and applies specifically to the rental housing sector.

Enacting such legislation, however, raises an important question: how can regulators identify algorithmic price-fixing in practice? For algorithmic price-fixing, investigators need to identify which sectors have intermediaries collecting and sharing pricing information across competitors and determine whether that information sharing leads to anticompetitive coordination rather than market efficiency. This may require collecting data from earnings reports, company white papers, and business testimonials, then cross-referencing with publicly available pricing information. However, these investigations could take years to complete and require significant resources to analyze business practices and pricing data across markets.

Surveillance pricing

Surveillance pricing (sometimes also referred to as personalized pricing) involves using software to set "customized prices for a consumer or group of consumers based on behavior, biometrics, location, or other personal characteristics." This practice raises concerns about individual consumer discrimination rather than market-wide coordination. Such activities prevent consumers from comparing prices and planning expenses and, in certain circumstances, may discriminate against consumers on the basis of protected classes like race.

In 2025, Consumer Reports identified how grocery chain Kroger used–sometimes incorrect–personal and behavioral data to determine which discounts shoppers were eligible for. Later in 2026, Consumer Reports leveraged 193 participants to document how Instacart showed different prices to different users for identical products, inflating grocery bills by as much as 23 percent based on users' shopping patterns and browsing history.

While no surveillance pricing bills were introduced in 2024, 13 bills targeting this practice were introduced in 2025 across multiple states. With the exception of New York’s amendment to General Business Law 349-a that requires disclosure of surveillance pricing practices to consumers, no other surveillance pricing legislation has been enacted yet. States pursuing surveillance pricing legislation include Colorado, California, Kentucky, and New York, among others.

While most state legislative proposals prohibit making individualized pricing decisions based on personal or behavioral data, state approaches vary in their enforcement mechanisms and exemptions. New York's S.8623 establishes a private right of action and exempts voluntary loyalty programs that offer uniform discounts, while Kentucky's HB33 does not establish a private right of action and allows for personalized offers in loyalty programs if the programs are voluntary.

Surveillance pricing presents different enforcement challenges because it targets individual consumer discrimination rather than market-wide coordination. Investigators would need to determine which businesses engage in surveillance pricing, what data influences prices, and how those practices impact consumers. Controlled participant studies, like the Consumer Reports investigation, face significant resource constraints—recruiting participants, controlling for confounding variables, and ensuring statistical validity requires substantial time and funding.

Observational datasets like the National Internet Observatory, which combines personal data—browser history, demographics—with pricing information, could enable researchers to detect discriminatory pricing patterns at scale. However, access to such datasets remains limited, the analytical infrastructure to interpret this data is still developing, and it’s unclear whether these datasets will remain relevant as shopping shifts to AI chatbots.

Legislative effectiveness and industry compliance remain open questions

Effective regulation of algorithmic pricing requires drawing difficult distinctions that current legislation has only begun to address. Regulators face unresolved questions about scope. While housing and groceries have received the most scrutiny, pricing intermediaries operate across many sectors with far less visibility.

Beyond determining where regulation should apply, policymakers must also decide which pricing practices should fall within its reach. Within regulated sectors, the line between harmful and beneficial practices remains difficult to draw. Loyalty programs and personalized discounts can benefit consumers, while pricing on the basis of protected characteristics like race or income may harm them. The same behavioral and demographic data that powers a loyalty discount could, in a different context, constitute discriminatory pricing. Legislation like Illinois SB 2255, which would broadly prohibit the use of surveillance data in individualized pricing, risks sweeping in practices consumers may benefit from, while more targeted approaches like New York's S.8623 may leave harmful practices unaddressed.

Industry compliance presents a separate set of challenges, including whether companies will be willing to rearchitect their products to eliminate prohibited data uses. The National Retail Federation sued New York over its surveillance pricing disclosure requirements, claiming the law compels speech and burdens retailers who use algorithmic pricing for legitimate purposes. This legal challenge suggests that companies and industry may resist any compliance obligations they view as overly restrictive.

Even for companies compelled to comply, the timeline of enforcement may present challenges. Compliance with data use regulation requires labor-intensive work: identifying which data inputs feed pricing algorithms, tracing where that data originated, and rearchitecting systems to eliminate prohibited uses.

As an example of data use regulation at the federal level, the FTC has used its authority to supervise companies following privacy violations, requiring them to establish comprehensive privacy programs and undergo regular audits similar to the data governance and system auditing that algorithmic pricing laws will demand. The FTC imposed broad data use restrictions on Meta in 2019. Even as Meta claimed to have made substantial investments, they struggled to comply with the requirements years afterward, with independent assessors noting substantial issues in their compliance program in 2021.

With state algorithmic pricing laws at varying stages of enactment and implementation, the path forward remains uncertain. Key questions persist: when enforcement will meaningfully begin, whether firms will come into compliance, and whether the technical and institutional changes can be completed within the statutory timeframes. As these laws move from proposal to practice, their real-world impact will depend not just on legislative design, but on the capacity and willingness of both regulators and industry to operationalize them.

Authors

Max Morgan
Max Morgan is a Master's in Public Policy candidate and Tech and Public Policy Scholar at Georgetown University where he studies open web platforms and state capacity for data governance. Previously, Max served as a Senior Software Engineer in the City of Detroit's Department of Innovation and Techn...

Related

News
Big Tech’s AI Shopping Tools Raise Stakes for ‘Surveillance Pricing’ LawsJanuary 22, 2026
Perspective
Is the Next Antitrust Problem the Prompt to an AI Agent?December 17, 2025

Topics