Proposed legislation sponsored by Reps. Ken Buck (R-CO), David Cicilline (D-RI), Lori Trahan (D-MA) and Burgess Owens (R-UT) would require “require that internet platforms give users the option to engage with a platform without being manipulated by algorithms driven by user-specific data.” The draft bill was first reported by Axios.
The language is nearly identical to a previous Senate version of the bill sponsored by Sen. John Thune (R-SD) that was reintroduced in June.buck_030_xml-filter-bubble
The language would apply to any internet platform with greater than 500 employees, $50 million in annual revenue, or that “collects or processes on an annual basis the personal data of less than 1,000,000 individuals.” It also excludes any platform “operated for the sole purpose of conducting research that is not made for profit either directly or indirectly.”
Sen. Thune discussed the motivation for the legislation during recent Senate testimony by Facebook whistleblower Frances Haugen. Haugen told Sen. Thune that engagement based ranking, which dictates how a user sees the News Feed, can be dangerous, and that it is designed to satisfy Facebook’s business model.
The language of the bill hinges on what it calls “opaque algorithm requirements.” An opaque algorithm “makes inferences based on user- specific data to select the content the user sees.” The bill would require services employing such algorithms to inform the user of their use and how they operate. It would also require the platform to offer an alternative “version of the platform that uses an input-transparent algorithm and enables users to easily switch between the version of the platform that uses an opaque algorithm and the version of the platform that uses the input-transparent algorithm by selecting a prominently placed icon, which shall be displayed wherever the user interacts with an opaque algorithm.”
The Federal Trade Commission (FTC) would be responsible for enforcing the Act, with any violation an “unfair or deceptive act or practice” under 15 U.S.C. 41. The language has a carve-out for “age appropriate content filters,” and for “data provided for express purpose of interaction with the platform” such as user-supplied “search terms, filters, speech patterns…. save preferences” or location,” or other user choices such as what accounts to follow or channels to subscribe to. The language would restrict platforms from using the history, behavior, or related inferences on a user.
This piece will be updated.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.