Congresswoman Lori Trahan, D-MA3, yesterday announced the Digital Services Oversight and Safety Act (DSOSA) with cosponsors Rep. Adam Schiff, D-CA28 and Rep. Sean Casten, D-IL6. A press release says it is “comprehensive transparency legislation to establish a Bureau of Digital Services Oversight and Safety at the Federal Trade Commission (FTC) that would have the authority and resources necessary to hold powerful online companies accountable for the promises they make to users, parents, advertisers, and enforcers.”
“The need for federal action to rein in the dominance and abuses of large online companies is obvious. Congress has failed to keep up with the digital marketplace, and users are paying the price,” said Rep. Trahan. “The Digital Services Oversight and Safety Act will be a shot of expertise in the arms of enforcers and legislators alike, helping to inform comprehensive and long overdue updates to the laws that govern the internet. Comprehensive transparency and product safety oversight are necessary complements to ongoing efforts to reform antitrust and data protection laws, and this new Bureau will be key to getting us there.”
Rep. Trahan previously introduced the Social Media DATA Act, which sought to create a transparency standard for digital ads- and similar language is incorporated into the new legislation. The DSOSA offers an alternative mechanism for platform transparency and access for researchers to a Senate proposal, the Platform Accountability and Transparency Act, put forward by Senators Chris Coons (D-DE), Rob Portman (R-OH) and Amy Klobuchar (D-MN) last year.dsosa_final
The DSOSA would establish a Bureau of Digital Services Oversight and Safety within the FTC and staff it with “at least 500 positions” to include “at least 80 technologists,” “80 sociotechnical experts” (defined as “an information science researcher, privacy or human rights advocate, international data governance expert, sociologist, psychologist, ethicist, language scholar, statistician, user interface designer, child development scholar, or an individual with expertise in another related field or application,”) and “at least 15 constitutional lawyers.”
The bill would also establish an “internal complaint-handling system” that would allow individuals to “appeal content moderation action” by a platform, establish transparency standards on moderation policies, and require reporting to the FTC on content moderation practices and internal risk assessments related to the platform’s function and policies. For instance, platforms would be required to disclose the “hiring and training of human content moderators, trust and safety personnel, engineers focused on detecting and reducing systemic risks” and so on.
The FTC would also collect best practices and provide it to the platforms as “evidence based nonbinding guidance” on ways to address systemic risks. It is noted that “The guidance shall focus on product design features and content moderation processes that aim to be content neutral.” An advisory committee is established to “solicit views” on such guidance, including from “communities most impacted by the systemic risks” identified in the legislation, which include discrimination; as well as from current and former content moderators and platform employees.
A section on recommender systems requires a range of transparency features on what information is used to power the system, and requires platforms to “provide an option that does not rely on any of the user’s personal information (either collected or inferred) to determine the order of information presented to the user.” There is a provision for users to opt in to allow use of personal information to enable certain features.
A section on “independent research facilitation” establishes an “Office of Independent Research Facilitation” inside the Bureau of Digital Services Oversight and Safety that would administer access to platform data for research into “the impacts of the content moderation, product design decisions, and algorithms of covered platforms on society, politics, the spread of hate, harassment, and extremism, security, privacy, and physical and mental health.” A method for certifying researchers and “host organizations” that would be permitted to conduct research with platform data.
The bill would create a “Federally Funded Research and Development Center” to enable “certified researchers to perform studies requiring information from multiple covered platforms.” A “Research Fellowship Program” is established to provide researchers an opportunity to work at the Bureau. A safe harbor provision for both platforms and certified researchers is included.
The bill also imagines a “High-Reach Public Content Stream” to be made available to researchers by the platforms. the “stream” would contain information about “pieces of high-reach and high-engagement public content, such as user-generated posts, texts, hyperlinks, images and videos,” data on the frequency of how such content is shared, and other metrics including engagement, exposure, interaction with high-profile accounts, and other information.
Violations of the legislation would be enforceable by the FTC as unfair or deceptive acts or practives.
The announcement of the DSOSA included endorsements from individuals such as Jonathan Greenblatt, CEO and National Director of the Anti-Defamation League; Laurel Lehman, a policy analyst at Consumer Reports; Tracy Rosenberg, Executive Director of Media Alliance; Nathan Miller, Campaign Director at Avaaz; Jesse Lehrich, Co-Founder of Accountable Tech; Imran Ahmed, CEO of Countering Digital Hate; Nathalie Marechal, Senior Policy and Partnerships Manager at Ranking Digital Rights; Irene Ly, Policy Counsel at Common Sense Media; Hany Farid, Professor at the University of California, Berkley; and Paul Barrett, Deputy Director of the NYU Stern Center for Business and Human Rights.
“Representative Trahan’s bill offers a novel approach that would empower the Federal Trade Commission, the public, and public interest researchers to investigate the ways online platforms impact consumers and the information ecosystem,” said Lehman.
Rep. Trahan said the legislation will be featured in an Energy and Commerce Committee legislative hearing next week.
“Social media platforms have reshaped everything from the way we communicate with family and friends to how we consume the news, participate in political discourse, and disseminate important information on which we rely. And as the digital landscape continues to evolve and expand, the public should have transparency about how technology platforms function and whether there are sufficient guardrails in place,” said Rep. Schiff. “This legislation will take the long-overdue step of giving federal regulators insight into how these companies operate, so they can issue evidence-based guidance and hold them accountable.”
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.