John Perrino is a policy analyst at the Stanford Internet Observatory.
As the European Union, United Kingdom and other governments around the world advance laws to hold social media companies accountable, the clock is ticking for the U.S. to pass legislation ahead of the midterm elections. While neither the EU’s Digital Services Act nor the UK’s Online Safety Bill would fit the U.S. regulatory system, both sets of rules cover a shared set of basic principles: transparency and user control in platform design.
As the final details from the EU’s Digital Services Act (DSA) are published in the coming weeks, Congress has an opportunity to adopt and adapt provisions that address important tradeoffs and technical challenges holding back legislation. While some more ambitious proposals for regulating tech are hung up in partisan debates, other low-hanging pieces of legislative fruit are held back by technical protocols, such as data access protections and platform design requirements. Principles and agreements on these issues are already in the late stages of development in the DSA following extensive input, including from American tech companies and civil society.
The German Marshall Fund’s Susan Ness and R Street’s Chris Riley recently argued in The Hill for the idea of “modularity,” or developing “discrete standards, protocols, codes of conduct, or oversight systems” that will fit disparate regulatory systems. Provisions in the DSA, which is furthest along with a political agreement reached in April, seem like a good place to start — particularly the transparency and research access provisions in Article 31.
Social media data access for researchers in the DSA is mirrored in U.S. transparency and online safety proposals. A key holdup for social media transparency legislation is over concerns about building strong privacy and civil liberty protections for data that is shared between social media platforms and researchers, or made public. Meanwhile, a plan to address that same challenge in the DSA is reportedly nearing completion. A detailed plan for privacy and security protections for data access and sharing can address lingering questions and concerns about what data is needed for research, how platforms will provide access to researchers and the public and which types of protections are needed to secure different data categories.
Another strong area for early engagement and collaboration is to harmonize standards and technical principles for platform design requirements in the DSA with similar language proposed in the UK Online Safety Bill as well as the Kids Online Safety Act introduced by Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN). There are good examples to draw from in other contexts, too. For instance, Australia has engaged industry and civil society since 2018 to develop and implement safety by design standards into software application programs, and the Future of Tech Commission developed a framework for online safety policy in collaboration with 150 experts and events across the U.S. and EU.
Industry would, of course, also benefit from a modular approach to social media regulation. Consistency for regulatory compliance and technical requirements would help the largest social media companies and their potential competitors avoid a patchwork of requirements across the largest Western markets. In fact, some social media sites are already implementing changes to comply with forthcoming safety design regulations, and the largest platforms will need to comply with EU requirements for data sharing standards that could also be used for U.S. transparency requirements.
Wonky concerns over technical standards may be holding up some of the less discussed but most promising pieces of online trust and safety legislation presently in Congress. A modular approach is far from suggesting a “copy-paste” solution, but it might provide universal technical solutions and some common frameworks for design and legal requirements that avoid a reinventing of the wheel.
A potential path forward for Congress to act on social media accountability was previewed during a recent Senate Judiciary subcommittee hearing on “Platform Transparency.” Sens. Chris Coons (D-DE) and Ben Sasse (R-NE) echoed shared concerns about the imbalance of power between social media platforms and their users and the need to shine a light to develop accountability through research and public understanding.
The critics were indeed supportive of this proposal as well, they just highlighted important challenges and tradeoffs for privacy, security and industry compliance that would need to be addressed before the legislation moved forward. Luckily, civil society has developed a roadmap and proposed solutions for most of those challenges through painstaking work and long meetings that informed EU policymakers.
American companies and civil society groups have long argued the U.S. should be taking the lead on social media laws and regulations. Legislators in a divided Congress don’t have to reinvent the wheel to achieve meaningful first steps to build public understanding and transparency to hold platforms accountable. But time is running out to pass any legislation ahead of the midterm elections this year. Congress doesn’t have to start from scratch to develop technical standards and protections holding back important platform transparency and design requirements. It can borrow and refine what’s working elsewhere.