Home

Tech Firms Take First Step Toward Self-Regulation on Trust & Safety

David Morar / Sep 25, 2022

David Morar is an affiliate with R Street Institute, a founding fellow of the Digital Interests Lab, project manager with the Private Ordering Observatory and a visiting professor at FGV’s Center for Technology and Society.

Platform governance is a tight rope tech firms must walk. Shutterstock

When it comes to platform governance, the status quo is not tenable. Almost any fix—be it company policy or legislation—is extremely difficult to get right, and nearly impossible to scale without major trade-offs and consequences. But while thoughtful legislation is getting discussed and passed in Europe, other jurisdictions– including in the United States– are stuck either flexing authoritarian muscles or impotently proposing to take a swipe at important protections for free expression.

Given the current legislative and regulatory impasse, it is important to also look at what tools are available for better self-governance.. On this score, the industry does not have a laudable track record. For years tech firms resisted regulation while ignoring or underinvesting in efforts to address the harms of their products and services. And the techlash has produced a variety of underpowered attempts at better governance at the company level.

But the industry has perhaps finally gotten serious about self-regulation, with the advent of the Digital Trust and Safety Partnership (DTSP), a membership organization composed of a growing number of tech firms. Its first major product– the DSTP Safe Assessments Report, a synthesis of self assessments by Discord, Google, LinkedIn, Meta Platforms, Inc., Microsoft, Pinterest, Reddit, Shopify, Twitter, and Vimeo published this summer– gives some indication of what is possible through the partnership, and presents an opportunity to evaluate its role and function in a complex ecosystem.

The Merits of Industry Collaboration

DTSP aims to address the platform governance challenge by providing a forum for tech platforms to have conversation, establish best practices and, where possible, collaborate and collectively take action on trust and safety principles. There are both normative and descriptive reasons for enthusiasm about what some more cynical observers would call yet another industry association.

Of course, there are dangers to industry collaboration- which is why there are laws against collusion, price fixing and manipulating markets. With regard to trust and safety, there is the threat of “cartelization,” the potential flattening of policy approaches that might bring about numerous other unintended consequences.

But collaboration in industry, at its best (and with guardrails in place), is often fruitful and necessary, particularly when an industry is dealing with complex challenges that affect the broader ecosystem. It has been at the forefront in other areas, like privacy, where organizations like the International Association of Privacy Professionals (IAPP) or the Future of Privacy Forum (FPF)– industry-leaning but formally independent– rely on these entities to build and maintain strong relationships within and across stakeholder groups, and to form an ecosystem of scholarship, training and advocacy.

The creation of DTSP took into consideration the myriad number of areas where its potential members can differ, evident by the wide range of companies that are now part of it. It is built on five commitments to product development, governance, enforcement, improvement and transparency. DTSP’s principles define its role within a nascent ecosystem that also includes the Trust and Safety Professional Association and the Integrity Institute, organizations focused— much like IAPP and FPF in the realm of privacy— on the professionalization of the space, and on the provision of thought leadership and analysis.

A Look at the Safe Assessments Report and Beyond

The tech industry has historically been reluctant to share much about its practices beyond bland blog posts and problematic transparency reports, meaning much of what is known comes from leaks, whistleblower accounts, or journalistic exposes. The recently released DTSP Safe Assessments Report, on the other hand, shows some promise at offering another vantage point from which to evaluate these firms. The report paints a picture of an industry that, unsurprisingly, believes it is generally successful and on the right track.

Top level findings focus on the developmental maturity of core content moderation processes across member companies, while acknowledging that work still needs to be done around delivering on policies and implementing third party and user feedback. Certainly, it’s also great to hear that a majority of the companies are reporting the beginning of the formalization of Trust & Safety and product team relationships– that will undoubtedly lead to a more holistic T&S perspective.

But while the report’s synthesis of first-party assessments has value, it also has obvious limitations. It offers little insight into the status or comparative sophistication of the individual companies that completed assessments, and no way to interrogate the veracity of the findings.

DTSP appears aware of these limitations, and promises that this inaugural report is a starting point. The next step, which DTSP has mentioned elsewhere and which is detailed in the conclusion of the Safe Assessments Report, is the design, creation and implementation of third-party assessments. These will, even if released at a level of abstraction connected to the industry, be important for a more objective view of the industry. These assessments are the bare minimum in order for DTSP to gain further legitimacy outside of its utility for its members. While the organization is of course beholden to its members, third party assessments should be built with input from experts on audits and other stakeholders. If these assessments are the steppingstone to more robust programs, like a certification, they should be preceded by or based on research that can clearly delineate what actually matters, rather than just what industry is interested in putting out.

It is crucial, of course, for civil society to be meaningfully included in the assessment process. DTSP has consulted with members of civil society and subsequently hired a staffer precisely for this purpose. Even so, inclusion of independent and diverse experts and organizations should be a more integral part of the fabric of the institution. An industry-based group can remain true to its goals while meaningfully including voices from civil society in its processes and outputs, thus ultimately strengthening its role and status by showing some fidelity to interests other than its own.

Toward A Healthier Ecosystem

DTSP’s existence and goals signal a welcome realignment of the industry toward action. However necessary this association may be, it is entirely not sufficient. More needs to be done by industry and by the rest of the ecosystem. Researchers are coalescing together, forming their own versions of cross-stakeholder groups. Collaboration between industry and civil society is needed, with more frequency and in more venues, including ad-hoc ways. The push to build codes for researcher access to data is one successful example of multi-stakeholder dialogue and action, but it should not remain the only one.

More collaboration can even just mean keeping open lines of communication between and within stakeholder groups, as it can help mitigate more low-level issues. Further, established structures—existing and future—are crucial to this ecosystem, with multistakeholder institutions helping tackle harms that are persistent but not necessarily pervasive. The lynchpin in talking about the platform governance space is thoughtful regulation, informed by the work of industry and civil society stakeholders, to strengthen the frameworks and principles already in place and to tackle specific society-level challenges. The tech industry, through the creation of DTSP, has signaled it is interested in having a conversation. It’s incumbent on the rest of us to make sure it’s a dialogue, and not just a monologue delivered entirely on industry’s terms.

Authors

David Morar
David Morar, PhD is Senior Policy Analyst at New America's Open Technology Institute, focusing on data privacy and protection, and platform governance. His academic research focuses on private governance structures.

Topics