Home

Donate
Perspective

Is Trust & Safety Dead, or Just Evolving?

Dean Jackson / Sep 26, 2025

Dean Jackson is a contributing editor at Tech Policy Press.

US President Donald Trump and First Lady Melania Trump hosted business and technology leaders, including Meta founder and CEO Mark Zuckerberg, for a dinner in State Dining Room at the White House on Thursday, September 4, 2025. (Official White House Photo by Andrea Hanks)


Is this the end of Trust & Safety? Is anyone left to defend it? It had a golden age, but it appears to be over. In its place is a grim realpolitick, or, some say, a performative compliance with new regulations. There is a sense of crisis but little consensus about what to do.

In a forthcoming research paper for the Emory Law Journal, University of Virginia School of Law professor Danielle Keats Citron and University of California Irvine School of Law professor Ari Ezra Waldman add to the chorus of concern and predictions about the profession tasked with (trying) to keep screens (mostly) free of nudity, graphic violence, and hate speech, and various other forms problematic behavior.

Citron and Waldman lay out a compelling history of the Trust & Safety field, starting with the 1996 passage of Section 230 of the Communications Decency Act. Section 230 allows platforms that host user-generated material to operate their services largely without risk of liability, but provides no instruction about how to do so. Citron and Waldman describe how companies operating in this vacuum gradually built internal structures, processes, and rules which became the foundation for Trust & Safety today. As the social media industry grew, moderation became more complex. Companies learned by watching their peers and competitors, and external pressure from users, regulators, journalists, scholars, politicians, and advertisers encouraged changes to corporate policies and practices. This was the period when platforms like Twitter evolved from libertarian playgrounds to companies with detailed community guidelines.

Citron and Waldman implicitly criticize what they see as the conventional wisdom that economic trends killed Trust & Safety’s momentum. I’m less convinced that this narrative was ever believed by many people who were paying attention; if nothing else, the stupendous sums of money tech companies are throwing at artificial intelligence suggests their financial straits were never that dire. Regardless, Citron and Waldman write that the changes afflicting Trust & Safety are “not simply due to economics but also to new laws and practices of the individuals and organizations that make up the field.” Echoing Stanford scholar Daphne Keller, they worry that “harms may go unaddressed if companies replace action with paperwork” as they pursue compliance under new laws like the Digital Services Act (DSA). Citron and Waldman are not warning that Trust & Safety is dead. They are afraid it will become a zombie.

“Proceduralist compliance,” they write, “is the dominant form of governance practice in neoliberalism.” The suggestion seems to be that regulations such as the DSA are not a floor on which Trust & Safety can build, but a glue trap from which it will never rise. Citron and Waldman see the signs of this shift, for example, in the rise of vendors selling tools and compliance strategies for platforms large and small alike. “Rather than focusing on strategies to make a platform safer, compliance professionals focus their energies on identifying strategies to comply with the law. Those questions could conceivably have the same answer, but the former is steeped in substantive results whereas the latter lends itself to creative lawyering and avoidance,” they argue.

Maybe so, but it feels strange to hear platform self-governance associated with “substantive results” in 2025 as Meta founder and CEO Mark Zuckerberg rolls back hate speech rules, Grok goes full Mecha-Hitler, and teenagers confide in AI chatbots with tragic results. The recent proliferation of harms attributed to Trust & Safety’s decline—AI spam, nonconsensual nudity, violent hate speech—do not stem from a culture of regulatory compliance, but rather from an apparent lack of concern for potential consequences.

Citron and Waldman’s argument is slow to incorporate politics as an influence on the evolution of Trust & Safety. In fairness, their objective is to describe how organizational theory predicts the field’s trajectory; but the absence of politics for most of the paper only highlights its importance. It was, after all, social pressure and the threat of regulation, not regulation itself, that backstopped many of Trust & Safety’s most important achievements over the last fifteen years.

It is at the very end of the paper, when they are recommending solutions to the field’s atrophy, that Citron and Waldman’s ideas become exciting, particularly when they imagine what tech workers could do to change things. “An independent, grassroots union of privacy professionals that would help prospective tech industry employees negotiate job security could help ameliorate some of the worst influences of the profit-at-all-costs mission. A similar worker-powered union could help strengthen the position of trust and safety professionals,” they write, recognizing that Trust & Safety workers have lost important forms of leverage over the past five years, and that reclaiming it might require labor organizers to reclaim worker power from Big Tech more generally. But that may be a tall order in Trump's America, where the National Labor Relations Board has been hobbled after a relentless campaign led by tech executives.

Citron and Waldman also recommend changes to Section 230—specifically in line with the proposed Intimate Privacy Protection Act—which raise questions similar to their criticisms of the DSA’s compliance requirements. According to draft text, the Act would require platforms to have reporting and takedown procedures for certain types of content, including cyberstalking, intimate depictions, and digital forgeries which are “virtually indistinguishable from an authentic record of the speech, conduct, or appearance of an individual.” Unlike the DSA, the law is vague, but it would still require companies to create extensive processes and procedures to show compliance.

Ultimately, Citron and Waldman leave me pining for a younger, less corporate internet, without a finger in the political wind. “Trust and safety practices developed endogenously, not necessarily in line with the public interest but rather in response to corporate self interest,” they write—begging the question of what a public interest internet would look like, and how different it would be from the spaces we log into today.

Authors

Dean Jackson
Dean Jackson is a Contributing Editor at Tech Policy Press and principal of Public Circle LLC. He was the analyst responsible for the January 6th Committee’s investigation into the role of large social media platforms in the insurrection. As a freelance writer and researcher, he covers the intersect...

Related

Perspective
A Realist Perspective on Trust & SafetyJuly 21, 2025

Topics