Why Trust and Safety is Not Just for Social Media
David Sullivan / Feb 28, 2024In the wide-ranging Supreme Court oral arguments over Florida and Texas laws that inhibit digital services from moderating online content as they see fit, one particular line of questioning has set off alarm bells for trust and safety professionals.
Although on the whole the justices appeared to recognize the First Amendment rights of social media companies to use content policies as part of their editorial discretion to shape what kind of content is acceptable on their services, several justices questioned whether this would apply to other, ostensibly less inherently expressive services, from e-commerce to ride sharing, and from email to payments.
As Justice Amy Coney Barrett said of the e-commerce marketplace Etsy, “It looks a lot more like a brick-and-mortar marketplace or flea market, you know, than, you know, a place for hosting speech.” And Justice Elena Kagan posited that a law targeted at such services might be constitutional: “with respect to Gmail and direct messaging and Venmo and Dropbox and Uber, with respect to all those things, a site could not discriminate on the basis of viewpoint, just as maybe a site couldn't discriminate on the basis of race or sex or sexual orientation or what have you.”
This line of reasoning not only misses the expressive nature of this much wider set of online products, it poses real safety threats for the users of these services.
While it might seem straightforward to distinguish services in need of content moderation from those that do not, reality is more complicated. As journalist Casey Newton noted, the payment service Venmo has a social element, where users can publicly share and comment on their transactions. With many digital services, the line between public and private features is blurred, further complicating this exercise. At what size does a group chat, for example, cross the line from a private discussion into something more public?
Such intellectual acrobatics may be theoretically interesting, but they distract from very serious safety matters at hand. Take ride-sharing for example. The justices were understandably concerned with a scenario where someone could be denied access to transportation due to inconsistent or even discriminatory application of a company’s terms of service. But the ability of companies to set policies for the acceptable use of a ride-sharing platform is essential to ensuring the safety of a business that connects people to ride in strangers’ motor vehicles.
Ride sharing has expanded access to transportation and reduced dependence on car ownership, but those benefits come with risks. Riders and drivers alike face the possibility of encountering harassment, abuse, and hateful speech in the vehicle or on the platform. Trust and safety teams at ride sharing, food or grocery delivery apps, or dating services need the freedom to set rules for the safe use of their services and be able to enforce against those rules.
To be sure, there is an urgent need for the public to have confidence that companies are accountable and transparent with their trust and safety operations. But there are far less restrictive means of providing this assurance than the laws that Texas and Florida (as well as other states) have enacted.
The organization I lead, the Digital Trust & Safety Partnership, has set forth a framework of best practices and a methodology for rigorous assessments that can be used by any company that provides public facing digital products and services. Rather than telling companies what types of content or conduct should be allowed on their very diverse services, we align industry around the practices that responsible companies can use to ensure they are thinking about safety across product development, governance, enforcement, improvement, and transparency.
At a moment when the Supreme Court seems to be giving consideration to taking away company latitude when it comes to how they shape their content and conduct policies, it is all the more important that any provider of services that deal with user-generated content or conduct demonstrates seriousness of purpose by aligning with industry standards. This is how those companies, from video games to marketplaces, and from podcasts to financial services, can demonstrate that the decisions they make are neither arbitrary nor discriminatory.
In the digital era, commerce and expression are intertwined. It is the rare purchase that does not come with a chance to leave a review. And our interactions with strangers are increasingly mediated by online services who find themselves in positions of responsibility for ensuring safe and trustworthy experiences.
Whichever way the Supreme Court rules on Florida and Texas, services will need to show how they are keeping their users safe, and aligning with industry standards is a good place to start.