Home

Donate

Topic

Content Moderation

The openness of social media enables the public to speak and connect in unprecedented ways, but also enables users to share harmful content on a massive scale. In response, social media companies sometimes deploy algorithms and human moderators to flag and remove unlawful content and enforce community guidelines. Governments can wield significant influence over social media platforms' content moderation decisions through legislation, regulations, and public pressure. While government regulation may be necessary to foster accountability for social media platforms, it also can be used to increase censorship and stifle free expression. Conversely, inadequate regulation may permit harmful content to flourish. Achieving the balance between freedom of speech and content regulation remains an ongoing challenge in the ever-evolving digital landscape.

Reading the Systemic Risk Assessments for Major Speech Platforms: Notes and Observations
Policy Tracker
Name
Type
Government
Date Initiated
Status
Last Updated

Rows per page:

1–10 of 30