Home

Topic

Content Moderation

The openness of social media enables the public to speak and connect in unprecedented ways, but also enables users to share harmful content on a massive scale. In response, social media companies sometimes deploy algorithms and human moderators to flag and remove unlawful content and enforce community guidelines. Governments can wield significant influence over social media platforms' content moderation decisions through legislation, regulations, and public pressure. While government regulation may be necessary to foster accountability for social media platforms, it also can be used to increase censorship and stifle free expression. Conversely, inadequate regulation may permit harmful content to flourish. Achieving the balance between freedom of speech and content regulation remains an ongoing challenge in the ever-evolving digital landscape.

Latest on Content Moderation
View more
New Analysis Reveals Scope of 'Fake News' Referencing or Produced by AI in Brazil; Little Related to Elections or Democracy, For Now
Policy Tracker
See all
Name
Type
Government
Date Initiated
Status
Last Updated

Rows per page:

1–10 of 15