Home

Donate

Topic

Online Safety

Social media and other platforms must grapple not only with the negative impacts of users’ content, but also the use of their platforms to sell illegal products and engage in other criminal activity. Children, for instance, face potential exploitation and exposure to damaging content that can negatively impact their mental health. Tech companies respond to these concerns with content moderation and other mechanisms to identify unlawful and offensive material and remove problematic users. Governments have grown increasingly impatient with the limits of these efforts, enacting new laws and regulations that require internet platforms to protect minors from harmful content, excessive use, and predation; prevent illicit online sales; and, more generally, crack down on unlawful uses of services. Many critics of these laws are deeply concerned that imposing these kinds of strict obligations on tech companies will diminish freedom of expression, particularly for marginalized communities, and enable governments to target political speech and dissent on the internet.

Reading the Systemic Risk Assessments for Major Speech Platforms: Notes and Observations
Policy Tracker
Name
Type
Government
Date Initiated
Status
Last Updated

Rows per page:

1–10 of 43