Home

Donate

The Supreme Court's NetChoice Rulings Underscore the Importance of Free Speech Online

Vera Eidelman / Jul 18, 2024

Silhouettes of the states of Florida and Texas superimposed on an image of the US Supreme Court.

At the start of this month, the Supreme Court made clear that the government cannot control social media in an effort to impose its own vision of what online speech should look like. Adding to a long line of precedent, the Court again recognized that the First Amendment protects the right to disseminate and curate content—and that holds whether you’re publishing online or in print, whether you’re curating your own words or the words of others, and whether you moderate a lot or just a little.

The order came in Moody v. NetChoice, LLC and NetChoice v. Paxton, two cases challenging laws from Florida and Texas that would force businesses such as Facebook and YouTube to carry certain content that they do not want to feature. The Florida law would prohibit social media companies from banning political candidates, or limiting the distribution or prioritization of posts by or about them or by “journalistic enterprises.” The Texas law would have barred larger social media platforms from blocking, removing, or demonetizing content based on the users’ views.

Though legislators may claim their goal with these laws is to prohibit censorship, these laws would replace private entities’ editorial voice with preferences dictated by the government. As the Supreme Court explained, such regulatory efforts constitute one of the gravest dangers to free expression—and they’re foreclosed by the First Amendment. The government cannot dictate the speech or curatorial choices of private actors in order to “achieve its own conception of speech nirvana,” period.

The First Amendment’s protection for editorial discretion isn’t new: the Supreme Court has applied it to curators as varied as newspaper publishers, parade organizers, and utility bill printers. But its application to social media and online speech is important. Prohibiting the government from imposing its editorial choices on online publishers—and, by extension, the public—is crucial to ensuring we can all speak our minds and access information on the internet.

Because the full scope of the laws—who they govern, and in what ways—was not evident from the record before the Court, it sent the cases back down to the Fifth and Eleventh Circuits to determine, first, the laws’ full reach and, second, whether any unconstitutional applications of the laws outweighed any permissible ones. It’s clear, for example, that the law dictates YouTube’s choices about which videos to play and in what order, and it’s clear that that application of the regulation is unconstitutional. But do the laws regulate Google’s decisions about Gmail, and, if so, do those applications of the law violate the First Amendment?

Though there are many questions ahead, there’s no longer room for debate on two big ones: Does the First Amendment protect the rights of online publishers, including social media companies, to disseminate and curate speech? Yes. Can the government force social media platforms to curate their feeds according to the government’s preferences, notwithstanding those rights? Absolutely not.

These answers could—and should—dissuade federal and state lawmakers from seeking to prohibit or burden social media platforms’ publication of First Amendment-protected speech because it doesn’t like what’s being published, or thinks it is too dangerous for the public.

These answers are right on the law, and they’re good for free speech in the digital age. Without curation and moderation, much of social media would be a mess, difficult to navigate and full of content users likely don’t want to see. Users would have to contend with speech that doesn’t fit the expressive goals of the platform or of the community of users, or they’d lose access to information on certain topics altogether, if platforms decided that distributing all viewpoints on those topics wasn’t worth it.

A platform should be able to welcome—and users should be able to access—posts celebrating diversity without having to host, or engage with, white nationalist views. Similarly, a social media site should be able to host posts questioning the scientific basis for climate change or affirming the existence of God without having to publish contrary viewpoints. People can seek any of this material out. But the government cannot force it upon either the platforms or the public that relies on them.

That’s not to say the curatorial choices these laws would have required are inherently bad. To the contrary, all people should have the ability to express themselves, and be able to access a wide range of views. Online platforms are vital to online speech, enabling us to discuss ideas and share perspectives. Ultimately, users should have as much control as possible over what expression they can access. Given their significant role, the major platforms should facilitate robust debate by erring on the side of preserving the public’s speech, and giving users tools to moderate their own feeds. And if they remove protected content, they should offer clarity upfront as to why and, at a minimum, stick to their own rules. Platforms should also offer opportunities for appeals when they inevitably get things wrong, but the government can't tell platforms –or you – what to say or promote.

What’s next in these cases specifically? Right now, the Texas and Florida laws continue to be blocked by the preliminary injunctions issued by the district courts in each case, which the Supreme Court’s order did not disturb. Meanwhile, the Fifth and Eleventh Circuits are tasked with deciding whether it is proper to continue to block the laws in their entirety, and NetChoice is tasked with laying out the full scope of the laws and showing that the bad applications outweigh the good ones. It’s possible that NetChoice will instead choose to instead focus its challenge only on the laws’ applications to content moderation and curation specifically—the pieces the Supreme Court has already clearly stated violate the First Amendment—or that the courts could themselves choose to limit the injunctions to that universe of applications.

Going forward, it is clear that the government cannot aim to suppress the freedom of speech online—whether that’s by requiring its preferred versions of content moderation policies, or imposing liability for publishing protected speech.

Authors

Vera Eidelman
Vera Eidelman is a staff attorney with the ACLU’s Speech, Privacy, and Technology Project, where she works on the rights to free speech and privacy in the digital age. She focuses on the free speech rights of protesters and young people, online speech, and genetic privacy. She has litigated cases in...

Topics