Home

Donate

Supreme Court Mulls NetChoice Cases: Reading the Room

Gabby Miller, Haajrah Gilani, Ben Lennett / Feb 27, 2024

Haajrah Gilani reported from the Supreme Court in Washington, DC.

The steps outside the US Supreme Court.

On Monday, the US Supreme Court appeared hesitant over what to do about a pair of state laws in Florida and Texas that would restrict social media companies’ ability to moderate content on their platforms.

In the Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton cases, the Court is assessing whether the Florida and Texas laws are constitutional under the First Amendment. ​​The two cases represent a circuit split where lower courts interpreted federal and constitutional law differently. The Supreme Court took up two questions:

  1. Whether the laws’ content-moderation restrictions comply with the First Amendment.
  2. Whether the laws’ individualized-explanation requirements comply with the First Amendment.

Those arguing before the Supreme Court included:

  • Henry C. Whitaker – Solicitor General, State of Florida, on behalf of Ashley Moody, Attorney General of Florida
  • Paul D. Clement – Counsel, NetChoice
  • Elizabeth B. Prelogar – Solicitor General, US Department of Justice
  • Aaron L. Nielsen – Solicitor General, State of Texas, on behalf of Ken Paxton, Attorney General of Texas

Observations from the oral arguments

Immediately following oral arguments, a representative for the Computer & Communications Industry Association (CCIA), which is a co-plaintiff with Netchoice, struck a celebratory tone during a news conference outside the courtroom. “We are confident that we're on the winning side of centuries of First Amendment jurisprudence. And based on today's argument, I look forward to a ruling in our favor,” said Matt Schruers, president of the CCIA. Yet the Justices did not clearly side with either the states or the tech companies during oral arguments. Below we discuss the key points of contention.

Are platforms like publishers?

Justice Samuel Alito drew laughter during his line of questioning with Clement, when he asked, “Let's say YouTube were a newspaper, how much would it weigh?” (During the press briefing after the hearing, CCIA’s Schruers told Tech Policy Press that this was the most unanticipated question NetChoice received.) NetChoice’s Clement responded by saying that a newspaper version of YouTube “would weigh an enormous amount, which is why, in order to make it useful, there's actually more editorial discretion going on in these cases than any other case that you've had before you.”

But because the definition of what a “social media platform” is is not consistently agreed upon, both the Court and counsel struggled to determine the bounds of “editorial discretion.” During oral arguments, these platforms were referred to in a myriad of ways: “public forums,” “modern public squares,” and even newspapers, among other characterizations.

Moreover, the Supreme Court’s earlier decisions in Gonzalez v. Google, LLC, and Twitter, Inc. v. Taamneh also came up considerably, in ways particularly unhelpful to NetChoice’s members. Lawyers for the social media companies, in both their briefs and oral argument, defended against claims that they had taken insufficient action to take down – or in some cases, promote – terrorism content. They framed their services as conduits for others’ speech, and argued that they were incapable of screening all content on their platforms, instead utilizing algorithms that were content-neutral.

Now, in the NetChoice cases currently before the Court, they appear to be arguing the opposite – something both the Texas and Florida solicitor generals, as well as the justices, brought up. Justice Amy Coney Barrett asked counsel for the State of Florida, Solicitor General Henry Whitaker, to distinguish between the editorial discretion of a newspaper and the content moderation of platforms. “In Twitter v. Taamneh, the platforms told [the Court] that they didn't even know that ISIS was on their platform and doing things, and it is a strange kind of editor that does not even know the material that it is editing,” replied Whitaker.

Justice Barrett further asked the solicitor general of Florida, to clarify whether an algorithm boosting certain content would qualify as speech. “Well, it might be, Your Honor, but again, in Twitter and Gonzalez, the platforms told you that the algorithms were neutral methods of organizing the speech, much like the Dewey decimal system,” Whitaker explained. Justice Barrett replied bluntly: “Well, that's not what [the platforms] are saying here.”

“As far as we can tell, if the algorithms work, though, in the manner that this Court described them in Twitter v. Taamneh, they look more like neutral ways to reflect user choice, and I don't think there's expression in that,” Whitaker said.

When asked about the prevalence of questions related to Taamneh in Monday’s oral arguments, legal counsel for the Taamneh family, attorney Keith Altman, told Tech Policy Press that it’s hard to relate a case fundamentally centered around terrorism content to a non-terrorism case. He also said that he does, however, believe social media platforms should be able to exclude users who pose terrorist threats. Altman and his co-counsel submitted an amicus brief in the Florida case arguing that the Court does not need to decide it on First Amendment grounds, but rather that it should review NetChoice’s Section 230 arguments.

Where does Section 230 fit into these cases?

NetChoice’s counsel rendered discussion of Section 230 of the Communications Decency Act of 1996, which shields website operators from liability for third-party content posted on their platform, “a distraction” from the crux of the case: whether platforms engaged in content moderation is constitutionally protected under the First Amendment. But the Justices didn’t seem to agree and spent some time drilling down on Section 230’s implications.

These questions mostly came from Justices Neil Gorsuch and Clarence Thomas: on the one hand, Justice Gorsuch asked how the court could possibly answer the question presented in Moody without at all examining Section 230, whereas Justice Thomas queried how NetChoice could reconcile how its clients engage in both editorial discretion and expressive conduct while also acting merely as a conduit. “Doesn’t that seem to undermine your Section 230 arguments?” Justice Thomas asked NetChoice counsel Paul Clement.

It was during a line of Section 230-related questioning that Justice Gorsuch entered a lengthy back-and-forth with US Department of Justice Solicitor General, Elizabeth Prelogar, whose position was largely in support of NetChoice in its challenge to the Florida and Texas laws. Prelogar related Section 230 back to the argument that platforms behave like publishers due to the editorial discretion they exercise when moderating content. “Congress specifically recognized the platforms are creating a speech product. They are literally, factually publishers. And Congress wanted to grant them immunity,” she said.

Maybe the toughest line of questioning around these issues came from Justice Alito, who seized upon an explanation from NetChoice’s counsel that, in the Taamneh case, the tech firms had “made quite clear” in their briefs that “some of the [platforms’] algorithms were as neutral as between rice pilaf and terrorism.” In response, Justice Alito questioned NetChoice’s framing of content moderation as editorial discretion. “I don't understand the rationale [here] for [Section] 230, if it wasn't that you can't be held responsible for that because this is really not your message. Either it's your message or it's not your message, I don't understand how it can be both,” Alito said. "It's your message when you want to escape state regulation, but it's not your message when you want to escape liability under state tort law.”

Does the Supreme Court view the laws differently?

While the laws presented in Florida and Texas are somewhat similar, the Texas law is more narrow in that it excludes websites primarily focused on news, sports, and entertainment, and more definitively categorizes platforms as common carriers. NetChoice counsel Clement noted these differences Monday afternoon in his opening statement for the NetChoice, LLC v. Paxton case, arguing that the Texas statute “operates more simply” because it bars “viewpoint discrimination.” Clement stated that the government may not engage in viewpoint discrimination, but it is an editor or speaker's First Amendment right to do so, which raised a hypothetical: for a platform to be viewpoint-neutral means that material advocating for suicide prevention, and promotion, would both be allowed on a platform. That would be a “formula for making these websites very unpopular to both users and advertisers,” Clement said.

However, several justices expressed concern over how broad both laws are, particularly the Florida statute. “I feel like there's a lot of indeterminacy in this set of facts,” Justice Jackson said to Florida Solicitor General Whitaker, asking him to clarify what the statute actually means in practice and who it covers. Whitaker could not elaborate, instead arguing that because the preliminary injunction was litigated at “breakneck speed,” the state did not have the chance to answer these questions. Justice Jackson also commented on the statute’s susceptibility to multiple interpretations, both narrow and broad.

What is the proper role of the Supreme Court in deciding these cases?

Opposing parties generally took the same stance on this question: because the Florida and Texas laws were swiftly blocked by preliminary injunctions issued by the lower courts, the states have yet to tackle questions beyond whether a social media platform is engaging in expressive conduct when it moderates content on its platform.

Justice Alito pressed NetChoice’s Clement on these concerns and whether the preliminary injunctions are good cause for skimming over them. “Mr. Clement, to what extent is it the result of your own litigation decisions?” asked Justice Alito. “You could have brought an as-applied challenge limited to the two platforms that you want to talk about, Facebook and YouTube,” Alito said. NetChoice instead brought a facial challenge. “You can't now shift and say it was a good preliminary injunction because it's fine as-applied to the platforms I want to talk about, and let's forget about all the other platforms that might be covered,” Alito said to Clement.

Concern over the undefined application of the laws served as the basis for multiple questions by the Court. Justice Barrett asked whether Uber and Etsy will have to comply with the Texas and Florida laws. Justice Jackson wondered whether the laws could determine who can be involved in a LinkedIn virtual job fair, which Justice Gorsuch argued could, again, open the door for discrimination via access to services. Justices pondered whether Gmail or WhatsApp, as a result of their messaging functions and speech transmission, could also be affected by the Florida law. NetChoice’s Clement swiftly dismissed these concerns, saying that the statute, in all its applications, is unconstitutional.

The justices signaled that the lower courts may be better suited to answer questions about definitions and applications. Justice Brett Kavanaugh, during the oral argument over the Texas law, noted that the landmark Supreme Court ruling in Turner Broadcasting System, Inc. v. FCC, which determined the must-carry rules imposed on cable television companies, was kicked down to the lower courts, making its second appearance before the Court after a trial. Texas Solicitor General Nielson, during the afternoon’s oral argument, indicated that Texas would be happy to do the same.

What’s next?

Although it’s unclear what direction the Supreme Court will take in determining how and to what extent the First Amendment applies to social media platforms and their content moderation decisions, Monday’s oral argument over the Florida and Texas laws provided a glimpse into what’s to come later this term. “I think the position we're offering here and the position this Court will consider next month in the Murthy case are entirely consistent,” DOJ Solicitor General Prelogar said Monday morning. She acknowledged that government coercion over editorial decision-making – as argument over Murthy v. Missouri will explore in March, particularly over disfavored views – would deem it a state actor and subject it to First Amendment scrutiny, but “vigorously disputes” that this is applicable with regards to the Texas law.

Authors

Gabby Miller
Gabby Miller was a staff writer at Tech Policy Press from 2023-2024. She was previously a senior reporting fellow at the Tow Center for Digital Journalism, where she used investigative techniques to uncover the ways Big Tech companies invested in the news industry to advance their own policy interes...
Haajrah Gilani
Haajrah Gilani is a graduate student at Northwestern University’s Medill School of Journalism in the Investigative Lab. She cares about the intersection between crime and public policy, and she covers social justice.
Ben Lennett
Ben Lennett is managing editor for Tech Policy Press and a writer and researcher focused on understanding the impact of social media and digital platforms on democracy. He has worked in various research and advocacy roles for the past decade, including as the policy director for the Open Technology ...

Topics