Review of Amicus Briefs Filed in NetChoice Cases Before the Supreme Court
Gabby Miller, Ben Lennett, Justin Hendrix, Maria Fernanda Chanduvi, Divya Goel, Mateo García Silva / Feb 24, 2024On Monday, the US Supreme Court will hear two cases, Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton.
The Florida law was passed in May 2021, and Texas followed in September 2021. After enactment, both laws were immediately challenged by industry groups including NetChoice, an association that represents the social media companies, and the Computer & Communications Industry Association (CCIA). The US District Court for the Northern District of Florida issued a preliminary injunction preventing the Florida law from being enforced. The Eleventh Circuit Court of Appeals upheld this decision after an appeal by the Attorney General of Florida.
A US District Court in Texas also issued a ruling blocking Texas from enforcing its social media law. However, the US Circuit Court of Appeals for the Fifth Circuit ruled to vacate the injunction and reinstate the law on appeal from the Texas Attorney General. Groups representing social media platforms filed an emergency application to the US Supreme Court, which reinstated the district court’s injunction pending the Fifth Circuit’s decision. In a 2-1 split, the Fifth Circuit upheld the law and reversed the district court's preliminary injunction.
Citing the Fifth Circuit’s decision, the state of Florida appealed to the Supreme Court to reinstate its law. NetChoice also petitioned the Supreme Court to review the Fifth Circuit decision to uphold the Texas law. In September 2023, the Supreme Court granted a petition for review in both cases to determine the following:
- Whether the laws’ content-moderation restrictions comply with the First Amendment.
- Whether the laws’ individualized-explanation requirements comply with the First Amendment.
Briefs were submitted to the Court by the respective States, NetChoice, and other parties either in favor of the States or NetChoice, or neither party. In order to help Tech Policy Press readers better understand what arguments are being made by the amici, we put together brief summaries of them. These summaries are intended to offer the broad contours of each brief, and thus do not always contain every argument contained within them. If the reader wants a complete version of any one brief, the link to the document is provided in the text.
Ashley Moody, Attorney General of Florida, et al. v. NetChoice, et al. (22-277)
- Supreme Court docket
Key event and legal documents
- Nov. 30, 2023: Brief of respondents NetChoice, LLC, et al.
- Dec. 7, 2023: Deadline for amicus briefs in support of respondents (NetChoice, et al.) or neither party
- Jan. 16, 2024: Brief of petitioners Ashley Moody, Attorney General of Florida, et al.
- Jan. 23, 2024: Deadline for amicus briefs in support of petitioners (Moody)
- Feb. 26, 2024: Oral arguments
Amicus brief summaries for Moody v. NetChoice
Economists in support of neither party
- The amici curiae are experts in the digital economy and fellows at Technology Policy Institute. Their brief’s goal is to assist the Court in understanding the importance of substantive economic analysis application to its decisions based on market power determinations.
- Their main argument is that the Florida and Texas legislatures that issued the challenged laws based their actions on assertions that “large social media platforms exert market power in the economic marketplace.” However, those assertions have no economic analysis support or reasoning. Contrarily, both legislatures and the Fifth Circuit that upheld the Texas law “presumed the existence of market conditions without engaging in any serious analysis.” The economists argue that given the extensive economic consequences of this case on the online landscape, the Supreme Court must apply “well-established economic principles as it determines whether and how marketplace considerations bear on the legal questions presented in these cases.”
- The amici curiae states that when determining whether social media platforms have market power and should be treated like common carriers, using economic techniques and explaining the reasoning behind their findings is indispensable. However, the Fifth Circuit’s decision in NetChoice, LLC et al. v. Paxton argued that large social media platforms have “substantial market power” and, therefore, they are or should be treated like common carriers. “Unfortunately, this decision relies on a conclusory finding of market dominance that lacks the benefit of a process for finding facts from objective analysis” since there is no explanation of the economic study that led to that conclusion.
The Lawyers’ Committee for Civil Rights Under Law in support of neither party
- The Lawyers’ Committee for Civil Rights Under Law, a nonpartisan, nonprofit organization dedicated to pursuing racial justice, emphasizes the detrimental impact of the Texas law (HB 20) and the Florida law (SB 7072) on the rights and safety of Black people and other people of color online. The brief highlights how these laws would hinder online businesses from effectively moderating content that spews hate and disinformation, exacerbating the risks of hate speech and discrimination that disproportionately affect minority communities. The Committee advocates for the laws to be struck down under First Amendment strict scrutiny due to their prohibition of expressive speech, inhibition of expressive association, and content-based nature, while also calling for a narrow ruling that does not prevent future regulation aimed at prohibiting online discrimination.
- Although the brief does not explicitly categorize social media platforms as either newspapers or common carriers, it focuses on the essential role these platforms play in facilitating free expression and association among users, including Black people and other people of color. It argues against the Texas and Florida laws for potentially increasing online hate and discrimination by limiting platforms' abilities to moderate content, suggesting these platforms operate as modern public forums where a wide range of ideas and expressions should be freely exchanged.
- The brief argues that the regulations imposed by Florida and Texas are subject to and fail to meet First Amendment protections. By challenging these laws, the Lawyers’ Committee asserts that they inhibit the expressive speech and association rights of online platforms, and by extension, the users, especially minorities targeted by hate speech and disinformation. The Committee underscores the need for laws that allow for effective content moderation to protect against hate speech and discrimination while cautioning against broad interpretations that could limit future governmental efforts to regulate social media platforms to prevent online discrimination.
Reynaldo Gonzalez, Mehier Taamneh, et al. in support of neither party
- Reynaldo Gonzalez, a plaintiff in Gonzalez v. Google, and Mehier Taamneh, a plaintiff in Twitter v. Taamneh, along with the other individual plaintiffs in those two cases, filed in the case because the “Florida statutory scheme in this case would have the effect of limiting the ability of social media companies such as the defendants in Gonzalez and Twitter to remove, or refuse to recommend, posted material likely to incite terrorism or violence, and of requiring such social media companies to engage in the very type of conduct which the lawsuits in Gonzalez and Twitter contend is forbidden by federal law.”
- The brief points to “the principle of constitutional avoidance,” that “dictates that courts not decide “constitutional questions” if a case can be resolved on other grounds.” It argues that this Court does not need to decide these cases on First Amendment grounds but rather review NetChoice’s section 230 arguments. “The parties clearly disagree, but the questions about which they disagree concern only the meaning and implications of section 230.”
Alan B. Morrison in support of NetChoice
- Alan B. Morrison is a constitutional law professor at George Washington Law School. His brief agrees with respondents (NetChoice LLC) and the Eleventh Circuit about “significant First Amendment problems with the Florida laws” and with some other laws that Congress must enact, either mandating or prohibiting certain Internet postings.
- He argues, "Florida’s choice is not limited to Florida, but binds everyone in the United States.” If Florida decides that the content must remain available to the public over the objection of the website’s host, that choice doesn’t only affect Florida but the entire country because “the World Wide Web does not respect state or even national boundaries.” Under Morrison’s argument, the “national impact of these decisions explains why only Congress can, consistent with our system of federalism and the First Amendment, choose what can and cannot be posted on the Internet.”
- Morrison believes that Section 230 and the Dormant Commerce Clause should also be evaluated since they define prohibited practices the hosting entities cannot evade. With Florida’s law, “the hosts must continue to include the posts covered by these laws on their websites even though the hosts' standards require it to take them down. The mandatory postings will continue for citizens in other states, including those states where the legislature might require the hosts to take down the posts that Florida insists must continue to be available.” The result is that Florida would have established the law of Internet postings for the covered materials for every state when that is a role that only Congress may undertake under the Constitution. Moreover, Section 230 supports the conclusion that Congress intended that website hosts, not the government - including a state government - should determine what may be posted on their websites.
- The amicus curiae supports the idea that the Court should first consider these non-First Amendment arguments under Section 230 and the Dormant Commerce Clause, before analyzing any First Amendment issue. Only if it concludes that Florida has the legal authority to regulate content on the Internet should it evaluate whether the First Amendment “is an absolute bar to legislation mandating what may, must, or may not be carried on the Internet.”
Donald J. Trump in support of Moody
- Arguing in favor of the Florida law, the former President’s lawyers say that social media platforms are common carriers. “Entities that do not make individualized determinations as to who may use their services are generally considered common carriers,” they argue, and thus they cannot engage in “unfair discrimination.”
- “This is no different from the requirement that air carriers and railroads sell a ticket to everyone who qualifies under their publicly disclosed terms of service.”
- The brief bases much of its argument on the special privileges it says are afforded to technology platforms by Section 230 of the Communications Decency Act. But it says, “NetChoice ignored Section 230’s pivotal role in creating the industry,” explaining why the existence of 230 makes social media more like railroads. Thus, the brief says, “The application of common-law principles to social media companies is perfectly consistent with Section 230’s framework.”
- The brief argues that “disclosure and consistency requirements” in the Texas and Florida laws do not apply to individual messages and do not require Platforms to carry any messages.”
- “Florida’s law is an attempt to ensure that Platforms state their censorship policies and apply them consistently. Sections (2)(a) and (2)(b)” of the law “are in perfect harmony with long-standing common-law prohibitions against unfair discrimination by common carriers.”
Freedom X in support of Moody
- Freedom X, a public interest law firm that says it is devoted to protecting freedom of thought, speech, and conscience, argues for the importance of preserving the ability to speak and exchange ideas freely on social media platforms. Freedom X contends that restrictions on speech by platform hosts can impede self-expression and the debate necessary for democratic self-government. Freedom X emphasizes the distinction between adding speech to and subtracting speech from public debate, advocating that adding speech serves the public interest more than subtracting it. They argue that the First Amendment protects the addition of speech to public discourse more than the suppression of speech.
- The brief does not directly classify social media platforms as newspapers or common carriers but focuses on the role of these platforms as modern public squares where free exchange of ideas should be preserved. Freedom X challenges the notion that social media companies have an inherent right to censor speech, suggesting instead that these platforms should facilitate open and robust debate akin to traditional public forums.
- According to Freedom X, the Florida law regulating social media platforms is aligned with First Amendment protections because it seeks to prevent viewpoint discrimination and promote a diversity of opinions online. Freedom X argues that laws preventing social media platforms from arbitrarily suppressing content based on viewpoint neutrality do not unduly burden the platforms' autonomy. Instead, such regulations ensure a free flow of information and ideas, which is essential for a functioning democracy.
PEN American Center and Library Futures in support of NetChoice
- PEN American Center (“PEN America”) is a nonprofit, nonpartisan public-policy organization with an interest in protecting free expression as the cornerstone of a robust and healthy democracy. Library Futures is a project of NYU’s Engelberg Center on Innovation Law & Policy and is a nonprofit organization that uncovers and confronts policy issues that threaten libraries in the digital age. Amici, both concerned about “government bans, attacks, and restrictions on the freedom to read and learn in both physical and digital libraries,” want the court to consider the Texas and Florida laws (the “Challenged Laws”) within the broader “nationwide efforts by state legislators to prescribe orthodoxy in the marketplace of ideas, and to punish those who violate that orthodoxy.”
- The amici argue that the First Amendment must provide “absolute protection against governmental efforts to ‘prescribe what shall be orthodox’ in public discourse.” In recent years, there has been a proliferation of laws and regulations across the US at both the state and local level that intend to “shape the contours of public debate” by correcting a “perceived ‘bias’ in the way certain social media websites treat political speech.” The challenged laws are an attempt by the government to “place a legislative thumb on the scales of whether and how certain content and viewpoints can or cannot be expressed,” and the fact they target social media websites does not differentiate them from other governmental attempts to impose orthodoxy within “more traditional venues for public discourse.” Upholding the Florida and Texas laws would be a “race to the bottom,” intensifying the race by government to exercise control over public discourse and emboldening other states to pass more extreme speech-restrictive laws.
- The state cannot force websites to disseminate speech that violates their moderation guidelines in the same way the state cannot constitutionally compel book publishers, book stores, and private libraries to “distribute a controversial book against their will, or maintain some government mandated ‘balance’ of books.”
NetChoice, LLC, et al. v. Ken Paxton, Attorney General of Texas (22-555)
- Supreme Court docket
Key event and legal documents
- Sept. 9, 2021: Texas House Bill 20 (H.B.20) enacted
- Sept. 22, 2021: Netchoice LLC and the Computer and Communications Industry Association (CCIA) file suit against Ken Paxton, Attorney General of Texas
- Dec. 1, 2021: The US Western District of Texas court blocks the Texas bill from going into effect.
- Dec. 6, 2021: Texas Attorney General Paxton appeals the preliminary injunction.
- Apri 1, 2022. The 5th U.S. Circuit Court of Appeals ruled to reinstate the law. The groups representing social media platforms subsequently filed an emergency application to the United States Supreme Court.
- May 31, 2022. In a 5-4 decision, the Supreme Court vacates the stay of the preliminary injunction, preventing enforcement of the law while the lawsuit is decided.
- Sept. 16, 2022. In a 2-1 split, the 5th US Circuit Court of Appeals upholds the law, reversing the district court's preliminary injunction.
- Dec. 15, 2022: NetChoice, LLC et al. petition the Supreme Court to review the court of appeals decision.
- Sept. 29, 2023. The Supreme Court grants a petition for review. The review will be limited to Questions 1 and 2 as presented by the U.S. Solicitor General's brief.
- Nov. 30, 2023: Deadline for brief of petitioners NetChoice, LLC, et al.
- Dec. 7, 2023: Deadline for amicus briefs in support of petitioners (NetChoice, et al.) or neither party
- Jan. 16, 2024: Deadline for brief of respondent Ken Paxton, Attorney General of Texas
- Jan. 23, 2024: Deadline for amicus briefs in support of respondent (Paxton)
- Feb. 26, 2024: Oral arguments
Amicus brief summaries for NetChoice v. Paxton
Phoenix Center for Advanced Legal & Economic Public Policy Studies in support of NetChoice
- The Phoenix Center is a “non-profit 501(c)(3) research organization that studies the law and economics of the digital age” and has “published significant academic work on the topics of telecommunications law and common carriage regulation.”
- Its brief addresses the Fifth Circuit’s argument that social media platforms are common carriers like telephone companies by arguing that “Internet platforms do not provide a ‘public good’” – namely they provide services that are both excludable and rivalrous. They also “do not act like telephone companies (which is why Internet platforms are currently not subject to common carrier regulation by the Federal Communications Commission).” It also contends that these platforms do not fit under the common carriage definition in the Communications Act, and that “social media is obviously not a service that ‘only governmental entities have traditionally provided,’” thus entitling platforms to the protections of the First Amendment.
- The brief also addresses potential unintended consequences of the Fifth Circuit’s decision. The FTC does not have jurisdiction over common carriers, so adopting the Fifth Circuit interpretation could remove their oversight of social media platforms on consumer protection and privacy matters. There could also be ripple effects applying similar arguments on any side of the political spectrum to cable and satellite providers, or “a dive down a very slippery slope toward government control over speech on the Internet.”
David Mamet in support of Paxton
- The brief submitted by David Mamet, represented by the Zachor Legal Institute, in support of the respondent (Ken Paxton, Attorney General of Texas) in case No. 22-555, presents a unique perspective on the regulatory landscape of social media platforms. David Mamet is a Pulitzer Prize-winning playwright, author, and filmmaker, and expresses his concern for the preservation of freedom of speech in the face of government-enabled censorship. He argues from the standpoint of a citizen and creator deeply invested in the free exchange of ideas and the integrity of information disseminated by major information conduits, which are both privileged and subsidized by the government. Mamet's argument is metaphorically framed through a short story about aerial navigation, illustrating the disconnect between the information provided by "maps" (or curated information from social media) and the observable "territory" (reality). He emphasizes the importance of basing decisions on direct observations rather than manipulated information, suggesting that reliance on distorted maps can lead to societal confusion and disorientation.
- Although the brief does not explicitly categorize social media as either a newspaper or common carrier, Mamet's metaphorical argument implies a critique of social media platforms acting as gatekeepers of information. By controlling the "maps" people use to navigate their understanding of the world, these platforms may distort users' perceptions of reality. The argument suggests that when information conduits restrict access to diverse "maps" or perspectives, they potentially act against the public interest, leading to a homogenized and possibly misleading representation of the "territory."
- The brief indirectly addresses the issue of whether the regulations in Texas are subject to First Amendment protections by highlighting the importance of safeguarding a diverse information ecosystem. While not directly engaging with legal arguments about First Amendment protections, Mamet's narrative suggests that ensuring access to a variety of maps (information sources) is crucial for a free society. The implication is that regulations should support the freedom to navigate information landscapes freely, without undue influence from government-privileged platforms that might restrict this diversity.
Law and History Scholars and the American Economic Liberties Project in support of Paxton
- The scholars are experts on “the First Amendment, regulation of addictive technologies, digital product design laws, antitrust, and the history of business, technology, communications, and American political development.” They include Richard John, Professor of History and Communications at Columbia Journalism School; Matthew Lawrence, Associate Professor of Law at Emory University School of Law; Lawrence Lessig, Roy L. Furman Professor of Law and Leadership at Harvard Law School; Zephyr Teachout, Professor of Law at Fordham Law School; and Tim Wu, Julius Silver Professor of Law, Science and Technology, Columbia University Law School. The American Economic Liberties Project (AELP) is an independent nonprofit research and advocacy organization dedicated to addressing the problem of concentrated economic power in the United States.
- Amici wish to preserve a “traditional state power,” or “barring unreasonable discrimination by private industry in the exercise of its business operations,” regardless of industry or communication technology. They affirm the Fifth Circuit’s determination that HB20 is constitutional, which “contains a facially neutral nondiscrimination provision” that forbids “treating users differently in the commercial spaces that serve as modern-day public squares, which their owners open to anyone with access to the Internet.” They claim NetChoice intends to upend the balance between state regulatory power and the judiciary, and risks granting “a broad and unjustified immunity to social media platforms from nearly any regulation in the public interest.”
- The general authority to pass HB20 is “well established” by PruneYard Shopping Center v. Robins, where the US Supreme Court “affirmed the States’ power to enact laws requiring operators of a commercial enterprise held open to the public to provide equal access.” It also held that these “generally applicable, neutral nondiscrimination laws aimed at ensuring equal public access to commercial spaces that are open to the public do not infringe their operators’ First Amendment rights and are thus presumptively valid and do not receive heightened First Amendment scrutiny.”
Life Legal Defense Foundation in support of Paxton
- Life Legal Defense Foundation (“Life Legal”) is a “California non-profit 501(c)(3) public interest legal and educational organization that works to assist and support” pro-life advocates. It believes that “pro-life voices have experienced viewpoint discrimination by social media” and argues in this brief that “the decision of the Fifth Circuit to vacate the preliminary injunction should be upheld.”
- First, Life Legal refutes the applicability of some cases NetChoice cites in its argument to social media platforms. It writes that a set of cases involving public access channels were not decided on First Amendment grounds and “left open the possibility that a legislative body could impose neutral rules on public broadcasting.” It also argues that social media platforms cannot be analogized to newspapers because they do not have the same space and time constraints that force editorial decisions and because they interpret the Court’s decision in Twitter v. Taamneh to say that “social media companies are passive conduits of the speech of others.” Life Legal goes on to dispute other NetChoice analogies, such as one that compares platforms to parade organizers.
- Next, the brief contends that “[e]ditorial discretion does not receive the same level of protection in every circumstance.” For example, the Court in Turner, “acknowledged that the law interfered with the editorial discretion of cable operators to a certain degree yet stated that the interference did not merit the same level of scrutiny in every situation.” Finally, Life Legal contends that the Texas law is content-neutral because it equally applies to speech of different political leanings, “does not alter social media companies’ speech,” and is justified due to the platforms’ market power. These features, it argues, mean that Section 7 of HB 20 satisfies intermediate scrutiny.
Professor Philip Hamburger in support of Paxton
- Philip Hamburger is a Professor of Law at Columbia Law School. He supports corporate speech rights and is directly interested in the case's outcome because he relies on social media for learning. His main argument is that Section 7 of the Texas free-speech statute – its antidiscrimination section – complies with the First Amendment, for the following reasons:
- The social media platforms have no speech or speech rights in their censorship because they don’t exercise initial choice, but leave the public to post what they wish.
- The social media platforms’ freedom of speech “impedes much speech and scientific knowledge”.
- “The platforms have no speech rights in allegedly private speech to the extent it is governamental.”
- Censorship by dominant private platforms is a temptation for the government to engage in public censorship.
- If the platforms have the right of expressive discrimination against users, it will be difficult to deny this right to other businesses.
- Hamburger argues that the platforms are common carriers and not speakers and they do not edit or curate in the sense of “initially choosing what appears on their sites” like newspapers do. On the contrary, “they indiscriminately allow the public to post on their platforms and then defenestrate some for disfavored views. This is discriminatory exclusion, not editing and curating. It shows that the platforms are nor exercising the initial choice that would reveal any speech.” Precisely because they are not speakers, they can constitutionally be barred from discriminating on the basis of viewpoint.
- The amicus curiae states that platforms are merely a common carrier for the speech of others, and that has two legal consequences: (i) the platforms are not ordinarily legally responsible for the speech they carry; and (ii) they can be subject to antidiscriminatory requirements without any question of freedom of speech.
- To Hamburger’s point of view, if the Court treats the platforms as speakers, it would allow them to enjoy freedom of liability “while giving them constitutional protection form the corresponding duty against discrimination, thus enabling them to shut down dissent.” “The speech is not theirs, so they have no speech right against the Texas antidiscriminaory rule.”
- He states that the First Amendment should not be misconstrued to prevent antidiscrimination regulation of communications common carriers - especially the massive private carriers. Common carrier doctrine “is the foundation of our antidiscrimination law. Its carrier-speaker distinction for communications carriers has always been considered entirely aligned with the First Amendment. So there is every reason to apply it to censoring companies that are dominant.”
Senator Josh Hawley (R-MO) in support of Paxton
- Missouri Senator Josh Hawley is Ranking Member of the Senate Judiciary Committee’s Subcommittee on Privacy, Technology, and the Law. His brief “urges the Court to affirm the decision of the Court of Appeals and interpret the First Amendment in a manner consistent with the common-law legal principles that anchor the American constitutional framework.” The brief warns that “That [tech] sector is not, and never has been, entitled to blanket immunity from both regulation and liability.”
- The brief notes that “individuals who play an active role in disseminating others’ speech are liable for any unlawful harm that speech causes” but since platforms “could not exercise publisher-level control over the speech generated by third-party users,” Congress passed Section 230 to insulate them from civil liability. In this case, the platforms “claim that their content hosting and curation decisions are in fact expressive—expressive enough that they enjoy First Amendment protection.” The brief argues that “The Court should not bless the platforms’ contradictory positions, much less constitutionalize them. Doing so would effectively immunize the platforms from both civil liability in tort and regulatory oversight by legislators.”
- The brief notes that “Under Section 230, providers shall not be treated by courts as the publishers of others’ speech because, in fact, they are not. They are, in principal part, conduits.” It points to the Court’s recent decision in Taamneh where it noted that the platforms’ moderation was “passive and content-agnostic.”
Students at Columbia Against Censorship in support of Paxton
- Amicus curiae is a group of students at Columbia University who study freedom of speech and are committed to resisting censorship from both public and private entities.
- The Students argue that “in the modern world, an orthodox opinion today can become a ‘community standards violation’ tomorrow, and such standards are often enforced in an arbitrary or biased manner.” They ask the Court to uphold the right to “share and receive dissident, unorthodox, and even offensive opinions on major social media platforms,” as preserved by the State, so students can participate online “without fear of being silenced.”
- The Students rely heavily on liberal philosopher John Stuart Mill’s ideas on “the societal benefits of uninhibited discussion,” as well as precedent established in NYT v. Sullivan, in their brief. They argue that while the First Amendment and similar state constitutional guarantees normally protect only against government suppression of nonconforming views, “any effective suppression of dissent—even if done by private companies—is dangerous,” as it deadens inquiry and discussion and marshals social stigma against unpopular ideas. Regardless of whether the threat comes from government or private entities, there is a compelling state interest “in preserving freedom of discussion from any significant threat,” and the Court should hold Texas HB 20 constitutional.
The Florida and Texas Laws
Amicus briefs in support of NetChoice
American Jewish Committee in support of NetChoice
- The American Jewish Committee (AJC) is a global advocacy organization focused on countering antisemitism and promoting democratic values. Its interest in the case stems from a commitment to combating hate, including antisemitism, across social media platforms. AJC emphasizes the connection between online hate speech and real-world violence, demonstrating through examples how online radicalization has led to tragic mass shootings. The organization argues that content moderation is a crucial tool in minimizing the spread of hateful messages and that the laws in Texas and Florida hinder these efforts, potentially leading to real-world consequences.
- AJC does not categorize social media as either a newspaper or common carrier explicitly. Instead, it focuses on the role of social media in democracy and the right of platforms to exercise editorial discretion in content moderation. AJC highlights the importance of allowing social media companies to minimize hate speech through content moderation, asserting that these actions are protected under the First Amendment and are vital for preventing violence.
- According to AJC, the regulations imposed by Florida and Texas on social media platforms interfere with the platforms' First Amendment rights to engage in content moderation. AJC argues that these laws make it difficult for social media services to efficiently and effectively moderate content, thereby hampering their ability to address and minimize online hate speech and its potential to incite offline violence. The brief suggests that the laws could lead to an increase in unmoderated hate speech, contributing to a rise in real-world violence, and emphasizes the need for platforms to have the freedom to quickly and decisively moderate content.
Americans for Prosperity Foundation in support of NetChoice
- Americans for Prosperity Foundation (AFPF), a nonprofit organization that says it is committed to protecting freedoms of expression and association, argues against the Florida and Texas laws. They highlight the dangers of allowing the government to compel private parties to host third-party speech, emphasizing the constitutional protections against such compelled speech. AFPF asserts that these laws, intended to prevent social media platforms from silencing users, inadvertently threaten the freedom of expression by imposing undue constraints on the platforms' content moderation practices. The brief underscores the importance of preserving an open and diverse society by protecting private speakers from being compelled to deliver messages against their will.
- AFPF does not explicitly categorize social media platforms as either newspapers or common carriers. Instead, its argument centers on the constitutional principle that private entities, including social media platforms, should not be compelled by the government to host or promote speech that they do not endorse. The focus is on the First Amendment rights of these platforms to exercise editorial discretion without governmental interference, rather than on fitting them into a specific regulatory framework.
- Given AFPF's stance, the brief argues that the regulations imposed by Florida and Texas infringe upon the First Amendment by compelling speech from social media platforms. The foundation posits that these laws conflict with the constitutional protections afforded to private speakers, including social media platforms, by mandating them to host or explain content moderation decisions in ways that undermine their editorial judgment. The brief cautions against the dangerous precedent these laws could set, potentially leading to greater governmental control over speech on digital platforms.
Article 19: Global Campaign for Free Expression, et al. in support of NetChoice
- The amici curiae are organizations that aim to “ensure individuals around the world may participate freely in online expression and debate matters of public concerns.” The organization was named for the corresponding article of the Universal Declaration of Human Rights, which advocates for freedom of expression as a fundamental human right, including in the digital environment. The organizations included are: The International Justice Clinic at the University of California, Irvine School of Law, and Open Net Association, Inc.
- The organization’s main argument is that Article 19 of the Universal Declaration of Human Rights requires that governments meet a strict three-part test to promulgate permissible speech regulations. “Any such limitation on expression must be (1) provided by law (legality); (2) necessary to protect (necessity and proportionality); and (3) a legitimate objective (legitimacy). From the organization’s point of view, the Texas law (HB 20) and the Florida law (SB 7072) cannot satisfy Article 19’s three-part test because both laws impose “must-carry obligations on select major social media platforms, with the express purpose of placing their moderation of user-generated content under increased government control.”
- The amicus curiae believes both laws were written to achieve that particular political goal. The organization argues that the laws combine vague prohibitions and requirements for social media platforms with broad, discretionary enforcement authority vested in increasingly politicized offices of attorneys in general,” and that combination creates precisely the kind of environment that Article 19 seeks to preclude.
- The organization believes both laws have an important negative opinion in public discourse. “Facing uncertain liability and potential politicized enforcement, platforms have two practical choices: (i) self-censor and promote only content aligning with the preferred government view of the day; or (ii) engage in no content moderation whatsoever, resulting in a deluge of unmoderated information that cannot possibly be sorted through in any effective fashion.”
Bluesky, M. Chris Riley, Copia Institute in support of NetChoice
- The brief filed on behalf of Bluesky, Chris Riley and the Copia Institute, all of which operate platforms that publish user generated content, argues that the Florida and Texas laws directly undermine the First Amendment rights of platforms by imposing restrictions on how they can administer their sites, affecting both the platforms' and users' ability to express themselves online. It highlights the ways in which these laws would harm the social platform Bluesky, the Copia Institute and Techdirt (which hosts user comments), and Chris Riley’s Mastodon community. For instance, as the operator of a Mastodon server, “Riley directly personifies how providing a platform service is itself an expressive activity that the First Amendment protects, and why it must, because his experience shows how personal the choices are that he, like any platform provider – big or small, commercial or otherwise – must make in order to administer his service.”
- The brief says the constitutional harms posed by these laws are not limited to Florida and Texas, but rather threaten the broader ecosystem of online expression, creating a chilling effect on free speech across the internet. An “ecosystem of platforms is necessary in order for there to be meaningful choices in what expression Internet users experience online.”
- It asserts that these laws not only infringe on platforms' operational freedoms but also stifle innovation in the tech industry, particularly affecting entities like Bluesky “which, in addition to providing a platform service, also innovates on the technology others can use to offer their own.”
- The laws could create a “platform or algorithmic monoculture,” effectively stifling the diversity of expression. And they may reach beyond Florida and Texas, affecting the business of the amici, since “even if they do not reach them today, they easily could tomorrow, either as amici grow and evolve, or as more jurisdictions take their turn passing their own laws designed to target how any platform may serve their users.” The laws could also set a precedent for further arbitrary and expansive regulations by other jurisdictions. “There is also no assurance that, if these laws are allowed to stand, the next ones produced would not more directly target amici.”
- The brief says the laws could foreclose on the promise of decentralization, since “technologies like Bluesky’s offer an alternative to either corporate or government control: strong user control, supported by a competitive marketplace of interoperating platforms built by third-party developers, which laws like these threaten.”
The Cato Institute in support of NetChoice
- The Cato Institute is a nonpartisan public policy research foundation founded in 1977 and dedicated to advancing the principles of individual liberty, free markets, and limited government. Cato argues that the PruneYard precedent is misapplied in this case by the Fifth Circuit, and that the Court should consider overruling the wrongly decided decision and “find that both laws at issue in these cases violate the First Amendment.”
- Cato argues that the earlier Wooley decision “recognized that being forced to platform, distribute, or amplify a message is itself a First Amendment harm, whether or not that amplification creates the false appearance of endorsement. And just as the drivers in Wooley were forced to become “mobile billboards” for someone else’s message, the PruneYard Shopping Center was forced to become an outdoor stage for someone else’s message.” As a result, a similar law that forces social media platforms to carry or distribute certain messages violates the First Amendment.
- In addition to the Wooley decision, Cato points to the Abood case, where the “Court established the foundational principle that the First Amendment prohibits states from requiring someone “to contribute to the support of an ideological cause he may oppose.” 431 US 209, 235 (1977).” Cato argues the PruneYard opinion “is inconsistent with both Wooley and the Court’s many compelled-funding cases” and “is a clear jurisprudential outlier among the Court’s compelled-speech cases.”
Center for Democracy and Technology in support of NetChoice
- The Center for Democracy & Technology (CDT) is a non-profit public interest organization that represents the public’s interest in an open, decentralized Internet and works to ensure that the constitutional and democratic values of free expression and privacy are protected in the digital age.
- CDT argues that the provisions in the Florida and Texas laws “regulating platforms’ content moderation are unconstitutional because they interfere with platforms’ exercise of editorial discretion in violation of the First Amendment.” Platforms exercise “substantial, active editorial discretion” over third-party content via their content moderation standards, which defines what a platform prohibits and decides what type of community it seeks to build and messages or values it wishes to convey. This is similar to standards newspapers adopt that can and do change over time. They additionally exercise editorial discretion by selecting third-party content that complies with their terms of service and organizing it to each user.
- Content moderation involves “expending substantial resources to identify violative content and take action against that content” and “involves more than just the binary decision [of] whether to take down content or allow it to remain on a service.”
- Platforms’ exercise of editorial discretion with respect to third-party content – a “quintessential activity of traditional content distributors” – are entitled to the same First Amendment protection that the Court has afforded to editorial discretion in other contexts, such as newspapers, corporations, and parade organizers. The state laws would unconstitutionally “force platforms to convey messages to which they object.” The provisions would also “impose viewpoint and content-based restrictions on platforms’ content moderation decisions,” making them unconstitutional.
Center for Growth and Opportunity, et al. in support of NetChoice
- Amici curiae are the Center for Growth and Opportunity, Freedom Foundation of Minnesota, Illinois Policy Institute, Independence Institute, James Madison Institute, Libertas Institute, Mountain States Policy Center, Oklahoma Council of Public Affairs, Pelican Institute for Public Policy, R Street Institute, Rio Grande Foundation, and The John Locke Foundation. They are “educational and research organizations committed to the faithful interpretation of the Constitution, the rule of law, market economics, individual rights, and limited government,” and argue that the Florida and Texas state laws in question “are both unconstitutional and unneeded,” and should thus be overturned.
- Amici argues that Texas and Florida have attempted to choose how much speech to protect or refuse at will, which is inconsistent with the First Amendment. Additionally, the US Constitution “mandates a national free-speech marketplace, unburdened by state interference.” This is especially true of the Fourteenth Amendment, which was designed “to prevent states from interfering with the free flow of ideas, as southern states had done with abolitionist speech before the Civil War.” That, much like the Commerce Clause, and taken together with the First Amendment, “bars states from interfering with the sovereignty of other states and thus embodies an anti-balkanization principle. The alternative would be a “splinternet” of 50 different speech codes, which may not even be technologically possible for websites.
- The Texas and Florida laws “are unnecessary because a free speech marketplace is best fostered—and is already being fostered—by market forces.” Websites express themselves using algorithms, amici argue. Companies have First Amendment rights to use algorithms to help them speak more effectively, and readers have First Amendment rights to read speech produced with help from algorithms, “on diverse platforms, each with its own distinctive speech mix.” These rights are held in both the First Amendment’s Speech and Press clauses, “which guards technologies that enable speech and serve readers.”
Chamber of Progress et al. in support of NetChoice
- Amici is made up of the Chamber of Progress; Access Now; Consumer Technology Association; HONR Network; Information Technology & Innovation Foundation; Information Technology Industry Council; Interactive Advertising Bureau; IP Justice; LGBT Tech; Stop Child Predators; TechNet; and Washington Center for Technology Policy, all of which are “deeply interested in ensuring that Ameriicans may participate in healthy online environments.”
- Amici argue that the Florida and Texas statutes threaten a strong, speech-affirming Internet that provides support networks to LGBTQ+ youth and parents, among other groups, by threatening websites’ ability to exercise editorial discretion. This would shut these communities out while amplifying the most extreme voices and promoting divisive and damaging material, “upend[ing] the Internet as we know it.”
- There is no basis for applying any level of First Amendment scrutiny given the unprecedented accessibility and abundance it provides the average consumer.
- Popular social media companies build their websites and “brand” by considering the content they’d like to distribute – or exercising editorial discretion – and the audience they’d like to reach, according to amici. Florida and Texas intrude on websites’ editorial discretion by forcing them to publish content they might not otherwise. There is no constitutionally acceptable justification for the government to compel a person to speak to the government’s own preferred message or to impose its political agendas on private media companies.
Developers Alliance and Software & Information Industry Association in support of NetChoice
- The Developers Alliance is a non-profit corporation that advocates for software developers to advocate on their behalf to support the industry’s growth and promote innovation. The institution analyzed the Texas and Florida laws from a software development point of view. The main idea is that “websites express themselves through their choices about what content to display and how – and that remains true, even when they use algorithms – to make those decisions. When these websites disseminate speech to their users, they convey a message about the type of speech they find acceptable and the kind of community they hope to foster.”
- The amicus curiae defends the idea that Texas and Florida are mistaken when they say that the curation and dissemination of speech online is not expressive and therefore does not trigger First Amendment protections. Particularly, the institution challenges Texas's argument when saying that such content curation and moderation cannot be expressive “because websites use algorithms.” However, the use of algorithms does not replace human decision-making. “Rather, algorithms are tools for implementing human decisions. Computers can only do that which they are instructed, and so any content-moderation action taken by a computer reflects the editorial objectives of its human administrator, whether that is by removing certain specifically identifiable content or looking for patterns to statistically identify content likely violative of the websites’ rules. The fact that content curation and moderation decisions are now implemented by algorithms does not sap those decisions of their expressive nature.” Therefore, from the amicus curiae point of view, social media platforms’ content moderation is expressive and should be considered speech subject to the First Amendment application.
- “To outputs of websites’ content-moderation algorithms are thus expressive in at least two regards” (i) algorithms are tools for using and disseminating vast quantities of information. Websites take the information available and through their algorithms, determine how best to use that information, which itself implicates the First Amendment; and (ii) “algorithms that downgrade or remove content or users shape the overall nature of the communications transmitted by the website, effectuating websites’ decisions about what “reason” tells them should be disseminated.”
- Algorithms are capable of operating automatically only because they have been programmed by human software developers to “operationalize websites value judgments regarding what content to remove, demote or prioritize.”
Discord in support of NetChoice
- The brief argues that Discord’s structure is inherently incompatible with the Texas and Florida bills. Discord, a real time messaging service whose users communicate within “servers,” moderates content centrally while providing groups the tools to manage and organize their communities. These communities lack the “resources, experiences, and scale” to moderate on their own or provide detailed disclosures of their decisions.
- If every act of moderation is attributable to Discord, the company is at risk for costly litigation and will be unable to assist communities in “protecting themselves from harmful or irrelevant content.”
- A government-mandated proliferation of “garbage” content on Discord that is hateful, dangerous, or merely unwanted drives off communities, violating users’ associational First Amendment rights. It’s long been recognized that the right to associate entails a right to exclude in order to carry out its expressive purposes.
Electronic Frontier Foundation, et al. in support of NetChoice
- “The First Amendment right to be editorially diverse does not evaporate the moment a site reaches a certain state-determined level of popularity,” says EFF. But in passing these laws, the states “take those protections away and force popular sites to ignore their own rules and publish speech inconsistent with their editorial vision, distorting the marketplace of ideas.”
- The First Amendment supports the existence of both moderated and unmoderated platforms, and “users are best served under current law, where the First Amendment preserves legal space for the emergence of a continuum of content moderation, from highly curated services to those not curated at all.” This produces a spectrum of options. “Some social media sites have special concerns for ensuring that the information they publish is accurate, and would be handicapped in these efforts by the Florida and Texas laws which would force them to publish posts regardless of their unreliability.”
- Users regularly choose platforms because of their editorial viewpoint, the brief argues, pointing to the exodus of many users from Twitter/X since Elon Musk bought the site. “Users who preferred Twitter’s old editorial viewpoint appear to have left X in large numbers, while X has undoubtedly gained new users who are attracted to its new editorial slant and features.”
- These laws could lead to the demise of many online communities that rely on content curation, from SmokingMeatForums.com, which has rules that ban bans “fighting or excessive arguing,” or The High Road, a firearms discussion forum that bans users from engaging in “discussions relating to the preparation for possible societal breakdown” or “foreign invasion.”
- The Florida law also violates the First Amendment and treats users unfairly by “mandating favoritism” for “the posts of Florida political candidates and highly popular ‘journalistic enterprises’ more favorably than average internet users’ posts, while at the same time impossibly requiring that all content moderation decisions be ‘consistent.’”
Engine Advocacy in support of NetChoice
- Engine Advocacy is a non-profit organization focused on technology policy, research and advocacy dedicated to bridging the gap between startups and policymakers. Engine submitted its brief to explain how the case may impact a range of burgeoning online businesses. The main argument is that Florida and Texas laws may “affect a more widespread set of smaller and independent companies.” Engine believes that there is a direct conflict between the laws and the Supreme Court’s precedents on the First Amendment.
- The amicus curiae may discourage “socially beneficial startups from launching and developing into successful companies, which will be a significant loss for competition, innovation, and speech on the Internet. The laws may also lead to a different kind of Internet: a messier and less useful one with fewer startups creating new kinds of online communities where users can listen, speak, and be heard.” With this argument, Engine explains that startups and their investors rely on the protection that the First Amendment offers to protect their editorial discretion in moderating content to keep websites “safe, healthy, and relevant for users.”
- The brief states that the Texas and Florida laws may fall into an unconstitutional practice, burdening smaller companies’ First Amendment rights “through their onerous notice and appeal requirements”, because the Florida law requires social media platforms to notify the content creator that their content might be taken down, while the Texas law provides “concurrently with removal a notice to the user and the explanation,” the possibility to appeal the decision to remove the content to the platform.
- Finally, Engine affirms that to build a successful company, startups need consistent and uniform laws that give them the confidence that they will not change. “Permitting content moderation restrictions and transparency requirements to chip away at the First Amendment rights of private entities will undermine that needed certainty.”
Professor Eric Goldman in support of NetChoice
- Professor Eric Goldman is a law professor and Associate Dean for Research at Santa Clara University School of Law who has been researching and writing about Internet Law for thirty years. His recent research focuses on “the censorial consequences when government regulators impose and enforce transparency obligations on content publishers’ editorial decisions.”
- Goldman argues that the state laws censor social media platforms by “stripping them of editorial discretion and otherwise distorting their editorial decision-making.” The laws directly interfere with online publishers’ content moderation, overriding their editorial freedoms, while also indirectly imposing censorship by “compelling publishers to disclose details about their editorial decision-making and operations.” This is a “censorship-by-transparency approach” that motivates online publishers to change their decisions to please regulators.
- Goldman’s amicus brief focuses on the statutory “explanations” requirements in both the Florida and Texas laws, which obligate online publishers to provide users with explanations regarding content moderation decisions (“explanations obligations”). Both the Fifth and Eleventh Circuits analyzed these explanations using the “relaxed test for constitutional scrutiny” in Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio, 471 U.S. 626 (1985). While the courts reached opposite conclusions, Goldman argues they misunderstood Zauderer, and that the Zauderer test never should have been applied to the States’ Laws. Over-expansive application of this test jeopardizes “the freedoms of speech and press online because regulators are imposing a wide range of disclosure obligations with the intent and effect of dictating editorial standards to publishers.” Goldman asks the Court to establish that “legislatures cannot, consistent with the First Amendment, use editorial transparency obligations to censor publishers’ editorial choices.”
First Amendment and Internet Law Scholars in support of NetChoice
- Amici curiae are professors and scholars who are experts in the First Amendment and Internet law, including Anupam Chander, James Grimmelmann, Michael Karanicholas, Kate Klonick, Paul Gowder, et al. They say they share an interest in the healthy development of the Internet, in protecting the rights of Internet users, and in ensuring the rule of law online. Amici are only interested in the constitutionality of the States’ Laws’ content-moderation restrictions, but take no position on the constitutionality of the Laws’ individualized explanation requirements.
- While the Texas and Florida laws raise questions about social media platforms' First Amendment rights, they also severely restrict platform users’ First Amendment rights to select the speech they listen to. This type of intrusion on listeners’ rights are “flagrantly unconstitutional” and “would force millions of Internet users to read billions of posts they have no interest in or affirmatively wish to avoid.”
- Amici argue that platforms’ content moderation is not “censorship,” but rather protection for users “from a never ending torrent of harassment, spam, fraud, pornography, and other abuse” and other unwanted speech.” Without moderation, the internet would be “completely unusable,” where “users would be unable to locate and listen to the speech they do want to receive.” The Florida and Texas laws systematically favor speakers over listeners and convert speakers’ “undisputed First Amendment right to speak” into “an absolute right for speakers to have their speech successfully thrust upon users.” The freedom to listen equally includes the freedom not to listen, according to amici.
Former Representative Christopher Cox and Senator Ron Wyden in support of NetChoice
- Former Rep. Christopher Cox (R-CA) and Sen. Ron Wyden (D-OR), at the time both US Representatives, co-authored Section 230 of the Communications Decency Act. They argue the “Fifth Circuit erroneously invoked Section 230 in support of its conclusion that internet platforms are mere conduits without First Amendment rights to editorial discretion.” Instead, “Internet platforms are speakers with First Amendment rights to edit and moderate the third-party content they publish. That is why Congress enacted Section 230 in the first place.”
- Section 230 “was enacted in recognition of the unique characteristics of the internet that make online platforms especially vulnerable to collateral censorship via litigation. By offering internet platforms protections from lawsuits based on their moderation of the user content they publish, by freeing them from liability for making the wrong editorial choice, and by immunizing them from liability for making a moderation decision “too late” to avert some alleged content-based harm, Section 230 enables them to exercise their First Amendment right to editorial control.” Accordingly, “Section 230, plainly confirms that internet platforms are publishers and speakers that select, arrange, edit, and moderate third-party content.”
- “The fact that the immunity provision extends only to third-party content, leaving services open to liability for publishing their own content, further demonstrates that under Section 230, internet platforms possess and exercise their First Amendment rights as publishers.”
Foundation for Individual Rights and Expression in support of NetChoice
- The Foundation for Individual Rights and Expression (FIRE) is a “nonpartisan, nonprofit organization dedicated to defending the individual rights of all Americans to free speech and free thought” that has litigated First Amendment cases, primarily related to college campuses.
- FIRE strongly analogizes platforms to newspapers and focuses on how “[t]he Free Speech Clause of the First Amendment constrains governmental actors and protects private actors.” Its brief is framed around why the 11th Circuit’s decision was correct and the 5th Circuit’s decision incorrect. FIRE argues that the 5th Circuit “confuses private editorial decisions with censorship,” drawing on inapplicable analogies to leafleting at malls and military recruitment at law schools and rebuts the 5th Circuit’s argument that content moderation is not protected private editorial decision-making because it takes place after initial publication, rather than before. Content moderation is protected speech, they argue – thus these statutes impact platforms’ speech, not just the speech of platform users.
- FIRE similarly asserts that requiring an appeals process and content removal explanations is unconstitutional under the First Amendment, and burdensome regardless. FIRE also takes issue with arguments about the platforms’ power as a reason to allow the Texas and Florida statutes to stand.
Francis Fukuyama in support of NetChoice
- The political scientist Francis Fukuyama, in a brief prepared with Stanford’s Daphne Keller and other prominent lawyers, says Florida and Texas should seek to address their concerns over social media content moderation through less restrictive means, specifically by empowering users with control over their online speech and content preferences. As it stands, the laws they passed do not pass First Amendment muster. “Though they purport to protect Internet users from private platform ‘censorship,’ the laws instead put control over speech into government hands.”
- The laws are unconstitutional under the First Amendment. Both fail under strict and intermediate scrutiny because they unjustifiably impose content- and speaker-based restrictions on speech. And both impermissibly treat platforms as “common carriers.”
- A better approach is encouraging the development of more substantial user controls. “The basic design of the Internet permits—and indeed has already led to—better solutions than those proposed by Florida and Texas.” Such user controls, enabled by middleware and interoperability, offer a constitutional and practical alternative to state-imposed content moderation, preserving user autonomy.
- The brief argues that the laws overlook the Internet's fundamental design and capacity to empower individual choice, unduly expanding government control over online speech. The internet itself retains the architecture for the free flow of information. “What has changed since the 1990s is not the basic technical design, but rather the significant concentration of users on a small number of websites and applications at the Internet’s content layer.”
- And, the laws may lead to extremes: either a torrent of awful content, or bans on controversial topics. “The laws effectively present platforms with two unappealing choices: They can open the floodgates to speech that most users do not like or want to see, or they can adopt broad bans on controversial topics to avoid losing users and advertisers.”
- The laws are also unconstitutionally vague, the brief argues. Florida’s law is “impenetrable,” while Texas includes “impermissibly vague requirements in its rules restricting ‘censor[ship],’ which it defines as’“to block, ban, remove, deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise discriminate against expression.’”
- “More fundamentally, the rules established by both Florida and Texas seem to assume the existence of a platonic ideal—a correct way of ranking, ordering, or exposing users to particular speech, departure from which would ‘deny equal access or visibility’ to or ‘discriminate against’ particular content.”
Goldwater Institute in support of NetChoice
- The Goldwater Institute (GI), a nonpartisan public policy and research foundation, filed the brief. GI is interested in advancing the principles of limited government, individual freedom, and constitutional protections. Despite experiencing what it describes as "censorship" by social media platforms, GI respects the property and speech rights of these platforms and argues that they have the right to decide how to operate. This argument is relevant to both the Florida and Texas laws, as GI supports the position that social media platforms, as privately owned businesses, have every right to content moderation practices without compulsion by state laws.
- GI views social media platforms neither as newspapers nor as common carriers but as privately owned businesses with the right to decide on their content moderation practices. This stance is rooted in the belief that these platforms have property and speech rights, which should not be overridden by state laws requiring them to host or promote certain content against their will. GI's argument is against the application of the "PruneYard" principle to social media, emphasizing that private property rights cannot be violated under the guise of expanding expressive rights of individuals.
- Based on GI's position that social media platforms are private properties with speech rights, the regulations imposed by Florida and Texas would violate the First Amendment. GI argues that states cannot expand the protections for individual rights in a manner that violates the rights of others, including the rights of social media platforms to decide how to manage content on their platforms. This assertion is grounded in the principle that any system of rights must be compossible, meaning different people must be able to exercise their rights without coming into conflict. GI posits that the laws in question fail this requirement by compelling platforms to host content they might wish to moderate or remove, thus infringing on their First Amendment rights.
International Center for Law & Economics in favor of NetChoice
- The International Center for Law & Economics (ICLE) is a “nonprofit, non-partisan global research and policy center” that “promotes the use of law and economics methodologies and economic learning to inform policy debates.”
- They make an economic argument that social media platform business models operate around selling advertisements, so the companies are incentivized to maximize user engagement and must “optimize the benefits of speech while minimizing negative speech externalities.” This, they argue, renders the platforms “best positioned to serve users’ speech preferences” and “moderate content in order to provide such benefits to users.” Drawing heavily on comparisons to Turner Broad. Sys. v. FCC, ICLE argues that a strict scrutiny standard should be applied.
- ICLE goes on to argue that the First Amendment protects the “private ordering” of speech, drawing comparisons to a property owner’s right to exclude unwelcome speakers from their property and editorial rights of newspapers. They rebut the argument that social media companies are providing public goods, and should thus be considered common carriers, because they “do not hold themselves out to all comers.” ICLE also argues that the companies structurally “lack gatekeeper monopoly power” in a manner comparable to cable providers and that many covered companies (e.g. reddit, LinkedIn, Tumblr, Pinterest) do not have “substantial market power as measured by share of visits.”
Internet Society in support of NetChoice
- “Texas and Florida have enacted laws that seek to curtail or altogether eliminate the ability for platforms to engage in content moderation. These laws put at risk core benefits of the Internet.”
- These laws could disproportionately affect smaller platforms and stifle innovation by imposing burdensome compliance costs. “If this Court permits these laws to stand, we would likely see a proliferation of similar state laws, expanding to require compliance from smaller platforms with fewer users and less revenue available to hire legal compliance staff. This, in turn, would entrench the large platforms that can afford the compliance efforts across the various states.”
- “The potential costs associated with compliance would squeeze out smaller platforms that could not afford to comply or dissuade platforms from engaging in any content moderation whatsoever. Either outcome will be detrimental to users and online discourse.”
- And, there could be implications beyond the US. “If this Court allows these state regulations to stand, other countries likely will push the envelope even further—requiring platforms to follow even stricter and possibly conflicting rules.” It predicts that for “non-U.S.-based companies, the Texas and Florida (and potentially 48 other state laws) could prompt an exodus from the U.S. market.”
- The Internet Society says users respond to differences in the content moderation norms on certain platforms, and leave for new ones when they prefer different norms. Preserving competition is necessary for healthy online discourse. “It is axiomatic that competition leads to better innovation, and competition with content moderation and disclosures has (as a technological matter) provided for more usable websites and user interfaces.”
Internet Works, Glassdoor, Indeed, Mozilla, NextDoor, Pinterest, Trip Advisor, Tumblr, and Vimeo in support of NetChoice
- The amici parties argue that Florida and Texas threaten Americans’ opportunities to form communities, find inspiration, land jobs, plan vacations, and more, by penalizing an indeterminate number of websites for making basic decisions about what acceptable speech is and how to present it. Undifferentiated content moderation burdens smaller and mid-sized companies by impacting their ability to distinguish themselves and attract new users.
- The state laws would invite lawsuits over virtually any content moderation or curation decision, like the suit against Reddit over volunteer community moderators ejecting a user from their Star Trek-themed group over perceived uncivil conduct. The amici ask the Court to affirm the decision of the Eleventh Circuit and reverse the contrary decision of the Fifth Circuit.
Liberty Justice Center in support of NetChoice
- The Liberty Justice Center (LJC) is a nonprofit, nonpartisan, public-interest litigation firm focused on protecting fundamental rights such as free speech. LJC argues against the misuse of government power to control speech, emphasizing that the government should not dictate the "proper" exercise of free speech online, particularly in light of state attempts to legislate private platforms' editorial discretion. The brief expresses concern over the potential for both federal and state governments to misuse their power to influence or control the editorial decisions of social media platforms, challenging the laws enacted by Florida and Texas as violations of the First Amendment.
- LJC does not explicitly classify social media platforms as either newspapers or common carriers. Instead, the brief focuses on the First Amendment implications of government attempts to legislate or influence content moderation practices. It highlights the importance of preserving the autonomy of social media platforms to make editorial decisions without government interference, likening these decisions to the editorial discretion traditionally exercised by newspapers.
- Based on LJC's argument, the regulations in Florida and Texas that aim to control content moderation on social media platforms violate the First Amendment. The brief asserts that government-mandated rules for content moderation infringe upon the platforms' right to free expression by dictating how they should manage content and interact with users. By drawing parallels with case law that protects the editorial choices of newspapers, LJC argues that social media platforms should similarly be protected from governmental control over content moderation decisions.
Marketplace Industry Association, OfferUp, Etsy, and eBay in support of NetChoice
- The filers (Marketplace Industry Association, along with OfferUp Inc., Etsy, Inc., and eBay Inc.), representing a coalition of digital marketplace platforms, express their vested interest in protecting their First Amendment rights to operate, curate online marketplaces, and ensure safe, user-friendly experiences. They argue against the "misguided notions of 'fairness'" imposed by the Florida and Texas laws, which they claim threaten the foundational First Amendment rights of marketplace owners to moderate content contrary to their expressive values and vision. The filers contend these laws create impossible compliance burdens, especially for small and mid-sized companies not typically characterized as "big tech," and could inadvertently force marketplaces to host content or expressive products that violate their established community standards or policies.
- The filers do not explicitly categorize social media platforms as either newspapers or common carriers. Instead, their focus is on the right of website owners to moderate content in alignment with their platforms' mission and community standards. They emphasize the importance of content moderation in fostering specific site values, ensuring public safety, and promoting a productive environment free from unsafe, spammy, discriminatory, or offensive content.
- The filers argue that the regulations imposed by Florida and Texas infringe upon the First Amendment by compelling platforms to host content against their policies. They assert that such laws undermine the platforms' editorial discretion and their ability to communicate a chosen set of values to their user base. The brief emphasizes the constitutional protection of private platforms' rights to decide what content to host and display, warning against the dangers of government-imposed regulations that could extend beyond large platforms to affect smaller marketplaces as well.
Media Law Resource Center in support of NetChoice
- The Media Law Resource Center mainly argues that Texas and Florida violated the First Amendment by elevating the private interests of individuals seeking to expand their audiences at the expense of the public’s right to workable social media platforms. The marketplace of ideas, according to the MLRC, requires editorial discretion so that the public “is not overwhelmed by an incomprehensible flood of messages including disinformation, misinformation, and irrelevancies.”
- The MLRC are careful to define content moderation’s relationship to the First Amendment, stating that the public’s First Amendment interest overlaps but is separate from the First Amendment protections afforded to expressive messages. The amici say that social media companies have the same expressive rights as media organizations, and the public receives the benefit of a moderated platform whether they understand that the operator is expressing anything by its moderation decisions.
Moderators of R/Law and R/Scotus in support of NetChoice
- As moderators on Reddit, the amici parties believe they are “self-processed censors” who maintain certain standards of substantive and constructive conversation by removing content that does not enrich, or actively harms, subreddits.
- Amici argue that under the Florida and Texas bills, distasteful content is protected by the First Amendment when it is extended to government actors, whereas the subreddit is a private forum moderated by private actors. Private actors censored on Reddit are free to make their own websites to host their speech but are “not free to hijack amici’s websites.” The First Amendment cannot force a private actor to carry or subsidize another’s speech, according to the amici parties.
National Taxpayers Union Foundation in support of NetChoice
- The National Taxpayers Union Foundation (NTUF) is a non-partisan research and educational organization founded in 1973, focused on government spending and tax regulation. The main argument of the NTUF is that each website is private property, and the owners have the right to curate or moderate the experience of visitors, just like any other business owner. The protection of this right is essential for the First Amendment, which not only protects freedom of speech but freedom to associate or not associate.
- The NTUF states that the Supreme Court should recognize that social media companies have First Amendment rights under the Press Clause “to curate their collection of information and opinion as they see fit. The NetChoice challengers have consistently asserted First Amendment claims to curate the information on their websites. This Court should recognize the First Amendment protects that right to curate.”
- From the NTUF point of view, the First Amendment protects the technology of mass media production and not only institutional press and that protection includes editorial control on what to say and what not to say. That protection includes all sorts of expression, including the rejection of amplifying one message over another.
Professor Christopher S. Yoo in support of NetChoice
- Christopher S. Yoo is a law professor at the University of Pennsylvania who has researched and published on the applicability of common carriage to social media platforms. He argues that ” common carrier status is determined by functions, not by denominations, and firms that are common carriers remain subject to the same standards as other First Amendment protected activity.”
- The brief makes the case that “the most universally accepted definition of common carriage turns on whether the firm eschews exercising editorial discretion over the content it carries and instead holds itself out as serving all members of the public without engaging in individualized bargaining.” “Social media platforms do not hold themselves out in this manner,” and “Supreme Court precedent establishes that regulations that force a platform to carry speech that it would prefer not to carry constitute an impermissible intrusion on its editorial judgment.” Furthermore, the Texas and Florida laws do not mandate nondiscrimination, which is synonymous with common carrier status. Rather they involve terms “including “censor,” deplatform,” and “shadow ban..” which Courts have yet to determine if these “terms coincide with or differ from nondiscrimination.”
- In addition, the two contexts where the Courts have upheld laws “requiring platforms to carry content balancing points of view that they chose to express,” television broadcasting and cable television, the Court “emphasized the physical (rather than economic) nature of” the service. Namely the “inherent scarcity of the [television] airwaves as a medium of communication” and “the ‘gatekeeper’ or ‘bottleneck’ control” possessed by cable operators.
Professors of History in support of NetChoice
- The amici curiae, consisting of professors who have extensively studied the history of journalism, the press, and the Free Press Clause of the First Amendment, aim to provide the Court with a thorough historical understanding of press freedom, including the exercise of editorial discretion. They argue against the Fifth Circuit's interpretation that editorial discretion was not understood to be among the freedoms enjoyed by the press at the Founding. By examining the historical role of printers and the freedom of the press, the brief demonstrates that printers' editorial discretion was an integral part of press activities, contributing to a broad understanding of press freedom that goes beyond mere freedom from prior restraints.
- Although the brief does not directly compare social media platforms to newspapers or common carriers, it emphasizes that the Founding generation's broad conception of press freedom encompassed editorial discretion, which is relevant to contemporary debates about social media content moderation. The historical perspective suggests that like early printers, social media platforms' decisions on content moderation involve a form of editorial discretion that should be protected under the First Amendment.
- Based on the historical analysis, the regulations in Florida and Texas are inconsistent with the First Amendment protections as understood by the Founding generation. The brief argues that the Founders' broad conception of press freedom, which included editorial discretion, would protect modern social media platforms from government-imposed requirements to host certain content or explain moderation decisions. This historical context underscores the importance of preserving the autonomy of platforms to make editorial decisions, akin to the discretion exercised by printers in the Founding era.
Professors Richard L. Hasen, Brendan Nyhan, and Amy Wilentz in support of NetChoice
- Richard L. Hasen is a Professor of Law and Political Science at UCLA School of Law; Brendan Nyhan is the James O. Freedman Presidential Professor in the Department of Government at Dartmouth College; Nyhan is also co-director of Bright Line Watch, a watchdog group that monitors the status of American Democracy; and Wilentz is the former Jerusalem correspondent for The New Yorker magazine and a long-time contributing editor at The Nation.
- The brief relies on the idea that “social media has greatly amplified the ability of average individuals to share and receive information, helping to further the kind of robust, wide-open debate that promotes First Amendment values of free speech and association.” However, the rise of “cheap speech” has negative consequences, like harassment, obscenity, violence or fraud.
- Florida and Texas’ laws would prevent the ability of platforms to moderate social media posts that risk undermining U.S. democracy and fomenting violence. However, even though both laws have certain exceptions from their bar on content moderation, “those exceptions seemingly would not reach much of the speech that could foment election violence and set the stage for election subversion.”
- The amici curiae states that both NetChoice and CCIA are correct that Florida and Texas laws violate the First Amendment rights of platforms to exercise appropriate editorial judgment. “In a free market, consumers need not read or subscribe to social media platforms whose content moderation decisions they do not like; they can turn to other platforms with policies and views more amenable to them.” In the amici curiae’s point of view, social media platforms are not common carriers because they produce coherent speech products and produce public-facing content. They add that “even common carriers cannot be barred from recommending some speech over others without violating their First Amendment rights.”
- The amici curiae stated that the Supreme Court should not allow states to claim the platforms, “forcing them to equalize speech to include messages that could foment electoral violence and undermine democracy, simply because the states have objected to the platforms’ exercise of editorial discretion.”
- They emphasize two main points:
- It is absurd to state that social media platforms are common carriers subject to viewpoint antidiscrimination provisions. Professor Eugene Volokh explains that whether they produce a coherent speech product “separates entities such as newspapers from entities such as phone companies. Those who do are entitled under the First Amendment to exercise editorial discretion.”
- The Florida and Texas laws seek to equalize political speech in violation of the Supreme Court’s First Amendment jurisprudence.
Protect the First Foundation in support of NetChoice
- Protect the First Foundation (“PT1”) is a “nonprofit, nonpartisan organization that advocates for protecting First Amendment rights in all applicable arenas and areas of law.” Its brief engages in a historical analysis that contends “online services are much more akin to Founding-era newspapers—a curated vehicle of information subject to editorial discretion.” As such, PT1 writes that the “Court should find the Texas and Florida laws to be unconstitutional under the First Amendment because this restriction of freedom of the press has no basis in the history and tradition of this nation and its Constitution.”
- First, the brief details how English law developed from treating newspapers as common carriers to “private speakers with broad rights to print whatever they chose” by the time of the Founding. It then cites pre-Independence writings by Benjamin Franklin about his editorial process and other debates illustrating “[e]arly Americans’ rejection of the common carrier model.” Franklin wrote that “morality permitted discrimination of what a printer chose to print,“ and while he “emphasized the necessity of having opinions published to serve the marketplace of ideas and the value of having the opportunity to be heard, there [was] no suggestion that this could appropriately be regulated or mandated by the government.” Finally, PT1 contends that “the same pattern continued ‘[t]hrough most of the nineteenth century.’”
Public Knowledge in support of NetChoice
- The consumer rights organization Public Knowledge argues that social media platforms are not common carriers, but rather publishers who engage in First Amendment-protected editorial decision in regards to content moderation.
- Texas and Florida’s respective laws fail rational basis review in favor of unconstitutionally promoting conservative viewpoints. This comes at the expense of social media users, and advances an “openly-stated political agenda” that claims to promote free speech but instead suppresses it. This is in direct conflict with conservative principles like limited government and respect for constitutional rights, according to Public Knowledge.
- The bills undermine the free expression of social media users, who may be driven off the platform if it is mandated to carry objectionable content, and interferes with users’ rights to access information.
Reason Foundation, et al. in support of NetChoice
- The brief submitted by the Reason Foundation, Committee for Justice, Competitive Enterprise Institute, and Taxpayers Protection Alliance provides a detailed analysis in support of the respondents in case No. 22-277 and the petitioners in case No. 22-555, focusing on the First Amendment rights of social media platforms. These organizations, with a focus on promoting free markets, individual liberty, and limited government, argue for the protection of social media platforms' First Amendment rights. They contend that content and contributor moderation are expressive associational activities protected by the First Amendment, emphasizing that platforms, like newspapers or clubs, exercise their organizational values through content moderation policies. The filers challenge the Fifth Circuit's assertion that social media companies primarily serve as conduits for others' speech, arguing instead that these platforms engage in protected speech and association by determining their content moderation practices.
- The brief argues against viewing social media platforms as common carriers, public utilities, or public accommodations, regardless of their size or popularity. It emphasizes that these platforms' content moderation practices reflect expressive and associational values, and their decisions to include or exclude certain content or individuals are integral to their freedom of speech and association. The filers argue that the size or reach of a platform does not diminish its First Amendment protections and that social media platforms should not be subject to regulations that would otherwise apply to traditional common carriers.
- The brief contends that the regulations imposed by Florida and Texas infringe upon social media platforms' First Amendment rights by attempting to dictate how they should manage content and interact with users. It highlights that platforms' decisions on content moderation are a form of speech and association protected under the First Amendment. Furthermore, the filers argue that statutory benefits, like those provided under Section 230 of the Communications Decency Act, do not imply a waiver of First Amendment protections and that platforms should not be compelled to host content that goes against their editorial discretion.
Reddit in support of NetChoice
- Reddit was sued by a user who was sanctioned by other users for violating a community rule in a Star Trek forum. This experience led them to file an amicus brief in support of NetChoice to show the “severe and immediate threat” for “meritless litigation” that the Texas and Florida bills pose. Reddit also argues that the two bills pose First Amendment risks via content moderation restrictions and individualized-explanation requirements.
- Reddit argues that users have the right to set their own forum rules, just like private book clubs have the right to set discussion rules without government control. This user-centric model and self-governance feature makes Reddit distinct, and is protected expression under the First Amendment as it advances the free-speech and free-association rights of Reddit and its users.
- The states’ requirements for detailed, technical, and individualized explanations regarding content moderation decisions would also place an “intolerable burden” on volunteer moderators core to Reddit’s model. The brief argues that states do not have the right to subvert Reddit’s approach to content moderation in favor of a government-defined model for online expression.
The Reporters Committee for Freedom of the Press, et al. in support of NetChoice
- Amici curiae includes the Reporters Committee for Freedom of the Press, the American Civil Liberties Union, the American Booksellers for Free Expression, the Authors Guild, the Digital Media Association, the Entertainment Software Association, the Media Coalition Foundation, Inc., and the Motion Picture Association, Inc. Amici are organizations that “defend the Constitution’s protections for editorial discretion by private speakers—news organizations, booksellers, film studios, video game publishers, and more.” They argue that upholding the Florida and Texas laws would intrude on platforms’ editorial autonomy and undermine the rights of publishers of all kinds.
- Amici uses Miami Herald Publ’g Co. v. Tornillo to demonstrate how the First Amendment “guarantees ‘virtually insurmountable’ protection for a private entity’s expressive decision to share––or not to share–– another speaker’s lawful expression with their own audience.” Florida and Texas fail “to offer a principled distinction” between new digital platforms today and “a litany of other speakers, from the traditional press to Hollywood studios.” Thus, upholding the Florida and Texas laws would “bulldoze” Tornillo.
- Editorial control cannot be shared with the state. For instance, the government cannot constitutionally define “what news is fit to print or which books are worth stocking. Texas and Florida undermine the First Amendment’s safeguards for editorial independence by commandeering the audiences of a few large online platforms’ they believe made unfair or biased judgments. To enforce Florida and Texas’ “dangerous fantasy of a government-mandated balance of views” would subject social media companies’ editorial processes to official state examination.
TechFreedom in support of NetChoice
- TechFreedom is a nonprofit, nonpartisan think tank dedicated to advancing policies that encourage technological progress, and provides a comprehensive argument against treating social media platforms as common carriers under the laws enacted by Texas and Florida. TechFreedom has actively participated in the debate over Texas's HB20 and Florida's SB7072, arguing against these laws due to their potential catastrophic consequences for free speech and innovation on social media platforms. The organization's interest lies in promoting a policy environment that allows for the free exchange of ideas and innovation online, emphasizing the unconstitutionality of the laws in question due to their infringement on the First Amendment rights of social media platforms.
- TechFreedom argues against the classification of social media platforms as common carriers, highlighting that social media platforms are fundamentally expressive and editorial in nature, not mere conduits for information like traditional common carriers (e.g., telephone companies). The brief emphasizes the diverse, evolving nature of social media services, which involve extensive data processing and content curation, distinguishing them from the passive, indiscriminate service provision characteristic of common carriers.
- TechFreedom argues against the classification of social media platforms as common carriers, highlighting that social media platforms are fundamentally expressive and editorial in nature, not mere conduits for information like traditional common carriers (e.g., telephone companies). The brief emphasizes the diverse, evolving nature of social media services, which involve extensive data processing and content curation, distinguishing them from the passive, indiscriminate service provision characteristic of common carriers
The Trust & Safety Foundation in support of NetChoice
- In addition to technology, trust & safety (T&S) professionals are essential for moderating content on modern internet platforms, says the Trust & Safety Foundation. These professionals are the ones that develop the detailed policies and practices intended to digital community standards.
- Texas and Florida's laws “are likely to hamper the ability and willingness of platforms to host user-generated content and are likely to yield unintended consequences that will suppress free speech.”
- These constraints may force platforms to reduce content moderation efforts or adopt overly cautious policies, potentially leading to the removal of lawful content to avoid legal challenges. They are “are likely to hamper the ability and willingness of platforms to host user-generated content and are likely to yield unintended consequences that will suppress free speech.” Indeed, “History teaches the most likely path these platforms will choose is the wholesale removal of any content that even remotely expands risk of liability.”
- “By imposing mandatory user notice and appeal requirements, along with consistency or viewpoint neutrality mandates, Texas and Florida burden platforms with skyrocketing compliance and litigation costs.” These costs will push platforms to suppress more speech, not less, since “many platforms will find it more practical to simply provide customers fewer opportunities to express themselves online, rather than incur this business risk thousands of times a day.”
- The brief predicts that under rules required to remain compliant with these laws, platforms may have to “generate billions of individualized notifications every year,” most of which provide no value to the user. It argues against moving in the direction of a “notice and takedown” regime similar to what is required in Europe. “European law and policy may permit imposing burdens that will reshape platforms’ editorial practices, but the First Amendment holds states to a higher standard. By the same token, European policymakers may be more willing to quash the next generation of startups, since the EU economy has seen few successful ones compared to the United States.”
United States in support of NetChoice
- The US Solicitor General filed an amicus brief given that the cases “present questions about whether and to what extent the First Amendment permits States to regulate social-media platforms” and that “Congress has enacted laws governing the communications industry, including social-media platforms (i.e., Section 230).”
- The Government brief argues that “Social-media companies are engaged in expressive activity when they decide which third-party content to display to their users and how to display it.” As a result, “social-media platforms are protected by the First Amendment because their websites are expressive compilations that reflect the platforms’ values, priorities, and viewpoints,” just as “publishers, editors, and parade organizers… shape third-party speech into compilations that constitute distinct expressive offerings reflecting the platforms’ own values, priorities, and viewpoints.” Specifically, “[t]his Court has long recognized that presenting a curated compilation of third-party speech is itself a form of speech,” and “[l]aws requiring platforms to present content they deem harmful, offensive, or otherwise objectionable thus implicate the First Amendment.”
- Even though the First Amendment applies to the platforms’ content-moderation activities, it “does not mean that the platforms are immune from regulation,” including “regulations targeting the platforms’ expressive activities.” These “could be consistent with the First Amendment if they are content-neutral and do not burden substantially more speech than necessary to further legitimate government interests.” However, the states fail to justify the laws’ “content-moderation requirements under any potentially applicable form of First Amendment scrutiny.” Moreover, the disclosure requirements in the Florida law violate the First Amendment because “they impose unjustified burdens on the platforms’ expressive activity” by compelling them “to provide an individualized explanation each time they choose to remove or otherwise moderate user content.”
US Chamber of Commerce in support of NetChoice
- The Chamber of Commerce of the United States of America is the world’s largest business federation, with members that include both social media companies and many businesses that rely on those platforms for commercial advertising. It argues that if the laws are “sustained, the approach taken by these states could lead to regulation across a range of internet businesses that will stifle commerce and place government in the position to regulate a great deal of private speech.” Specifically, “the challenged regulations burden expressive activity at the core of the First Amendment.” “Editorial discretion—regardless of the editor’s message, and irrespective of its motivation—has always been entitled to constitutional protection.”
- It also challenges the contention by Florida and Texas that social media are common carriers, as the “companies lack the factual and legal characteristics of common carriers.” It argues that social media companies “do not operate like common carriers” because they “require users to agree to explicit and detailed terms of service before joining and… reserve the power to impose standards on third-party content,” including an “ability to accept or reject particular content.” Furthermore, they “lack another key characteristic of common carriers: They are not mere conduits of information that third parties may employ.” Rather they “curate [messages], prioritize or deprioritize them based on user preferences, and actively remove content that violates their community standards.”
- Finally, it argues that the requirement that a platform provide an individualized explanation when removing user content, fall outside of the Zauderer test which is “limited to laws aimed at preventing misleading commercial advertisements by requiring the advertiser to disclose “purely factual and uncontroversial information about the terms under which his services will be available.” “Decisions to remove or deprioritize content are not commercial advertisements; the individualized-explanation requirements… do not consist of “purely factual and uncontroversial information” and “in addition, the significant burdens these laws impose on platforms provide a further reason to apply ordinary First Amendment scrutiny.”
US Senator Ben Ray Luján in support of NetChoice
- United States Senator Ben Ray Luján (D-NM) emphasizes “the important role that social media sites have in the lives of his constituents, particularly marginalized communities” and how “Content moderation serves an indispensable role for organizations to ensure that their platforms remain viable forums for a diverse range of voices—from those in marginalized populations to those holding minority opinions.” “When state governments seek to impose their own preferred moderation policies onto companies for whom those policies may not fit, the functional availability of the platform to marginalized communities can shrink or disappear. Accordingly, content moderation is necessary to preserve freedom of speech and expression on social media”
- The Senator also underscores “that regulation of the Internet, including social media platforms, is primarily a federal matter.” For example, Section 230 in Telecommunications Act of 1996, “made it the policy of the United States ‘to promote the continued development of the Internet and other interactive computer services and other interactive media’ and ‘to encourage the development of technologies which maximize user control over what information is received by individuals.’” Moreover, “the federal government maintains the authority to ensure [social media companies]... do not discriminate against vulnerable minority communities” or based on an individual or group’s race, color, religion, sex, or national origin. “Neither politicians nor political viewpoints are a protected class as the State laws attempt to establish.
Washington Legal Foundation in support of NetChoice
- Washington Legal Foundation (WLF) is a “nonprofit, public-interest law firm and policy center” that “promotes free enterprise, individual rights, limited government, and the rule of law” and has submitted amicus briefs in prior cases related to compelled speech. Its brief, in support of NetChoice, is narrowly focused on arguing that the Court should find that the 5th and 11th Circuit “both erred by applying Zauderer to the individualized-explanation requirements” in the Texas and Florida statutes.
- Both the 5th and 11th Circuit applied the Zauderer standard which requires that a law not be “unduly burdensome,” rather than a strict scrutiny standard, when analyzing the individualized-explanation requirements. WLF argues that this use was incorrect because Zauderer only (1) applies to commercial advertising, (2) “when the compelled speech is uncontroversial,” (3) is limited to “false or deceptive” speech, and (4) when applied to “objective disclosures the government seeks to compel.”
- WLF asserts that the “reasons for taking editorial actions are not an uncontroversial topic” and that both states have not identified “false or deceptive statements (or even material omissions)” that the bills seek to correct. WLF goes on to argue against “excessive government entanglement in editorial judgments by private parties” that the individualized-explanation requirements could create. Finally, WLF alleges that both state governments are using these statutes to regulate political speech based on content, “seeking a backdoor way to regulate social media content.”
Wikimedia Foundation in support of NetChoice
- The Wikimedia Foundation, which operates Wikipedia, an “encyclopedic reference guide” that has a distinctive governance structure where users develop content and conduct policies, typically stays out of content moderation decisions and only blocks accounts under “extraordinary circumstances.” While the Texas and Florida bills target major social media platforms like Facebook, YouTube, and X, Wikimedia projects risk being swept up by laws that violate the First Amendment rights of the Wikipedia user community and Foundation.
- Both bills’ statutes rely on “impermissibly vague definitions that risk misapplication,” and may be weaponized politically, particularly by the Florida bill’s sweeping definition of a “social media platform” and permitting private rights of action.
- The laws violate the First Amendment regardless of which platform they are applied to, but it also violates the rights of Wikipedia users. Just like publishers have the right to decide what material appears on their platform and the message it conveys, Wikipedia users have the right to remove viewpoints they don’t wish to be associated with as well as inaccurate and unverifiable information. This, essentially, is government-compelled speech and cedes control to private actors.
Yelp in support of NetChoice
- Yelp, a popular platform for reviewing and finding businesses, would likely be covered by the Florida and Texas laws. In its brief, Yelp uses real examples to detail its content moderation practices, including its use and development of recommendation algorithms and employment of content moderation teams. It frames this explanation around potential harms to businesses and consumers from inauthentic reviews and benefits to those groups from curated reviews, analogizing its process to newspapers’ editorial decisions. Yelp emphasizes that user trust gained through their content moderation process is key to their business model. For example, “If Yelp had to display every submitted review, without the editorial discretion to recommend some over others, business owners could submit hundreds of positive reviews for their own business with little effort or risk of a penalty.”
- Yelp argues that the “Texas and Florida laws…appear to prohibit Yelp from taking any future action to address unreliable reviews” because it would be required to give “the same weight to political attacks as it does to genuine bakery reviews or be liable for “viewpoint” discrimination and “shadow banning.” It also raises concerns about the “consistency” requirement in the Florida law because it “appears to bar treating two reviews differently based on factors about the trustworthiness of the source” and because Yelp’s recommendation algorithm changes over time.
- Yelp also takes issue with the individualized-explanation requirements of the laws, arguing they would “chill Yelp’s protected activity of taking proactive steps to protect consumers against deceptive review practices—and potentially reward people posting fraudulent reviews with $100,000 in statutory damages per claim.”
Amicus briefs in support of neither party
American Center for Law and Justice in support of neither party
- The American Center of Law and Justice (ACLJ) is an organization dedicated to defending constitutional liberties secured by law. The Center expressed its concern with both the government control of communications media and the ideological totalitarianism of many big tech companies.
- Its main argument relies on three overarching principles that the Court must consider in the analysis:
- The “discriminatory exclusion of speech or speakers based on viewpoint is not ipso facto constitutionally protected free speech.” To do so would equip online giants to “establish ideological totalitarianism over vast and important swaths of daily life.”
- Government control over media platforms also raises concerns about imposed “ideological conformity.” Disregarding the government requiring viewpoint-neutral access, the Court should be careful in empowering the government to superintend private social media platforms.
- Different rules apply when the private entity is defined by a mission with ideological elements. Government cannot impair such an entity’s capability to maintain mission focus and integrity.
- NetChoice argues that the “First Amendment protects their exclusion of speech and speakers from their platforms, calling it “editorial discretion.” But no government is entitled to review “editorial discretion.” Under that idea, the tech titans can exclude speech and speakers for mere disagreements with the speaker’s viewpoint.
- The Center recommends that the Court should: (i) Reject the idea that a private entity’s viewpoint against third parties ipso facto represents constitutionally protected free speech; (ii) Reject any authority of government that would create dangers of ideological totalitarianism; and (iii) Affirm that “the First Amendment shields an ideologically mission-oriented entity from government interference.”
The Anti-Defamation League in support of neither party
- “ADL believes the state statutes at issue in these cases unconstitutionally deprive social media platforms of the content-moderation tools they urgently need to help stop the proliferation of hate and harassment online,” and that precedent confirms “the First Amendment forbids the states’ effort to compel private entities, like social media companies, to platform material they deem hateful, dangerous, or otherwise harmful.”
- ADL argues that these “laws strike at the heart of First Amendment freedoms this Court has long guaranteed: the freedom from state confiscation and cooptation of (physical or digital) communications media to deliver the state’s preferred messages; or, more simply, the freedom to choose what to say and what not to say.” Pointing to precedent, ADL says the Court “has long held it to be a bedrock principle of the First Amendment that no government may ‘compel’ private actors ‘to permit publication of anything which their ‘reason’ tells them should not be published.’”
- The brief says “A contrary holding would invite dire consequences.” ADL points to potential dangerous effects if the Court rules in favor of the states, based on examples of online harms having effect in the real world, from anti-semitism to the genocide of the Rohingya in Myanmar, and “the deluge of hate and harassment that is thrust on members of minority and marginalized communities simply for existing online,” including Black Americans and LGBTQ+ people.
- Citing precedent, ADL says that “no matter whether strict or intermediate scrutiny applies to the statutory provisions at issue here, the First Amendment forbids the states’ effort to compel private entities, like social media companies, to platform material they deem hateful, dangerous, or otherwise harmful.”
- The brief argues that the state laws “impermissibly” contravene Section 230 of the Communications Decency Act. “S.B. 7072 and H.B. 20 are antithetical to the necessary empowerment to address harmful content that Section 230 guarantees.”
The Becket Fund for Religious Liberty in support of neither party
- The Becket Fund for Religious Liberty is a “nonprofit, nonpartisan law firm that protects the free expression of all religious faiths.” Their brief does not side with either party in this case, arguing instead that “the Court should distinguish the speech claims at issue in these appeals from Free Speech claims made by sincere religious speakers.”
- Becket points to precedent to assert that “religious speech has the highest level of protection available under the Free Speech Clause,” above commercial speech, and especially above obscene or threatening speech. Religious speech has additionally been protected by the Free Exercise Clause. Becket notes NetChoice’s use of cases involving religious speakers and cautions against taking these comparisons at face value, saying that a tech company covered by the statutes that was “operated on religious principles” would have a stronger claim.
The Center for Business and Human Rights of the Leonard N. Stern School of Business at New York University in support of neither party
- The Center for Business and Human Rights at NYU Stern examines the constitutional and statutory provisions relevant to the case, arguing that social media companies perform editorial judgment by curating third-party content using moderation standards and algorithms to filter objectionable material, providing a curated service to users. This editorial judgment is considered a form of commercial expression protected under the First Amendment against government regulation.
- The Center concurs with Netchoice that the content moderation provisions of the Florida and Texas laws are subject to strict scrutiny and that they “fail to satisfy such scrutiny.” “The editorial judgment that social media platforms exercise is without a doubt protected under the First Amendment, but that does not render social media platforms immune from all regulation.” But the Center seeks to distinguish its argument from that of Netchoice. “The editorial judgment that social media platforms exercise is without a doubt protected under the First Amendment, but that does not render social media platforms immune from all regulation.” The Court must be careful not to rule in such a way as to upend the ability for states to regulate data privacy, or certain disclosure requirements.
- The Center discusses the application of the Zauderer standard in evaluating the constitutionality of certain laws that require companies to provide individualized explanations. It says the 5th Circuit and 11th Circuit Courts applied the Zauderer standard to the Texas and Florida laws, but in doing so expanded the scope of Zauderer beyond its original context. The Zauderer standard was established in a case involving an Ohio rule requiring attorneys to disclose potential litigation costs in contingency fee cases. It uses a form of rational basis review, asking whether disclosures are "purely factual and uncontroversial" and "reasonably related to the state’s interest in preventing deception of consumers" without being "unjustified or unduly burdensome."
- Since Zauderer, the brief says, courts have not directly addressed how the standard applies outside commercial advertising. However, circuit courts have broadly interpreted its applicability to various compelled commercial disclosures, indicating a significant expansion of the standard's original intent. These courts have upheld the use of Zauderer in cases not directly related to consumer deception in advertising, suggesting that regulations compelling factual and uncontroversial commercial disclosures should receive rational basis review.
- The brief emphasizes the importance of this broad application, noting that limiting Zauderer's scope could invalidate much existing First Amendment case law and create confusion regarding the evaluation of commercial individual disclosure mandates. “The infeasibility of providing individualized explanations that comply with Florida’s and Texas’ parameters will likely lead social media companies to eliminate broad categories of speech, such as all expression related to politics and public affairs, in order to avoid liability, especially in Florida where there is a risk of significant monetary liability.”
Center for Social Media and Politics at New York University, et al. in support of neither party
- This brief was composed in support of researchers who study social media data for insights into social and political issues.
- “Independent social science research has played a critical role in helping the public and policymakers understand the wide-ranging effects of these technologies,” but social media platforms “unilaterally control and limit access to their data, erecting significant barriers to rigorous research.”
- “As a result, independent researchers are limited in their efforts to study the causes, character, and scope of the various phenomena attributed to the rise of social media.”
- “This untenable status quo points to the need for, and overriding public interest in, meaningful platform transparency mandates. Although the Court has decided not to address directly the general disclosure provisions of the Florida and Texas laws at issue in these cases, the Court’s resolution of the remaining provisions—in particular the laws’ individualized explanation requirements—implicates fundamental questions about the power of governments to mandate platform transparency and access to data.”
- The brief argues the Court should craft any ruling in such a way as to leave open the possibility of regulation that mandates researcher access and transparency measures that may be passed by states or the federal government in the future. “Amici respectfully submit that the Court should craft rulings in these cases that leave ample room for responsible legislative and regulatory efforts aimed at mandating meaningful platform transparency and access to data. It is essential that such efforts survive constitutional scrutiny.”
Electronic Privacy Information Center in support of neither party
- EPIC argues that the argument made by NetChoice, “if accepted, would create a new and far-reaching right for social media companies to be free from meaningful oversight and regulation of their business practices.”
- The brief distinguishes between “regulations that unduly interfere with expressive activities from those that are permissible regulations of business conduct is the specific context of the regulated activity.”
- EPIC says that social media companies engage in three key activities: “hosting content, ranking content, and designing the user interface and user experience of their platforms.” An overly broad ruling that does distinguish between these activities may preclude the ability to regulate harms such as “surveillance and addictive design.”
- “The Court would never rule, for instance, that ‘walking’ is protected speech generally, but it has recognized that walking in the context of a parade can be a form of protected speech…. In this case, NetChoice has urged the Court to issue the digital equivalent of a holding that all walking is protected speech.”
- “Regulations of social media companies’ hosting, ranking, and UI/UX design choices should not automatically trigger heightened First Amendment scrutiny. Instead, the Court should evaluate the regulations with sensitivity to whether they actually impose a must-carry requirement, whether they are content-neutral, and whether the characteristics of the regulated medium—social media platforms—supports intermediate or some higher level of scrutiny.”
- “Social media companies’ content-agnostic, engagement-maximizing activities are not expressive and in many cases can cause significant harm to users.”
Giffords Law Center to Prevent Gun Violence in support of neither party
- Giffords Law Center to Prevent Gun Violence (Giffords Law Center) is a non-profit policy organization serving lawmakers, advocates, legal professionals, gun violence survivors, and others who seek to reduce gun violence and improve the safety of their communities. The Center filed with the Court because the resolution of this “challenge may have consequences that reach beyond the two laws directly at issue.” More specifically, the Center seeks to “highlight the increasingly direct and troubling connection between the glorification of hate and violence on social media and hate-motivated mass shootings in the United States.”
- The brief takes “no position on how this particular dispute should be resolved, but instead asks the Court to consider the role that social media has played in fueling hate-based gun violence in the United States when considering this challenge.”
Knight First Amendment Institute in support of neither party
- The Knight First Amendment Institute at Columbia University is a non-partisan, not-for-profit organization that works to defend the freedoms of speech and the press in the digital age. They argue that “none of the parties in this case offers a compelling theory of how the First Amendment should apply to the regulation of social media.” The arguments from Florida and Texas, that “platforms’ content-moderation decisions do not implicate the First Amendment at all,” could lead to “sweeping [governmental] authority over the digital public sphere…” Conversely, platforms “arguing that any regulation implicating their content-moderation decisions must be subjected to the most stringent First Amendment scrutiny… ” “would make it nearly impossible for governments to enact even carefully drawn laws that serve First Amendment Values.”
- Knight argues that social media platforms are not newspapers, but “social media platforms’ content-moderation decisions are protected by the First Amendment because they reflect the exercise of editorial judgment…” Thus, the “relevant inquiry is not whether a regulated entity exercises editorial judgment in some context, but whether the entity exercises editorial judgment in the specific context addressed by the regulation.” Even though “content-moderation is protected by the First Amendment, however, does not mean that any regulation that touches on it is unconstitutional,” content-neutral laws “are reviewed less stringently because they “do not pose the same inherent dangers to free expression…” However, “content-based laws [like those in Florida and Texas] that interfere with editorial judgment are subject to strict scrutiny,” and in this case, these “must-carry provisions are unconstitutional because they override the platforms’ exercise of editorial discretion.”
- They also ask the court to reject the parties’ “most extreme arguments about the disclosure provisions” in the States’ laws. Relying on the Zauderer decision, which recognized that disclosures of “purely factual and uncontroversial information” are “evaluated less stringently than laws that compel the disclosure of other forms of speech,” Knight argues that Texas’s disclosure provision is constitutional because “it requires the disclosure of information that is (i) factual and (ii) uncontroversial, and that (iii) relates to commercial services provided to the public.” In contrast, Florida’s disclosure provision would “chill the platforms’ speech” because it is unduly burdensome, and “because it imposes potentially massive damages liability for violating [its] requirements.”
National Security Experts in support of neither party
- Amici are national security experts, including “former career and politically appointed officials, across Republican and Democratic administrations, from the National Security Council staff, Office of the Director of National Intelligence, Central Intelligence Agency, Department of Homeland Security, Department of Justice, Federal Bureau of Investigation (FBI), Department of Defense, and Department of State, as well as former members of Congress with national security credentials and other national security experts.” They are concerned about the threat of online radicalization in the US. They argue that US security depends on social media platforms being “responsible corporate citizens” who remove dangerous content by malign actors or mitigate its spread.
- The spread of violent online content has real world consequences, including from foreign “terrorist organizations” seeking to foment violence and political discord. While content moderation is “far from perfect,” it does succeed in removing some of the worst extremist and violent content online, according to amici. They argue the Texas and Florida statutes will disrupt these efforts to the detriment of US national security.
- Both statutes hinder platforms’ ability to exercise their constitutional rights, which should be empowered rather than restricted by the First Amendment. Amici also argues that according to Section 230, private actors may remove content from their online platforms if they find it “objectionable” and act in good faith.
States of New York, Arizona, et al. in support of neither party
- The States filed in the case because they “have significant interests in regulating these platforms to protect their citizens.” They filed in support of neither party, but “principally to explain that any decision interpreting the First Amendment to foreclose most regulation of social media platforms would undermine States’ important objectives.” The States urged the “Court to make clear that States may regulate social media platforms consistent with the First Amendment.”
- They argue that “the Court must consider each challenged provision individually. Where a provision applies only to non-expressive conduct or otherwise has little or no effect on speech, the First Amendment does not limit States’ authority to regulate.” For example, “compelled commercial disclosures of factual, noncontroversial information, like the disclosure of a platforms’ policies, receive the lowest level of protection and should be upheld if they reasonably relate to an appropriate governmental interest and do not unduly burden speech.”
- Social media platforms further may not “claim special protection from governmental regulations” simply because they are in the business of hosting users’ speech.” “Courts must also consider whether a challenged speech restriction is content-based, that is to say, if it applies to noncommercial speech “because of the topic discussed or the idea or message expressed.” Moreover, “a regulation that distinguishes between different platforms [as both laws do] is not inherently constitutionally suspect.” As the Court found in Turner Broadcasting held that the “regulations, which “distinguish[ed] between speakers in the television programming market… were not “presumed invalid under the First Amendment” because they differentiated based only on a particular characteristic of the regulated cable television services… not based on the content of the messages they carried.”
Amicus briefs in support of the States’ Laws
American Principles Project in support of the States’ Laws
- The American Principles Project is a nonprofit corporation that advocates against policies detrimental to parents and children, including threats to free speech. The Project’s starting point is that social media platforms are regulable as common carriers “consistent with the First Amendment because they are businesses that carry customers’ messages - just like telegraphs and telephones.”
- The amicus curiae support Justice Thomas’ test summary to classify common carriers considering (i) whether the entity regulated is part of the transportation or communications industry; (ii) whether the industry is affected by the public interest; (iii) whether the platform has market power; (iv) whether the industry receives benefits from the government such as liability protection; and (v) whether the platform “holds itself out as providing service to all.” In the Project’s opinion, social media platforms comply with all the requirements mentioned by the test and therefore are common carriers.
- Furthermore, even though NetChoice states that “just as the government cannot compel a platform to remain a common carrier, it cannot force it to become one.” However, the Project responds to this argument affirming that it lacks historical support since telephones were not originally common carriers but became one and have to compel with government permission required for them to end service.”
- NetChoice also states that “Congress has gone out of its way to enable websites to weed out objectionable content, and exclude speech citing 47 U.S.C. 230 (C) and has specifically disclaimed any intent to treat such websites as common carriers, citing 47 U.S.C. 223 (e)(6).” Nevertheless, saying that the fact that Section 223, which addresses transmission of child pornography and obscenity, does not make websites into common carriers does not mean that they cannot be treated as common carriers under other sections of the Telecommunications Act or by other jurisdictions.
Amicus Populi in support of the States’ Laws
- Amicus Populi is a coalition of former prosecutors who advocate for laws promoting public safety and effective crime prevention. The main argument expressed in the brief is that “Democratic self-government depends on expressing ideas, not suppressing them.” This means that there is a preference for speech over silence that is supported by the idea that The government generally may participate in public debate by “speaking but not by silencing other speakers.”
- The amici beliefs that “the freedom to speak/publish deserves more protection than the freedom to keep others from speaking/publishing; subtracting speech imperils democratic decision making far more than adding it.”
- Amicus Populi states that social media platforms are common carriers and therefore not speech producers (not protected by the First Amendment), making a difference between them and newspapers and using an example from printer Franklin, which published a book separately from his newspaper because having “contracted with his subscribers to furnish them in the newspaper with desired content, it would breach that contract and harm his reputation to provide them with undesired content.” However, from their point of view, “social media function very differently. Franklin’s newspaper was the producer of content and the reader was the consumer, but social media served to let the user produce the content.”
- Another argument supported in the brief is that “the open, viewpoint-neutral access prescribed by S.B. 7072 and HB20 would not unconstitutionally infringe the Platforms’ “rights” because “it is agnostic algorithms that filter content.”
- In addition, Amicus Populi states that “the Platforms have agreed they are not the speaker or publisher of their users’ views and are now stopped from contending otherwise.” Social media platforms, to evade liability, disclaimed any responsibility for posted content using Section 230 as a shield: “Section 230 forbids . . . treat[ing] Google as the ‘publisher or speaker’ of content posted by others;” but now, “ to enable them to censor user speech, they demand the same legal treatment” like newspaper’s editors and authors. In consequence, from the Amicus Populi point of view “after successfully asserting one position, party may not assert contradictory positions to obtain unfair advantage.”
- In conclusion, according to the amicus curiae, the Supreme Court should “confirm the superiority of adding speech over subtracting it in the constitutional hierarchy.”
Babylon Bee and Not the Bee in support of the States’ Laws
- The Babylon Bee (“The Bee”) is a Florida limited liability company (LLC) and website that says it “exposes foolishness, mocks absurdity, and highlights hypocrisy in faith, politics, and culture through satire, humor, and parody.” Not the Bee says it is “a Christian news website that, alongside The Bee, points out “social media platforms’ viewpoint-based censorship of conservative groups, conservative leaders, and their own satire” that they, too, allegedly suffer from.
- Amici argue that platforms, which host third-party speech and administer a service open to the public, unevenly enforce their standards and assert their “unlimited and unilateral right to censor, deplatform, or shadow-ban disfavored users, disfavored content, and disfavored viewpoints.” The Florida and Texas “consumer protection laws” merely restrict the “social media titans’ ability to do so” by modestly requiring platforms, which have “all the hallmarks of common carriers,” to disclose and evenhandedly apply the standards they voluntarily choose. The laws still allow platforms to remove “‘material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.’”
- Platforms also do not qualify for civil immunity under Section 230 of the Communications Decency Act, which “shields only the ‘good faith’ removal of ‘objectionable’ ‘material,’ rather than the bad-faith removal of certain disfavored viewpoints or disfavored users.”
The Center for American Liberty in support of the States’ Laws
- Center for American Liberty (CAL) is a “nonprofit law firm dedicated to protecting civil liberties and enforcing constitutional limitations on government power” that has previously represented clients whose posts were removed from platforms for sharing “misinformation.” Its brief characterizes social media platforms as state actors that are more like cable companies than newspapers, and which “moderate user content on their sites based on the government’s preferences” and are thus not allowed to exclude speech, just as governments may not.
- CAL explains two tests for evaluating when private entities should be treated as state actors, drawing on precedent that “a private company transforms into a state actor subject to the Constitution when its actions are “fairly attributable” to the state.” The joint action test focuses on agreements between Platforms and government “to limit protected speech” and they interpret the nexus test to mean that “when the state intimately relies on a Platform to disseminate and censor information for the government, it transforms a business transaction into a cooperative relationship necessary to control the dissemination of information through messaging and censorship.”
- CAL goes on to claim that the Platforms cannot prevail on facial challenges to the statutes because they would have to show there was no way the statute could be applied in a constitutional manner. Rather, they “should be required to bring as-applied challenges to demonstrate how the statutes allegedly infringe on each Platform’s constitutional rights.” CAL supports this point by asserting that the state statutes are not overly broad “because they do not chill protected speech” and that “any alleged chilling of the Platforms’ ability to regulate content is minimal when viewed against the preservation of First Amendment liberties for millions of monthly active users.”
Center for Constitutional Jurisprudence in support of the States’ Laws
- The Center for Constitutional Jurisprudence (CCJ) is part of the Claremont Institute, a conservative think tank, “whose stated mission is to restore the principles of the American founding to their rightful and preeminent authority in our national life, including the principle that freedom of speech is critical to a functioning republic.”
- Its brief argues that social media platform users are speakers, not the platforms themselves, and asserts that “federal government officials pressure social media outlets to suppress ideas and even truthful information that runs counter to the government-backed narrative.” It undertakes a historical analysis to claim that
- First, CCJ claims that “the Free Speech Clause protects speech, not censorship.” It supports this with a historical analysis of the First Amendment’s passage, writing that “[t]he impulse to protect the right of the people to share their opinions with each other was nearly universal in the colonies.” It also reviews historical debates related to the Sedition Act of 1798, which was enacted and then undone a few years later. CCJ claims “this is the clearest indication we have [sic] that the people intended the First Amendment’s speech and press clauses to be much broader than a simple bar on prior restraints.”
- It goes on to argue that “prohibition on viewpoint discrimination does not constitute compelled speech” because the Texas and Florida laws “do not require platforms to create any speech for other parties.” CCJ also draws on Justice Thomas’ past writings that social media platforms have “concentrated control of [sic] speech in the hands of only a few private parties” and compares platforms to broadcasters.
Center for Renewing America in support of States’ Laws
- The Center for Renewing America (CRA) is a conservative think tank that “looks to [Texas] HB 20 as an important step in preserving free speech in America.” Its brief focuses primarily on the Texas statute, asserting that “[t]he strongest defense for the Texas statute rests on the massive technological changes that have transformed every aspect of how these platforms do business relative to earlier operations under simpler models.”
- CRA repeatedly draws on monopoly power in its support of the Texas law, writing that social media platforms “have all the indicia of illicit monopoly control of speech, which justifies the imposition of a narrow remedy, such as the viewpoint discrimination prohibition.“ It summarizes economic and legal theories about monopoly power and common carriers, with a particular focus on how network effects cause social media platforms to tend towards consolidating market power and reducing consumer choice. It notes that this justifies the Texas laws’ coverage of only larger platforms. The brief also analogizes the platforms to transportation providers, which are common carriers that must allow any paying customer to board but can still remove disruptive customers.
- CRA contends that social media platforms may be colluding with each other and with web services providers. It supports this contention with examples of Apple, Google, and Amazon cutting off the social media platform Parler’s access to web services and the similar handling of COVID-19 and vaccine-related content across the major platforms. CRA goes on to reference arguments in Missouri v. Biden to establish that the federal government has been interacting with platforms’ editorial decisions such that they “no longer have meaningful independence.”
- It also argues that the major cases cited by NetChoice are outdated because they predate social media platforms. The brief reviews three of these cases to argue why they should be understood differently in a modern context. For example, it discusses how Tornillo may be understood by the Court’s note that the “modern press is often more complex and more concentrated, which raises the general issue of monopoly power.” Finally, CRA discusses how private monopoly power can be regulated under the First Amendment, drawing on examples of broadcasting companies where the Court has previously balanced regulatory action with the public interest.
Children’s Health Defense in support of the States’ Laws
- Children’s Health Defense (CHD) is a nonprofit organization dedicated to ensuring that people have access to complete, accurate health information for themselves and their families. CHD’s more than 70,000 members across the US are avid consumers of online health news that has been “repeatedly targeted for censorship” by social media platforms, particularly regarding COVID-related news.
- The CHD is interested in the question around whether social media platforms are “common carriers.” It argues that the essence of this case is around whether platforms are more like telegraph companies or newspapers. However, this “all-or-nothing view” presents two main issues: social media platforms don’t resemble either Western Union or the New York Times and traditional common carrier doctrine was “developed in blissful ignorance” of the modern First Amendment. Instead of the Court choosing “between the zero First Amendment scrutiny evidenced in common carrier law and the maximal First Amendment scrutiny applicable to newspapers,” it should decide the case using its cable operator precedents established in Turner Broad. Sys., Inc. v. FCC and applying intermediate scrutiny.
- Social media platforms are sufficiently similar to cable operators as the Fifth Circuit recognized in Turner, which did not turn on a common carrier analysis but rather engaged in First Amendment analysis. The Florida and Texas laws contain provisions “barring social media platforms from engaging in viewpoint-based censorship” that pass the following tests in Turner: a must-carry law directed at cable operators is constitutional if it is 1) content neutral, 2) promotes a substantial governmental interest and 3) does not burden substantially more speech than is necessary to further that interest.
Dr. Christos A. Makridis in support of the States’ Laws
- Dr. Christos A. Makridis is a professor and academic who “previously served on the White House Council of Economic Advisers managing cybersecurity, technology, and space activities.” He holds doctorates in economics and management science & engineering from Stanford University and his research focuses on the digital economy. His brief focuses on Texas’ statute, HB 20.
- The majority of Dr. Makridis’ brief focuses on the argument “that there has been a secular increase in market power and that social media firms hold and may be exercising that market power,” which would lead to the “lawful application of common carrier regulation” to these platforms. The Texas legislature found that the largest social media companies are common carriers by virtue of market dominance, and Dr. Makridis argues that “courts should uphold a legislature’s determination, unless the statutory determination lacks a rational basis.” He attempts to establish that rational basis by cataloging previous findings by academics and courts in the US, UK, and EU about social media platforms’ market power. For example, he discusses high-profile acquisitions such as Facebook’s acquisitions of Instagram and WhatsApp, and economic research pointing to increased concentration across multiple industries.
- Finally, he contends that facial challenges to statutes are difficult and disfavored, because they “rest on speculation,” “run contrary to the fundamental principle of judicial restraint,” and “can undermine democratic decision-making by preventing duly passed laws from being implemented as the Constitution requires.” He writes that “if the Court wishes to apply a higher standard to the factual questions about market power or whether H.B. 20’s viewpoint neutrality requirement would affect the message any social media user would or could perceive, the Court should allow legal and factual development in an as-applied challenge.”
The Digital Progress Institute in support of the States’ Laws
- The Digital Progress Institute, a Washington, DC-based think tank, argues for policies promoting robust competition in tech markets and a holistic approach to Internet regulation. The Institute contends that HB 20 in Texas, a non-discrimination law targeting major social media platforms, is constitutional and necessary for ensuring an open and competitive digital market. The Institute emphasizes that the law aims to prevent viewpoint-based censorship by these platforms, which, it argues, have amassed power that rivals or exceeds governmental power. The Institute's arguments are grounded in a historical perspective, suggesting that the Founders of the United States, particularly James Madison, recognized the dangers of concentrated power not only in government but also in private monopolies, which today could be analogous to large tech companies.
- The Institute argues that social media platforms should be regarded as communications platforms, similar to traditional telecommunications services, rather than as publishers or speakers with their own editorial content. It emphasizes that these platforms primarily serve as conduits for others' speech, facilitating communication rather than producing content. This classification underpins the argument that applying non-discrimination laws to these platforms does not infringe upon their First Amendment rights, as it does not compel speech from entities that do not engage in speech themselves.
- Based on a view of social media platforms as communications conduits, the Institute argues that the regulations imposed by Florida and Texas are compatible with the First Amendment. It asserts that non-discrimination laws like Texas's HB 20, which prohibit platforms from censoring content based on viewpoint, do not compel speech. Instead, these laws ensure that platforms provide a neutral space for all views, consistent with the platforms' stated purposes and the public's interest in a free and open exchange of ideas. The Institute's argument challenges the notion that such regulations compel speech by platforms, instead framing them as measures to prevent undue censorship and promote free speech online.
Donald W. Landry in support of the States’ Laws
- Laundry is a Hamilton Southworth Professor of Medicine, Chair Emeritus of the Department of Medicine, and Director of the Center for Human Longevity at Columbia University College of Physicians and Surgeons. He writes in defense of the freedom of scientific debate.
- The crucial point in his brief is that “all theories, however widely accepted, must be open to being challenged by the publication of alternative theories and evidence.” In other words, his main concern is restricting the scientific public debate with different perspectives and positions even if they are “offensively wrong,” but science depends on the freedom to publish observations and ideas that may be in error.
- Social media platforms, under Laundry’s idea may “believe that a viewpoint is so clearly wrong that there is no harm, and even much benefit, in censoring it. But if the censored perspective aptly challenges a prevailing theory, the censorship may prop up a false theory, with profound consequences.”
Eric Rasmusen in support of the States’ Laws
- Social media platforms function as natural monopolies, similar to traditional utilities, due to network externalities and increasing returns to scale, which is what justifies their regulation as common carriers. “For centuries, economic and legal thinkers have understood that natural monopolies can—in some cases—be best managed through common carrier-style laws, like the HB 20 statute adopted by the Texas Legislature.”
- Common carrier-style laws have a long historical precedent in the US, and are well aligned with First Amendment jurisprudence and economic theory.
- State governments have the authority to enforce the First Amendment through reasonable regulations, including laws that treat social media platforms as common carriers. “Social media platforms have much in common with municipal water companies, telephone companies, or electrical power generators traditionally subject to common carrier laws.” And here, “the Texas Legislature, in exercise of its sovereign authority, has acknowledged the status of online platforms as a critical public forum, and implemented equal access rules to open that forum to lawful expression.”
- “If state common carrier laws affirming First Amendment rights in cyberspace are blocked, new, much lower thresholds for speech will be set from abroad through foreign regulations. The U.K., Canada, Australia, and European Union have much weaker protections for speech than the U.S., to say nothing of the strict speech restrictions that China seeks to globalize.”
- The Texas HB 20 statute is an appropriate regulatory response, aiming to mitigate the monopolistic tendencies of social media platforms without infringing on constitutional principles. The largest social media companies are natural monopolies, which have “obvious network externalities.”
- “The suggestion that if social media platforms allowed more dissenting voices, the bulk of their customers would leave is misplaced. No one has succeeded in entering and competing with the incumbent social media giants head to head, despite the technological ease of doing so and the large advertising profits that could be earned.”
- Rasmusen rejects the Fukuyama middleware argument. “Dismissing HB 20 as unnecessary because of middleware is like dismissing antitrust law for energy company mergers because with the impending advent of cheap solar power, those companies will have no market power. Maybe eventually— but not now.”
- Rasmusen further argues that the Texas law would prevent government censorship. “HB 20 would make it more difficult for the government to pressure a social media corporation, because it would prevent the corporation from censoring on the basis of viewpoint, as the government desires.”
The Heartland Institute in support of the States’ Laws
- The Heartland Institute says it is one of the world’s leading free-market think tanks, and it supports Texas H B 20, stating that free speech is America’s most vital right. Under the Institute’s argument, Texas’s response to social media’s threat of censorship is consistent with the First Amendment precedent and proceeds from common carrier law.
- The Institute supports that social media platforms are common carriers under Justice Thomas’s test. However, NetChoice tries to “rewrite the test claiming that common carriers hold themselves out as affording neutral, indiscriminate access to their platform without editorial filtering.” However, under the Institute’s argument, NetChoice cites no Supreme Court precedent supporting its test. Under NetChoice’s idea, the Institute asserts, “any entity regulated as a common carrier, or for that matter, a public accommodation, could decide that they were going to impose an editorial filter and select customers according to their standards – and thereby evade common carrier or public accommodation law.”
- The major internet platforms covered by the HB 20 are global. The EU regulates their activities in foreign countries. The amicus curiae states that last year, the EU issued the Digital Services Act (DSA), an “offensively anti-free speech regulation that requires platforms covered under HB 20 to censor “harmful” speech. This category includes “disinformation.” Social media platforms want the DSA regulation to predominate and be replicated in the United States. Under this idea, the institute affirms that the question is not whether social media platforms will be governed but by whom. The United States is interested in protecting its citizens’ right to exercise their right to free speech, and HB 20 would render the EU’s “censorship regime – which discriminates based on speech disliked by bureaucrats – unlawful to apply in Texas.”
iTexasPolitics (The Texan) in support of Moody and Paxton
- iTexasPolitics, a political news organization, argues that it is not the federal court’s role to “decide what state law ‘should’ be.” The federal court must interpret a law according to precedent from a state’s highest court, and then can determine the constitutionality of that law. It also argues the federal Communications Act has no bearing on Texas and Florida common law.
- Imposing “content and viewpoint neutral” restriction on common carriers’ exercise of editorial discretion does not require platforms to speak or restrain speech, thus it does not violate the First Amendment.
- iTexasPolitics’ amicus brief focuses largely on Texas HB 20 Section 2. It argues it is a consumer protection measure that ensures transparency and fair content moderation practices, and treats all users equally, in line with First Amendment principles and international standards like the EU’s Digital Services Act.
Keep the Republic in support of the States’ Laws
- Keep the Republic is “a research project that investigates issues of national democracy, and develops models for action through executive policy, legislation, and legal theory.” Its brief argues in support of the Texas and Florida laws and contends that they would only be unconstitutional if they “attempt[ed] to reduce the protection of public free speech through common carriage.”
- The brief focuses substantially on why social media platforms should be considered common carriers. It argues that there is “a constitutionalized right of access to this public speech venue, that Internet public communications services are only the modern continuation of the carriage industry, and that where such services are large and general they qualify as common carriers.”
- Its argument includes a historical analysis and constitutional analysis of the Guarantee Clause, asserting that “[t]he enumerated Guarantee Clause must properly include the unenumerated obligation to assure the means for public communications.”
- It contends that common carriage applies based on purpose, not specific technology – and the common law understanding of common carriage should trump the FCC’s narrower definition.
- Keep the Republic does not believe that a demonstration of market power or countervailing benefit is necessary to establish common carrier status.
- It also engages in a three-pronged analysis concerning the relationship between a potential common carrier and the public to establish that “services such as user-review sites and single-subject discussion forums are excluded” from a common carrier definition. On the other hand, it claims that the Florida law can and should apply to infrastructure providers like ISPs and web hosting services.
- Though it notes Section 230 is outside of the case’s immediate scope, Keep the Republic argues that Section 230 has been misinterpreted by lower courts and should not be understood “as a protection for editorially-selective publishers.” Section 230, it writes, “produces a voluntary common carrier system with a limited allowance for restriction of material.” It also urges the court not to “fall back on Section 230 to save free speech in place of common carrier status.”
Legal Scholars Adam Candeub & Adam MacLeod in support of the States’ Laws
- Adam MacLeod is a Professor of Law at St. Mary’s University of Texas, while Adam Candelub is a Professor of Law at Michigan State University College of Law, where he directs its Intellectual Property, Information, and Communications Law Program. The amici curiae brief is filed in support of the States of Texas and Florida, and it is intended to aid the Court by “adding to the record a scholarly overview, with relevant legal and historical contexts, of common carrier and public accommodations doctrines.”
- The amici curiae’s main argument is that social media platforms are common carriers; therefore, they cannot evade the law by “asserting that their business conduct is inherently expressive and immune under the First Amendment from such common carrier regulation. Otherwise, any business could evade anti-discrimination laws prohibiting point view discrimination by claiming their discriminatory conduct is expressive.”
- Candelub and MacLeod think that social media platforms are common carriers because they comply with the common carrier test the Supreme Court used to assess States’ common carrier regulations and support this idea on Justice Thomas’ summary about the various bases for common carrier status:
- A firm exercises market power
- An industry is affected with the public interest
- The entity regulated is part of a transportation or communications industry
- The industry receives certain benefits from the government
- The firm makes or holds a public offering of carriage
- In response to NetChoices’ argument that social media do editorial filtering, the amici curiae state that if excluding customers for expressive purposes takes an industry out of common carrier status, then the telephone, mail and package carriers “could discriminate unlawfully, characterize their policies as “editorial filtering” and immunize themselves from such common carrier regulation.” Regarding First Amendment issues, Candelub and MacLeod defend the idea that the First Amendment protection “only applies when the complaining speaker’s message was affected by the speech it was forced to accommodate.” However, these conditions are not met in the case of social media platforms, because
- The platforms have the “bandwidth” to express their views to their users. The Texas law doesn’t restrict platforms’ speech.
- Users choose their followers and block others. It is users and not platforms that create online experiences. The platforms’ editorial actions are not a message.
- The acts of editorial discretion only have meaning if accompanied by “explanatory speech,” which lacks First Amendment protection.
Missouri, Ohio, 17 other States, and the Arizona Legislature in support of States’ Laws
- The States and the Arizona Legislature argue that “States have a long history of regulating to protect citizens from abridgment of their free speech rights by dominant communication platforms.” The laws at issue “address companies that possess extraordinary market power,” and align with other regulations that are “justified when “the service provider possesses ‘bottleneck monopoly power’... and can use it to “diminish[ ] the diversity and amount of content available.”
- The States view social media platforms as more like the telephone or telegraph than a newspaper, thus subject to common carrier regulations. Given these platforms' status as common carriers, the restrictions on content moderation at issue in the case do not “limit a social media company’s speech” as the courts have permitted “compelled hosting to preserve access to information” when “the company has power to obstruct access to information” or when “the statute would [not] ‘dampen the vigor and limit the variety of public debate.’” The States argue social media meets both criteria.
Moms for Liberty and Institute for Free Speech in support of the States’ Laws
- Moms for Liberty (MFL) is a “nonprofit organization whose mission is to organize, educate and empower parents to defend their parental rights at all levels of government.” The Institute for Free Speech (IFP) is a 501(c)(3) that conducts scholarly work and represents litigants in cases related to the First Amendment. Moms for Liberty contends that an ideologically opposed organization lobbied Facebook to “censor the Moms’ content as ‘misinformation,’” after which many of its chapter Facebook groups were disabled. The amici argue that such actions “seriously disrupt the political process” and the government, including states, should be allowed to “protect consumers against viewpoint discrimination in their use of social media platforms.”
- MFL and IFP write that platforms should be afforded First Amendment protections over their own speech, but not the speech of others based on viewpoint. Rather than choosing a single analogy between platforms and newspapers, common carriers, or other discrete entities, they note that ‘[t]he key distinction between those who have a right to exclude speech and those who do not is whether the putative speaker acts primarily as a conduit for other people’s speech, or whether it would publish the speech of others as a means of expressing its own message.”
- They then go on to argue that the platforms are primarily conduits for speech – ordinary users do not view individuals’ posts to be espousing the platforms’ views, platforms do not curate speech to the point of showing all users the same viewpoints, and platforms themselves argued that they are conduits for speech in past cases. They note that the Court has also accepted this argument by agreeing platforms do not have the same legal obligations as publishers in its decision in Twitter v. Taamneh.
- MFL and IFP support the Florida and Texas laws viewpoint-neutrality provisions by writing that these “do not regulate the platforms’ own speech” or “prevent the platforms’ users from choosing what speech they receive and with whom they interact.” They note that private actors have the ability to negatively affect free functioning in society, so “the state’s curtailment of such threats from private power promotes rather than infringes on individual liberty.”
Open Markets Institute in support of the States’ Laws
- The Open Markets Institute, a non-profit organization dedicated to promoting fair and competitive markets, argues in “support of the states’ exercise of their police power to regulate internet platforms as common carriers if and when they determine it is appropriate.” They note, “Under established precedent, states and the federal government can impose common carrier obligations on certain classes of businesses, including communication firms.” However, they cautioned that those regulations did not strip “away First Amendment protections of those regulated” in all cases and took “no position on the wisdom of the two state laws.”
- These “Common carriers historically possessed distinguishing features, such as generally holding themselves out as open to the public….” Platforms fall into this category and, “are distinguishable from newspapers” given that ‘Users are able to publish messages “of their own design and choosing,” and “these messages are not individually evaluated by social media platforms to ensure they are suitable for public display ahead of time.” Although platforms issue a “terms of service (“TOS”), which may require users’ adherence to community standards, [it] does not insulate them from common carriage designation or obligations,” as ”[e]ven for common carriers, “[a]ccess has always been qualified.””
- In terms of regulatory parallels, Open Markets argues that internet platforms are more like shopping centers, where the Court found that “State may authorize access to a shopping center for expressive activity beyond that which the First Amendment recognizes.” Their use of algorithms and content curation to boost engagement does not make them “newspaper editors making considered decisions based on human judgment… about what content to include or exclude in a publication. Instead, they are more like shopping center owners anxious to connect shoppers with retailers.”
The Rutherford Institute in support of the States’ Laws
- The Rutherford Institute, a nonprofit civil liberties organization, supports the content moderation restrictions imposed by Texas and Florida laws, arguing they comply with and further the First Amendment's purposes. The Institute emphasizes the need to curb the suppression of diverse political discourse by social media platforms, sometimes under federal coercion, asserting that such laws are necessary to prevent viewpoint-based censorship and preserve the free exchange of ideas in the digital public square.
- The brief suggests that social media platforms function as common carriers rather than newspapers or traditional carriers, based on their role in offering services to the public without discrimination. It argues that, due to their market dominance and the public function they serve in facilitating free expression, these platforms should be subject to non-discrimination obligations akin to common carriers, ensuring they cannot restrict access to their services based on content or viewpoint.
- The Rutherford Institute contends that, even if the actions of social media platforms were considered speech, the regulations in Florida and Texas do not violate the First Amendment. It argues that these laws serve substantial government interests, such as preventing monopolies from suppressing speech and preserving social media as a vital forum for public discourse. The brief suggests that the laws do not burden more speech than necessary, thereby meeting the criteria for intermediate scrutiny and aligning with the principles of the First Amendment.
World Faith Foundation in support of the States’ Laws
- The World Faith Foundation (WFF), a California religious non-profit organization focused on preserving and defending religious faith and speech as guaranteed by the First Amendment, urges the Court to affirm the Fifth Circuit's decision and reverse the Eleventh Circuit's decision. WFF's argument hinges on the role of the internet, particularly social media, as the "vast democratic forums" essential for the exchange of views, emphasizing the need for freedom of expression online to support democracy and innovation. Its interest is directly related to ensuring these platforms remain avenues for free speech and diverse viewpoints, underlining the oversized and often unaccountable role that major social media platforms play in shaping public discourse.
- WFF argues that social media platforms should be treated as common carriers subject to public accommodation restrictions, contrasting with the view that they are either newspapers or common carriers in a traditional sense. This stance is based on the platforms' function as the modern "public square" open to a multitude of voices and viewpoints, comparable to a traditional public forum. WFF contends that these platforms, by virtue of their market dominance and the public function they serve in facilitating free expression, closely resemble traditional public forums subject to constitutional constraints, thereby supporting regulations that preserve free expression and reduce discrimination.
- Depending on its position that social media platforms are akin to common carriers or public accommodations, WFF argues that the regulations in Florida and Texas are subject to First Amendment protections. WFF asserts that these laws aim to ensure that social media platforms, as central public forums for debate, do not engage in viewpoint-based censorship. By treating these platforms as common carriers or public accommodations, the regulations seek to preserve free expression and reduce invidious discrimination, thereby aligning with the First Amendment's goal of promoting a diverse marketplace of ideas and ensuring broad access to digital public squares.