Transcripts: Moody v. NetChoice, NetChoice v. Paxton Oral Arguments
Gabby Miller / Feb 28, 2024On Monday, Feb. 26, 2024, the US Supreme Court heard oral arguments for Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton. The cases are on similar but distinct state laws in Florida and Texas that would restrict social media companies’ ability to moderate content on their platforms.
Those arguing before the Supreme Court included:
- Henry C. Whitaker – Solicitor General, State of Florida, on behalf of Ashley Moody, Attorney General of Florida
- Paul D. Clement – Counsel, NetChoice
- Elizabeth B. Prelogar – Solicitor General, US Department of Justice
- Aaron L. Nielsen – Solicitor General, State of Texas, on behalf of Ken Paxton, Attorney General of Texas
What follows are lightly edited transcripts of oral arguments for case 22-277, Moody v. NetChoice, LLC, and NetChoice, LLC v. Paxton. Please also refer to the official oral argument audio and transcripts posted to the Supreme Court website.
Transcript for case 22-277, Moody v. NetChoice, LLC.
Chief Justice John Roberts:
We will hear argument first this morning in case 22-277, Moody v. NetChoice. Mr. Whitaker?
Mr. Henry C. Whitaker:
Mr. Chief Justice, and may it please the court. Internet platforms today control the way millions of Americans communicate with each other and with the world. The platforms achieved that success by marketing themselves as neutral forums for free speech. Now that they host the communications of billions of users, they sing a very different tune. They now say that they are, in fact, editors of their user speech, rather like a newspaper. They contend that they possess a broad First Amendment right to censor anything they host on their sites, even when doing so contradicts their own representations to consumers.
But the design of the First Amendment is to prevent the suppression of speech, not to enable it. That is why the telephone company and the delivery service have no First Amendment right to use their services as a choke point to silence those they disfavor. Broadly facilitating communication in that way is conduct, not speech. And if Verizon asserted a First Amendment right to cancel disfavored subscribers at a whim, that claim would fail, no less than the claimed right to censorship failed in Pruneyard v. Robins and Rumsfeld v. FAIR.
Social networking companies, too, are in the business of transmitting their user's speech. Their users are the ones who create and select the content that appears on their sites. The platforms indeed disavow responsibility for that conduct in their terms of service. The platforms do sort and facilitate the presentation of user speech, but this court, just last term in Twitter v. Taamneh, and the platforms themselves in Gonzalez v. Google, described those tools as little more than passive mechanisms for organizing vast amounts of third-party content.
The platforms do not have a First Amendment right to apply their censorship policies in an inconsistent manner and to censor and deplatform certain users. I welcome your questions.
Justice Clarence Thomas:
Counsel, it would seem that this case is a facial challenge. And to some extent it relies on the overbreadth doctrine, but that seems to be an odd fit, since Respondent represents virtually all of the platforms, and that it would be easy enough for a platform who's affected to bring it as-applied challenge. Would you comment on that, or at least address the fact that this is a facial challenge?
Mr. Whitaker:
Certainly, Your Honor. I do think that's a very significant aspect of this case. It comes to the court on a facial challenge, which means that the only question before the court is whether the statute has a plainly legitimate sweep. I actually don't understand them, Your Honor, to be making an overbreadth challenge, which as I understand it would rely on the effects on third parties. As I understand it, they're principally relying on the effects on their members. If they were bringing an overbreadth challenge, they would have to show various third-party statements.
Justice Thomas:
Well, I think how would they do that when they haven't shown that there are-- there's no way that this statute can be applied that's consistent with the Constitution. Have they met that?
Mr. Whitaker:
They certainly have not, Your Honor. We think that the statute has, indeed, a plainly legitimate sweep. And certainly there are a number of the platforms that are open to all comers and content, much like a traditional common carrier. And just as a traditional common carrier, consistent with the First Amendment, would be subject to hosting requirements, nondiscrimination requirements. So too we think that the platforms that satisfy that characterization, which are a number of them, absolutely would give this statute a plainly legitimate sweep.
Justice Sonia Sotomayor:
This is such an odd case for our usual jurisprudence. It seems like your law is covering just about every social media platform on the internet. And we have amici who are not traditional social media platforms, like smartphones and others who have submitted amici brief telling them that readings of this law could cover them. This is so, so broad, it's covering almost everything. But the one thing I know about the internet is that its variety is infinite.
So at what point in a challenge like this one does the law become so generalized, so broad, so unspecific really, that you bear the burden of coming in and telling us what exactly the sweep is and telling us how there is a legitimate sweep of virtually or a meaningfully swath of cases that this law could cover, but not others? When does a burden shift to the state when it writes a law so broad that it's indeterminate?
Mr. Whitaker:
I don't think so, Your Honor. I still think it is their burden as the plaintiffs are challenging an action of a sovereign state legislature to show that the law lacks a plainly legitimate sweep. But let me just say a word about the breadth of the law. The legislature did define the term social media platform, which is part of what triggers the law's application. But the breadth of that definition, which wouldn't cover every single website, it would cover large websites with large revenues and subscribers and the like, but the breadth of the law, apart from that definition, is significantly narrowed by the fact that the substantive provisions of the law are regulating websites that host user-generated content. That's what the substantive provisions of the statute apply to.
Justice Sotomayor:
So let me talk about Etsy. Etsy is a marketplace, if I'm going to try to analogize it to physical space, which I think in this area is a little crazy. Because yes, in some ways this is like an online bookstore or an online magazine, online newspaper, online whatever you want to call it, an online supermarket. But it's not. Because even though it has infinite space, it really doesn't. Because viewers, myself included, or users, can't access the millions of things that are on the internet and actually get through them and pick the things we want, because there's too much information. So we're limited by human attention span. So are they.
So, our theories are a little hard, but let's look at Etsy. Etsy is a supermarket that wants to sell only vintage clothes. And so it is going to, and does, limit user’s content. It's a free marketplace. It's open to everyone. But it says to the people who come onto its marketplace, "We only want this kind of product." They're going to have to censor. They're going to have to take people off. They're going to have to do all the things that your law says they can't do without all of these conditions. Why is that?
Why should we be permitting and under what level of scrutiny would we be looking at this broad application of this law that affects someone who all they want to do is sell a particular kind of product? And they have community standards and they tell you that they don't want you to curse, they don't want you to talk politics, they don't want you to do whatever. All they want you to do is sell your product. But if they're a public marketplace, which they are, they're selling to the public, this law would cover them.
Mr. Whitaker:
I think that's right, Your Honor. But let me just say a word about how the law might apply to Etsy. First of all, it wouldn't regulate the goods Etsy is offering. What our law regulates is the moderation of user-generated content. So it would only apply to Etsy to the extent that they... And I'm not sure to what extent it actually would apply to Etsy. I guess it would apply somewhat. But I guess people are uploading user-generated conduct in connection with the sale of goods, and that's the conduct it would regulate. It doesn't limit what goods Etsy can limit its marketplace too. Let me just say a word about that.
Justice Sotomayor:
Well, it opens it up for sale of goods and it tells its users, "Don't please speak about politics because that's not what our marketplace is about." That's viewpoint discrimination. This falls under a whole lot of your listings and bans and disclosure requirements. Why are we imposing that on something like this?
Mr. Whitaker:
Well, in Pruneyard v. Robins, Your Honor, this court held that the state of California could regulate the speech-hosting activity of a shopping mall, which was hosting speech, as an incident to-
Justice Sotomayor:
But not inside the stores. We said that they could come, but if they go inside the store, we didn't say anything, that free speech, that someone could stand on a platform in the middle of the store and scream out their political message. We said the common areas where we're permitting others to speak, we're going to let this particular speaker speak anything he or she wants. That's why I'm afraid of all of these common law rules that you're trying to analogize to.
Mr. Whitaker:
Well, Your Honor, I do think Etsy is similar insofar as it is in fact hosting speech and some expression as an incident to some other commercial enterprise. And I think that, if anything, makes Etsy's speech interest even weaker than the social–
Justice Sotomayor:
I'm out of–
Chief Justice Roberts:
Counsel, you began your presentation talking about – concerned about the market power and ability of the social media platforms to control what people do. And your response to that is going to be exercising the power of the state to control what goes on on the social media platforms. And I wonder, since we're talking about the First Amendment, whether our first concern should be with the state regulating what we have called the modern public square.
Mr. Whitaker:
Well, I think you certainly should be concerned about that, Your Honor. What I would say is that the kind of regulation that the state of Florida is imposing is one that is familiar to the law when you have businesses that have generally opened their facilities to all comers and content. This is the way that traditional common carrier regulation has worked for centuries. If you were an innkeeper and you held yourself out as open to the public, you could indeed be permitted to act in accordance with that voluntarily chosen business model.
So I certainly think the court should proceed carefully, but one thing the court I think is important to keep in mind is that there is an important First Amendment interest precisely in ensuring that large, powerful businesses like that that have undertaken to host massive amounts of speech and have the power to silence those speakers, the state has a First Amendment interest in promoting, in ensuring the free dissemination of ideas.
Chief Justice Roberts:
Is there any aspect of social media that you think is protected by the First Amendment?
Mr. Whitaker:
Yes, Your Honor. I can certainly imagine platforms that would be subject to this law that would indeed have First Amendment rights. We point out in our brief that when we think that if you had an internet platform that indeed had a platform-driven message, was selective on the front end, Democrats.com, I think that would be a very different kind of analysis compared to a company like Facebook or YouTube who is in the business of just basically trying to get as many eyeballs on their site as possible.
Justice Elena Kagan:
But why is it different? When we had the parade case we said, "They don't have a lot of rules, but they have some rules and we're going to respect the rules that they do have. Even though they let a lot of people come in, they don't let a few people come in. And that seems to be quite important to them." And similarly here, Facebook, YouTube, these are the paradigmatic social media companies that this law applies to, and they have rules about content. They say you can't have hate speech on the site. They say you can't have misinformation with respect to particular subject matter areas.
Somebody can say, "Maybe they should enforce them even more than they do," but they do seem to take them seriously. They have thousands and thousands of employees who are devoted to enforcing those rules. So why aren't they making content judgments not quite as explicit as the kind in your hypothetical, but definitely they're making content judgments about the kind of speech that they think they want on the site and the kinds of speech that they think is intolerable.
Mr. Whitaker:
Well, there's a lot in there. Your Honor, maybe I can start with the Hurley case. I think what was going on in Hurley I think is that you had a parade that was-
Justice Kagan:
Maybe just start with the more general question.
Mr. Whitaker:
Sure, sure. For sure.
Justice Kagan:
I'm happy for you to talk about Hurley. I don't want to get in your way.
Mr. Whitaker:
I'll start wherever you want. It's your time, not mine, Your Honor. So, yeah. So certainly, the broader question about rules of the road and the like. Common carriers have always conducted their businesses subject to general rules of decorum. I think the fact that the platforms have these general rules of decorum, the fact remains that upwards of 99%... For all that content moderation, that's really a product of the fact that they host so much content. But the fact remains that upwards of 99% of what goes on the platforms is basically passed through without review. Yes, they have spam filters on the front end and the like, and that's not uniquely-
Justice Kagan:
But that 1% seems to have gotten some people extremely angry. The 1% that's like, "We don't want anti-vaxxers on our site," or, "We don't want insurrectionists on our site." That's what motivated these laws, isn't it? And that's what's getting people upset about them, is that other people have different views about what it means to provide misinformation as to voting and things like that. And that's the point. There's some sites that can say this kind of talk about vaccination policy is good and some people can say it's bad, but it's up to the individual speakers.
Mr. Whitaker:
The fact that some people are angry about the content moderation policies doesn't show that it's their speech. And my friends talk about their advertisers. Well, we don't know whether the advertisers think it's their speech or whether they just disagree with the speech, and their advertisers and people who are angry with speech don't get a heckler's veto on Florida's law. But even more broadly than that, we know that the fact that a hosting decision is ideologically charged and causes controversy can't be the end of the game, because I think Rumsfeld v. FAIR would've had to come out the other way then. Because in Rumsfeld, certainly the law schools there felt very strongly that the military were being bigots and they didn't want them on campus. And yet this court did not look to the ideological controversy surrounding those decisions. Instead, it looked at objectively whether the law schools were engaged in inherently expressive conduct.
Chief Justice Roberts:
Well, it looked at the fact that the schools were getting money from the federal government, and the federal government thought, "Well, if they're going to take our money, they have to allow military recruiters on the campus." I don't think it has much to do with the issues today at all.
Mr. Whitaker:
Well, Mr. Chief Justice, it's difficult for me to argue with you very much about what Rumsfeld v. FAIR means, but let me just take a crack. Because I think, as I read your opinion for the court, you didn't rely actually on the funding aspect of the case to reach the conclusion that what was going on there was not First Amendment protected conduct. You were willing to spot them – that the question would be exactly the same if it were a direct regulation of speech as opposed to a funding condition. So I absolutely think that the analysis in that case directly speaks to this. And just –
Justice Brett Kavanaugh:
Can I ask you about a different precedent about what we said in Buckley. And this picks up on the Chief Justice's earlier comment about government intervention because of the power of the social media companies. And it seems like in Buckley in 1976 in a really important sentence in our First Amendment jurisprudence, we said that, "The concept that the government may restrict the speech of some elements of our society in order to enhance the relative voice of others is wholly foreign to the First Amendment," end quote.
And that seems to be what you responded with to the Chief Justice. And then in Torneo, the court went on at great length as well about the power of then newspapers. And the court said they recognized the argument about vast changes that place in a few hands the power to inform the American people and shape public opinion, and that that had led to abuses of bias and manipulation. The court accepted all that, but still said that wasn't good enough to allow some kind of government-mandated fairness right of reply or anything. So how do you deal with those two principles?
Mr. Whitaker:
Sure, Justice Kavanaugh. First of all, if you agree with me with our frontline position that what is being regulated here is conduct, not speech, I don't think you get into interests and scrutiny and all that. I do think that the law advances the First Amendment interest that I mentioned, but I think that interest, the interest that our law is serving, if you did get to a point in the analysis that required consideration of those interests, our interests-
Justice Kavanaugh:
Do you agree then if speech is involved, that those cases mean that you lose?
Mr. Whitaker:
No, I don't agree with that. And the reason I don't agree with that is because the interests that our laws serve are legitimate. And it's hard because different parts of the law serve different interests. But I think the one that sounds in your concern that is most directly implicated would be the hosting requirement applicable to journalistic enterprises. So one provision of the law says that the platforms cannot censor, shadow-ban, or deplatform journalistic enterprises based on the content of their publication or broadcast. And that serves an interest very similar to the interest that this court recognized as legitimate in Turner, when Congress imposed on cable operators a must-carry obligation for broadcasters.
And just as a broadcaster – and what the court said was there was not just a legitimate interest in promoting the free dissemination of ideas through broadcasting, but it was, indeed, a highly compelling interest. And so I think the journalistic enterprise provision serves that very similar issue. But there are also other interests that our law serves. For example, the consistency provision, Your Honor, is really a consumer protection measure. It's sort of orthogonal to all that. The consistency provision, which is really the heart of our law, just says to the platforms, "Apply your content moderation policies consistently." Have whatever policies you want, but just apply them consistently.
Justice Kavanaugh:
Could the government apply such a policy to publishing houses and printing presses and movie theaters about what they show? Bookstores, newsstands. In other words, be consistent in what kinds of content you exclude. Could that be done?
Mr. Whitaker:
I don't think so, Your Honor.
Justice Kavanaugh:
And why not?
Mr. Whitaker:
Well, I think that there is – the consumer – here, the social media platforms, their terms of service, their content moderation policies are really part of the terms under which they're offering their service to users. I don't think that that paradigm really fits in what Your Honor is talking about. But look, we agree. We certainly agree that a newspaper, a book in a bookstore is engaging in inherently expressive conduct. And our whole point is that these social media platforms are not like those, and why are-
Justice Jackson:
But doesn't it depend on exactly what they're doing? I guess the hard part for me is really trying to understand how we apply this analysis at the broad level of generality that I think both sides seem to be taking here. You say what is being regulated here is conduct, not speech. Well, I guess maybe if you're talking about Facebook's news feed feature. But there are lots of other things that Facebook does. That might be speech, but then there might be other things that Facebook does that doesn't qualify as speech. So, don't we have to drill down more in order to really figure out whether or not things are protected?
Mr. Whitaker:
Actually, I don't think so. I think that precise aim and view strongly favors our position, Your Honor. Because in the posture of this facial challenge, all you need to look at is whether there are at least some activities.
Justice Jackson:
No, but that's... No, no, no. I guess what I'm saying is, you mentioned the Pruneyard case, or the FAIR case, excuse me. We didn't say that law schools as a categorical matter are always engaged in unprotected speech. We looked at the particular thing. This was a fair, and the law school was saying, "We don't want these certain entities in it." I hear you suggesting that we can just say, "Facebook is a common carrier and therefore everything it does qualifies as conduct and not speech." And I don't think that's really the way we've done this in our past precedents. So can you speak to that?
Mr. Whitaker:
Sure. Certainly that's not what we're saying, Your Honor. I completely agree with you that it's very important to isolate what conduct each particular provision of the law is regulating.
Justice Jackson:
Not the law, the entity. What is the entity doing? We have to do an intersection of what the law says they can't do and what in particular they are doing, right?
Mr. Whitaker:
And I guess the right level of generality and the level of generality that's sufficient, I think, to conclude that the law has a plainly legitimate sweep is, we are talking about the social networking companies activities in content moderating user-uploaded content. That, I think, is the relevant activity and that activity-
Justice Jackson:
All right, so what do you do with – LinkedIn has a virtual job fair and it has some rules about who can be involved. That seems to map on, I would think, to the FAIR case. Is that what you're saying?
Mr. Whitaker:
Well, I don't think so. I don't think it would map onto our theory in this case because it sounds like to me, and I'm not totally aware of all the facts of LinkedIn there, but if I understand-
Justice Jackson:
Yeah. I think that's a problem In this case. We're not all aware of the facts of what's happening.
Mr. Whitaker:
Well, exactly. And I think that that is one of reasons why this facial challenge has been very confusing to defend, because we kind of don't know what to defend against.
Justice Neil Gorsuch:
Mr. Whitaker, on that score, in a facial challenge we have a bit of a problem because different legal principles apply and different factual circumstances, and there are many different defendants or plaintiffs here, sorry, with different services. So that's a complicating feature on a facial challenge. But here's another one for you. What about Section 230, which preempts some of this law? How much of it and how are we to account for that complication in a facial challenge?
Chief Justice Roberts:
Why don't you answer the question-
Justice Gorsuch:
Briefly.
Chief Justice Roberts:
–and then we'll move on.
Mr. Whitaker:
Well, I think that the court should answer the question presented, I guess.
Justice Gorsuch:
But how can we do that without looking at 230?
Mr. Whitaker:
Well, because I don't think that there's any, and some of this was briefed at the cert stage, Your Honor, I don't think that the Section 230(c)(2) preemption question is really going to dispose of the case. The district court actually reached the Section 230 issue, but concluded that it still had to reach the constitutional issue anyway.
Justice Gorsuch:
I'll get back to this in my turn. Thank you.
Chief Justice Roberts:
Thank you, Counsel. Justice Thomas, anything further?
Justice Thomas:
Mr. Whitaker, could you give us your best explanation of what you perceive the speech to be in this case, or alleged to be in this case?
Mr. Whitaker:
Well, as I understand their contention, it's this idea that the platforms, in having content moderation policies, are somehow creating a welcoming community, I guess. It seems to me at that level of generality, that can't really be a cognizable method. That seems to me more like a tautology than a message. Basically, we want the people on our sites that we want. And I think at that level of generality, certainly the Pruneyard case would have to come out the other way. Because in Pruneyard, the mall certainly wanted to ban leafleting because it wanted to create a certain environment, and yet this court said that they did not have a First Amendment right to do that.
Justice Thomas:
I think what I was more interested in is, we're using broad terms like content moderation, and throughout the briefs you have shadow-banning, deprioritizing, and all sorts of things. And I guess with these facial challenges, I always have a problem that we are not talking about anything specific. And then as-applied challenge, at least we know what's in front of us and what your interpretation, or at least the state's interpretation of its law is in that case. Now we're just speculating as to what the law means. So I'm just trying to get more specificity as to what the speech is in this case they are censoring. As far as I can tell, and I don't know of any speech interest in censoring other speech, but perhaps there is something else.
Mr. Whitaker:
Well, I don't think that they do have, certainly not a speech interest. At most, I think that they would have some interest in the allegedly inherently expressive conduct of speech, that way of looking at it. I take it my friends from the United States agree with, but we do not think they have a message in censoring and deplatforming users from the sites any more than the law schools in FAIR had a message in booting military recruiters off campus.
Chief Justice Roberts:
Justice Alito?
Justice Samuel Alito:
Did the plaintiffs raise content, I'm sorry, overbreadth below?
Mr. Whitaker:
No, Your Honor. I couldn't find the word overbreadth in any of their pleadings.
Justice Alito:
Where in the record should I look to find a list of all of the platforms that are covered by the Florida statute?
Mr. Whitaker:
Well, Your Honor, I'm afraid that doesn't appear in the record because I think the platforms were fairly cagey about which of their members they thought the statute applied to. The record only contains three platform-specific declarations: Etsy, Facebook, and YouTube. So that's part of the problem in this case is that we don't have a sense of – the record has not been fully developed to answer that question. So we're kind of litigating in the dark here. And this was litigated on a preliminary injunction at breakneck speed without the state having a chance to take Discovery. And that's part of the reasons why some of these questions are difficult to answer.
Justice Alito:
Well, I'll ask Mr. Clement that question too. As to the platforms that are covered, where in the record would I look to find a list of all of the functions that those platforms perform?
Mr. Whitaker:
I'm not aware in the record, your Honor, of an all encompassing list of all the functions the platforms perform. There certainly are, as I mentioned, three platform specific declarations. Also, some more general declarations that talk about some of their members more generally, but it's not sort of all in one place. I apologize, your Honor.
Justice Alito:
Does your law cover any websites that primarily or even exclusively engage in non-expressive conduct?
Mr. Whitaker:
I think it does cover websites that engage in primarily non-expressive conduct. I mean, we would characterize the social networking platforms as engaging in primarily non-expressive conduct in so far as they are hosting speech, just like a traditional common carrier is not engaged in expressive conduct in transmitting the communications of its subscribers. And we do think our law would apply to certainly the largest, at a minimum, the largest social networking platforms.
Justice Alito:
What is the right standard for a facial challenge if we think that your law implicates a portion, a percentage of expressive conduct, and a portion of non-expressive conduct? How should we analyze that?
Mr. Whitaker:
I think that, so there's a.
Justice Alito:
So we need a numerator and a denominator there. I think. What would they be?
Mr. Whitaker:
Well, I don't think there isn't that the standard would have a numerator and a denominator, actually, your Honor, in this context. We would view it as the question being whether the statute has a plainly legitimate sweep without the need to compare applications. As I understand this court's precedence, the numerator denominator comparison would be something you would do if there were an overbreadth claim in this case. But I don't understand my friends to be making an overbreadth claim. Maybe they'll say something different, but I could not find the word overbreadth in their pleadings. In the Texas case, they do have a footnote suggesting that they made an overbreadth claim in the alternative.
Justice Alito:
Thank you. Justice Sotomayor. Justice Kagan.
Justice Kagan:
I just want to sort of understand your position and I want to narrow this to the paradigmatic social media companies newsfeed postings, Facebook, YouTube, Twitter slash X. So suppose that I say, just take this as a given. I mean, you can argue with the facts, but don't. Suppose that I say for the most part, all these places say we're open for business, post whatever you like and we'll host it. But there are exceptions to that and clearly content-based exceptions, which the companies take seriously.
So let's say they say we think that misinformation of particular kinds is extremely damaging to society, misinformation about voting, misinformation about certain public health issues. And so too, we think that hate speech or bullying is extremely problematic. And so we are going to enforce rules against this. If they're only going to apply to a small percentage of the things that people want to post, for the most part, they're open for business. But we are serious about those content-based restrictions. All right, so in that world, why isn't that a classic First Amendment violation for the state to come in and say, we're not going to allow you to enforce those sorts of restrictions, even though you're basically, it's like an editorial judgment, you are excluding particular kinds of speech.
Mr. Whitaker:
Well, your Honor, I think if you, I take your hypo to be assuming that it's First Amendment protected activity. And I think that what you would do in that instance, you would have to run intermediate scrutiny under Turner. And the analysis, regrettably–
Justice Kagan:
Don't say what I take it to be First Amendment activity. I mean, do you take it to be First Amendment activity?
Mr. Whitaker:
No, no, that's our whole point. I mean, again.
Justice Kagan:
Even though they're saying, yeah, we are a big forum for lots of messages, but not for those kinds of messages, we want to exclude those kinds of messages. Why isn't that a First Amendment judgment?
Mr. Whitaker:
I think the court held otherwise, I think in Pruneyard because there was an editorial policy against leafleting too. And again, I don't–
Justice Kagan:
No, that was just about leafleting and the small owner didn't have any expressive views. I'm taking as a given that YouTube or Facebook or whatever has expressive views. There are particular kinds of expression defined by content that they don't want anywhere near their site.
Mr. Whitaker:
But I think your Honor, you still would have to look at the objective activity being regulated, namely censoring and deplatforming and ask whether that expresses a message. And because they host so much content, an objective observer is not going to readily attribute any particular piece of content that appears on their site to some decision to either refrain from or to censor or deplatform. And that makes-
Justice Kagan:
Do you think so as to this, this is a real world example, Twitter users one day woke up and found themselves to be X users and the content rules had changed and their feeds changed, and all of a sudden they were getting a different online newspaper, so to speak, in a metaphorical sense every morning. And a lot of Twitter users thought that was great, and a lot of Twitter users thought that was horrible because in fact there were different content judgments being made that was very much affecting the speech environment that they entered every time they opened their app.
Mr. Whitaker:
Your Honor, respectfully, that does not answer whether they have a message in their censorship any more than I'm sure people objected again, quite strenuously to the fact that the law schools were permitted to interview on campus. I'm sure people wanted to ban leafleting at the mall in Pruneyard. And that does not give them a message in that. And I think the reason for that is if they are not carefully selecting the content in the newspaper, they don't have a message in the existence, in the mere existence, of the content on this.
Justice Kagan:
Thank you, General.
Justice Alito:
Justice Gorsuch.
Justice Gorsuch:
Just wanted to give you a chance to finish up on the Section 230 point, I think it's section six of your law that says that the law is not enforceable to the extent it conflicts with Section 230.
Mr. Whitaker:
Sure.
Justice Gorsuch:
So why wouldn't we analytically want to address that early on in these proceedings, whether in this court or a lower court? Because that complicates our attempt to resolve things in a facial challenge.
Mr. Whitaker:
Sure, your Honor. And I think that the reason is because the law is not facially, at least preempted under 230(c)(2), which principally regulates takedowns. One reason for that is we understand 230(c)(2) not to sanction viewpoint based content moderation under the rubric of otherwise objectionable. And there's a very nice article that Professor Volokh has on this in the Journal of Free Speech Law where he lays this out and we obviously haven't briefed this, your Honor. The second point I would make about Section 230(c)(2) is that it only applies to good faith content moderation. So to the extent our law prohibits them from engaging in bad faith content moderation, that is absolutely not preempted by 230(c)(2). And one way to understand their constitutional claims in this case because they have an expansive view of Section 230(c)(2), is that they are in essence asserting a constitutional right to engage in bad faith content moderation because they already have the right to engage in a lot of moderation of illicit content under 230(c)(2) as long as they do so in good faith.
Justice Gorsuch:
And then just to follow up on Justice Kagan's line of questioning, you've analogized to common carriers and telegraphs in particular. Why is that an apt analogy here, do you think?
Mr. Whitaker:
I think it's an apt analogy, your Honor, because the principle function of a social media site is to enable communication and it's enabling willing speakers and willing listeners to talk to each other. And it's true that the posts are more public, but I don't think that Verizon would gain any greater right to censor simply because it was a conference call. I don't think that UPS or FedEx would gain a greater right to censor books because it was a truckload of books as opposed to one book. And so the analogy is indeed apt. And so there's been talk of market power. Market power is not an element, I think of traditional common carrier regulation. And indeed some entities that are regulated as common carriers like cell phone providers operate in a fairly competitive market.
Justice Gorsuch:
Thank you.
Justice Alito:
Justice Kavanaugh.
Justice Kavanaugh:
In your opening remarks, you said the design of the First Amendment is to prevent suppression of speech, end quote. And you left out what I understand to be three keywords in the First Amendment or to describe the First Amendment by the government. Do you agree by the government is what the First Amendment is targeting?
Mr. Whitaker:
I do agree with that, your Honor, but I don't agree that there is no First Amendment interest in allowing the people's representatives to promote the free exchange of ideas. This court has recognized that as a legitimate First Amendment interest in the Turner case and all the way going back to the Associated Press case.
Justice Kavanaugh:
The Turner case, the intervention was the court emphasized unrelated to the suppression of speech, the antitrust type intervention there. So I'm not sure when it's related to ensuring relative voices are balanced out or there's fairness in the speech or balance in the speech that is covered by Turner. Do you agree with that?
Mr. Whitaker:
No, I don't agree with that, your Honor. Our interest in our law-
Justice Kavanaugh:
What did Turner mean by unrelated to the suppression of speech?
Mr. Whitaker:
We don't view our law as advancing interests that are related to the suppression of speech. We think that the interest, for example, in protecting journalistic enterprises from being censored, from MSNBC being censored because an internet platform doesn't like a broadcast it showed on its station the other day, that is just an interest in preventing from being silenced. It's not an equalizing interest, it's giving them a chance.
Justice Kavanaugh:
On the editorial control point, you really want to fight the idea. And I understand that editorial control is the same thing as speech itself and you've emphasized Pruneyard over and over again, but we have a whole nother line of cases as you're aware, of course, Hurley, PG&E, Torneo, Turner, which emphasize editorial control as being fundamentally protected by the First Amendment. And I understood the line between Pruneyard on the one hand and those cases on the other to be whether you were involved in a speech communications business as opposed to a shopping center owner, which is the other side of the line. Can you respond to those cases?
Mr. Whitaker:
Sure. I guess I don't dispute the general principle of editorial control. I just don't think that the social media platforms are engaged in editorial control. And again, the recruiters, the law schools, excuse me, in Rumsfeld v. FAIR argued that they were exercising editorial control when they booted military recruiters off campus and invoked Torneo explicitly. And this court had none of it. So the court does need to draw a line, I think between a selective speech host that is exercising editorial control and a speech host like a common carrier or like the mall in Pruneyard that can indeed be regulated in being prevented from silencing its customers.
Justice Kavanaugh:
On the selective speech host point, I think you've made the point to Justice Kagan that they don't eliminate much speech. But didn't we deal with that in Hurley as well and say that the mere fact that the parade organizer usually took almost all comers was irrelevant to the First Amendment interest in essentially editorial control over who participated in the parade?
Mr. Whitaker:
Yeah. And I guess I think Hurley, your Honor, really turned more on the fact that what was the activity there was a St. Patrick's Day parade with a particular expressive purpose. And so perhaps it could still be expressive and be a little bit more lenient. But I would note that this court in Hurley did, in rejecting the conduit argument, relied on the fact that there was front end selection of the members of the parade. That the parade committee, the committee that was responsible for it was doing front end selection. So I do think Hurley fits our theory, but I also think that selectivity is totally relevant to who is the speaker. And we analogize in our brief to the government speech cases where this court has made that exact point in a variety of cases such as Matal v. Tam and Shurtleff. And what you have said is that if the government is not exercising a ton of control over the speech that comes into a forum, it is not speaking and it can't censor. That's what this court held in Shurtleff the ball.
Justice Kavanaugh:
Thank you.
Justice Alito:
Justice Barrett.
Justice Amy Coney Barrett:
Mr. Whitaker, I have a question about this editorial control because really when it comes to platforms that are the traditional social media platforms like YouTube, Instagram, TikTok, Twitter slash X, it all rides, it all turns on editorial control. It seems to me that one distinction between this and fair is that here, these companies are speech hosts, right? I mean the law schools in FAIR were hosting job fairs for this purpose, like online recruiting. They weren't gathering together a whole bunch of people and saying, here, present your ideas, present your posts. I mean these social media companies are hosting speech, so why isn't that more like a newspaper in Torneo?
Mr. Whitaker:
It is different, your Honor. But I think that that's why we've leaned also on the common carrier analogy, which I think reflects that a speech, you can't just say it's a speech host and go home. Because if that were true, Verizon could censor. Excuse me.
Justice Barrett:
Well put aside common carry for one second and just pretend just common carrier to the side. Just tell me why this doesn't look like the same kind of editorial control. We see newspapers exercise
Mr. Whitaker:
Because the platforms do not review. It is a strange kind of editor, your Honor, that does not actually look at the material that is going on. Its compilation. I mean in Twitter v. Taamneh, the platforms told you that they didn't even know that ISIS was on their platform and doing things. And it is a strange kind of editor that does not even know the material that it is editing.
Justice Barrett:
Is it because it's not humanized? I mean human eyes, not human eyes. Is it because it could be an algorithm that says we want to have, Justice Kagan was pointing out terms of service, we want to have this kind of site. Or some say that for example, TikTok might have boosted pro-Palestinian speech and reduced pro-Israel speech. That's a viewpoint, right? And if you have an algorithm, do it. Is that not speech?
Mr. Whitaker:
Well, it might be your Honor. But again, in Twitter and Gonzalez, the platforms told you that the algorithms were neutral methods of organizing the speech much like the Dewey Decimal System.
Justice Barrett:
Well, that's not what they're saying here. So let's assume that what they're saying here, that they're organizing it in ways that reflect preferences that are expressive of their terms and conditions. In that event, do you think it would be editorial control in a First Amendment sense?
Mr. Whitaker:
No, and I think it's important to separate the organized, and I agree with Justice Jackson that it's important to separate the various functions, the organizing function from the hosting function. And this is a point that Professor Volokh has made in his article that we cite. I mean, simply because they are required to host certain speech that does not actually meaningfully prevent them from organizing that speech. So I think the court has to separate out regulation of the organization from simply preventing them from censoring. And the reason, your Honor, it is different from a newspaper, I think is two principle points. First, we've been talking a lot about selection, but second space constraints. Space constraints are something that this court in FAIR and in Torneo relied on as one factor that is relevant. And the social media companies don't have any space constraints, which means that a requirement to host an additional piece of content is a relatively less essential-
Justice Barrett:
Well, let me just interrupt you there. I mean, Justice Sotomayor pointed out that even though there may not be physical space constraints, there are the constraints of attention. They have to present information to a consumer in some sort of organized way and that there's a limited enough amount of information that the consumer can absorb it. And don't all methods of organization reflect some kind of judgment? I mean, could you tell, could Florida enact a law telling bookstores that they have to put everything out by alphabetical order and that they can't organize or put some things closer to the front of the store that they think their customers will want to buy?
Mr. Whitaker:
I think first, let me just take a step back because one of the problems here is we don't have any information in this record on their algorithms. It's very difficult for us to pick apart what exactly the algorithms are doing. You certainly could imagine, I think to be candid, an algorithm that could be expressive. As far as we can tell if the algorithms work though in the manner that this court described them in Twitter v. Taamneh, they look more like neutral ways to reflect user choice. And I don't think there's expression in that. Now you can imagine a different kind of algorithm if an algorithm, if it were possible to have an algorithm made a website look like a newspaper, that would be different. But again, I think the question of organization is analytically distinct from the separate question of whether they can be regulated in their hosting and censorship.
Justice Barrett:
Okay. So your argument that it's not expressive entirely depends on the hypothesis that the sorting and feed functions are solely some sort of neutral algorithm that's designed to use their preference and that they reflect no kind of policy judgment based on the platform itself?
Mr. Whitaker:
No, no, not at all. Actually, your Honor, because I think that preventing them from censoring does not meaningfully preclude them from organizing. If they're required to carry a piece of content, they can organize it however they want, generally. I mean there are prohibitions on shadow-banning and the like, but they can generally organize it however they want. So a prohibition on censorship and deplatforming is not, I think, a meaningful interference with organizing. But again, on algorithms, I would just stress that this is a facial challenge. We don't have any particular information on what exactly the content of their algorithms are. And so I think the only question there is whether there's a possible state of the world under which the algorithms are non-expressive.
Justice Barrett:
Okay, let me just ask you one last question. It's about the facial challenge aspect of this. So Florida's law so far as I can understand it is very broad and we're talking about the classic social media platforms, but it looks to me like it could cover Uber. It looks to me like it could cover just Google search engines, Amazon Web Service, and all of those things would look very different. And Justice Sotomayor brought up Etsy. It seems to me that they're arguing, now Etsy has a feed recommended for you, but it also just has shops for handmade goods that you can get.
It looks a lot more like a brick and mortar marketplace or flea market than a place for hosting speech. Okay. So if this is a facial challenge and Florida's law indeed is broad enough to cover a lot of this conduct, which is farther away from expression than these standard social media platforms, why didn't you then in your brief defend it by pointing out, "Look, there's all this other stuff that's perfectly fine that Florida covers. We don't want some person who wants to sell their goods on Etsy to be suppressed because it's stuff, handmade goods that express a political view," for example?
Mr. Whitaker:
I think we did defend the application of our law to Etsy, and I think I've defended that from the lectern, but I don't think you need to be with me at all–
Justice Barrett:
I mean pointing out, I mean, I can sit here and think of all kinds of applications of this law that really wouldn't hit expression, but I just don't understand you to have been defending the law in that way as opposed to countering the argument that the platforms are not engaged in expression.
Mr. Whitaker:
We're making both arguments, your Honor, to be clear. As I was discussing with Justice Sotomayor, or we view Etsy as not having a significant expressive interest in applying its content moderation policy.
Justice Barrett:
So is that enough to just make this whole thing fail, I guess is my question?
Mr. Whitaker:
Yes.
Justice Barrett:
If we'd agreed with you that Etsy, it's fine for it to apply to or Uber, it's fine, the Amazon Web Services, if we agreed with you of all that, is that enough to just say, well then this facial challenge can't succeed?
Mr. Whitaker:
Yes, because that would give the law a plainly legitimate sweep and that's all the court needs to address here to reject the facial challenge.
Justice Barrett:
Thank you.
Justice Alito:
Justice Jackson.
Justice Jackson:
So I feel like there's a lot of indeterminacy in this set of facts and in this circumstance as Justice Alito tried to, I think illuminate with his questions. We're not quite sure who it covers, we're not clear exactly how these platforms work. One of the things I wanted to give you the chance to address is the lack of clarity about what the statute necessarily means. You've given a couple of, you've talked about the consistency provision for example, and you've represented what you think it means, but we don't have a state court determination interpreting that provision, do we?
Mr. Whitaker:
You do not, your Honor. In fact, the law was not allowed to go into effect. So the Florida courts have not had an opportunity to construe this statute at all. And I think that counsel is strongly in favor of rejecting the facial challenge because this court has considered in the Washington State Grange case, the fact that the state courts have not had an opportunity to construe a state law that's being attacked on its face as a reason to reject–
Justice Ketanji Brown Jackson:
Can I ask you, do you think this statute could be susceptible to multiple interpretations? I mean, I can imagine even the consistency provision. What does it mean that they have to do this consistently? They have to apply the same standards or it has to substantively result in the same level of preference? I could imagine you could interpret that both more narrowly or broadly.
Mr. Whitaker:
There certainly may be some interpretive questions, your Honor. On that point, I don't think there is any ambiguity. And let me just read to you what the consistency provision says. It says, "A social media platform must apply censorship deplatforming and shadow-banning standards in a consistent manner among its users on the platform." And the censorship deplatforming and shadow-banning standards are the things that the social media company must under a separate provision of the law publicly disclosed, which was a disclosure requirement that the 11th Circuit upheld.
Justice Jackson:
Yes, I understand. I mean, I appreciate that Florida's positions is that our law is perfectly clear, but-
Mr. Whitaker:
Well, but I think that that language I just read to you, I think makes clear that the baseline for comparison is not some abstract notion of fairness.
Justice Jackson:
Alright, well let me ask you this about that. Alright, so let's assume we get to the point we disagree with you about whether or not expressive activity is covered and we're actually applying or trying to determine which standard applies. That is the level of scrutiny. What I'm a little confused about is how we evaluate, for example, the 30-day restriction with respect to determining whether it's content-based or content neutral. I appreciate that on its face. It doesn't point to a particular type of content, but I suppose it's applied in reference to content. I mean that restriction is a regulated entity can only change its rules, terms and engagements once every 30 days. But we would have to look at what it was before and what it is now to determine if there's a change. So is that a content-based restriction or not?
Mr. Whitaker:
Certainly not. I mean, this court held a couple of terms ago in the City of Austin Case just that simply because a regulation requires consideration of content, doesn't make it content based, and there's nothing on the face of that provision that targets any particular message of the platforms. And I think just to zoom out a little bit on the 30-day provision, I mean that provision is really an adjunct to the consistency provision as I understand it. And the point of it is that it wouldn't do much good to require the platforms to apply their policies consistently if they could just sort of constantly change them. And that I think is the point.
Justice Jackson:
I understand, but in the application of even the consistency provisions to determine whether they're not doing it consistently, aren't we also looking at content to some extent? I think it's not necessarily as easy as it might seem to determine whether or not these provisions are actually content-based or content neutral.
Mr. Whitaker:
Well, again, I don't think the fact that it requires consideration of content makes it content based. I think you would look at whether it's targeting some kind of a message of the platform and there's nothing on the face of the 30-day provision that does that, your Honor.
Justice Jackson:
Thank you.
Justice Alito:
Thank you counsel. Mr. Clement.
Mr. Paul D. Clement:
Mr. Chief Justice and may it please the court, Florida's effort to level the playing field and to fight the perceived bias of big tech violates the First Amendment several times over. It interferes with editorial discretion, it compels speech, it discriminates on the basis of content, speaker and viewpoint, and it does all this in the name of promoting free speech, but loses sight of the first principle of the First Amendment, which is it only applies to state action. Florida defends its law as you've heard this morning. Principally by that there's no expressive activity being regulated. That blinks reality. This statute defines the targeted websites in part by how big their audience is. It regulates the content and display of particular websites and it tries to prevent my clients from censoring speakers and content. If you are telling the websites that they can't censor speakers, you can't turn around and say you're not regulating expressive activity.
It's all over this law. And that brings it squarely within the teaching of Torneo, PG&E and Hurley. All three of those cases teach that you cannot have the forced dissemination of third party speech and they reject considerations of market power misattribution or space constraints. And Reno and 303 Creative make clear those principles are fully applicable on the internet. Indeed, given the vast amount of material on the internet in general and on these websites in particular, exercising editorial discretion is absolutely necessary to make the websites useful for users and advertisers. And the closer you look at Florida's law, the more problematic the First Amendment problems become. It singles out particular websites in plain violation of Minneapolis Star. Its provisions that give preferences to political candidates and to journalist enterprises are content based in the extreme. I welcome the court's questions.
Justice Thomas:
Mr. Clement. If the government did what your clients are doing, or would that be government speech?
Mr. Clement:
So it might be government speech, but I think it would be unconstitutional government speech, which is to say when the government, I mean obviously you have government speech cases, but what the government's doing is exercising editorial discretion to censor some viewers or some speakers and not others. I think that plainly violates the first amendment, and I think that's essentially the thrust of this court's decision in the Manhattan community cable case against Halleck, which is that in this area looking for state action is absolutely critical. There are things that if the government does is a First Amendment problem, and if a private speaker does, we recognize that as protected activity.
Justice Jackson:
Mr. Clement, you – Oh, sorry.
Justice Thomas:
Can you give me one example of a case in which we have said the First Amendment protects the right to censor?
Mr. Clement:
So, I don't know that the court used that particular location, Justice Thomas, but I think that is the thrust of Hurley, that is the thrust of PG&E, that is the thrust of Torneo. In all of those cases, a private party did not want to convey and disseminate the speech of a third party. And in every case, the government said, "No, we have some really good reason here why this private party has to disseminate the message of a third party."
Justice Thomas:
I've been fortunate or unfortunate to have been here for most of the development of the internet. And the argument under Section 230 has been that you're merely a conduit, which that was the case that back in the 90s and perhaps the early 2000s. Now you're saying that you are engaged in editorial discretion and expressive conduct. Doesn't that seem to undermine your Section 230 arguments?
Mr. Clement:
With respect, Justice Thomas, I mean, obviously you were here for all of it. I wasn't here for all of it. But my understanding is that my clients have consistently taken the position that they are not mere conduits. And Congress, in passing Section 230, looked at some common law cases that basically said, well, if you're just a pure conduit, that means that you're free from liability. But if you start becoming a publisher by keeping some bad content out, then you no longer have that common law liability protection. And as I understand 230, the whole point of it was to encourage websites and other regulated parties to essentially exercise editorial discretion, to keep some of that bad stuff out of there.
And as a result, what Congress said is, they didn't say, "And you're still a conduit if you do that." No, it said, "You shouldn't be treated as a publisher," because Congress recognized that what my clients were doing would, in another context, look like publishing, which would come with the kind of traditional defamation liability. And they wanted to protect them against that, precisely to encourage them to take down some of the bad material that if these laws go into effect, we'd be forced to convey on our websites.
Justice Jackson:
Mr. Clement, can I ask you about the facial nature of this? Because my understanding is that to strike down this statute as facially unconstitutional, we would have to conclude that there's no possible way for this law to govern these entities in their conduct. So first, do I have the standard right?
Mr. Clement:
With all due respect, I don't think so. In the First Amendment context, as my friend was indicating, the question is whether or not the statute has a plainly legitimate sweep. So, it's not the Salerno, if there's one little application somewhere, that's enough to save the statute.
Justice Jackson:
But I mean, whose burden is that? I thought it was your burden to say that this statute, in almost all of its applications, or in most or a substantial number or something, would be unconstitutional, in order to get it facially stricken.
Mr. Clement:
So, two things, your Honor. I think our burden would be, it would be our burden to say that this statute doesn't have a plainly legitimate sweep. In fact, it is our position, and we did make this argument below and succeeded, that this statute actually has no constitutional application. And part of that is because none of this statute, at least none of the part that's in front of you today, applies unless you are a covered website. And the website-
Justice Jackson:
But wait, I don't understand. I'm sorry. So, no application, but we have so many different applications of the law in this situation, precisely because it is so broad. So, how can you say that?
Mr. Clement:
Because the statute only applies to websites that are a handful of websites that meet the viewership threshold or the total sales threshold. And it's not our only argument obviously, but one of our arguments is, you can't regulate expressive activity in that kind of –
Justice Jackson:
And those websites only–
Justice Alito:
Mr. Clement, does the Florida law cover Gmail?
Mr. Clement:
The Florida law, I think by its terms, could cover Gmail.
Justice Alito:
All right. So, does Gmail have a First Amendment right to delete, let's say, Tucker Carlson's or Rachel Maddow's Gmail accounts, if they don't agree with her, or his or her viewpoints?
Mr. Clement:
They might be able to do that, your Honor. I mean, that's obviously not something that has been the square focus of this litigation, but lower courts will-
Justice Alito:
Well, if they don't, then how are we going to judge whether this law satisfies the requirements of either Salerno or overbreadth?
Mr. Clement:
So, it's, again, I think it's the plainly legitimate sweep test, which is not synonymous with overbreadth, but in all events. Since this statute applies to Gmail, if it applies at all because it's part of Google which qualifies over the threshold, and it doesn't apply to competing email services that provide identical services. That alone is enough to make every application of this statute unconstitutional.
Justice Alito:
Doesn't apply to – go ahead.
Justice Kagan:
How could that be, Mr. Clement? It's not unconstitutional to distinguish on the basis of bigness, right?
Mr. Clement:
It is when you're regulating expressive activity, that's what this court said in Minneapolis Star. So, the statute in Minneapolis Star was unconstitutional in all its applications, the statute.
Justice Kagan:
You're saying, if there were no issue here, that this is really a subterfuge, they were trying to get at a certain kind of media company because of their views. And the only issue was, it's not worth it to regulate a lot of small sites. We only want to go after the big sites that actually have many millions of users. You think that's a First Amendment violation?
Mr. Clement:
I do. The way you're asking the question suggests you think that's a harder case than the one I actually have before you.
Justice Kagan:
I think that's a little bit of an impossible case to say you can't go after big companies under the First Amendment.
Mr. Clement:
All you have to do is go after all the social media websites or all of the websites. You don't have to draw these artificial distinctions that just so coincidentally happened to coincide with the websites that you think have a bias that you are trying to correct. And just to remind you of how the statute-
Justice Kagan:
Right. But I took that out of the question. Let's say that they weren't going after these companies because of bias or because they thought they had a slant. It was just, we're going after the biggest companies because those are the companies with the biggest impact and the most number of users. How could that be a First Amendment violation?
Mr. Clement:
Because Minneapolis Star says it is, because Arkansas Writers’ Project says it is. And, because if you actually got to analyzing their so-called consumer protection interest, the consumer protection interest would be exactly the same for a website with 99 million global users as it would be with a website with 100 million global users. And so, I think there are red flags over all of the distinctions drawn in the statute. And then, if you look at the statute more closely, I mean, my goodness, the political candidates provision says that you can't have posts about a political candidate. I can't imagine anything more obviously content-based than that. That's unconstitutional in every one of its applications.
Chief Justice Roberts:
Is there any aspect of the service provided on the social platforms that is not protected under the First Amendment, or that is plainly valid under the First Amendment?
Mr. Clement:
I think it's all protected by the First Amendment. I mean, obviously-
Chief Justice Roberts:
Direct messages?
Mr. Clement:
I think direct messages are protected under the First Amendment. I think that the courts that have looked at things like whether Gmail is a common carrier have actually held, and there's a case involving the RNC that has a specific holding, that Gmail is not a common carrier. I think much of the logic of that would apply to direct messaging. Obviously, if this were a statute that tried to address my clients only to the extent that they operated a job board, this would be a lot closer to fair and I might have a harder case.
Justice Gorsuch:
So, Mr. Clement, the government says your brief sometimes airs in suggesting that a conduit type activity is always expressive. And direct messages, Gmail, I take it your view then is that providers can discriminate on the basis, political views, religious beliefs, maybe even race?
Mr. Clement:
So, Justice Gorsuch, I think you have to distinguish between two things. One is sort of a status based discrimination, and the other is status as speaker. And so, I don't think that our clients could discriminate and say, "You can't be on our service. You can't even get access to our service," on the basis of race.
Justice Gorsuch:
But in how they use it and their speech, talking about the content of their speech. It has something to do with religion or politics or race, you can editorialize and use that editorial power to suppress that speech, right?
Mr. Clement:
So, I think that gets to a very hard question. I think it would be speech, but I think it's just-
Justice Gorsuch:
So, the answer is yes, we can delete emails, we can delete direct messages that we don't agree with based on politics, religion, or race.
Mr. Clement:
Probably not an application. But I do think, look, a bookstore, if it wants to have a display this month to celebrate Black history, can they limit that display just to African-American authors? I think the answer is probably yes.
Justice Gorsuch:
And so, it is here too, right?
Mr. Clement:
I think the answer is that there's at least First Amendment activity going on there, and then you would apply the equal protection clause to it, and then you would decide whether or not that's permissible or not. But obviously, I think this case involves editorial decisions at its heart. And one thing I just want to make clear on the facial challenge point just so you understand how this case came to be. As you heard today, my friend's principal argument is, this doesn't cover expressive activity at all. And in the lower court, when we sought a preliminary injunction, they put all their eggs in that basket.
And they specifically said, "Look, we don't want to do intermediate scrutiny at the preliminary injunction stage, so we really only have an argument to resist this preliminary injunction if you hold that this is not expressive activity." And they did the same thing in the 11th circuit. We have a footnote in our brief, making it clear on the pages exactly where they did this. So, they basically said, "We either want to win this on the threshold question that this is not expressive activity, or we don't want to get into the rest of it at this point. We'll have some discovery and we'll have the preliminary injunction."
Justice Alito:
Mr. Clement, does the Florida law apply to Uber?
Mr. Clement:
Its definition would seem to apply to Uber, yes.
Justice Alito:
So, you've told us that it's okay for your clients to discriminate on the basis of viewpoint in the provision of email services, or in allowing direct messages, messages from one Facebook user to another on a private facility. How about Uber discriminating on the basis of viewpoint with respect to people that its drivers will pick up?
Mr. Clement:
So, I think the way that-
Justice Alito:
Is that okay?
Mr. Clement:
I don't think that's okay. I don't think Uber is interested in doing that. I think the way the statute would apply to Uber, just to make clear, is it really would apply on comments on the drivers or comment section on something like that. If Uber wants to just sort of, and Etsy, I think it's the same way. Etsy has an ability for you to put comments on the seller and whether they did a nice job or a bad job, and Etsy doesn't want certain comments on that, and they want to clean that up to keep it to be a better place for people to come and look at materials.
So, when you think about the applications of this statute to some of the things that seem less obvious, it's really focused on that expressive aspect of it. But obviously, the core of the statute and the motivation for the legislation, and the examples that my friends from Florida include in their own petition appendix are about much more expressive activity by the YouTubes and the Facebooks of the world, excluding certain speakers, and they want to override that classic editorial decision.
Justice Barrett:
But Mr. Clement, that's one of the things that's hard for me about this case is, let's posit that I agree with you about Facebook and YouTube and those core social media platforms. Don't we have to consider these questions Justice Alito is raising about DMs and Uber and Etsy, because we have to look at the statute as a whole? And I mean, we don't have a lot of briefing on this and this is a sprawling statute, and it makes me a little bit nervous. I'm not sure I agree with you about DMs and Gmail. Just, it's not obvious to me, anyway, that they can't qualify as common carriers.
Mr. Clement:
Look, I agree you don't want to decide all of that today, but this is not here on sort of final judgment. It's here on a preliminary injunction. And the question is, do you want this law, with all of these unconstitutional applications, enforced by every Floridian? So, these provisions are enforced by every Floridian being able to go into court and get $100,000 in civil penalties.
Now, do you want that completely antithetical law to the First Amendment to go into effect while we sort out all these anterior questions? Or do you want it to be put on hold while we can litigate all of this stuff, and if it turns out there's a couple of applications that are okay, or somebody wants a briefing just on the question of whether direct mail is a common carrier, all that?
Justice Barrett:
Can you escape that in this posture?
Mr. Clement:
Absolutely you can escape that in this posture. You affirm this preliminary injunction, which is in place. If you want to, you can point to the clear litigation judgment that Florida expressly made below, which is, "we're not going to get into all of that intermediate scrutiny stuff. We don't want a record on that. We're going to put all our eggs in the expressive activity basket," and they could not have been more clear about that below and in the 11th circuit. And then you say, "This law, which has all of these First Amendment problems, this wolf comes as a wolf, we are going to put that on hold, and then we can sort out some of these tertiary-
Justice Alito:
Well, if that's the case, Mr. Clement, to what extent is it the result of your own litigation decisions? You could have brought in, as-applied challenge, limited to the two platforms that you want to talk about, Facebook and YouTube. But instead, you brought a facial challenge and you claim that it's also susceptible to analysis under overbreadth. So, to get a preliminary injunction, you had to show you had a probability of success on your facial or overbreadth challenge.
Mr. Clement:
And we did.
Justice Alito:
You can't now shift and say, "It was a good preliminary injunction because it's fine as-applied to the platforms I want to talk about, and let's forget about all the other platforms that might be covered."
Mr. Clement:
Well, Justice Alito, first of all, we did all that and we won. Second of all-
Justice Alito:
Did you bring an as-applied challenge?
Mr. Clement:
No, we didn't bring an as-applied challenge because we think this statute is unconstitutional in all its applications.
Justice Gorsuch:
Exactly. And so, you suggested it could be sorted out on remand. But on remand, it's still a facial challenge.
Mr. Clement:
It is still a facial challenge, you're right.
Justice Gorsuch:
And so, again, you think all of the applications are unconstitutional?
Mr. Clement:
I do because the definitions are problematic. The terms-
Justice Gorsuch:
So, there's nothing to sort out on remand. It's done. If you should prevail on a preliminary injunction here, I mean, for practical purposes, it's finished. And so, there is no opportunity to sort out anything on remand.
Mr. Clement:
There's the whole merits. What we've shown is a likelihood of success on the merits. We haven't won on the merits yet.
Justice Gorsuch:
All or nothing.
Justice Jackson:
Can I try it another way? I mean, I asked you before what was the standard, and now you're saying that you think that all applications are unconstitutional, which I think is your burden to establish. So, if we come up with some scenarios in this context, in which we can envision it not being unconstitutional, why don't you lose?
Mr. Clement:
First of all, that's not the standard, with all due respect. I mean, this court has never applied the Salerno standard in the First Amendment case. And this would be the worst first Amendment case in this court's history if you started down that road, because you can always put in some provision into a statute that's innocuous and then you say, "Well, there's a couple of fine things in there." You look at it section by section and these sections are pernicious from a First Amendment standard. Can't have content about a political candidate. There's no constitutional application to that.
Chief Justice Roberts:
Thank you, counsel. Just so I understand, precisely, your position is that the only issue before us is whether or not the speech that is regulated qualifies as, not to beg the question, the expression that's before us is not speech?
Mr. Clement:
I think that's one way to put it. Obviously, you have two questions presented. You're going to be able to decide whatever you think is fairly included in those questions presented. I'm just pointing out that as an artifact of the way my friends litigated the case, you do not have a record on everything that might be interesting for intermediate scrutiny, and it's not my fault. It is based precisely on their representations to the courts below, that they did not want to get into the intermediate scrutiny thing. They wanted to tee up the expressive activity issue.
Chief Justice Roberts:
If the appropriate standard is not Salerno, could you articulate what you think is the appropriate standard?
Mr. Clement:
I think the appropriate standard is whether the statute that implicates the First Amendment has a plainly legitimate sweep.
Chief Justice Roberts:
Okay. Justice Thomas?
Justice Thomas:
Could you, again, explain to me why, if you win here, it does not present a Section 230 problem for you?
Mr. Clement:
If we win here, we avoid Section 230 problems, I think, your Honor, and the reason is that 230 is a protection against liability. It's a protection against liability because Congress wanted us to operate as publishers. And so, it wanted us to exercise editorial discretion, so it gave us liability protection. But liability protection and First Amendment status don't go hand in hand. I don't think the parade organizer in Hurley was responsible for the parade floats that went into its parade. Historically, news stands and others aren't responsible for the materials. So, I don't think you have to sort of say it's one or the other. I mean, I think the 230 protection stands alone.
Justice Thomas:
So, what is it that you are editing out that fits under Section 230?
Mr. Clement:
So, in some of these, I mean, it depends on, in some cases, it is terrorist material. In other cases, it's kids that are telling other kids, "Hey, you should do this Tide Pod challenge." In some cases, it's kids that are encouraging other kids to commit suicide. There's a whole bunch of stuff that we think is offensive within the terms of 230 that we're exercising our editorial discretion to take out.
Justice Thomas:
Well, but 230 does not necessarily touch on offensive material. It touches on obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable. Do you think-
Mr. Clement:
It's that last one.
Justice Thomas:
Well–
Mr. Clement:
I mean, we can have a fine debate about the last sort of, how much of that you use, sort of what's the Latin for that of the company you keep and all of that. I mean, we could have that fine debate in some other case, but we would certainly take the position that we're protected in those judgements.
Justice Thomas:
Well, I think you'd make that, the ejusdem doctrine, do a lot of work. But let's put that aside. Tell me again exactly what the expressive conduct is that, for example, YouTube engages in when, oh I'm sorry, Twitter deplatforms someone. What is the expressive conduct, and to whom is it being communicated?
Mr. Clement:
So, when they, let's say deplatform somebody for violating their terms of use or for continuing to post material that violates the terms of use, then they are sending a message to that person and to their broader audience that that-
Justice Thomas:
How would you know someone's been deplatformed? Is there a notice?
Mr. Clement:
Typically, you do get a notice of that, and there's a provision-
Justice Thomas:
No, I mean the audience, the other people.
Mr. Clement:
Well, they're going to see that they're not there anymore. They're no longer in their feed. And presumably–
Justice Thomas:
Well, but the message could be they didn't want to be there anymore. They're tired of it, they're exhausted.
Mr. Clement:
Well, and here's the thing. I mean, that message is then going to be carried over in, this isn't just about who gets excised from the platform. It's all about what material people see on their individualized sort of, when they tap into Facebook or Twitter or YouTube. And What they're not going to see is they're not going to see material that violates the terms of use. They're not going to see a bunch of material that glorifies terrorism. They're not going to see a bunch of material that glorifies suicide.
Justice Thomas:
Is there any distinction between action or editing that takes place as a result of an algorithm as opposed to an individual?
Mr. Clement:
I don't think so, your Honor. These algorithms don't spring from the ether. They are essentially computer programs designed by humans to try to do some of this editorial function is-
Justice Thomas:
Well, but what do you do with this deep learning algorithm, which teaches itself and has very little human intervention?
Mr. Clement:
You still had to have somebody who kind of created the universe that that algorithm is going to look at.
Justice Thomas:
So, who's speaking then? The algorithm or the person?
Mr. Clement:
I think the question in these cases would be, that Facebook is speaking, that YouTube is speaking, because they're the ones that are using these devices to run their editorial discretion across these massive volumes. And the reason they're doing this, and of course they're supplementing it with lots and lots of humans as well. But the reason they have to use the algorithms, of course, is the volume of material on these sites, which just shows you the volume of editorial discretion.
Justice Thomas:
Yeah. And finally, I'm sorry to keep going, Mr. Clement. Exactly what are they saying? What is the algorithm saying? I don't know, I'm not on any. But what is it saying? Is it a consistent message? Usually, when we had Hurley, it was their parade and they didn't want certain people in their parade. You understood that. What are they saying here?
Mr. Clement:
They're saying things like, "Facebook doesn't want pro-terrorist stuff on our site."
Justice Thomas:
We're not talking about terrorists here. Those aren't – Terrorists aren't complaining about it.
Mr. Clement:
Well, I think actually, we are talking about terrorism here, because I think if these laws go into effect-
Justice Thomas:
I thought that was a crime. I mean, as I understood Florida, they said that, the one provision in the act says, "Nothing that's inconsistent with Section 230." It seems to me that it is consistent with Section 230.
Mr. Clement:
So, your Honor, there are things, like if you have a video on how to build a bomb to blow up a church or something, maybe that's prohibited by sort of that kind of illegality provision. But if there's something glorifying the attacks of October 7th and one of these companies wants to keep that off of the sites, or is there something on there that they want to, that sort of glorifies sort of incredibly thin teenage bulimia and they want to keep that off their sight, they have the right to do that. And that's an important message. And just like in Hurley, the message that they are sending is a message about what they exclude from their forum.
Chief Justice Roberts:
Justice Alito?
Justice Alito:
There's a lot of new terminology bouncing around in these cases, and just out of curiosity, one of them is content moderation. Could you define that for me?
Mr. Clement:
So, look, content moderation to me is just editorial discretion. It's a way to take all of the content that is potentially posted on the site, exercise editorial discretion in order to make it less offensive to users and advertisers.
Justice Alito:
Is it anything more than a euphemism for censorship? I'm going to just ask you this. If somebody in 1917 was prosecuted and thrown in jail for opposing US participation in World War I, was that content moderation?
Mr. Clement:
So, if the government's doing it, then content moderation might be a euphemism for censorship. If a private party is doing it, content moderation is a euphemism for editorial discretion, and there is a fundamental difference between the two.
Chief Justice Roberts:
For editorial discretion, are you affirmatively saying... Nevermind. No further questions. Justice Sotomayor?
Justice Sotomayor:
Mr. Clement, I'm now sort of trying to take all of this in, and I think that I came into this very differently than you have. I came into this thinking there are different functionalities by websites. So, some host news, like the newsfeed in Facebook, some host, like Justice Barrett was talking about and others, Gmail, or where they're just letting people contact each other, direct messaging. And I was thinking that, since I think, rightly, this law seems to cover all of that, that it's so broad, it might have some plainly legitimate sweep.
It might be okay to require direct messaging, to give you notice, to be consistent, to pay attention to the 30-day registration. Some of these provisions might be okay for those functions. But you're saying to me, that's not true. Can you articulate very succinctly why you think, at this stage, on a facial challenge, that we can say there is no cleanly legitimate sweep that this particular law, after we sort it all out below, will still survive? Now, I think the court below said, and you try to take that out from Justice Kagan's answer. Maybe I don't want to. Okay. Is it because this law was passed with viewpoint discrimination in mind? That's what the court below said.
Mr. Clement:
The court below said that, and that would be a sufficient basis to take out the whole law. The law is also shot through with content-based provisions. I think that's enough to take out the whole law. It also, the entire law, every provision we challenge is speaker-based in its limited reach. And what this court's case is clearly saying, including NIFLA, which my recollection is was a facial challenge, says that when you look at speaker-based distinctions, you can then open the lens a little bit and see if those speaker-based provisions are infused with viewpoint discrimination or other discriminatory influences.
And if you do that here, I mean, you don't have to get past the governor's official signing statement to understand that the restrictions on this statute. I mean, it's one thing to say, "Well, they're only getting the big companies," but when the governor's telling you we're going after the viewpoints of the Silicon Valley oligarchs, then all of a sudden, limiting it to the biggest companies starts to tell you that this is targeted like a laser beam at the companies that they don't like the editorial discretion that was being exercised.
Chief Justice Roberts:
Justice Kagan?
Justice Kagan:
I mean, let me ask the same kind of question in a different way. Suppose that, instead of this law, you had a law that was focused, it excluded the kind of curated news feeds where your argument about editorial discretion sort of leaps out. So, this law didn't touch those. But it said, with respect to Gmail and direct messaging and Venmo and Dropbox and Uber, with respect to all of those things, a site could not discriminate on the basis of viewpoint, just as maybe a site couldn't discriminate on the basis of race or sex or sexual orientation or what have you. So, it just added viewpoints to the list. Wouldn't that be all right?
Mr. Clement:
I actually don't think it would be alright because all of those things are still in the expressive business. And I also think-
Justice Kagan:
Well, do you think that, suppose it didn't say viewpoint, it just said, "You can't discriminate on the basis of all the usual protected characteristics." Is that all right?
Mr. Clement:
That would probably be alright, but it wouldn't save the whole statute from being-
Justice Kagan:
Well, so this is just on this statute. It's a statute about, it excludes YouTube and Facebook, the Facebook newsfeed. But it's just direct messaging, Venmo, all of those kinds of things. And it just said, "We're not going to let you exclude on the basis of race and sex, and we're also not going to let you exclude people on the basis of viewpoint."
Mr. Clement:
So, I mean, the first part of that statute, I don't think my clients would even challenge. I mean, whether there's an abstract First Amendment right to have the Black authors table for Black History Month.
Justice Kagan:
And also on the basis of viewpoint.
Mr. Clement:
When you throw viewpoint into there, then I think, I'd have to ask my clients whether they challenge that statute. But obviously, that's not the statute we have here. And if you think about-
Justice Kagan:
I guess, what I'm saying is, in part, it is the statute you have here, and that gives you your plainly legitimate sweep. Because all it's saying is that when you run a service where you're not speaking, unlike in Facebook feed, where your editorial discretion argument is good because the platform is engaged in speech activities, well, when you're running Venmo, you're not engaged in speech activities. And so, when a state says to you, "You know what? You have to serve everybody irrespective of whether you like their political opinions or not," then it seems you have a much less good argument. But this statute also says that, doesn't it?
Mr. Clement:
Not really, Justice Kagan. I think we're in danger of losing sight of the actual statute. So, let me take you to Petition Appendix 97A and the definition of censor used in the statute. So, censor includes any action taken by a social media platform to delete, regulate, restrict, edit, alter, inhibit the publication or republication of, suspend a right to post, remove or post an addendum to any content or material posted by a user. The term also includes actions to inhibit the ability of the user to be viewable or to interact with another user of the social media platform. Censor is all about the expressive activity. Post-prioritization is all about it. It specifically talks about a newsfeed, a feed, a view, search results, and they give essentially political candidates in journalistic enterprises a right to sort of non-discrimination. So, they're going to pop up there even though, like, "I have no interest in politics, I just want to look at feeds about Italian bicycles. And I'm still going to get these Florida politicians popping in there?"
That's what this statute does. And then, you go through shadowban. Shadowban is not about any of the things you're talking about. Shadowban is all about content. And then we go to journalistic enterprises, they get pride of place. Then we talk about post-prioritization, that's all about how you display the content. So, maybe the 30-day provision, you could sort of say that, well, that applies to Uber. But even then, if Uber wants to change its comment policies because all of a sudden, they did one thing to try to deal with one set of issues, and then a problem comes up and there's a whole bunch of people using the comments in a really rude way, why couldn't they change their editorial policy on the comments? I just don't understand it.
And then, all of the duty to explain provisions, the duty to explain provisions are all driven by decisions to exclude conduct and content. And that happens a billion times a quarter at YouTube. So, it's a crushing blow. It has nothing to do with some of the other things you're talking about.
Justice Kagan:
Thank you.
Chief Justice Roberts:
Justice Gorsuch? Justice Kavanaugh?
Justice Kavanaugh:
Just pick up on the word censorship because I think it's being used in lots of different ways. So, when the government censors, when the government excludes speech from the public square, that is obviously a violation of the First Amendment. When a private individual or private entity makes decisions about what to include and what to exclude, that's protected generally editorial discretion, even though you could view the private entity's decision to exclude something as "private censorship".
Mr. Clement:
Absolutely. That was the whole thrust of this court's decision in Halleck, and I suppose the Hurley case might've been a completely different case if that was an official city of Boston Parade, and the city of Boston decided to exclude the group. The whole reason that case came down the way it did unanimously is because it was a private organization exercising its First Amendment right to say, "We don't want glib in our parade."
Justice Kavanaugh:
How does 303 fit into that?
Mr. Clement:
Well, I think 303 is just further evidence that, I mean obviously, I think 303, where 303 is most relevant, is that Colorado in that case tried to rely on FAIR, much the way my friends here rely on FAIR. And this court made clear in 303 Creative, no, it doesn't work that way. This is expressive activity. And so, and the fact that my friend's best case is FAIR, I think, just shows how radical this statute is, because this targets expressive activity in its core. If the Solomon Amendment said to the law schools, "You have to give the military equal time in the classroom," I think the case would've been 9-0 the other way, and that's essentially what Florida's trying to do here.
Justice Kavanaugh:
And then, on the procedural posture, I think this is important to try to understand what's exactly before us. And you've gotten questions on this, but I want to nail it down for my benefit, which is, you said that they came in and opposed a PI solely on the ground that what was involved here was not expressive activity or speech, but instead conduct. Is that accurate?
Mr. Clement:
That's accurate. It came up in the context of how much discovery we were going to have before we had the preliminary injunction hearing. And in that context, the state says, "Look, we're going to sort of kind of rest on this kind of threshold question," as my friend said, and that, "We'll limit discovery on both sides." And then, in the 11th circuit, it was even more clear because in the 11th circuit, the position of the state of Florida was like, "We're not going to really engage on intermediate scrutiny at all. We're putting all our eggs in the expressive activity."
Justice Kavanaugh:
So, if we think that the statute does target expressive activity in some respects, and we affirm in this case what is left to Justice Gorsuch's question, what's left to happen, that just means it can't go in place for the next year or two until a final judgment. What will happen in the litigation?
Mr. Clement:
So, there'll be litigation on the merits. I don't even think we're past the point where we could amend. So, if this court tells us, we sure better have an as-applied challenge in there. I suppose we could do that. But the point is, the litigation will go on, there will be discovery, unless Florida decides at that point that the writing's on the wall and it tries to pass a more narrow statute. But otherwise, there would be discovery, there would be essentially the whole nine yards. But in the interim, and I just can't emphasize enough, particularly that $100,000 civil penalty provision.
Justice Kavanaugh:
All that's before us then is what should happen in the interim before final judgment, and it comes back to us potentially a year or two from now. Should it be in effect or not be in effect until it comes back to us, correct?
Mr. Clement:
Yeah, if it comes back to you. Yes.
Justice Kavanaugh:
If it came back to us or it goes to the Court of Appeals. And what will happen, I mean, you've alluded to it, but what will happen in that year, do you think? Because I don't think we've heard much about exactly what you're concerned about. In other words, you're very concerned about this, that's obvious. But what are the specifics of that?
Mr. Clement:
Well, I mean, honestly, if this statute goes into effect, we'd sort of have to fundamentally change our business models. And I think each company is going to make their own judgment about how they'd come into compliance. I think part of the irony here is that, as to one – they say this is going to promote speech, but they also allow us to discriminate on the basis of content as long as we do it consistently. So, what we might do in the interim, at least some of these companies might do, is just, well, let's do only puppy dogs, at least in Florida, until we can get this straightened out.
Because that's the one way that – because these same companies are getting hammered by people that say we're not doing enough to keep material that's harmful to children off of these sites. And yet, these laws make it impossible for us to keep material that's harmful to children off of our sites unless we take so much material off of our sites that nobody can say that we're not being inconsistent or not discriminating. In Texas, it's viewpoint discrimination.
Justice Kavanaugh:
Could you just say a word about the word consistency, what you think that entails?
Mr. Clement:
I have no idea, and one of the other arguments we have in this case, it's just not part of the preliminary injunction you have before us, is a vagueness challenge. And I think, when you're targeting editorial discretion, to put a consistency requirement, I mean, if you tried to tell the New York Times to be, I mean, I haven't met anybody who thinks the New York Times is 100% consistent in its editorial policy. But if you put a state action requirement that they editorialize consistently, or somebody can sue them for $1,000 or the state can haul them into court, I think that would be the most obvious First Amendment violation in the world.
Justice Kavanaugh:
Thank you.
Chief Justice Roberts:
Justice Barrett?
Justice Barrett:
I have a practical question. So, let's assume that I agree with you about YouTube and Facebook feeds, newsfeeds, but that I don't want to say that Facebook marketplace or Gmail or DMs are not within the statute's plainly legitimate sweep. If I asked you the question, can you still win, I know that you'll say yes. But how would we write that opinion, given the standard? Without having to canvas whether all of those things would be within the plainly legitimate sweep?
Mr. Clement:
Honestly, I'm not, well, I'm not sure you could reach that result without definitively holding that. That stuff is within the plainly legitimate sweep of the statute. You don't have the record for that in part because of litigation decisions that were made by the state of Florida. So, I think what you would do is you would affirm the preliminary injunction, and then you would perhaps lament the fact that the record here is somewhat stunted. And then, you would make clear that there might be a possibility to modify the preliminary injunction on remand. And now, at that point, I think when the lower court sees all the details about how these things actually operate, they might not have the same skepticism that you are starting with.
But I think there's lots of ways to write the decision that keeps the... And again, what's in place right now is a preliminary injunction for the benefit of my clients. So, people that haven't sued yet, I mean, the statute in theory could apply to them. But my clients have the benefit of a preliminary injunction while this litigation goes forward. And obviously, anything this court says in its opinion that suggests what the future course of that litigation should be is going to be powerfully effective in terms of how this case gets litigated in the district court.
Justice Barrett:
Thank you.
Chief Justice Roberts:
Justice Jackson?
Justice Jackson:
So, Mr. Clement, I just want to push back for a minute on the private versus public distinction. I mean, I think we agree that the government couldn't make editorial judgments about who can speak and what they can say in the public square. But what do you do with the fact that now, today, the internet is the public square? And I appreciate that these companies are private companies, but if the speech now is occurring in this environment, why wouldn't the same concerns about censorship apply?
Mr. Clement:
So, two reasons, your Honor. I mean, one is, I really do think that censorship is only something the government can do to you. And if it's not the government, you really shouldn't label its censorship. It's just a category mistake. But here's the second thing. You would worry about this if websites like the cable companies in Turner had some sort of bottleneck control where they could limit your ability to go to some other website and engage in speech. So, if the way websites worked was somehow that if you signed up for Facebook, then Facebook could limit you to only 19 other websites and Facebook could dictate which 20 websites you saw, then this would be a lot more like Turner.
But as this court said in Reno, in 1997 when it was confronted with an argument about the then fresh Turner decision, this court basically said the internet is the opposite of Turner. There's so much information out there, it's so relatively easy to have a new website come on. And reality tells us that, right? X is not what Twitter was, and TikTok came out of nowhere.
Justice Jackson:
All right, I think I get your point. Let me just ask you about the illegitimate sweep point. So, what is illegitimate about a government regulation that attempts to require these companies to apply consistently their procedures? I guess, I don't understand why the enforcement of sort of anti-discrimination principles is illegitimate.
Mr. Clement:
So, consistency when, as a government mandate, when what is being regulated is expressive activity is, I think, a clear First Amendment violation. And I don't think, I mean, some of these judgments are very tricky judgments. Okay, well, we're going to take some of the stuff sort of celebrating October 7th off, but we want to have some-
Justice Jackson:
All right. Well, what about a straightforward one, right? I understood that one of these was, no candidate can be deplatformed. That seems pretty straightforward.
Mr. Clement:
Right. And I think it's great–
Justice Jackson:
Right. And so, why isn't that enforcing anti-discrimination principles? If somebody's a candidate for office, they can't be deplatformed,
Mr. Clement:
So, that means they can't be deplatformed no matter how many times they violate my client's terms of use. No matter how horrible their conduct, no matter how misrepresenting they are in their speech, we still have to carry it. And not just have to carry it, but under this statute, we have to give it pride of place. And it doesn't take much to register as a candidate in Florida. And so, this gives a license to anybody, even if there's somebody who's only going to poll 2% in their local precinct, they can post anything they want. They can cause us to fundamentally change our editorial policies and have to ignore our terms of use.
Justice Jackson:
Thank you.
Chief Justice Roberts:
Thank you, counsel. Solicitor General Prelogar?
Solicitor General Elizabeth B. Prelogar:
Mr. Chief Justice, and may it please the court, the First Amendment protects entities that curate, arrange and present other people's words and images in expressive compilations. As this Court's cases have held, those principles cover newspaper editors, parade sponsors and web designers. It also covers social media platforms. Those platforms shape and present collections of content on their websites, and that inherently expressive activity is protected by the First Amendment. That doesn't mean though, that every business that transmits speech can claim First Amendment protection for that conduct. For example, telephone and delivery companies that carry speech from point A to point B aren't shielded by the First Amendment when they provide that service.
But that's because they're not producing any expression of their own. It's not because they're some kind of common carrier or communications company exception to the First Amendment. None of this is to say that social media platforms are immune from government regulation, and governments at every level obviously have an important interest in facilitating communication and the free exchange of ideas. But in promoting that interest, governments have to stay within the bounds of the First Amendment, and these state laws, which restrict the speech of the platforms to enhance the relative voice of certain users, don't withstand constitutional scrutiny. I welcome the Court's questions.
Justice Thomas:
Normally, you are defending regulations, but if the US government did exactly what these petitioners or respondents are doing, would that be government speech?
General Prelogar:
So, if I'm understanding the hypothetical correctly, Justice Thomas, if you're suggesting that the government itself would open a forum and allow users to post messages on that, I think that that would implicate First Amendment principles because the government might be creating something like a public forum where it would itself be bound by the Constitution. I don't think that that would all necessarily qualify as the government's own speech. But the critical difference here, of course, is that these platforms are private parties, they're not bound by the First Amendment as an initial matter.
Justice Thomas:
Mr. Clement said the difference is that if the government does it, it is censoring. If a private party does it, it is, I forget, content moderation. These euphemisms bypass me sometimes, or elude me. Do you agree with that distinction?
General Prelogar:
Yes. I mean, the critical difference is that, as Justice Kavanaugh observed, the government's bound by the First Amendment. And so, if it were to, for example, dictate what kind of speech has to appear and in what order, that could create a First Amendment violation. But here, it's the private platforms themselves that are making that expressive choice. And our recognition here is that they're creating their own expressive product in doing so. These are websites that are featuring text elements, speech elements, photos, videos, and the platforms which are private parties not bound by the Constitution, are deciding how they want that to look, what content to put on it and in what order. That's an inherently expressive activity.
Justice Thomas:
What are they saying?
General Prelogar:
So, it depends on the platform, the various value judgments that are embodied in its content moderation standards. I think there's a wide variety in the kind of content that the platforms deem objectionable, the kind of content they think might be harmful or will drive away users and advertisers. There's no one single message that each platform is conveying. But I guess, if you wanted to look at the lowest common denominator, at the very least, it seems like their content moderation policies embody a judgment of, "This is material we think might be of interest to our users, or that the users will find interesting and worthy of looking at."
So, it's a lot like the parade in Hurley in that circumstance, where the court specifically said, "Maybe you're lenient, you let a lot of content in. You can't identify a single discernible message from the parade as a whole, but there is still the baseline of the parade sponsor signaling, this is something that's worthy of looking at in my parade."
Justice Gorsuch:
General, you indicate in your brief that net choice sometimes errs, by suggesting that the dissemination of speech is always expressive activity. And I just wonder how we're supposed to deal with that fact, if I agree with you, in this facial challenge context? And particularly when many of the platforms, while reserving the right to prohibit various kinds of posts, most of which are consistent with Section 230, also say and guarantee users a right to express their ideas and opinions freely, I'm quoting from one of them. And even if the platform disagrees and they say that they do not endorse and are not responsible, again, I'm quoting from some of these terms of service, sure sounds a lot like conduit, doesn't it?
General Prelogar:
So, I think there is a big difference between a pure conduit, the kind of company that is quite literally engaged in carrying speech, transmitting it, whether that's across the telephone wires or via Telegraph or on a delivery truck like UPS and FedEx. A big difference between that kind of conduit and what the platforms are doing here, because they're not just literally facilitating user's ability to communicate with other users. Instead, they're taking that and arranging it and excluding it.
Justice Gorsuch:
But some of them are promising that they're not going to interfere, and they're promising you get to express your views freely and openly, and they're promising that they, and they're representing, rather, that your views don't represent theirs, and everybody understands that. And those are their terms of service. And this is a facial challenge again, and I just think separating the wheat from the chaff here is pretty difficult. Can you help us with that?
General Prelogar:
Sure. And I think, looking at their terms of service, it's certainly true that many of the platforms have generally indicated that they welcome a wide variety of views. But it would be incorrect to say that they are holding themselves out as forums for all possible speech. Those same terms of service contain the kind of editorial policies that are at issue here. And the state laws are narrowly targeted on the kind of speech the platforms want to include. So, it wouldn't be implicated-
Justice Gorsuch:
Yes, I acknowledge that their terms of service also include the right to exclude certain speech, but those are usually like the Section 230 things, the way they discuss it, the lewd, lascivious, obscene, blah, blah, blah, blah. And after that, they do seem to promise a whole lot of latitude. And when you look at classic common carriers, it's very similar. They don't give up the right to exclude certain activities or speech that might be detrimental to their business or that might be otherwise regulated. That holds true for telegraphs, it holds true for telephones even. But, beyond that bare minimum, they're open to all comers. And that seems to be how a lot of them are representing themselves, to the public at least.
General Prelogar:
The key difference though, with common carriers, the kinds of industries that have traditionally been regulated, those in the transportation sector, railroads, some of the communications companies and so forth, is that they're not creating any kind of expressive speech product in providing their service. And so, government regulation that says, "Don't discriminate based on policy."
Justice Gorsuch:
Well, the telegraph companies argued just the opposite back in the day.
General Prelogar:
But I think that those things–
Justice Gorsuch:
And they lost.
General Prelogar:
–fail because, although they are transmitting the messages, they aren't themselves creating any speech on the side.
Justice Gorsuch:
Oh, they said they were. In fact, they curated a lot of the speech, or tried to, including political speech, which they didn't agree with.
General Prelogar:
I think it's wrong to call that curation. It's certainly true they tried to adopt certain discriminatory calls.
Justice Gorsuch:
Well, whatever euphemism one wishes to choose.
General Prelogar:
But they weren't taking that speech out and putting it into a compilation that's expressive. That's the difference here.
Justice Gorsuch:
On that... Okay, okay. So, if the expression of the user is theirs because they curate it, where does that leave Section 230? Because the protection there, as I understood it, and Justice Thomas was making this point, was that Section 230 says, "We're not going to treat you as publishers, so long as you are not, it's not your communication in whole or in part," is what the definition says. And if it's now their communication in part, do they lose their 230 protections?
General Prelogar:
No, because I think it's important to distinguish between two different types of speech. There are the individual user posts on these platforms, and that's what 230 says that the platforms can't be held liable for. The kind of speech that we think is protected here under the First Amendment is not each individual post of the user, but instead the way that the platform shapes that expression by compiling it, exercising this kind of filtering function, choosing to exclude–
Justice Gorsuch:
Let me interrupt you there, I'm sorry. But I understand that it's not their communication in whole, but it's, why isn't it their communication in part if it's part of this larger mosaic of editorialized discretion and the whole feel of the website?
General Prelogar:
Well, I don't think that there is any basic incompatibility with immunizing them as a matter of Congress' statutory choices and recognizing that they retain First Amendment protection.
Justice Gorsuch:
Isn't the whole premise, I'm sorry, the whole premise of Section 230, that they are common carriers? That they're not going to be held liable in part because it isn't their expression? They are a conduit for somebody else's?
General Prelogar:
No, not at all, Justice Gorsuch. I think, to the extent that the states are trying to argue that Section 230 reflects the judgment, that the platforms aren't publishing and speaking here, there would've been no need to enact Section 230 if that were the case. Congress specifically recognized the platforms are creating a speech product. They are literally, factually publishers, and Congress wanted to grant them immunity, and it was for the purpose of encouraging this kind of editorial discretion. That's the whole point of the Good Samaritan blocking provision, 230(c)(2)(A).
Chief Justice Roberts:
General, there's been a lot of talk about the procedural posture of the case, how it was litigated below, what's available if it goes back when it goes back. I'd like your views on that.
General Prelogar:
Yes. So, we presented our arguments in this case, taking the way it had been litigated at face value. And what that means is that below, Florida treated this law as though the central provision and scope was focused on the true social media platforms. The thing that, the website you have in mind when I use that term, things like YouTube and X and Facebook. And Florida's presentation to the lower courts was, this law isn't a regulation of their speech at all and so it's valid. So, I understand the force of the questions that the court has been asking today about, are there other types of websites that might be covered? Could this extend to direct messaging? We don't really have a dog in that fight to the extent that there are those other applications of the law out there. That's not how Florida sought to defend it.
And to Justice Barrett's question, what should the court do with this? It's been litigated one way, and now it looks like maybe there are other possible applications you would have in mind. I would urge the court to take a really narrow approach here. Florida defended this law on the basis that it could control what the true social media platforms are doing with respect to their expressive websites. And if I were the court, I would really want to reserve judgment on the application to e-commerce sites, to companies like Uber, which don't seem to be creating a comparable type of expressive product. And I think the court could save those issues for another day or for further factual development in this case, while looking at the decision on the record that was created based on those litigation judgments by the parties.
Chief Justice Roberts:
Justice Thomas, anything further? Justice Alito?
Justice Alito:
Yeah. I'm baffled by your answer to the Chief Justice. Didn't Florida argue that a preliminary injunction should not be issued because the plaintiffs had not shown that they were likely to succeed on their facial challenge? Did they not make that argument?
General Prelogar:
They made that overarching argument, but they didn't go further and say, "And the reason for that is because here's direct messaging."
Justice Alito:
All right. Well, do you think that issue is not before us?
General Prelogar:
I think it would be hard for the court to figure that issue out because there's a lot of lack of clarity.
Justice Alito:
Oh, well, it may be hard for us to figure out, but my question was, is the issue before us?
General Prelogar:
I think that the way Florida litigated this case makes it difficult to say that the issue is properly before you. Usually, the court holds a party to the arguments that pressed below and that we're passed upon below, and there is no court in this case that has considered questions about other types of platforms or about other types of functionalities.
Justice Alito:
If the record is insufficient to allow us to comfortably decide whether the facial challenge standard or an overbreadth standard is met, isn't that the fault of the plaintiffs? And isn't the remedy to vacate and remand, for all of that to be fleshed out? And that would not mean, I wouldn't say anything necessarily about what will happen in the near future. It would mean that it would be litigated, and perhaps if the plaintiffs developed the record in the way that Florida thinks they should, and provides a list of all of the NetChoice members who are covered by this and goes through all of the functions that they perform, and assesses whether the law is unconstitutional in every application or whether it has a legitimate scope that is constitutional, then they would be entitled to a preliminary injunction.
General Prelogar:
So I certainly don't want to resist the idea that if this court thinks those issues are properly before it and affect the analysis of the facial challenge, notwithstanding the way the parties litigated the case, I don't want to stand in the way of that. I do think there would be a lot of value though in the court making clear, that with respect to Florida's defense of this law in the lower courts, namely the idea that the state really can control the curation and editorial function of the true social media platforms with respect to their expressive product, that seems to me a type of provision that is invalid in all of its applications with respect to those platforms.
Justice Alito:
Could I just ask you to comment on a few things? I understood Mr. Clement to say, so I understood him to say that the email function could be denied on the basis of, access to that could be denied on the basis of viewpoint, direct messaging could be denied on the basis of viewpoint. Do you agree with that?
General Prelogar:
No, we disagree with that. We think that both direct messaging and email service seems a little more like the pure transmission of communications. So we would likely put those in the box of the phone company, the telegraph company, internet service providers, and so forth. We don't think that that's an inherently expressive product in the same way as the main website that has the newsfeed and that's curating the stories and deciding how to prioritize them.
Justice Alito:
Do you agree that discrimination on the basis of bigness violates the First Amendment?
General Prelogar:
No, I don't think that on its own, simply trying to regulate, based on the size of a company, is always a First Amendment problem.
Justice Alito:
Do you agree that a private party cannot engage in censorship? Let me give you an example. Suppose that a private law school says that any student who expresses support for Israel's war with Hamas will be expelled. Would that be censorship or would that be content moderation?
General Prelogar:
So I think–
Justice Alito:
It's a private party.
General Prelogar:
Yeah. So I guess the first order question would have to be, is there some regulation that prohibits the law school from acting in that way? So if you're thinking about a public accommodations law, for example-
Justice Alito:
No, I'm just talking about terminology.
General Prelogar:
Colloquial terminology?
Justice Alito:
That's not censorship, that's content moderation.
General Prelogar:
I think that–
Justice Alito:
Because it's a private part.
General Prelogar:
–the semantics of it don't matter. You could say that the parade in Hurley was censoring the glib contingent that wanted to march or that the newspaper, in Torneo, was censoring the candidate who wanted to publish his speech. I think that the particular word you use doesn't matter. What you have to look at is whether what's being regulated by the government is something that's expressive by a private party. And here we think you have that.
Justice Alito:
Well, I mean, the particular word that you use matters only to the extent that some may want to resist the Orwellian temptation to re-categorize offensive conduct in seemingly bland terms. But anyway, thank you.
Chief Justice Roberts:
Justice Sotomayor?
Justice Sotomayor:
General, think I'm finally understanding the argument, but let me make sure I do, okay? When I came in, I had the reaction Justice Alito did, which is we should vacate and remand. And I have been thinking about what does that do to the preliminary injunction? Because I agree with you as I understand what the state did below was to say, "We don't have to offer you any justification for any part of our law, because everybody of these social media companies are common carriers." And I think what's clear is, from our questioning, that that's not true, that there are many functions that are expressive that we can't say are common carriers.
But even if we did say they were like common carriers, the issue would be one of what's the level of scrutiny? And the state said, "There's no level of scrutiny we're going to address." They basically said, "We can do anything we want to common carriers and to any of the expressive platforming or deplatforming things." But I don't even think that's true. They can't come in, and I'm not sure they can do any of these things or some of these things, even to common carriers if it is a content or viewpoint content exclusion.
So a common carrier doesn't have to permit unruly behavior, doesn't have to permit, can throw somebody off the train if they're threatening somebody else or if they're doing other things. So I guess what you're saying is let's keep the injunction in place, vacate and remand, affirm on the preliminary injunction, but vacate and remand on the application of this law and how, based on what level of scrutiny given the function that's at issue, correct?
General Prelogar:
So we do think that the court should hold the parties to the way they litigated this case and teed it up for the court's review. And it's uncommon for the court to start considering new arguments that weren't presented by the party defending its law below. But, if I can respond for a moment on the common carrier point, Justice Sotomayor, because I think you've put your finger on a really important response here to many of the arguments that Florida's making.
They suggest that the designation of a platform as a common carrier or not has some talismanic significance, but it's completely irrelevant to answering the First Amendment question, because it's not like companies that are treated as common carriers have no First Amendment rights, with respect to their expressive activities. You can take a railroad, like Amtrak, and you can regulate it as a common carrier with the transportation of passengers, but if it creates some kind of magazine for those passengers to peruse, that's entitled to full First Amendment protection.
And the reason that the non-discrimination mandate in the common carrier scenario usually poses no problem under the First Amendment is there's no speech or expressive activity in carrying passengers or in carrying communications. It's entirely different with respect to the activity that Florida is seeking to regulate, because that is inherently expressive. It's putting together, literally, a website with pictures and video and text and arranging it. And that looks just like the protected editorial and curatorial activity the court has recognized in other cases. So whether you say they're a common carrier or not, we think is entirely beside the point.
Chief Justice Roberts:
Justice Kagan?
Justice Kagan:
I think I want to try again on this question of where does this leave us? Because suppose that I agree with pretty much what you said. Let's just take that as an assumption, which is, when Florida is trying to regulate Facebook newsfeed, well it can't do that because Facebook newsfeed is itself providing a speech product. But, when Florida is trying to regulate Gmail, well maybe it can do that, because Gmail is not in the business of providing that sort of speech product.
And if you take it, and if we again assume that this statute covers a variety of things that are Gmail-like, direct messaging, and Uber, and things that are not creating speech products. And we have this First Amendment doctrine that says if you can find a legitimate sweep, we can't overrule something facially. But you don't really want to allow this law to go into effect, because of the unconstitutional applications that you're talking about, with respect to all these companies that are creating speech products. What do we do?
General Prelogar:
So, I guess, if you were confident that the state law had these applications and that the particular provisions would regulate the kinds of companies that you're referring to, that aren't creating an expressive speech product, then I think that that would poke holes in the theory of facial invalidity. But I don't think you can have that certainty because that's not how Florida litigated this case below. It's not as though it said, "This statute is not invalid on its face because it applies to Gmail."
Justice Kagan:
I take the point, we could just say, "Gosh, we can't even think about those questions, because this was litigated in a certain way." So that's one option. But suppose we think it's pretty obvious that this covers a lot of stuff that does not look like Facebook feed, and we wanted, I mean, suppose we were to, we can take notice of that, then what?
General Prelogar:
Okay, so I think, at that point, what I would do if I were the court is make clear that with respect to the issues Florida did present and that the 11th Circuit and the district court resolved, Florida's wrong to say that it can apply these provisions to the social media companies that are engaged in creating an expressive product and make that much clear. Otherwise, I think if the court just vacates and sends it back, it'll be right back up here in an emergency posture again, on an as-applied basis with respect to one of those companies. So I think the court can decide that much. That was the issue that was litigated below and decided.
And then if you think that there are some additional questions about the scope of the Florida law and whether it might have valid applications, along the lines we've been discussing, I don't have a particular interest on behalf of the United States in what you do with the preliminary injunction. In the meantime, I think there's a lot of force to the idea that this is backed up by $100,000 in penalty, per violation, and that could have a huge chilling effect on any protected speech out there that's occurring. But, I think the court could say there are some unresolved issues about concrete applications of this law and await further factual development on that.
Justice Kagan:
Thank you.
Chief Justice Roberts:
Justice Gorsuch?
Justice Gorsuch:
This is a facial challenge, right? It's an all-or-nothing deal. How is a court supposed to make as-applied rulings in a facial challenge on remand?
General Prelogar:
I would do it based on the party presentation principle and the fact–
Justice Gorsuch:
No, I got the first point.
General Prelogar:
Yeah. I might run out of options beyond that, Justice Gorsuch.
Justice Gorsuch:
Yeah. After the first one–
General Prelogar:
I agree that these are hard questions.
Justice Gorsuch:
Right. It's the first one you–
General Prelogar:
No, I suppose you could certify to the Florida Supreme Court the unresolved issues of Florida law, if you think that that is necessary to actually reach a disposition in this case.
Justice Gorsuch:
Okay, thank you.
Chief Justice Roberts:
Justice Kavanaugh?
Justice Kavanaugh:
Just want to follow up on Justice Alito's questions and he'll have the opportunity, since this is continuing, to follow up on mine if he wants to. But, I think he asked a good, thought-provoking important question and used the term Orwellian. When I think of Orwellian, I think of the state, not the private sector, not private individuals. Maybe people have different conceptions of Orwellian, but the state taking over media, like in some other countries. And in Torneo, we made clear, the court made clear that we don't want to be that country, that we have a different model here and have since the beginning, and we don't want the state interfering with these private choices.
Now, Torneo then dealt with, and this is my question, Torneo dealt with the idea, well, newspapers have become so concentrated and so big that maybe we should have a different role. And Torneo, in the court's opinion, Chief Justice Burger's opinion for unanimous court, talked about those changes. I mentioned those before. He says, "Those changes, if placed in a few hands, the power to inform the American people and shape public opinion. The abuses of bias and manipulative reportage are said to be the result of vast accumulations of unreviewable power in the modern media empires. In effect, it is claimed the public has lost any ability to respond. The monopoly of the means of communication allows for little or no critical analysis of the media."
And then, though, and he says, "From this premise, it is reasoned that the only effective way to ensure fairness and accuracy to provide for some accountability is for government to take affirmative action." And then he goes on and explains, "No, we're not going to do that. The First Amendment stands against that. However much validity may be found in these arguments, at each point the implementation of a remedy calls for some mechanism, either governmental or consensual. And if it's governmental, this at one, brings about a confrontation with the express provisions of the First Amendment.
"Compelling editors or publishers to publish what that which reason tells them should not be published is what is at issue in this case." And so he says, for the court in 1973, "No, we don't have a big exception to the idea that the First Amendment distinguishes the state from the private sector and private individuals." Now, here's my question. We're 50 years later, how does that principle, articulated in Torneo, apply to the current situation, the current bigness?
General Prelogar:
So I think that Torneo does establish a bright-line proposition that the state, even if it has these concerns about market power, and dominance, and control, cannot directly overtake the editorial function and prevent a private party, that's creating an expressive product, from making those kinds of judgments about how to present that product. But, at the same time, I think that there are legitimate concerns here about the power and influence that social media platforms wield.
And I want to emphasize, it's not like the government lacks tools to deal with this, it's not as though it can't regulate at all. There is a whole body of government regulation that would be permissible, that would target conduct, things like antitrust laws that could be applied, or data privacy, or consumer protection, things that we think wouldn't come into any conflict with the First Amendment at all. And even in a situation where the government does think that it's necessary to regulate in a manner that's going to affect protected speech rights, that's not the end of the inquiry.
You still have a chance as the government to establish that your regulation can pass constitutional muster, like it did in the Turner case that you were referring to earlier. So I want to be very clear that we are not suggesting that governments are powerless to respond to some of the concerns that Justice Alito mentioned. I think one natural place to go as a government is to disclosure, to ensuring that if you think that platforms have Orwellian policies, you at least make sure users have information about how they're acting, what their policies are, the generalized disclosure requirements here that were not invalidated by the lower courts and aren't before this court.
Justice Kavanaugh:
On Turner, the key was content neutral there, right?
General Prelogar:
Yes. So Turner concluded that the interest, the governmental interest-
Justice Kavanaugh:
Are one key.
General Prelogar:
... that was asserted there, as you put it, was unrelated to the suppression of expression. And the problem here, my friend suggested that Florida has precisely the same interest, but here, the interest that Florida has asserted in affecting these content moderation choices is to change the speech on the platforms. It doesn't like the way that the platforms are moderating content and it wants them to create a new expressive product that reflects the state's judgments about what should go on the website, whether that's candidate speech, or speech by journalistic entities, or otherwise. And that is just not an interest that's unrelated to the suppression of expression. So we think the court should apply intermediate scrutiny here and find that the state can't get out of the starting gate with that interest.
Justice Gorsuch:
Thank you.
Chief Justice Roberts:
Justice Barrett?
Justice Barrett:
General, I asked Mr. Clement, at the end, this practical question, which Justice Kagan also asked you. And so, I just want to be sure that I'm understanding, maybe, exactly your answer to Justice Kagan. It was different than Mr. Clement's to me. You were pointing out to Justice Kagan that if we just vacate and send it back, it's going to be right up here in an emergency posture, on an as-applied challenge. So you are encouraging us to address at least this question of whether the Facebook newsfeed or YouTube, et cetera, is expressive.
But if I think there are real problems with some of these other applications which may be legitimate, do you think it's an option to say that we think that some of these editorial applications would be unconstitutional? But, because we don't know about these other applications, they might be within the statute's legitimate sweep, that we're going to vacate and remand anyway, and send it back for the court to sort out all those other applications?
General Prelogar:
So I think that would be one possible approach here. I want to express strong agreement with the instinct, I think that, is underlies that question that the court shouldn't do more than is necessary here with respect to the types of applications that we've been discussing, e-commerce, Gmail, or websites, or email servers, and that kind of thing, I do think they present a really distinctive set of issues. And so, if you think that those issues are properly in this case, I don't think the court has received the briefing, frankly, to try to take a stab at resolving them, but it seems like it would be a reasonable thing to do to send it back for further factual development and consideration by the lower courts.
Justice Barrett:
Okay. And one other question, and this is about Section 230. When you were talking to Justice Gorsuch, you were pointing out the distinction between the post and the post's content for which the platform would not be liable and then the feed, and you were saying, well, the speech that is the platform's is not what's on the post, and the platform can't be liable for that. So could a platform be liable then, say, if its algorithm or its feed boosted things, like, say, the Tide Pod challenge? That's different. Is that within Section 230?
General Prelogar:
Yeah. So I think that this is a difficult issue about how 230 might apply with respect to decisions that the platform is making itself, with respect to how to structure its service. And I want to be careful here, because I have to confess that I haven't gone back recently to look at the brief we submitted in the Gonzalez case last term, that I think touched on some of these issues. But I do think that there are circumstances where, of course, if the thing that's causing harm is the platform's own conduct in how it structures its service, that's something that might not be immunized under Section 230.
I think all of this is separate and apart from the First Amendment issue in this case, though. Because, here, whether or not you think that, recognizing that they have a speech product affects the proper interpretation of the statute under 230, it means that there are some situations where they won't have immunity. That is a completely distinct question from whether they are creating a speech product that warrants First Amendment protection.
Justice Barrett:
I totally agree, but I also think there are a bunch of landmines, and if that's a landmine, if what we say about this is that this is speech that's entitled to First amendment protection, I do think then, that has Section 230 implications for another case. And so, it's always tricky to write an opinion when you know there might be landmines that would affect things later.
General Prelogar:
Yes, and I certainly would think the court could try to carefully cabinet and make clear that it's not opining on the specific statutory terms in two-thirty or whether this First Amendment characterization of the expressive compilation fits within the provision that Justice Gorsuch cited earlier about creating speech in whole or in part. And the court could very clearly outline that in its decision, to try to caution lower courts away from conflating those two issues.
Justice Barrett:
Thank you.
Chief Justice Roberts:
Justice Jackson.
Justice Jackson:
General, I hear you struggling valiantly to set aside other kinds of applications, in response to a number of the questions. And, I guess, I can't figure out why those other applications aren't in this case. I mean, I think Florida defended the law as NetChoice challenged it and NetChoice brought a facial challenge. And I had understood that to mean, I mean, first, I was a little surprised that the government's brief didn't focus on that, but I had understood that to mean that NetChoice, number one, bears the burden in this case.
And number two, that NetChoice has to, I guess, Mr. Clement and I had a difference of opinion as to how you say it, but that burden is to show that there's either no valid application of this law or that the law has a legitimate sweep. So if we can identify other valid applications, if we see worlds in which Uber and money services or whatnot could be regulated, I don't understand why that just doesn't mean that NetChoice has not met its burden. And so that's the answer.
General Prelogar:
Well, I think you would have to conduct it at a more granular level, Justice Jackson. Because, it's not just about what are the universal platforms out there and what functionality do they offer? You'd really have to parse the challenge provisions of the Florida law and ask, "Are those platforms engaged in any of the relevant conduct?"
Justice Barrett:
I agree with you–
General Prelogar:
And I think-
Justice Barrett:
–100%. But the question is, isn't it NetChoice's burden to have presented the case to us in that way? If we don't have that information, again, I say, "Don't they lose?"
General Prelogar:
So, I want to say again that we don't have a particular stake in how you think about their own litigation decisions, on both sides, but this case very much was teed up in the lower courts as being all about what they called the big three social media companies. That's clearly the central aim of this law. It was focused not on the Ubers of the world and their comment boxes, but on the core function of creating an expressive website that principally contains user-generated components, the text and the photos and so forth.
And that the provisions that are challenged here are the ones that are focused on the type of editorial discretion that those types of platforms are engaged in. So I don't think it's as easy to say maybe we can look in the dark recesses of this law and peek around a corner and find some possible valid application. That's not how Florida sought to defend the law, and I think it would go down a complicated road to allow the core provisions of the statute to effect-
Justice Barrett:
I understand, General, but the confusion I think is that the law on its face is really broad. We've said that, and other people, many people, have noticed that it could apply to all sorts of things. And yet, you say it was litigated below as if it was narrow. I appreciate that, but we have a facial challenge on the table. And to the extent the entire law goes, then I suppose maybe these other lawful applications would go, too. And isn't that problematic when you're talking about facial challenges?
General Prelogar:
Well, you are looking at this in the posture of a preliminary injunction. So I don't think that the court is definitively resolving and issuing the final say on exactly what the status of this Florida law is. But, look, I want to agree I have some sympathy here. In preparation for this argument, I've been working with my team to say, "Does this even cover direct messaging? Does this even cover Gmail?" And we've been trying to study the Florida law and figure it out ourselves.
We think there's a lot of ambiguity about exactly what the state law provisions require. I don't think, though, that that's a basis to not resolve the central issue in the case, which is with respect to what we know the state law does. It would require these social media platforms, that are creating the compilation of third-party speech, to fundamentally alter their product that they're offering. We think that's an infringement of speech and the court should say so.
Justice Jackson:
Thank you.
Chief Justice Roberts:
Thank you, Counsel. Rebuttal, Mr. Whitaker?
Mr. Whitaker:
First, on the procedural posture, the fact that there's no record in this case is entirely NetChoice's fault. It was NetChoice who insisted in district court on litigating the PI very fast. In fact, we actually wanted to slow it down and take discovery. And we actually even offered to voluntarily stay the law while we did that, and NetChoice says, "No, we want to go fast." And the district court obliged them, and went fast. There was no meaningful opportunity to take discovery. And in fact, when we appealed, we tried to say, "Hey, let's litigate this case while it's on appeal and do discovery." And they said, "No, we want to stay discovery even while it's on appeal," and the district court obliged. So the fact that there's no record in this case is not Florida's fault. It is NetChoice's fault.
Second, there are clearly constitutional applications of this statute. And contrary to what my friend said, it does apply to Uber. And he read you the definition of censorship on 97A, and right before that is the definition of deplatforming. And if Uber deplatforms a user, that is covered by our law. If user's, if Uber says to a journalistic enterprise, "I don't like the cut of your jib, the broadcast you did last week," that is covered by our law. And so that is something that is there.
And there are also, it's not just Gmail, it's also WhatsApp. There are messaging functions, those are constitutional applications. And the consequences of my friend's argument is really quite sweeping. My friend seems to think that even a traditional common carrier has a First Amendment right, I guess, to censor anything. I guess that means that Verizon can turn around tomorrow and have a First Amendment right to kick all Democrats or all Republicans off of the platform. And that would have sweeping consequences that I do not think is supported, because Verizon has no message in deplatforming or censoring its users.
And that principle is distinct from what my friend, from the United States, is saying. Because, she's talking about, "Oh, well, they arrange material on the site in various ways," but that doesn't speak at all to whether they have a constitutional right to censor. Just because you have to carry content or carry a particular user, you could still arrange it. And I think that's the fundamental conflation that the United States does in its brief. It ignores the distinction between the hosting function and the organizational function. And that's something that I think the court needs to keep separate in its mind. And I would commend to the court Professor Volokh's article, cited on page 24 of our brief, that makes this distinction. Thank you.
Chief Justice Roberts:
Thank you, counsel. The case is submitted.
--
Transcript for case 22-555, NetChoice, LLC v. Paxton
Chief Justice Roberts:
We will hear argument next in case 22-555, NetChoice v. Paxton. Mr. Clement.
Mr. Clement:
Mr. Chief Justice. And may it please the court. I don't want to proceed as if I wasn't here for the first argument so let me focus on what's different about Texas. One thing fortunately that's different about Texas is its definition of social media platforms excludes websites. So we can just put that Gmail issue to one side for when we're talking about Texas. The other thing it excludes, of course, is websites that are primarily focused on news, sports and entertainment. In the First Amendment business, we call that content-based discrimination. And that's just one of the many reasons that this statute is, dare I say it, facially, unconstitutional. The other thing that's different is in some respects, this statute operates more simply because it forbids my clients from engaging in viewpoint discrimination.
Now we're used to thinking that viewpoint discrimination is a bad thing and the government shouldn't do it. And of course when governments do it is a bad thing. But when editors or speakers engage in viewpoint discrimination, that is their First Amendment right. It is also absolutely vital to the operation of these websites because if you have to be viewpoint neutral, that means that if you have materials that are involved in suicide prevention, you also have to have materials that advocate suicide promotion. Or if you have materials on your site that are pro-Semitic, then you have to let on materials onto your site that are anti-Semitic. And that is a formula for making these websites very unpopular to both users and advertisers. So it's absolutely vital.
The other thing that makes Texas a little different is at least in passing the law, Texas was even more explicit in relying on the common carrier analogy as if simply labeling websites common carriers makes the First Amendment problems go away. And that is fundamentally wrong for two basic reasons. One, these companies don't operate actually as common carriers. They all have terms of use that exclude varying degrees of content. And second, Texas can't simply convert them into public common carriers by its say so.
I welcome the Court's questions.
Chief Justice:
Mr. Clement. If these laws go into effect, how would your clients–what steps would they take to comply?
Mr. Clement:
So I mean, one thing that they would–
Chief Justice:
Including, I'm sorry, just in particular addressing the situation of compliance in Texas and Florida as opposed to nationwide?
Mr. Clement:
Sure. So one of the things that they would contemplate, at least with respect to Texas in the first instance, is there's some way to just withdraw from the market in Texas and Florida? And of course Texas had that in mind in the statute and specifically said, we essentially have to do business in Texas and we can't discriminate against users based on their geographic location in Texas. So if we lose this, including the idea that we can be forced to engage in expressive activity in Texas, then I think we would fundamentally have to change the way that we provide our service in order to provide anything like the service that we want to while not engaging in viewpoint discrimination, we'd basically have to eliminate certain areas of speech entirely. So we just couldn't talk about suicide prevention anymore because we're not going to talk about suicide promotion. I guess we couldn't have pro-Semitic speech because we're not going to have anti-Semitic speech.
So we'd have to figure out some way to try to engage in even more content moderation or editorial discretion to try to get us to a level where we're more benign and somehow we don't run afoul of Texas's law. And then on the disclosure provisions, the record here reflects that YouTube would have to basically increase its disclosure and appeal process basically a hundredfold in order to comply with Texas law.
I mean, I'm happy to talk more about the common carrier issue. I do think it's a central part of their defense. There was an illusion earlier about somehow Section 230 treats my clients, the websites, as common carriers. To the contrary, Congress specifically, and this is 47 U.S.C. 223 subsection (6), which we cite in our briefs, it specifically is a congressional provision in the same act of Congress that says that interactive computer services should not be treated as common carriers. And I think more broadly, the whole thrust of 230 is don't just be a common carrier, don't just put through all of this material. We don't want that. We want you to exercise editorial discretion in order to keep some of the worst of the worst off the site. Now-
Justice Gorsuch:
All that's true and I acknowledge all that. But it also says that's true only if it's not your speech. And that seems to be in tension a bit with your suggestion that everything is your speech. And I think Justice Barrett pointed out an interesting feature of that, which is these algorithms arrange, sort, promote certain posts by users and not others. And is that not your, and not yours, but your client's speech?
Mr. Clement:
So I don't think it's our speech in the way that Section 230 talks about the speech. And I think for these purposes you have to distinguish between the speech that is the editorial function and the underlying user's speech.
Justice Gorsuch:
I understand that, and I didn't mean to suggest otherwise, but there is some editorial speech, your term, going on, right?
Mr. Clement:
I think that's right. And–
Justice Gorsuch:
So the carrier would be liable for its editorial speech?
Mr. Clement:
I don't think so. I mean, I did actually reread the brief that I filed at least in the Gonzalez case, and I think that you could make a strong argument based on the text of that statute that that kind of editorial sort of functioning is not something that causes you to lose your 230 protection.
Justice Gorsuch:
So it's speech for purposes of the First Amendment, your speech, your editorial control, but when we get to Section 230, your submission is that isn't your speech?
Mr. Clement:
Yes, as a matter of statutory construction, because otherwise Section 230 ends up being self-defeating because again, the whole point of Section 230 was to promote that editorial discretion and this court wrestled with these issues. They're hard issues and I certainly applaud the instinct that you shouldn't resolve them here, but I don't think just by recognizing that my clients are engaged in editorial discretion when they make those decisions about what's going to ultimately go to the individualized screen that a user's going to see when they tap into their website or their application, I don't think that's the kind of speech that you're talking about in the 230 context. And if you did, I think you would defeat the fundamental purpose of 230 because they wanted you, they wanted my clients and others to exercise that editorial discretion to keep the bad material out.
Justice Gorsuch:
With respect to other people's speech. So it seems like we have speech and then we have speech.
Mr. Clement:
You literally, and again, I'm happy to argue that case right now if we want to, but you can't have Section 230–
Justice Gorsuch:
Well, no. It's a really hard question for us and it's perfectly relevant here and very important because of course 230 preempts things, and we don't know how much of this law it preempts.
Mr. Clement:
Absolutely, but this law is unconstitutional in all its applications and certainly it has no plainly legitimate sweep. So you don't have to reach the 230 question directly here. And I would simply say that when you're reading those statutory terms in 230, you wouldn't sweep in editorial discretion because if you do, you will defeat the fundamental purpose of Section 230, which is to empower editorial discretion.
Justice Gorsuch:
Well, I just wanted to raise with you the question I raised with the Solicitor General, who offered a thoughtful response, but many of your clients' terms of service, while reserving some editorial discretion, and I think about most of them as speaking about the things covered by 230, obscenity, et cetera, go out of their way to promise an open forum to all members of the public and go out of their way to say we don't endorse what other people say on this site and go out of their way to say all views shall flourish. Now that's not true for all of your clients, but it's true for some of them, and many of them, what do we do about that?
Mr. Clement:
So I would say that, and it's true of some of my clients and some more than others, and I think all of those terms have service as the General said go on to say, "And there's certain things though that are out of bounds." And I do think it's just a factually true thing that my clients in the main, as long as you stay within the lines, they actually do want to promote an open dialogue and a fair dialogue. And if you look at the Center for Growth and Opportunity Brief, it shows you that actually some conservative voices have really flourished on these websites. Ben Shapiro and the Daily Wire are killing it on Facebook. And that shows you that we do want a broad discussion, but there's some stuff that is just out of the lines.
And I don't think it's as simple to say, "Well, that's just the 230 stuff," because again, we had a debate about what otherwise objectionable means, but I also think that my clients are getting a lot of pressure to be particularly careful about things that are damaging to youths. And I think in that context they want sort of err on the side of keeping some bad material off.
Justice Gorsuch:
Well, you mentioned that a few times. Let me just press the other way though. Doesn't it also hold that on your view, part of the editorial discretion of a platform would be that it could use algorithms designed specifically to try to attract teens to addiction or suicide, depression, those kinds of things as, well, that would be part of their editorial discretion too?
Mr. Clement:
So a website, I don't think my clients, because my clients are working hard–
Justice Gorsuch:
I don't mean to cast aspersions on anyone, but I think it's a natural consequence of your position, isn't it?
Mr. Clement:
There would be protected First Amendment activity with that very different website with a business model that I don't think would stay in business very long. And it is possible, as the United States has pointed out in its brief, that if you have a different concern and you identify a different government interest, that maybe the government might be able to do something, particularly if it does it in a content-neutral way, to address some of those concerns.
But to get back to something Justice Kavanaugh pointed out before, I mean I actually think that both Texas and Florida have been pretty aggressive about their government interest here being something that is not just not a legitimate interest in the First Amendment context, but is affirmatively prohibited, which is the idea that we're going to amplify some voices and we're going to put burdens on private parties so that some voices can be louder than others or some people can get a boost from what they're getting in the marketplace of ideas.
And the only place this court has ever allowed that was in Turner, and I mean Justice Kavanaugh, you pointed out that one of the key things there was it was content neutral, but I actually think the critical thing in Turner is that bottleneck or chokehold on the content that went into individual houses. And I think that's what made what was otherwise an impermissible government interest a legitimate government interest in that narrow context. And maybe you could say the same thing, I mean I don't know if Red Lion is still good law, but that's the same idea that there's a scarcity rationale. But there's no scarcity rationale in the internet and this court said that in 1997 in the Reno case where–
Justice Kagan:
I'm sorry, can I ask you about a distinction between two possible kinds of applications of the Texas law? So one is the application that prevents you from keeping out certain speech that you want to keep out. You said anti-Semitic speech, it could be any of a number of things. As I understand it, the Texas law also -- prevents you also from doing something else, which is suppose you wanted to prevent anti-Semites from posting anything, you know, you want -- you just wanted to -- to say that they're a class of people we're not even going to let them post cat videos. Should we think about that set of applications differently?
Mr. Clement:
I don't think you should think of it radically differently. I mean it's a different application, but I think it's the same idea, which is there are some speakers, and I think this is going to be very few, but there are some speakers where they are so associated with a particular viewpoint that it informs essentially all of their speech. And it also affects the speech of other people in the forum. If you have a white supremacist on your speech forum and they're posting there, it's going to cause a lot of other people to say, "What is that person doing? What's going on here? Why are all the dog photos white?" I mean, it's going to fundamentally change the dynamic on the website. And I think a website that's trying to promote a particular discussion has a First Amendment to exclude those people.
And in practice, this is what is used to exclude sexual predators, which is something, again, that the government can't do Packingham, but Facebook does. And there's certain other people with just very distinct viewpoints where it's in a sense, we know the viewpoint, the viewpoint is problematic even if the particular post is not.
Justice Barrett:
But Mr. Clement, I just wanted to follow up on that because it seems to me that Justice Kagan's question kind of gets to the distinction in 303 Creative between turning people away and the speech that you have. And so if you think about it as silencing someone who you let on your platform, then that seems more like speech or content moderation to the extreme, for example. But I assume that the implication of your answer to Justice Kagan is that you could tell the anti-Semite, we're not open for business to you, right?
Mr. Clement:
You can tell that person that our speech forum is not open to you. And I think that's what makes it different, that Texas is focused really on these speech-oriented platforms. And so I think if you are in the business of speech and you have somebody, and again, this is not sort of other prohibited statuses, this is viewpoint. And so you are in notorious Anti-Semite, we do not want you to participate in this conversation
Justice Barrett:
Religion then?
Mr. Clement:
Sure. And I want to have a Catholic website. I can keep off somebody who's a notorious Protestant. I mean, I want to preserve the nature of the discussion on my forum and it's a private forum and the government can't tell me as a private party, “Let the Protestant into the Catholic Party.” I don't think so.
Justice Alito:
Mr. Clement, can I ask you about Section 2? I don't think anything has been said about it so far. So you say that Section 2's individualized explanation requirements violate the First Amendment because they impose a massive burden, right? That's your argument? I mean, it seems to me that the European Union has imposed exactly the same... pretty much the same individualized explanation requirement on anybody who operates there that Texas has imposed. And I'm not saying that whatever the European Union says is okay is constitutional here, but just on the practical question of whether it's too much of a burden, if it's not too much of a burden for your clients to do it in Europe, how can it be too much of a burden for them to do it here?
Mr. Clement:
So as I understand the requirements, they are different. They're materially different. And in a sense, the European Union provision has sort of a built-in kind of reasonably practical provision right into what you have to do. You only have to do what's reasonably practical. This is an absolute requirement to respond to every takedown, and that's over a billion takedowns of comments in a quarter for YouTube. And then there's also this appeal process, which I don't think is coextensive with the process in Europe.
So just as a practical matter, I think this is more burdensome. But as you said, the First Amendment does not apply in Europe. And I think that having this kind of disclosure requirement on what is really an editorial discretion decision is potentially, I mean, hugely problematic. I mean, if you took this and said The New York Times, you have to tell us why you rejected my wedding announcement. I mean, you only take 10% of the wedding announcements you have to tell me, even if you automized that and sort of said, well, one, if you weren't rich enough, two, if you weren't connected enough in New York social circles and three, we just didn't like the way you looked.
Justice Alito:
And some of your clients are humongous. And if you want to say this is unduly burdensome, you have some obligation in the district court to try to–is it enough for you to just say, this is a huge burden, so knock this out? Didn't you have to provide something to show what resources would be required we did and why that would be too much for these megaliths?
Mr. Clement:
I mean, we did. There's more of a record in the Texas case than in the Florida case. The witness for YouTube in their declaration specifically said this would be a hundred times more burdensome than their current process. And so there is a record on this. It is incredibly burdensome.
Chief Justice:
Justice Thomas, anything further? Justice Alito?
Justice Alito:
The 230 argument is intriguing to me, and the distinctions that you're drawing somehow to some degree escape me. So is it your position that you are exercising editorial discretion as to everything – let's take YouTube, as to every video that is placed on you have exercised editorial discretion that you want that on YouTube?
Mr. Clement:
I would say that we have exercised some editorial discretion to not sort of eliminate that from the site entirely. And as to an individual user, we've used what are typically in many cases neutral algorithms, but some of them are not neutral. And even in Taamneh, the briefs I think made quite clear, although that at a certain point some of the algorithms were neutral as between rice pilaf and terrorism, there were other efforts to affirmatively get terrorist stuff off of those sites. So there –
Justice Alito:
I mean, if you were a newspaper and you published the content that appears in every single one of the videos on YouTube that you allow to be included, you would be liable potentially for the content of that material. And I don't understand the rationale for 230 if it wasn't that you can't be held responsible for that because this is really not your message. Either it's your message or it's not your message. I don't understand how it can be both. It's your message when you want to escape state regulation, but it's not your message when you want to escape liability under state tort law.
Mr. Clement:
I don't really think we're being inconsistent. And what I would try to draw the analogy just to a good old-fashioned anthology, if I put together an anthology of 20 short stories, everybody understands that the underlying short stories are still the product of the individual author. But as the anthologist, as the editor of this compilation who decided which 20 got in, which ones didn't, I'm responsible for those editorial discussions, those decisions. Those are both protected First Amendment decisions, you can distinguish between the underlying material and the editorial decisions. Now at common law, the publisher was responsible for both. And so they were still liable for republishing the authors work. And that's precisely what Congress wanted to get rid of in 230. And they wanted to essentially give our clients an incentive to weed out of the anthologies the stuff that was harmful for children and problematic. And that's why I don't think it works to say, "Oh, well then that's your speech, so you're liable under 230," Because it's that editorial control, the weeding out the bad stuff, that was the whole point of 230, to empower that.
Justice Alito:
Well, I don't know how a publisher could be liable for – well, I take that back. For fiction, but certainly if it was – I mean you back in the day when some written material was considered to be obscene, you put together an anthology that included obscene material, you could be sued. Today if you put together an anthology of essays, nonfiction writing, and there's defamation in there, then the publisher could be sued. Even a publisher –
Mr. Clement:
I agree. I mean –
Justice Alito:
Well, we exercised editorial discretion. That doesn't shield you from liability.
Mr. Clement:
Not at common law. And that's why Congress had to come in with 230. But what Congress did is it looked at the common law and it said, "Oh, this is problematic." Because the only way you can avoid liability at common law is if you act as a conduit and let everything out. And once you start keeping out a little bit of porn, then you're responsible for the porn that slips through. And that's not practical on the internet and that's why we have 230.
Justice Alito:
I don't want to belabor the point. Let me just say something about the analogies that both sides draw to the issues that were presented in prior cases. So you say this is just like a newspaper, basically it's like the Miami Herald, and the states say, no, this is Western Union. It's like a telegraph company. And I look at this and I say, it's really not like either of those. It's worlds away from both of those. It's nothing like a newspaper. A newspaper has space limitations. No matter how powerful it is, it doesn't necessarily have the same power as some of your clients. But put that aside, newspapers overtly send messages. They typically have an editorial. They may have an editorial 365 days a year, or more than one, but that's not the situation with even the most prominent of your clients. So I don't know how we can decide this case by jumping to one side or the other of this case law.
Mr. Clement:
Well, Justice Alito, let me offer two thoughts. One, this isn't the first time you're wrestling with the internet. You wrestled with it in Reno, you wrestled with it last term in 303 Creative. And I think the gist of those cases is this is more like the newspaper or the parade organizer than it is like a common carrier. And then as to the cases, whether you think that this is different from a newspaper, I mean the arguments that you're pointing to say this is different are the arguments that those cases wrestled with and said didn't matter.
So I know you know this, but in Torneo, there was all this language about it being a monopolist, and that was in the context of a local political election where if you couldn't get into the Miami Herald, where else were you going to go? And yet this court said that didn't matter. And also in Torneo this court said, "Yes, space constraints, there are some, but our decision doesn't turn on that." And then in Hurley, there's a lot of language in the court's opinion that says this is not much of a message and they let some people show up even if they get there the day of, and the only thing they're doing is excluding these groups. But of course the exclusion was the message that they were sending and it's the message the state was trying to prohibit. And that's kind of the same thing here, which is-
Justice Alito:
If let's say YouTube were a newspaper, how much would it weigh?
Mr. Clement:
Well, I mean it would weigh an enormous amount, which is why in order to make it useful, there's actually more editorial discretion going on in these cases than any of other case that you've had before you, because people tend to focus on the users that get knocked off entirely and end up on the cutting room floor. But both these statutes also regulate the way that these social websites, they sort of get you down to something that's actually usable to an individual user. And in fact, if you try to treat these entities like a true common carrier, so first in first out, just order of, you'd open up one of these websites and it would be gobbledygook. Half the stuff wouldn't even be in a language you understood. And even if you controlled for that, you'd get all this garbage you didn't want.
Justice Alito:
All right, thank you.
Justice Sotomayor:
I'd like to go back to the individualized explanation requirement. And then please remind me, what did the district court do here? Did it grant you an injunction here and it was the circuit court who didn't?
Mr. Clement:
Yeah.
Justice Sotomayor:
So it was a district court who looked at the amount of material you submitted, and I know your declaration, YouTube said it would be a burden a hundred times more than it does now. I don't know what the quantification of that, whether that was quantified or not. Was it 100% more, 100% more costly, 100% more what?
Mr. Clement:
100% more of its current effort, its current efforts that dedicated to–
Justice Sotomayor:
Yeah, but we still don't know what the cost of that is and what–there's a lot of unknowns. But this was a facial challenge with respect to that. And Texas seems to say you don't need to do much, you just need to have the computer spit out one through ten reasons. And if you have a few individualized ones, you could just explain those individualized. What do we do with that dispute?
Mr. Clement:
So first of all–
Justice Sotomayor:
Because it is a facial challenge.
Mr. Clement:
It is a facial challenge, it is a preliminary injunction. We've obviously been over some of that. Here there wasn't just declarations, there were depositions taken. There was a record that was put together on all of this, and Texas was taking a slightly different view of what the burdens of Section 2 were there. And so I think on that, if you just look at the record that was before the district court, you should affirm the district court's preliminary injunction. What I would say though is I also think that even what they say on page 44 of their red brief is that you can do this in a relatively less burdensome way as long as your editorial policies are sufficiently specific and particularized. And what they're basically saying is you could change your editorial policies a little bit to make it easier to comply with this disclosure obligation. And that seems–
Justice Sotomayor:
That begs the question, right, because they're effecting–
Mr. Clement:
Yeah, exactly.
Justice Sotomayor:
Okay.
Chief Justice:
Justice Kagan? Justice Gorsuch? Justice Kavanaugh?
Justice Jackson:
I just have a quick question. So part of the dynamic that I think is going on in these cases is the fact that this regulation is enacted by the democratically elected representatives of a state, and I suppose that if the state's regulation of these platforms gets too burdensome, then presumably the platforms can say, "Forget it, we're not going to operate in your state," and then the citizens of the state would have the chance to determine if that's what they really wanted. That's sort of how I'm looking at this at a meta level. So what caught my attention was your response to the Chief Justice when you suggested that your client couldn't withdraw from the state of Texas because you read the provision related to censorship and geography as ensuring that you don't do so. I had not read that provision in that way, so can you say more about why that's your interpretation?
Mr. Clement:
Sure. I think that's the obvious interpretation of that provision, particularly when it talks about this isn't like don't discriminate against Texans or Texans wherever they are. The fact that it's particularly preventing us from discriminating on somebody with a geographic location in Texas is basically telling us that we can't try to geofence our service and try to essentially explain to the people–sometimes if you get your cable service as a dispute with the provider and you can't get your football game and they tell you if you're hacked off about this, call this number and complain. We can't do that in response to this law. And I think the legislators in Texas were able to tell their constituents, "Don't worry if you like your website, you can keep it. We're not going to threaten. They can't pull out of here based on the way that we're regulating them."
Justice Jackson:
So even if we could read it a different way, you're saying this necessarily–I mean I guess this also dovetails with my concern about us not having sort of state interpretations or an application here to really understand, because I could read this differently. It seems to me it's fitting into the whole set of things you're not allowed to do, you can't censor people on the basis of the viewpoint of the user. You can't censor them on the basis of the viewpoint that is being expressed. And you can't censor them based on their location in your state or another part of the state. And so I guess I don't necessarily see that in the same way. I mean you can't just automatically do that, I guess, I don't know.
Mr. Clement:
It seems to me quite clear that it's designed essentially as a poison pill or somebody described it as the Hotel California provision that can't leave Texas even if you want to try to do that as a way of showing that this is an impermissible way of regulating our expressive activities.
And so I do think that is the right reading. I do think the fact that it's geographical location in Texas is kind of a clue to that. This is not something where if you're a Texas fan, you're protected no matter where you go in America. This really is designed to sort of say that you can't do the kind of geofencing that you might otherwise do to comply with an idiosyncratic state law. I should mention just for the sake of completeness, that in the lower courts, not part of the preliminary injunction, they're dormant commerce clause challenges to these provisions and the way that this is just one state trying to regulate everybody, and so that's part of the case that will be there-
Justice Jackson:
But it's not here yet?
Mr. Clement:
But it's not here. All that's here is a preliminary injunction that runs to my clients. So I mean this statute has a smaller universe of people, but if there's somebody else out there who isn't one of my clients who isn't covered by this preliminary injunction, the statute could take effect as to those people. And the same is true in Florida.
Justice Jackson:
Thank You.
Chief Justice:
Thank you counsel. General Prelogar?
General Prelogar:
Mr. Chief Justice, and may it please the court. I want to pick up with the question that Justice Alito asked in the seriatim round to my friend about the idea that the social media platforms don't perfectly fit into either analogy or paradigm here, and I want to acknowledge the force of that intuition. They obviously operate at a massive scale that goes beyond any particular parade or beyond any particular newspaper. I think the right thing to do with that intuition is to recognize that it's not like you can just exempt them from the First Amendment. They are obviously creating something that's inherently expressive in taking all of this quantity of speech on their websites and curating it and making selectivity decisions and compiling it into a product that users are going to consume. So the First Amendment applies, but I think that those kinds of concerns about how the social media platforms and how they look somewhat different from the other kinds of expressive products this court has reviewed in prior cases can come into the question of whether the First Amendment is satisfied with respect to any particular regulation.
Now here we think it's not satisfied because of the way that Texas has designed this law. I'd urge the court to rule narrowly. It's not necessary here to try to figure out how the First Amendment applies to new technology in general or to every possible website or the internet in particular. This law has a very clear defect. What Texas has done is tried to countermand the protected editorial speech decisions of the platform and the only justification it's offered to the courts below is that it wanted to essentially amplify the voice of users on that platform by suppressing the platform's own protected speech. That is a defect that is clear under the First Amendment, and the court could say only that and resolve this case. I welcome the court's questions.
Justice Thomas:
General, when I asked you about the difference in treatment of private party as opposed to the government engaged in similar conduct, your answer was of course that it would be different, the government would be bound to comply with the First Amendment. There was some discussion in a number of the amicus briefs about instances in which the government and the private party, say petitioners here, and the government coordinating efforts. How would you respond to that?
General Prelogar:
So let me respond to that by saying I think the position we're offering here and the position this court will consider next month in the Murthy case are entirely consistent. We of course acknowledge that if the government actually coerces the platforms and takes over their editorial decision making, then the platforms could be deemed a state actor and that would be subject to First Amendment scrutiny. We vigorously dispute that that has actually happened and the federal government has engaged in that kind of coercive conduct and we further dispute the legal standards that were applied in that case.
But there's no inherent tension here. The federal government obviously can act and criticize the social media platform's content moderation decisions. That's just using the bully pulpit to express views. And if the states disagreed with how the platforms were exercising their content moderation standards, it could have done the same, it could have criticized them, it could have urged them or tried to influence them to adopt separate standards. But here what the state did is said, we're going to pass a law that actually takes over their content moderation and dictates that it has to be done in a different way.
Justice Kagan:
General, Texas's law, even more than Florida's, can be understood as an expansion of public accommodations laws. And the United States is often in a position of defending public accommodations laws and insisting that they be vigorously enforced. And how do you see what Texas is trying to do as consistent with that broader stance about public accommodations laws?
General Prelogar:
Yes, so I want to be very clear and stake out potentially some separate ground from my friend representing the platforms in this case with respect to generally applicable public accommodations laws that protect based on a particular status. We think of course those laws are valid on their face and that they serve compelling governmental interests. And so to the extent that you're looking at how an ordinary public accommodations law operates, the refusal to deal, the refusal to serve, as Justice Barrett said, we think that's a regulation of conduct and that ordinarily there would be no First Amendment problem with the application of that law. Now, I acknowledge that it gets more complicated when those laws are applied to a business that is providing an expressive product and cases like Hurley or 303 Creative show that in certain applications sometimes the public accommodations law has to give way to First Amendment interest.
But I think the court has drawn a clear line. It has never suggested that the mere refusal to deal or serve based on status, even with respect to an expressive association, would fail under First Amendment scrutiny. Instead, you look at a case like 303 Creative and there the concern was about changing the message or a case like Hurley, gay and lesbian individuals could march. You just couldn't change the message by holding up a particular sign. So we recognize that there are going to be some applications where you'd have to conduct that kind of First Amendment analysis, but the relevant question is could you just bar people on the basis of a protected status from creating an account and it's not going to affect your message, they want to lurk on X and read other people's posts, I think that that kind of law would certainly be valid.
I want to briefly address, Justice Gorsuch, the question you asked about the scope of CDA preemption under Section 230. Just to be clear on this one, I want to say there are unresolved issues here. I would warn the court away from trying to resolve exactly how much conduct CDA 230 protects and exactly how that interacts with the Texas law here. The only point I would make is that there are questions about what it means to act in good faith questions about what it means for the platform to take down content that is otherwise objectionable, but however those interpretive disputes might shake out in a particular case, surely Texas here isn't saying that its entire law is preempted and it has no effect whatsoever, and CDA 230 fully takes care of the problem.
So I think what the court could do, not knowing exactly the scope of how that preemption issue might be resolved, is to say whatever exists in that category of speech that Texas is prohibiting, the editorial decisions, it's countermanding on the one hand versus what CDA 230 would authorize on the other hand, whether that's a big category or a little category, all of the things in that category constitute protected decisions by the platform that haven't been adequately justified. And I think that's all you need to say about the preemption issue in this case.
Justice Alito:
If a legislative body enacts a law requiring viewpoint neutrality in some area, and it does so because it is concerned that people who express a particular viewpoint are suffering discrimination, is that law unconstitutional on the ground that the intent of the legislative body was to benefit a particular group?
General Prelogar:
No, I don't think that that kind of law would immediately be unconstitutional. And again, I think if it's structured like a generally applicable public accommodations law, there might be important or significant governmental interests in being able to protect against that kind of discrimination.
Chief Justice:
Unless there are any further questions.
Justice Kagan:
Can I do one more?
Chief Justice:
Sure.
Justice Kagan:
Government has spent a lot of time defending net neutrality, so maybe I should have asked you this with respect to Florida's law, just given the breadth of that law and why are internet service providers in your view so different? And what if an internet service provider wanted to make certain content distinctions?
General Prelogar:
Internet service providers are fundamentally different because they are engaged in transmitting data in order to make websites accessible, and that is not inherently expressive. They're certainly providing the infrastructure, the cable, the fiber optics and the service to make sure that you can log in on your home computer and access the internet writ large, but along the way, they're not compiling that speech into any kind of expressive compilation of their own. So we would put them in the same category as telephone and telegraph companies or UPS where you could say, sure, they're literally facilitating the transmission of speech, but they're not creating an expressive product that could implicate the First Amendment principles at stake.
Now, then you might ask, okay, well what if they want to start discriminating with respect to the service they're providing for particular types of websites? The kind of quintessential example of this is an internet service provider that decides to slow down service to a streaming site, let's say Netflix because it wants to direct internet traffic to some other website of its own choosing, maybe its own streaming service. We think net neutrality could come in there and say you're not allowed to discriminate based on content in that way, but that's because again, there would be no expressive speech or compilation that you could attribute to the internet service provider itself. People don't sign up with Comcast or Verizon to give them some kind of limited curated access to the internet. They're engaging in service with those companies because they need someone physically to transmit the data so they can get access to the whole internet.
Justice Kavanaugh:
Can I ask one? I don't have to buy anything you just said to rule for your position in this case, anything you just said on net neutrality, right?
General Prelogar:
You do not have to agree with me, Justice Kavanaugh. I hope someday if it comes to it to persuade you, but–
Justice Kavanaugh:
I'm not saying, but I just want to make sure that's walled off.
General Prelogar:
Nothing about the Court's decision in this case would at all affect the net neutrality issue. We think that here the platforms are engaging in expressive activity that's protected by the First Amendment. And you can leave for another day all of the kind of conduit questions that come up in the net neutrality context.
Justice Kavanaugh:
Thank you.
Chief Justice:
Thank you. Counsel. Mr. Nielson?
Mr. Aaron L. Nielson:
Thank you. It's been a long day. Mr. Chief Justice, and may it please the court, this is not the first time that new technology has been used to stifle speech, telegraphs also discriminated based on viewpoint prompting a national scandal. Yet under the platform's theory, Western Union was just making editorial choices not to transmit pro-union views. Today, millions of Americans don't visit friends or family or even go to work in person. Everybody is online, the modern public square. Yet if platforms that passively host the speech of billions of people are themselves, the speakers and can discriminate, there will be no public square to speak of. We know this because Twitter has admitted that their theory of the First Amendment would allow them to discriminate, not just based on what is said on the platform, but "on the basis of religion or gender or physical disability."
That's not the First Amendment. That's Lochner 2.0. And as more than 40 states warn the court, the implications are gravely serious. For example, as New York explains, if these algorithms are constitutionally protected, platforms may be able to continue selling advertisers the ability to discriminate based on race or, as Professor Lawrence Lessig, Zephyr Teachout and Tim Wu, who do not typically file briefs in support of Texas, caution, not just states, but Congress may be powerless to address the social media crisis devastating the lives of kids. HB 20 is a modest effort to regulate such power in the context of viewpoint discrimination. Platforms can say anything they want under HB 20 about anything. There's no limit. They can say anything they want. Users can block anything they don't want. There's no limit on that. All that's left is voluntary communications between people who want to speak and people who want to listen. This law is just nowhere near the heartland of the First Amendment. Instead, this is democracy and federalism, not a facial pre-enforcement injunction. I welcome the Court's questions.
Justice Thomas:
If this was so clearly within a common law tradition as you suggest, why hasn't Congress seen fit to act as Texas has? And it appears that Mr. Clement suggests that actually Congress has acted in the opposite direction. Would you comment on that?
Mr. Nielson:
Yeah, I don't see how, with all respect to my friend, how their reading of 230 is at all consistent with what Congress said. They have all sorts of policy arguments about how 230 ought to work, but if you actually just read the words of the statute, it doesn't work. So his suggestion that Congress somehow kicked out Texas or said that that's not how he wanted this to be, I don't think it's consistent with the text of the statute. I didn't hear a lot of textual argument coming from Mr. Clement there. So that'd be my first line answer. My second line answer is I have no idea why Congress does or does not do. But I do know that Texas has the ability to protect Texans, and that's what Texas has done here.
Chief Justice:
Counsel, you began by saying the platforms, they want to keep out this person and that person on the basis of race or sex. And then you said that's not the First Amendment. Well, the First Amendment doesn't apply to them. The First Amendment restricts what the government can do and what the government's doing here is saying, "You must do this. You must carry these people. You've got to explain if you don't." That's not the First Amendment.
Mr. Nielson:
Well, respectfully, Your Honor, the First Amendment is big. It applies in a lot of different ways. So it's true for us like we're saying, because this isn't speech, it's conduct, we can require viewpoint neutrality. But in other cases, the same companies are saying when New York or some other state says, "Hey, you can't have algorithms that try to hook kids." They say, "Well, we have a First Amendment right to do that." It's the same First Amendment, the same First Amendment that says... I mean, if it's all First Amendment, then I guess it's going to be hard for Texas to say you have to be viewpoint neutral, but it's also going to be hard for California and Illinois or anybody else to say you can't have an algorithm that hooks kids because it's all the same.
Chief Justice:
Yeah, I'm sure it's the same for all the other states, the question is they don't have the obligation to act in the same way that you as the state has the obligation to do. They can discriminate against particular groups that they don't like, whether it's a group that encourages kids to take the Tide pod contest or something else, and you have different obligations.
Mr. Nielson:
I guess a couple ways I could respond to that, Your Honor, the easiest one I'm going to talk about is if, I may, common carriage. My reaction coming to this case was the same as yours. My reaction was, well wait a minute, it's their own platform. You can't censor. They're private. But that's the exact same scenario that came up with the telegraph. The idea that the telegraph was dumb pipes is not true. Instead, what the telegraph was is they had the technological ability to say that we're not going to let this type of speech through.
Chief Justice:
No, you're absolutely right, but it's kind of begging the question, you're assuming that they're like the telegraph. It seems to me that that's a big part of what the case concerns. And I'm just not sure that a telegraph had a particular compelling type of monopoly. I mean, if you didn't want to use the telegraph that was there, you usually didn't have an alternative choice, whether you're talking about railroads or other types of common carriers, I'm not sure the same thing applies with respect to social platforms.
Mr. Nielson:
So I give you my theory for why common carriage is important here. As I look at the cases, and I agree, they're really hard to figure out where conduct starts and speech ends and all of that, and you look at all the various cases this court has said, some commentators say they can't be reconciled. I'm not sure about that. But I think as a helpful way to think about it is we know that there is a line between speech and conduct, and we know that common carriage has always been on the non-speech side of the line, the conduct side of the line. So if this falls within the common law tradition of what is common carriage, nobody has ever thought that falls on the speech side of the line. So we can't make them say something otherwise that they didn't want to say. The whole point of it is that's a signal to the court. That's a way that the court can figure out which side of the line are we on.
Chief Justice:
Well, that turns on whether you're saying, who do you want to leave the judgment about? Who can speak or who can't speak on these platforms, and do you want to leave it with the government, with the state, or do you want to leave it with the platforms, the different various platforms. The First Amendment has a thumb on the scale when that question is asked.
Mr. Nielson:
It does, and that's why it's important I said to go back to look at the history on this because at some point the First Amendment has to end or everything is covered by the First Amendment. This court has said that the way that we tell the difference is whether it's inherently expressive. And the court has said what they mean by inherently expressive, they talked about in Miami Herald, you're not a passive conduit. We talked about in Hurley whether you're intimately connected. Well, this court last year had a case in Taamneh where they talked about what these platforms do and they say that they are passively connected to the speech on their platforms and that they're agnostic about the content, it's just one big algorithm that's matching things together. And I think that's important.
But I also want to stress, if I may, again, this is a facial posture, and if you look at the breadth of our statute, there's the talk about whether you have to host somebody's speech. There's also about, you just want to read Facebook. That is one of the provisions of our statute. You go online in the morning and you want to see what's going on in the world, according to their theory, they can stop you from doing that too. And that's surely public accommodation law, the idea that they don't like somebody because of their race or their disability or something like that and we're going to say, "We're not going to allow you onto our platform." That surely cannot be constitutional. That's what I mean by that's Lochner, that's gone beyond any content of those platforms themselves on their page to saying, "We're not going to let people even look at what we're selling." That's a bookstore saying, we won't sell you our book. That's different from saying we won't publish your book.
Justice Kagan:
Do you think that there are any unconstitutional applications of your law?
Mr. Nielson:
I mean, that's a hard question. I suspect that there might be.
Justice Kagan:
What would they look like?
Mr. Nielson:
So the one that comes to mind would be imagine, and this comes up in their brief, they pick the most vile example and they say, imagine a publisher didn't want to publish the book written by the Proud Boys, was the example that they use. I think you might very well have an as-applied challenge to that. But the problem for them is they pick the most vile example when I think all of them would say, well, wait a minute, surely you can let them on Facebook and you can't kick them off because their grandma said something outrageous. Right? So there's got to be a limit there, and that's why a facial resolution of this case doesn't work. And if it is in fact-
Justice Kagan:
And how do they separate the one from the other? Where's the line?
Mr. Nielson:
That's hard, right? I would say this court struggled with that in 303 Creative because it's really hard to know when something becomes inherently expressive and the court's cases like Dale about when does something that happened, all of those are hard cases. But in all of them, this court has had facts. They've actually looked at the facts of the case and tried to figure out as applied whether that makes sense here. In this situation, there's a million applications of this law that are perfectly fine, and they pick some of the most vile possible hypotheticals ignoring, by the way, the provision of Texas law, which they never address, which says under Texas law, if you don't want to hear content, they're allowed to make sure you never hear that content. So all you have left, I mean, again, they never mentioned at all, that's like focus point of our brief, they never respond to it. But that means all that's left is I don't want to hear this type of speech. I just want to hear this type of speech. And it's just voluntary communication. That's a telephone.
Justice Barrett:
Mr. Nielson. You heard during the prior argument a lot of conversation about how broad Florida's law was. I read Texas's law to be more narrow in its coverage, that it wouldn't sweep in some of the examples we were using in the last argument, like Uber, Etsy. Am I correct?
Mr. Nielson:
I think that's fair, Your Honor.
Justice Barrett:
So what platforms does Texas's law cover? Am I right that it covers only the classic social media platforms like YouTube, Facebook?
Mr. Nielson:
So that's what their deponent has said. The only one that they were sure that it was covered is Facebook, Twitter, and YouTube.
Justice Barrett:
But that's their deponent. Presumably Texas is the one who can authoritar–it was in the Texas courts, I mean, it's not them. They're not the ones that get to decide authoritatively what the scope of the law is.
Mr. Nielson:
Well, correct. I mean, we would have to prove it at trial that they're subject to it.
Justice Barrett:
What's Texas's position about the scope of the law?
Mr. Nielson:
Well, the law says that it applies to any platform with more than 50 million active users per month. So I'm not sure where some of the other platforms fall on that. The ones, like I said, that we know are the three biggest ones fall–
Justice Barrett:
So you're making that judgment based on size? So it's nothing about the definition. I mean, in the last argument, we're pointing out that the Florida law in defining what a platform does and how it works would encompass Uber, for example. But you're saying that you're just distinguishing this based on numbers?
Mr. Nielson:
No, I apologize, Your Honor. There's also a separate provision which defines social media platform as a website open to the public, allowing the user to create an account, enables users to communicate with other users for the primary purpose of posting information, comments, and so on.
Justice Barrett:
And so is it Texas's position that that definition then covers the classic social media sites? And by classic social media sites, I mean sites like Facebook and YouTube?
Mr. Nielson:
Yes, Your Honor.
Justice Barrett:
And that it would not sweep more broadly to some of these other things like Etsy?
Mr. Nielson:
I don't think so, Your Honor. But the court–
Justice Kagan:
The District court thought it covered WhatsApp. Do you think that it doesn't?
Mr. Nielson:
I don't know. I don't know the answer, that's the best I can give you. I don't know. You don't have discovery into that. We had the deponent, their own witness said these are the three that we are sure are covered. It might very well be. That's another reason why it's hard to do this on a facial basis because it might very well be WhatsApp, which sure looks like a telephone to me, would be covered by our–
Justice Jackson:
But what about, I mean, within the big three, there are some email-looking functions, aren't there? I mean, I appreciate that. It's hard to do this because we don't have a record, but I understood that Facebook, for example, which you say would be covered, has a messenger function, which looks like email.
Mr. Nielson:
Yes, your Honor.
Justice Jackson:
So we have to do this at the level of the functionality of these various platforms rather than at the kind of entity level?
Mr. Nielson:
Yes, Your Honor, you would. And it's not just that, you'd also have to go through the different types of verbs included in our statute for censoring, including the one that they keep ignoring, which is the ability to receive the expression of somebody else. That's when I say you look at the text of the statute, their theory would mean that even if you just want to lurk and just listen and see what other people are saying, they can kick you off for any reason at all. So if you have somebody who had never posted anything or their speech is identical to the speech of somebody else, their theory is, well, we can kick you off. That seems to me pretty far into the world of public accommodations. 303 was a narrow case. If that's what 303 means, like boy, now we're really, really, really big. Hence Lochner 2.0, the idea that everything can't be protected by the First Amendment. At some point, there's lines of content–
Justice Gorsuch:
Counsel, during the prior argument, which I'm sure you listened to attentively.
Mr. Nielson:
Yes, Your Honor.
Justice Gorsuch:
There was some discussion about how difficult life will be if these injunctions are dissolved and a parade of horribles and expenses and difficulty geofencing Texas or Florida. Can you address some of those concerns?
Mr. Nielson:
Yes. Two answers if I may. First, there is some suggestion that the prohibition on discrimination against Texas or a part of Texas is somehow a trap to keep companies in. That's not true. You read the statute, that's not what it says. There's a separate provision in the statute, which is the jurisdictional hook, which is if you're doing business in Texas. And by the way, even if Texas tried to do that, there's something called personal jurisdiction that you can simply just leave a forum. That's this court's decision in Ford. So that argument, it's just not true.
But the other part I think is really important about this is Texas's law, what is the remedy here? It's an injunction. There's no damages here. It's an injunction. And in fact, we know that it's not going to flood the courts because the injunction against the attorney general is limited to the attorney general. There's private enforcement of Section 7 and we have a handful of cases because you don't get damages. So it's hard, unless you have a really darn good case, to be able to go to court if nobody's going to get damages for prevailing, which I think matters a lot in terms of what are the real world consequences here. They're going to have some lawsuits by the attorney general for injunctions, and if we can't prove it, if we can't prove viewpoint discrimination, they will prevail.
Justice Kavanaugh:
Did you say they could stop doing business in Texas under this law?
Mr. Nielson:
Yes, Your Honor. Of course. I mean, it's true under the law, but it's also just true as a matter of personal jurisdiction. Anybody can get out of any jurisdiction.
Justice Kavanaugh:
I just meant under the law.
Mr. Nielson:
Correct. Yes, under the law, yes, Your Honor.
Chief Justice:
How does that work? If you're talking about Facebook, I mean if somebody emailed and all that, if they send something into Texas, are they doing business in Texas?
Mr. Nielson:
No, your Honor, though, that would be a fun personal jurisdiction case. The answer as I understand it, is you have to purposely avail yourself of the forum. So merely because somebody can look at your website if you're not having some purposeful direction towards the forum, that's generally not sufficient.
Chief Justice:
No, no, it's a worldwide sort of thing and people are going to be sending stuff left and you know that as the company. I don't see how they can wall off Texas from the activities of the social media platform?
Mr. Nielson:
Well, I mean two answers. One, they can, they have the technological abilities, it's called geofencing, which they can carve off... I mean, if they wanted to, they could probably carve off this building itself. They have the ability all the way down to that granular level.
But again, more than that, it isn't just, it shows up there. If you want to have an account with Facebook or Twitter or any others, there's a contractual relationship between the two. So they have customers that are in these places and people say, well, they don't have any customers because they're not charging any money. Well, we know that if they're not charging any money, you're the product. So they're taking your data and they're selling it to the advertisers, which is why it's so important that we recognize that if this algorithm is protected by the Constitution, then they can take that data and sell it to people and have highly targeted ads based on socioeconomic characteristics. The New York brief explains that on page 12, which I think is important and shouldn't get lost in this. They pick, again, the most vile examples, which are the fanciful things that we don't usually do in a facial posture, and they try to say, well, that means the whole law should fail. There's a whole lot of perfectly fine applications that the court needs to remember and not lose sight of.
Justice Kavanaugh:
What about terrorist speech? How's that handled?
Mr. Nielson:
Yeah, so a few ways. First response that I would have to that is the provision of the statute that they ignore, which is: no user has to receive anything they don't want.
Justice Kavanaugh:
Right. That still allows the communication of it. So that's not–
Mr. Nielson:
Sure. Okay. All right, let's go through that there. Now most of the universe is gone, but the next level of this. Under Texas law, if it's illegal, they don't have to do that either. So I'm assuming that a lot of the terrorism is going to be like, we're inciting you to come join Hamas, or something like that-
Justice Kavanaugh:
No, no, no, no. Just pro-Al-Qaeda kind of messages that were common pre-9/11, post-9/11, not necessarily incitement, but advocating.
Mr. Nielson:
Okay, sure. All right, so we put aside the first two levels here. Third, they're allowed under the statute to pick any categories they want. So if they want to keep that category for which the speech falls in, that's their choice. If they want to cut that category out, they're free to do so. They just can't do so on a viewpoint basis. At the end of the day–
Justice Kavanaugh:
That last clause, they can't do it on a viewpoint basis. How does that work with terrorist speech?
Mr. Nielson:
Sure. It's hard to say with terrorist speech because you'd have to pick the category, but assume that it is, you can't very well say you can have the anti-Al-Qaeda, but not the pro-Al-Qaeda. If you just want to say no one's talking about Al-Qaeda here, they can turn that off. And at the last point, this is at the very end of the game. So you've gone through all of those things. All you have left are voluntary people wanting to talk each other, and I mean, people say horrible things on the telephone, and I don't think we've ever thought, well, you know what, we're going to turn that off because we don't want the telephone providers to have that sort of right to censor.
If I may, with some hesitance, I want to talk about Orwell a little bit, and I say that with some hesitance. But my reaction coming to this case was very similar to yours. I looked at this and I'm like, wait a minute, these are companies, they have their own rights. We don't generally think of censorship as something from private people. That's the government. Here's how I came around on this. Maybe it'll persuade you, maybe it won't. I came around on this to say this is something further up the food chain than that ordinary level of political discourse. This is just the type of infrastructure necessary to have any kind of discourse at all. That's why I keep going back to the telegraph. This isn't the level of discourse where they're making the content decisions that we make our decisions based on. This is the infrastructure that we need to have any sort of discourse at all.
So if we say we want to have that type of infrastructure, not have censorship on it, that would mean we would have to have a massively increased federal government because it would have to control all the infrastructure. And then we would have, okay, now you can't discriminate based on this kind of infrastructure of how things work. I mean, that is Orwell, right? So for me, the answer is for these kinds of things like telephones or telegraphs or voluntary communications on the next big telephone, telegraph machine, those kinds of private communications have to be able to exist somewhere. The expression like, “Sir, this is a Wendy's,” there has to be some sort of way where we can allow people to communicate.
Justice Jackson:
And is that just because of the modern public square? I mean, Mr. Clement has said many, many, many times that there's a distinction between public and private and that that's sort of driving his analysis as to when and under what circumstances this kind of regulation can be done. And are you just rejecting that because you're suggesting that they merge in this situation, given the nature of the communications?
Mr. Nielson:
I'm not doing that, and that's again, I'll try again to be careful because there are complicated concepts, but I think about the common carrier as a really useful tool for this court because we know that there's hard lines to draw. It's really hard to tell the difference between FAIR and Miami Herald in the application, especially when you kind of get down to the granular level, it's really kind of hard to tell. I think it would be helpful if the court had a compass that could kind of give it some direction of where to draw those lines. And common law, common carriage is that compass.
Justice Jackson:
But are you suggesting that a common carrier, as the Solicitor General pointed out, could never have First Amendment protected activity? I mean, that's why I keep going back to doesn't this have to be not at the level of entity, but at the level of what exactly are they doing in a particular circumstance? Because you just seem to say, well, these are common carriers, so everything they do is conduct and therefore we can regulate it. And I don't know that that's the way we've ever thought about this.
Mr. Nielson:
Well, then this is how the court thought about [it] with telegraphs, which I think is a useful way of thinking about it. I mean, my friend in the government says, well, they're just transmitting speech. But that's totally question begging because they have the technological ability not just to do that. The reason that cell phones don't screen your calls or telegraphs-
Justice Gorsuch:
Well, Mr. Nielson, I'm sorry to interrupt.
Mr. Nielson:
Sorry.
Justice Gorsuch:
But I think you'd agree with Justice Jackson though, that there might be some speech that these carriers, even as a common carrier would be their own?
Mr. Nielson:
100%. Yes, Your Honor.
Justice Gorsuch:
And you do have to take that function by function.
Mr. Nielson:
Yes, and that's the other part of this law, which I think is so important to recognize is we don't say one word about what they can say. So I would kind of disaggregate the functions of what's going on here. They have the one function, which is they are creating a message. We do nothing about that. They can say whatever they want about specific posts or anything, and that's fine. But there's a separate thing that they do, which is facilitate conversations between two people, which is like a phone.
Justice Gorsuch:
I understand that. Now, one of the things that we've sometimes looked at in the past, this court, I mean in the common carrier world, is market power.
Mr. Nielson:
Yes, Your Honor.
Justice Gorsuch:
And how do you analyze that here? On the one hand, there are network effects that one would take account of in any analysis of market power, and that might help you. On the other hand, this is a bit unlike a telegraph in the sense that there might only be one right of way to run the wires. There might be serious practical barriers from more than one set of wires here. One can start a new platform, at least in theory, anytime.
Mr. Nielson:
Yep. So I guess–
Justice Gorsuch:
Fewer barriers to entry, but market effects.
Mr. Nielson:
Sure. So the first answer is if we are not talking about speech, if we're just in the world of conduct, then we're not talking about market power at all. And we know that because cell phones are intensely competitive markets, and yet they're still all common carriers. But let's move that aside now we're saying there's some sort of reason to focus on market power. It's true. This is not like the market power of there's just one bridge. But as an economic matter, there's really no difference. And I know here's a simple kind of a way to look at it. Twitter has its platform. There's a lot of competitors for Twitter, would-be competitors, including Threads for Meta, which is backed by one of the largest companies in the world. They invested massive amounts of money to try to break up the Twitter monopoly and they failed miserably.
Justice Gorsuch:
So what do we do about, I mean, there's some legislative findings here about market power. What deference do we owe those, if any?
Mr. Nielson:
I would think considerable deference, Your Honor. This is a sovereign state. We don't usually treat states like the FTC [Federal Trade Commission] where we subject it to arbitrary and capricious hard look review. The state is entitled to make determinations as a matter of law as to how things are. And obviously at some point it might be so far afield, but I sure hope that the states get some deference on such important questions from this court.
Justice Barrett:
Mr. Nelson, can I just – Oh, sorry. Go ahead, Chief.
Chief Justice:
This may be the same question that Justice Gorsuch was asking, but does the nature of the economy at issue matter to us? I mean the social media platforms, the internet, all of that stuff, an incredibly dynamic market, the government maybe not so much, and yet it's sort of an inflection point to say that the government has the authority by categorizing the members, participants in this dynamic market as common carriers to take over extensive regulation of them. Not with respect to communication, but all sorts of things. I mean, whether you're talking about railroads or telegraphs, it's not just moving, transportation, it's what the railroads look like, what the safety things, they have to have a whole range of things that in the wild west economy surrounding the social media platforms and the internet may be totally inept. Now, I don't know if it comes at a time when you need to make that transition or not, but that is a very big step when it comes to the extent of government regulation.
Mr. Nielson:
I certainly think that's fair. My response is going to be this is a facial pre-enforcement injunction. We should at least be able to make our showing on the facts. We're quite confident that we'll be able to show, not just market power, but durable, extensive market power here. I actually don't think it would be even all that difficult to make that showing. So to the extent that market power is a requirement, I think that they haven't shown that they're likely to prevail on the merits as to that, which is another reason why a facial injunction was just simply inappropriate. Bring it as-applied case, and we're happy to litigate that. It's really hard to, facially, they can pick a few examples and then say the whole thing fails.
Justice Barrett:
Mr. Nielson. What besides market power? I want to give you a chance to elaborate on your definition of common carrier. I mean, you've said conduct, market power, what else?
Mr. Nielson:
Sure. So the main requirement of common carrier, this is where common carriage and public accommodation are, if not cousins, maybe twins, is it has to be open to the public, which means that it's not a private associational group or something like that. You hold yourself out open to the public with non-differentiated contracts. You have, this is a contract with everybody. So that's the very first one. The second is it has to be the type of industry that has traditionally been regulated as such. So for public accommodation, that's your inns and your restaurants. For common carriage, that's where you're talking about things like bridges and telecommunications.
Justice Barrett:
But then you get into the problem of having to draw the analogy, right? I mean, the Chief Justice called the internet, kind of like the wild west of the internet. And the internet looks a lot different, even each of these platforms has different functionalities within it. You've got grist mills and then railroads and cable companies. Each time you encounter something new that might qualify as a common carrier, you have to make a decision, does it fit the bill or not?
Mr. Nielson:
Sure. So I guess I can keep going further. That's why some courts have said, “well, maybe there's additional requirements that we put on common carriage.” One is market power, which not everybody says – I don't know how that works with cell phones, but they said, “well, you need market power.” And the other was it has to be somewhat invested with a public interest. And here, under that, we know that if it’s state action to block somebody from your Twitter account, how can that not be infected with a public interest?
Justice Barrett:
Thank you.
Chief Justice:
Justice Thomas? Justice Alito? Justice Sotomayor?
Justice Sotomayor:
I have a problem with laws like this that are so broad that they stifle speech just on their face, meaning I think that's what the government's been trying to say. If you have a particular type of speech that you want to protect against or promote, it would be one thing to have that kind of law. But we have a company here, Discourse, who's also a direct messaging app, and there's no question that your law covers them, but they tell us that their whole business model is to promote themselves to a particular message and groups of messages. So they're not doing it indiscriminately. You're basically saying to them, if they're out there and they're a common carrier, they can't have this kind of business model?
Mr. Nielson:
I mean, two responses, if I may, Your Honor. The first is as to the particular company, we're only talking about the three largest, maybe more depending on who falls within the 50 million, the largest telecommunications companies on earth. We're not talking about everything else.
Justice Sotomayor:
Okay.
Mr. Nielson:
But as to the second point–
Justice Sotomayor:
You're agreeing with them that basically this law is aimed towards them?
Mr. Nielson:
Yes, to the largest. We've never disputed that. But even if you agree with all of that, I disagree with you, but I understand that there's still applications of this law that should be allowed to go into effect. I don't see how they can say that they can kick somebody off for off-platform speech of their grandmother. That can't be. Or because they don't like it where you live in Texas, you live in El Paso and not Dallas, so you're not as valuable to the advertisers. So we're going to kick you off. Surely that can't be okay.
Chief Justice:
Justice Kagan.
Justice Kagan:
No sir.
Chief Justice:
Justice Kavanaugh?
Justice Kavanaugh:
Two very quick ones. On the deference to the legislative findings point, my memory is that there was a trial in Turner Broadcasting.
Mr. Nielson:
Yes, Your Honor, that's Turner II. So maybe there'll be a Paxton II. I'm not sure how that plays out.
Justice Kavanaugh:
Right. But there wasn't just, “Congress said this, that's good to go.” There was a trial about that, right?
Mr. Nielson:
Sure, Your Honor, and like I said, we're happy to go to trial, but the court–
Justice Kavanaugh:
That's all I wanted to ask there.
Mr. Nielson:
Of course. Of course.
Justice Kavanaugh:
And then on common carrier, if a company says, we're not a common carrier, we don't want to be a common carrier. We're carrying a lot, but we're not a common carrier. Can the state make them into a common carrier?
Mr. Nielson:
That's a great question, and that was the first question I had when I came to this case. The answer is no. If you are not a common carrier, you can't suddenly become a common carrier. That's why I think it's important to think of it as a compass to kind of tell you where the line is. But I would urge the court, if you're interested, again, we've heard, you know, read Professor Volokh's article. One thing that really struck me as strange was, well, wait a minute, they have terms of service, so how can they be a common carrier? Because if you have terms of service saying you can't do this. And this court addressed that very, the case that he cited is New York Central v. Lockwood from 1873, where the court said, you can't just get out of the duties of common carriage by contract. If you're a common carrier, you're a common carrier unless you stop opening yourself up to the public.
Justice Kavanaugh:
Seems a little circular, but I'll end there.
Mr. Nielson:
Yeah, sure.
Chief Justice:
Justice Barrett?
Justice Barrett:
I just want to get a clarification. So you said that Facebook could geofence and just pull out of Texas? Was that–
Mr. Nielson:
Of course. Of course, Your Honor.
Justice Barrett:
Okay, because I was just confused. Mr. Clement was pointing out that according to the provisions of the law, you couldn't, and I'm looking at 143A.002, and it says that you can't sensor users expression, ability to receive information, et cetera, "based on a user's geographic location in this state or any part of the state." So you don't understand that to say, well, based on your location in Texas, we're not going to let you post content?
Mr. Nielson:
Your Honor, this is one of the prohibitions of the law, that they can't–let me say it a different way, if I may. There is a provision of the law, which is the jurisdictional hook that says who is subject to this law at all. If you choose to do business in Texas, then this provision kicks in and you can't discriminate against people after you've chosen to do business in Texas based on the status that they're in Texas. But if you don't want to do business in Texas at all, that's a separate provision and you can get out of Texas. This is the prohibition on what you can't do, if you choose to do business in Texas, you can't darn well discriminate against somebody because they're in El Paso.
Justice Barrett:
And doing business in Texas is what? Just allowing Facebook users to sign up in Texas? Or is it Facebook accepting ad money from Texas corporations?
Mr. Nielson:
That question has not been resolved by any of the Texas courts because none of them have been. But as I read it, it is, you have to have customers in Texas. You've entered into contractual relationships with Texans.
Justice Barrett:
Thank you.
Chief Justice:
Justice Jackson?
Justice Jackson:
So Justice Barrett had exactly my same thought, and I just want to clarify, so this doesn't speak in your view to a business decision not to offer services in Texas because, for example, their requirements are too burdensome. Instead, this is your offering business in Texas and everywhere else, but you are prohibiting them from discriminating against people on the basis of their geography, meaning they're in Texas?
Mr. Nielson:
Yes, Your Honor.
Justice Jackson:
Thank you.
Mr. Nielson:
Thank you. Counsel. Rebuttal, Mr. Clement?
Mr. Clement:
Thank you, Mr. Chief Justice, just a few points in rebuttal. First of all, as to the common carrier, the two classic elements of common status are missing here. One is that you just put transmitted or carried messages from point A to point B. That's not what's going on here. We use the word in our brief and from this court's cases, disseminate. Disseminate means to spread broadly. That means you are in the expressive enterprise business. There's zero tradition of treating entities in the expressive enterprise business as common carriers. And then the other factor is there really is like an essential facility. The telephone wires used to go, the copper wire, the last mile to every house in America. So if you were kicked off Ma Bell, you were really out of luck. This is the opposite situation in the internet where you have lots of other choices. This is just not a common carrier, not that that really is talismanic under the First Amendment anyways, Justice Thomas made that point back in Denver carrier case, and he had it exactly right there.
Now second, public accommodation. I wouldn't be worried about any other accommodation law, public accommodation law, no other public accommodation law prohibits discrimination on the basis of viewpoint and applies exclusively to speakers. That is a First Amendment red flag that you're trying to limit speakers' ability to discriminate on the basis of viewpoint. That's just a frontal assault on editorial discretion. Every other public accommodation law that I'm aware of works differently.
Third point, protecting kids. If you're at all concerned about protecting kids on the internet, that should be a vote in our favor in this case, because if you can't do viewpoint discrimination, that disables us from doing many of the things that our companies try to do to protect youths online. I mean, the idea that, okay, we're going to have to choose between having, if we have suicide prevention, we have to have suicide promotion to avoid viewpoint discrimination. That should be a non-starter. And protecting kids is important even as to the disclosure provision, there is a record on this case at page 161 of the joint appendix, a witness from Stop Child Predators testified and said these disclosure provisions give a road map to predators to figure out why their messages aren't getting to children, so they can figure out why they got bounced and they can try again and sort of work their way around.
So the last point, and I think this is an important one to end on, this idea that somehow we're behind the eight-ball because we brought a facial challenge. There is a proud tradition of facial challenges to vindicate First Amendment rights in this country. That's how many of these cases have been brought. There's an equally proud tradition of getting a preliminary injunction against a law that is chilling speech. And as the general pointed out, I mean the party presentation rules have to be foundational here. If we had gone into the District Court and said, "This is unconstitutional on its face." And they said, "No, it's not because of Gmail," we could have had a fair debate about that. We could have modified our complaint if necessary. That's a difficult issue. As I said, the only court that I've seen that deals with it directly said Gmail is not a common carrier. But in all events, we could have litigated all of that. But the plaintiff's burden is not to think of any theory the government can come up with on appeal and then foreclose it in the district court. Thank you.
Chief Justice:
Thank you counsel, all counsel. The case is submitted.