Home

Donate

Transcript: US Supreme Court Oral Argument on TikTok

Justin Hendrix / Jan 11, 2025

The interior of the United States Supreme Court. Shutterstock

On Friday, January 10, 2025, the United States Supreme Court heard oral arguments to decide whether to uphold or reverse a District Court decision declining to block the Protecting Americans from Foreign Adversary Controlled Applications Act from going into effect. If the Supreme Court upholds the District Court decision, app stores such as those operated by Google and Apple will be required to block access to TikTok for US users by January 19, 2025.

Arguing before the Court were:

  • Noel J. Francisco, on behalf of petitioners TikTok, Inc., et al.
  • Jeffrey L. Fisher, on behalf of petitioners Brian Firebaugh, et al.
  • US Solicitor General Elizabeth Prelogar, Department of Justice on behalf of the government.

What follows is a lightly edited transcript. Compare to the official Supreme Court transcript when quoting.

Chief Justice Roberts:

We will hear argument this morning in case 24-656 TikTok versus Garland and the consolidated case. Mr. Francisco.

Mr. Francisco:

Mr. Chief Justice, and may it please the court. Under the act, one of America's most popular speech platforms will shut down in nine days. That shouldn't happen for three reasons. First, TikTok Incorporated is a US company speaking in the United States. The act requires it to go dark unless ByteDance executes a qualified divestiture. Whether you call that a ban or a divestiture, one thing is clear: It's a burden on TikTok's speech, so the First Amendment applies.

Second, the act is content-based from beginning to end. It applies only to social media platforms that have user-generated content except for business, product, and travel reviews. Within that content-based universe it singles out a single speaker for uniquely harsh treatment, and it does so because the government fears that China could, in the future, indirectly pressure TikTok to disseminate foreign misinformation and propaganda.

Finally, the act can't satisfy any standard of scrutiny. The government has no valid interest in preventing foreign propaganda, and its fallback that it seeks merely to prevent covertness makes no sense since that could be addressed with a risk disclosure. The government's real target, rather, is the speech itself. It's fear that Americans, even if fully informed, could be persuaded by Chinese misinformation. That, however, is a decision that the First Amendment leaves to the people.

Given that the government's data security rationale cannot independently sustain the act, it is also grossly underinclusive and ignores the most obvious, less restrictive alternative, simply banning TikTok Incorporated from sharing any sensitive user data with anyone.

In short, this act should not stand. At a minimum, you should preliminarily enjoin it, which will allow you to carefully consider this momentous issue and for the reasons explained by the President-elect potentially moot the case. I welcome your questions.

Justice Thomas:

Exactly what is TikTok's speech here?

Mr. Francisco:

TikTok, Your Honor, uses an algorithm that in its view reflects the best mix of content. What the act does is it says TikTok cannot do that unless ByteDance executes a qualified divestiture. That's a direct burden on TikTok's speech, much less of a burden than the one that this court struck down in the Simon & Schuster case, where all the author had to do was take a certain amount of proceeds and put it into an escrow account for a short period of time to satisfy a civil judgment.

Justice Thomas:

So why does a restriction on ByteDance, which is not a citizen, is not located in the US, a restriction on TikTok?

Mr. Francisco:

Because what the law says to TikTok is that TikTok, you cannot use the algorithm that you prefer to use unless ByteDance executes a qualified divestiture. So the law therefore falls directly on TikTok itself. It imposes a burden on TikTok speech. Again, a much more significant burden than the one that was struck down in Simon & Schuster.

Justice Thomas:

So you're converting the restriction on ByteDance's ownership of the algorithm and the company into a restriction on TikTok speech. So why can't we simply look at it as a restriction on ByteDance?

Mr. Francisco:

Because I think the burden falls directly on TikTok, and I can use a hypothetical that helps illustrate the point. Suppose that China used its leverage over Jeff Bezos's international empire, including his Chinese businesses, to force the Washington Post to write whatever China wanted on the front page of the post. Surely the government couldn't come in and say, "Jeff Bezos, you need to either sell the Washington Post or shut it down." That wouldn't just violate Mr. Bezos's First Amendment rights. That would also violate the Washington Post's First Amendment rights because they're ultimately the one that's suffering the burden under that law because they have to go dark and close up their books.

Chief Justice Roberts:

Counsel, you began by saying this is a US company operating in the United States.

Mr. Francisco:

Yes, Your Honor.

Chief Justice Roberts:

But the ultimate company that controls it, ByteDance, was found by Congress, and I quote this, "to be subject to Chinese laws that require it to assist or cooperate with the Chinese government's intelligence work" and to ensure that the Chinese government has the power to access and control private data that the company holds. So are we supposed to ignore the fact that the ultimate parent is in fact subject to doing intelligence work for the Chinese government?

Mr. Francisco:

Well, Your Honor, I don't think you are supposed to ignore that at all, but I also don't think that it changes the analysis for a couple of reasons. Look, TikTok Inc-

Chief Justice Roberts:

Well just hold on a second. Well, as I said, you began by saying this is a US operating in the United States, and it seems to me that you're ignoring the major concern here of Congress, which was Chinese manipulation of the content and acquisition and harvesting of the content.

Mr. Francisco:

Sure. And I'll start by saying that TikTok Incorporated is the United States subsidiary operating in the United States with its own set of free speech rights.

Chief Justice Roberts:

Do you dispute the fact that ByteDance has ultimate control of TikTok in its corporate organization?

Mr. Francisco:

Yes, Your Honor, I do dispute that, but I also don't think that it matters because even if China could exercise overwhelming power against TikTok versus ByteDance, I don't think it would change the analysis.

And I can take that Washington Post hypothetical and ratchet it up a little bit to help illustrate the point. Let's suppose that the Chinese government had actually taken the Bezos children hostage and it was using that leverage in order to force Bezos and the Washington Post to publish whatever they wanted on the front page of the post. So China effectively has total control. I still don't think that Congress could come in and tell Bezos either sell the post or shut it down because that would violate Bezos's rights and the Washington Post's rights. Maybe what they could do is come in and say, "You need to disclose the fact that you're under this amount of coercion so that the people who are looking at the paper understand it and can make their own assessment." But I think the First Amendment rights of both Bezos and the Post would be directly implicated, notwithstanding that China in that scenario has effectively total control over what gets printed in the Washington Post.

Justice Sotomayor:

Counsel, let me break this down. I understand your argument that there is a First Amendment right that the US company has. I'll go that far with you. Okay?

Mr. Francisco:

I'll take it.

Justice Sotomayor:

Because we're affecting their ability to talk in whatever way they choose. The Washington Post could choose without any influence or threat against the children of Mr. Bezos to promote Chinese policy and our First Amendment would permit them to do that if they chose it independently. Correct?

Mr. Francisco:

Yes.

Justice Sotomayor:

Now the question becomes, so it's not... That's just a given that they have a First Amendment right. The next question is assuming they do, what's the level of scrutiny we apply? Isn't that what the issue of the hearing?

Mr. Francisco:

That is certainly one of the issues, Your Honor.

Justice Sotomayor:

All right, so if we get to that side of the issue that TikTok USA has some sort of First Amendment right, taking your example, if the government said, "No speaker is free to speak under a criminal compulsion by someone else because of extortion, because of kidnapping, we are doing this because it is the only way to ensure the safety of people that they are not going to be kidnapped or threatened, their lives threatened."

You don't think that the government has a compelling state interest in saying if there is a threat, a physical, criminal threat against someone to do some activity that the government couldn't say... I'm not questioning whatever the content is of that activity. I'm simply saying we in our governmental powers have a right to say you can't do that. You can't speak.

Mr. Francisco:

Sure, Your Honor. So to take your question in pieces, I do think that they would have a compelling interest in that scenario to do something. But what I don't think is that they could simply target speakers and speech. Take for example, generally applicable laws like the trade-

Justice Sotomayor:

So you think in that situation that the only thing the government could do is tell the Washington Post, "Disclose to the public that you are saying this because you are being forced to."

Mr. Francisco:

So, sure-

Justice Sotomayor:

That that - that's the only remedy the government could undertake?

Mr. Francisco:

No, Your Honor, but I want to make sure I understand the hypothetical, the compelling interest is in preventing this kind of compulsion, coercion, and ultimately harm to children. And I think that the government has a lot of different ways they can address that through speech-neutral laws. And I was going to point to things like the Trading with the Enemy Act or Russia sanctions you can broadly say and attack problems-

Justice Sotomayor:

They haven't been very effective.

Mr. Francisco:

Well be that as it-

Justice Sotomayor:

We're still having people kidnapped. We're still having coercion.

Mr. Francisco:

And be that as it may, you can say to Americans, "You cannot collaborate with our enemies at all, and if you do that, you're going to be severely punished for doing that." But what I don't-

Justice Sotomayor:

All right. We can go on to the effectiveness of the remedy, but the point is, I believe, that even if your First Amendment rights are impinged and there is some protection, the question is at what level of scrutiny?

Mr. Francisco:

Yes, Your Honor.

Justice Sotomayor:

And whether that the action is content-neutral or not.

Mr. Francisco:

I agree that that is the way that the analysis proceeds. Here we believe that the level of scrutiny should be strict scrutiny but-

Justice Kavanaugh:

What is the relevance of the history? Chief Judge Srinivasan in his opinion in the DC Circuit emphasized that there is a long tradition of preventing foreign ownership or control of media in the United States...

Mr. Francisco:

Sure.

Justice Kavanaugh:

Going back... Radio, television, and what have you. I would think no matter the level of scrutiny, that history has to be important, and I want to get your response to it.

Mr. Francisco:

Mm-hmm. I don't actually think it's important in this context because that history all arises in the context of bandwidth scarcity. And in that context, you have the government that's in the position of doling out a limited number of licenses, and when you have to dole out a limited number of licenses, you by definition have to pick winners and losers. And when you have to do that, you get a certain amount of discretion. I think that's the whole basis of those cases. You can't really take those cases and-

Justice Kavanaugh:

Well let's... Keep going.

Mr. Francisco:

You can't really take those cases and extend them to an area where there is no scarcity like the World Wide Web because once you do that, there's really no limiting principle. There's no reason why it wouldn't also apply to really popular books or magazines or newspapers or chains of newspapers. The bandwidth scarcity, I think, is really what justifies the greater discretion that the government gets in that area.

Justice Alito:

Mr. Francisco, let me see if I can break this down. Suppose that TikTok were outright owned by the People's Republic of China. Would you make the same argument?

Mr. Francisco:

I wouldn't be making the same argument, Your Honor. There, you would-

Justice Alito:

Why not?

Mr. Francisco:

Because there you would have to confront a very different question, whether a foreign government that was speaking in the United States has First Amendment rights, and I don't know that the court has ever addressed that. But here we've got a US company-

Justice Alito:

No, I understand that. I just want to see where you draw the line. So it's true, the court has never held that a foreign government has free speech rights. And if we were to hold that, I would think it would be because speech by a foreign government, particularly one with enormous resources, is not protected, allowing that it does not serve the underlying interests of the First Amendment, which are among other things, fostering democratic self-government and furthering the search for truth.

So let's assume that we start with that, all right? What if TikTok were then not owned by the foreign government, but it was undisputed that TikTok was totally controlled by the foreign government, could not do one thing without the approval of the foreign government. That's different?

Mr. Francisco:

I do think that it is different, Your Honor. For example, I've given the hypothetical that I've given, but there are a lot of companies in this country that have foreign owners, not just companies like Politico, which is German-owned or Al Jazeera, which is partly owned by the government of Qatar-

Justice Alito:

Well, I understand that, but what would be the reason for drawing that line?

Mr. Francisco:

Sure if you can-

Justice Alito:

If there's a good reason for saying that a foreign government, particularly an adversary, does not have free speech rights in the United States, why would it all change if it was simply hidden under some kind of contrived corporate structure?

Mr. Francisco:

Because it is a US speaker... I'll give you another example. AMC movie theaters used to be owned by a Chinese company. Under this theory, Congress could order AMC movie theaters to censor any movies that Congress doesn't like or promote any movies that Congress wanted. And I think the reason is that here where it's conceded, you actually have a bona fide US company. It is not simply a Chinese cutout that is the Chinese government speaking itself-

Justice Alito:

All right, let's say that's not's not a complete-

Mr. Francisco:

... but an independent United States Company.

Justice Alito:

Let's say this is not a complete answer to your First Amendment argument, but would you be willing to concede that this is a very important factor that should be taken into account in deciding whether there's a First Amendment violation?

Mr. Francisco:

Well, Your Honor, I think that it does help supply a compelling governmental interest, but I still think you have to march through the strict scrutiny analysis and analyze their interests. I do not think that they have a compelling governmental interest in the manipulation of content. I think that is in the teeth of the First Amendment, and if you look at the government's brief and the rest of the record in this case, that's really what it's focused on. Their complaint is the fear that the content could be critical of the United States government or could undermine our democracy. Yes, Your Honor.

Justice Gorsuch:

Mr. Francisco, I just wanted to follow up on that line of question with just some fact questions because it seems to me there are a couple of things that the parties still dispute about facts in this court, which is a little unusual.

The government says that TikTok US has no authority or ability to alter the algorithm or recommendation engine, but must simply follow ByteDance's directives. You disagree with that in your reply brief.

Mr. Francisco:

Yes, we do.

Justice Gorsuch:

Somebody has to be right and somebody has to be wrong about that. What does the record show on?

Mr. Francisco:

Well, Your Honor, we are here on a record, and there is nothing in the record that says that TikTok, like any other subsidiary, doesn't have its own independent-making authority. If you look at their record sites, what they point to is the ordinary types of control that a parent company has over a subsidiary company, but it doesn't change the fact that-

Justice Gorsuch:

All right, what is the fact? Are you prepared to make a representation?

Mr. Francisco:

Yes, Your Honor. The fact is that TikTok Incorporated as a US company does have a choice over the algorithm. Now, it would be an incredibly bad business decision for them to abandon this algorithm, and they very doubtful would ever do it, but they have that authority.

What they clearly have the authority to do is shut down the platform in the face of Chinese pressure. That's actually what they agreed to do in the National Security Agreement. I think that underscores why TikTok Incorporated as a US company does have its own set of First Amendment rights.

Justice Gorsuch:

Okay, and then another fact question. Before the DC Circuit, you argued that the Chinese government has made clear in public statements that it would not permit a forced divestment of the recommendation engine. Does that mean that some key component of the recommendation engine is under Chinese control?

Mr. Francisco:

No, Your Honor, what it means, and this might warrant a little more explanation, what it means is that there are lots of parts of the source code that are embodied in intellectual property that are owned by the Chinese government and they would restrict, like the United States restricts the sale of those types of things to foreign governments. It doesn't alter the fact that this is being operated in the United States by TikTok Incorporated. So-

Justice Gorsuch:

Okay. I got it.

Mr. Francisco:

Okay.

Justice Gorsuch:

I got it. And then you represent that the divestiture is not feasible within the act's timeframe. I'm sorry for these fact questions.

Mr. Francisco:

Sure.

Justice Gorsuch:

I just want to understand what's before us.

Mr. Francisco:

Yeah.

Justice Gorsuch:

Would it be feasible in any timeframe? I take the government doesn't dispute that it's infeasible in the 270 days provided by law. Would it be feasible at all?

Mr. Francisco:

Your Honor, I think at least as we understand how they've interpreted the qualified divestiture provision, it would be exceedingly difficult under any timeframe for two principal reasons. The first is that there's a global team of engineers that are some in China, some in Europe, some in the United States that maintain and update the original source code. And as we understand their interpretation, a qualified divestiture would prohibit any kind of coordination with that global team of engineers.

The other reason is because as we understand how they're interpreting it, a qualified divestiture would divorce the US platform from the global content. So for example, there are videos created in the United States, there are videos created in Ireland. In order to get global content, we need access to the Irish videos. They need access to the US videos.

Justice Gorsuch:

I got that.

Mr. Francisco:

We understand that couldn't happen.

Justice Gorsuch:

Okay, so you think it's probably not feasible in any timeline?

Mr. Francisco:

Well, Your Honor, I think it'd be extraordinarily difficult.

Justice Gorsuch:

Okay. Last fact question, then I'll yield the floor here. The government admits that it has no evidence that TikTok has engaged in covert content manipulation in this country, but says that ByteDance has responded to PRC demands to censor content outside of China in other countries. Again, you deny that in the reply brief. Somebody has to be right about that.

Mr. Francisco:

Well, Your Honor, the problem there is everything that follows what you just read is redacted, and so I don't know what it says. What the record shows is two things. The record shows first what you just said. They haven't done anything here in the United States with respect to TikTok Incorporated.

And second, the record also shows through our transparency reports that we haven't removed or restricted content on the TikTok platform in other parts of the world. And TikTok doesn't operate in China. It operates in other parts of the world. We haven't removed or restricted content at the request of China. That's what we put out in our regular transparency rules.

Justice Gorsuch:

Removed or restricted though doesn't necessarily cover covert content manipulation though, right?

Mr. Francisco:

Well, Your Honor, I'm limiting my response to what's in the record-

Justice Gorsuch:

To what's in the record? Okay.

Mr. Francisco:

It's very difficult for me to respond to things that I .... where I don't know what the accusation is.

Justice Gorsuch:

I have other questions about the secret evidence in this case, but we'll get to that later.

Mr. Francisco:

Yes, Your Honor.

Justice Gorsuch:

Thank you.

Justice Barrett:

Mr. Francisco, can I ask you a question about the relevant speech here? So it strikes me that this is a little different than your Bezos example because there it's clearly content discrimination because we're talking about the ability to post particular articles versus other articles. Am I right that the algorithm is the speech here?

Mr. Francisco:

Yes, Your Honor. Well, I would say the algorithm is a lot of things. The algorithm has built within it... It's basically how we predict what our customers want to see.

Justice Barrett:

The editorial discretion?

Mr. Francisco:

Yeah....

Justice Barrett:

Yeah.

Mr. Francisco:

The editorial discretion. It also has built within it the moderation elements. All of this kind of comes together when the source code is translated into executable code in the United States. In the United States, that executable code is then subject to vetting, review, moderation through content moderation algorithms. And so it ultimately lands on the TikTok platform.

Justice Barrett:

Got it. But what we're talking about as a net choice is the editorial discretion that underlies the algorithm. And I just want to be clear, a lot of your examples talk about, including the Bezos one, the right of an American citizen to repeat what a foreign entity says... Or say, you know, "I'm hitching my wagon to China. I want to say everything China does." Here the concern is about the covert content manipulation piece of the algorithm. That is something that ByteDance wants to speak, right?

Mr. Francisco:

Well, Your Honor, I think that ultimately it's TikTok's choice whether to put it on the platform-

Justice Barrett:

And you don't want that? Is your client disclaiming any-

Mr. Francisco:

Well, we absolutely resist any kind of content manipulation by China at all. But what I do want to focus in on is what their asserted interests here. They do talk about covertness, but it can't possibly be that all they're concerned about is mere covertness. If all you were concerned about was the covertness untethered from the underlying content, that's something that could be easily addressed through a risk disclosure.

Justice Barrett:

But that goes to scrutiny. The level of-

Mr. Francisco:

Yes, Your Honor.

Justice Barrett:

... the application. I mean, let's say that I agree with you the First Amendment is implicated, and I'm trying to figure out what level of scrutiny applies.

Mr. Francisco:

Sure.

Justice Barrett:

And I'm trying to figure out what content, if any, discrimination is going on here. You know, there's a disproportionate burden.

Mr. Francisco:

Right.

Justice Barrett:

Let's say that I agree with you about that. No one is preventing you... I mean, you're seeking access to a particular source code engineering, the recommendation feature. It's the technology that you want. You're not trying to repeat, as in the Bezos example, if we take the speech that the government's concerned about to be the covert content manipulation rationale, you're not seeking to utter that speech.

Mr. Francisco:

Well, what wer're.... that's correct, Your Honor. What we are seeking to do is use an algorithm that displays the combination of content that we prefer our users to see on the platform-

Justice Kagan:

But is that....

Justice Barrett:

And the government doesn't care about that. I mean, the government is fine with you doing that. You can invent it yourself. It doesn't even care what content that displays, cat videos or whatever.

Mr. Francisco:

But I think that the way that the analysis has to unfold is first you ask, "Is this law burdening our speech?" I think we agree

Justice Barrett:

Yeah.

Mr. Francisco:

That the law is burdening our speech. Then you have to look at whether the law itself is somehow content-based, not just what their motivations are, but whether the law is content-based. And here, the trigger for this law, the one thing that gets it going is if you operate a social media platform that has user-generated content, unless that content takes the form of a product, travel, or business review. Then within that universe of content it says there's one speaker we're particularly concerned about, and we're going to hammer home on that one speaker. And then just to make the rubble bounce, they come in and tell us that one of the reasons they're targeting that speaker is because they're worried about the future content on that platform, that it could in the future somehow be critical of the United States or undermine democracy, to pull examples from the government's brief.

So I think there's no way to get around the fact that this is a content-based speech restriction, and you do have to go directly to what their interests are. Now, their principal interest is-

Could I... Could I.... because I think I'm a little bit surprised by one of the answers that you gave to Justice Barrett. I had understood that TikTok's essential complaint here is that they wouldn't be able to use the algorithm that ByteDance has invented and that they want to use the algorithm that ByteDance has invented.

100%. And if I was unclear on that, Your Honor, I apologize.

Justice Kagan:

Okay, because I think...

Mr. Francisco:

That is absolutely the core of the point.

Justice Kagan:

...what Justice Barrett was saying to you is like what's the problem here? Because ByteDance is a foreign company. Or maybe this isn't what Justice Barrett says, it's just what I say. (Laughter.) ByteDance is a foreign company, and you started off with Justice Alito saying, "Well, we would be making a different argument." And of course that's true. I mean, I would think that Alliance for Open Society makes it pretty clear that you have to be making a different argument with respect to a foreign state or a foreign company.

So let's say that they don't have First Amendment rights. The only First Amendment rights lie in TikTok, which does have First Amendment rights. And I guess my question is how are those First Amendment rights really being implicated here? This statute says the foreign company has to divest. Whether or not that's feasible, however long it takes, TikTok still has the ability to use whatever algorithm it wants, doesn't it?

Mr. Francisco:

No, Your Honor, and their rights are implicated at a most basic level. In 10 days, TikTok wants to speak. In 10 days, because this law was passed, TikTok cannot speak unless ByteDance executes a qualified divestiture. That's not just ByteDance's choice. That is a condition that's imposed by law.

Justice Kagan:

Well, I realize that it definitely has effects on TikTok if ByteDance acts in the way that you are assuming it will act. So this is not to say that the First Amendment isn't involved because TikTok is going to suffer some pretty severe incidental effects, but they are incidental, aren't they? Because the statute only says to this foreign company, "Divest or else," and leaves TikTok with the ability ...

Mr. Francisco:

Right.

Justice Kagan:

...to do what every other actor in the United States can do, which is go find the best available algorithm.

Mr. Francisco:

Yeah. I very much disagree that the effects are incidental because the way that this law works is it is only triggered if somebody is engaging in speech based on their content, user-generated content, except for business product and travel reviews. It then singles out a single speaker, and you have the concession for the government that one of the reasons they've singled out that speaker.

Justice Kagan:

That puts a lot of emphasis on the idea of just... I think what you're basically saying is that all speaker-based restrictions generate strict scrutiny. I'm not sure that we've ever said anything like that.

Let's put aside your argument that this is facially content-based. It seems to me that your stronger argument, or at least the one that most interested me, was this argument of look, if the government is doing something specifically for the purpose of changing the content that people see, that has to be subject to strict scrutiny. But I don't see that as affecting TikTok, as opposed to as affecting ByteDance.

Mr. Francisco:

Well, no, no. I very much do see it as affecting TikTok because they choose this algorithm because it reflects the mix of content. The government's fear is that China could come in and pressure TikTok, through ByteDance to TikTok, to alter that mix of content to make it too pro-Chinese or too anti-American. That is very much directly a content-based charge straight at TikTok. The other point I would like to-

Justice Kagan:

I hear you that it might very well have that effect. I guess what I'm suggesting is that the law is only targeted at this foreign corporation, which doesn't have First Amendment rights. Whatever effect it has, it has. Maybe ByteDance will figure out a way to put this on open source, and then TikTok will be able to use the algorithm.

Mr. Francisco:

So, your Honor, if I could take that on directly, because I think TikTok has First Amendment rights. To the extent ByteDance is speaking in the United States, it, I believe, has First Amendment rights. If you conclude that neither has First Amendment rights, then surely the creators have First Amendment rights.

But if you take a step back, what their position is is that none of these entities, this is the universe of entities affected by this law, none of these entities have the authority to assert First Amendment rights, which means that the government really could come in and say, "I'm going to shut down TikTok because it's too pro-Republican or too pro-Democrat or won't disseminate the speech I want," and that would get no First Amendment scrutiny by anybody. That cannot possibly be the case, yet that is the effect of their position.

The last point I'd like to emphasize though is this law, like the Playboy case, like the Hobby Lobby case, has built within it a less restrictive alternative, which is the general provision, by definition designed to protect against the very harm the government is identifying. Suppose New York State passes an asbestos abatement law. They say, "These types of buildings have to abate asbestos. In addition, New York Times, you have to abate asbestos in your building." And they say there are two reasons for this. One, we want to abate asbestos. Two, we hate the New York Times editorial page. Surely, at the very least, what you're going to say is you can't target the New York Times directly. What you can do is throw them into the general process.

PART 1 OF 5 ENDS [00:30:04]

Chief Justice Roberts:

Thank you, Counsel.

Mr. Francisco:

We think that's the minimum that should be done here.

Chief Justice Roberts:

Thank you, Counsel. We've been talking about connection between the regulation of TikTok and the burden on expressive conduct. And your basic position is that interfering with the ownership of TikTok constitutes a direct regulation of the expressive conduct of other people. What is your best example in our precedent of a situation where we've a regulation of corporate structure, where something else has been treated as a direct regulation of expressive conduct?

Mr. Francisco:

The regulation of a corporate structure as a-

Chief Justice Roberts:

Yeah.

Mr. Francisco:

Your Honor, I don't have a case in my fingertips. I can consult and have that done-

Chief Justice Roberts:

Well, I don't have one at my fingertips or any other part of my body.

Mr. Francisco:

Rebuttal. But I think it's quite clear though, that if you're saying to a company, "You have to stop talking," unless somebody else does something, and that's imposed by the force of law, it directly affects that company's speech. Well, that's-

Chief Justice Roberts:

Again, I don't know if it's directly affecting the company's speech or the speech of third parties. And I'm not sure where your emphasis is. But again, I'm not sure there's another case where we've said that regulating a company, others' expression should be treated as direct imposition on their speech, in terms of the standard of review, for example, when it's based on derivative regulation of corporate structure of somebody else.

Mr. Francisco:

Well, Your Honor, I would concede that this is a pretty unprecedented case. I'm not aware of any time in American history where the Congress has tried to shut down a major speech platform. But I think that if a law imposes a direct regulation on a third party, that in turn results in shutting down somebody else's speech, and they do it for content-based, viewpoint-based reasons, and in particular on this record, because the speaker that is ultimately being shut down, they don't like the speech of that particular platform, that's a real problem.

Chief Justice Roberts:

It may be a real problem or may not, but I just am wondering if there's any precedent where we have that same connection, and that it affects the standard of review, for example. You would treat it as a direct restriction on expression. Even the only thing the law does is say, in this case, somebody other than the Chinese government has to own TikTok.

Mr. Francisco:

So we don't have any direct precedent along the lines that you're citing, but we do have precedence. We have cases like Arcara. And what Arcara says is, if the law is totally speech neutral, then that's one thing.

We have cases like O'Brien, which say if the law doesn't care about speech, but happens to draw in speech, that's another thing. Both of those cases make clear, however, is that when the law is concerned with the content of the speech, when the justification is based on the content of the speech, that's cases like Read II, then you do trigger strict scrutiny.

Chief Justice Roberts:

So then I think your argument comes down to, is this direct concern with speech? Or is it concerned with the potential for Chinese interference with the level of interference indirectly?

In other words, they're not coming back. The Chinese government, TikTok doesn't care what the people are saying on TikTok. That's not the concern. The concern is that they are regulating a particular channel of communication. And I just wonder if there's any precedent for that type of thing.

They're not saying, "We're going to restrict this content and that content, but not this." They're just saying, "We're going to be in a position where we can control what happens," whether it's based on the expression, whether it's based on anything else.

Mr. Francisco:

So Your Honor, I disagree. And I think if you take a step back and look at this record, I think it is quite clear that it is focused on both current and potential future content on TikTok. TikTok Incorporated. Here you don't have just an act that is based on speakers and speech. It's triggered by speech. It's focused on a single-speaker, TikTok Incorporated.

Chief Justice Roberts:

All right. Justice Thomas, anything further? Justice Alito?

Justice Alito:

What if Congress, if there were nothing in this act about content moderation or covert manipulation, what if it was just about preventing what Congress viewed as an enormously powerful popular application from gathering an arsenal of information about American citizens. And they said, "This is the worst offender and we're going to require divestiture by this offender"? Would there be a First Amendment problem there? And if you think there would be, what would the level of scrutiny be?

Mr. Francisco:

Yes, there would be a First Amendment problem if you had a law like this that was only focused on speakers, those who use user-generated content, other than product, travel, or business-

Justice Alito:

Well, Congress concludes that this particular entity is the worst. This is the worst offender, and it happens to be an entity that is involved with speech.

Mr. Francisco:

If all you had... So I want to make sure I understand the hypothetical. The only provision you have is one that says, "This company has to shut down..."

Justice Alito:

Right.

Mr. Francisco:

"...because of data security."

Justice Alito:

Right.

Mr. Francisco:

I would have a different set of arguments. I think it would still implicate the First Amendment, particularly where you have strong evidence that they were being targeted, in part, at least because of their speakers and speech. Suppose Congress passed the law that you possibly-

Justice Alito:

Well, all right, but you're changing that. You're changing the hypothetical by injecting congressional concern about the content of the speech.

Mr. Francisco:

Okay, well I'll put that to the side.

Justice Alito:

So what would your argument be? It would be an equal protection argument?

Mr. Francisco:

Nope. Nope. I'd still be saying-

Justice Alito:

Based on rational basis-

Mr. Francisco:

I'd still be saying that Arcara itself makes clear, that where a law disproportionately burdens just a speaker, we have to subject that to scrutiny to suss it out, to suss out whether the asserted interest is the actual interest.

There, the asserted interest is in data security. I think I would have a couple of arguments under whatever form of scrutiny you wanted to apply, whether it is strict scrutiny or intermediate scrutiny in that context. I would say first, that that law is dramatically under-inclusive, because it categorically exempts e-commerce apps, that this record shows have comparable ties to China collect-

Justice Alito:

All right, you said... I don't want to prolong this too much. You say this is not like Arcara, I think primarily because you say that the divestiture requires the new company to cease using the algorithm. Right?

Mr. Francisco:

No, I think it's not like Arcara for a much more fundamental sense. Arcara involved a totally speech neutral law. It didn't go after speakers at all. If you had a law in Arcara that said we're going to prohibit prostitution in book stores only, then I think that Arcara come out differently. There would've at least been some kind of intermediate scrutiny, potentially strict scrutiny.

Justice Alito:

All right, what you're continuing-

Mr. Francisco:

That's the law that I think is your hypothetical.

Justice Alito:

... you're continuing to walk away from the hypothetical-

Mr. Francisco:

I don't think so, Your Honor.

Justice Alito:

... that I've posed for the purpose of narrowing in on what your argument is. I understood you to say that that would not be a solution to the problem, because one of Congress's motivations was based on the content of TikTok. Am I wrong in that? Did I read your argument incorrectly?

Mr. Francisco:

Well, I think the...I want to make sure I understand what you're saying. I certainly think that because one of the motivations was content, that is an enormously important fact. I was trying to answer your hypothetical, where we were trying to take that out of the mix. And the reason why Arcara is different is because Arcara didn't just simply say, "No prostitution in book stores." That's what your hypothetical effectively says. It says, "No data security problems in speakers," or in this particular speaker. And I think that that would trigger, at the very least, intermediate scrutiny.

Justice Alito:

All right-

Mr. Francisco:

And then-

Justice Alito:

Thank you. Thank you.

Chief Justice Roberts:

Justice Sotomayor?

Justice Sotomayor:

That goes to my question, which is, the Chief Justice asked you whether or not we've ever had a case where pure ownership was at issue and not speech. And I don't think we had one like that, you're right. But I don't think that the question gets to the essence of your argument. Is it? The essence of your argument is you're being asked to divest because of speech. Correct?

Mr. Francisco:

Correct.

Justice Sotomayor:

All right. So if I get past that, if I go to Justice Alito's point, which is, I don't think it's just about speech, it's about data control. If it's about data control, and assume for the sake of argument, that I believe intermediate scrutiny applies to the control provision, then your arguments would be different, wouldn't they? They would be under inclusiveness, they would be other arguments, correct?

Mr. Francisco:

Well, Your Honor, I think they'd be very similar, because I think the nature of our arguments work just as well under intermediate and strict scrutiny. If I could unpack that a little?

Justice Sotomayor:

No, I'm not going to-

Mr. Francisco:

Sure.

Justice Sotomayor:

Because we're going to run out of time, because we're going to need to figure out what intermediate scrutiny means. But I'm not sure it means what you do, which is, I don't think any of our cases have ever suggested that we have to use the least restricted means under intermediate scrutiny. In fact, our cases have said we have to use a reasonable means.

Mr. Francisco:

And if I could respond to that point specifically, I completely agree, it's not a least restrictive means alternative, Your Honor. But you do have to at least consider alternatives. Here, if the concern, let's take the data security concern, which you put your finger on-

Justice Sotomayor:

Well, I know you want to keep going on, but I can't let you, because I can't monopolize the argument. Okay? But let me just get to the bottom of that. All right? You seem to suggest that Congress has to actually look at all of the alternatives, and say no. I don't think we have a case that says that.

Mr. Francisco:

I am not suggesting-

Justice Sotomayor:

If from the record it's clear that alternatives won't be adequate, for whatever set of reasons, isn't that enough?

Mr. Francisco:

If the record were clear on that, that might be enough.

Justice Sotomayor:

Oh, okay. Now I take that-

Mr. Francisco:

But here on the key-

Justice Sotomayor:

Now, let me go to the next question, and the last.

Mr. Francisco:

If I could, Your Honor, just one sentence?

Justice Sotomayor:

Mm-hmm.

Mr. Francisco:

If on the key less restrictive alternatives, they had actually considered them, and said what you suggested, this would be a different case. But our point is that on the key most obvious less restrictive alternatives, a law, for example, that simply prohibits TikTok Incorporated from sharing any sensitive user data with ByteDance or anyone else, there's nothing in the record that suggests they even considered it.

Justice Sotomayor:

That's because they're-

Mr. Francisco:

That's why it would fail under even intermediate scrutiny.

Justice Sotomayor:

We have a different problem, which is that the record shows that there is no sharing that could happen, that wouldn't put the data at security. But we can go past that.

Mr. Francisco:

That's incorrect, actually.

Justice Sotomayor:

No, because the NSA doesn't. What's very-

Mr. Francisco:

I'm not talking about the NSA.

Justice Sotomayor:

Or even anything else. But putting that aside, one last question. Assuming that the covert manipulation issue is one, I think that what remains is, to the Chief's question and Justice Alito's questions, if the covert manipulation is a concern, then the question becomes, what kind of burden does it put on TikTok USA? And I think your point is that that requires strict scrutiny because it doesn't permit them to speak to the Chinese government through the algorithm and promote whatever speech it wants to promote through the algorithm. Correct?

Mr. Francisco:

It doesn't permit them to speak to the American public through the algorithm-

Justice Sotomayor:

Right.

Mr. Francisco:

... and promote whatever type of speech they want to promote on the algorithm. And I also think that this covert manipulation is a little bit odd. They're not concerned just with covertness. If all you were concerned are the secrets-

Justice Sotomayor:

I'm going to ask Ashi about that. How do you disentangle the two things?

Mr. Francisco:

Thank you, Your Honor.

Chief Justice Roberts:

Justice Kagan? Justice Gorsuch? Justice Kavanaugh?

Justice Kavanaugh:

Just on the data collection interest, I think Congress and the President were concerned that China was accessing information about millions of Americans, tens of millions of Americans, including teenagers, people in their twenties, that they would use that information over time to develop spies, to turn people, to blackmail people, People who, a generation from now will be working in the FBI, or the CIA, or in the State Department.

Is that not a realistic assessment by Congress and the President, of the risks here?

Mr. Francisco:

Your Honor, I'm not disputing the risks. I'm disputing the means that they have chosen. One way, the most direct way to address that, all of this user data sits on data servers in Virginia, controlled by Oracle. I'm not talking about the National Security Agreement. What I'm talking about is a law that simply says, "To TikTok Incorporated and its US employees, you cannot share that user data with anybody. You can't give it to ByteDance, you can't give it to China, you can't give it to Google, you can't give it to Amazon, you cannot give it to anybody under threat of massive penalties."

They never even considered that most obvious alternative. And so whether you apply intermediate scrutiny or strict scrutiny, it's not a least restrictive means test, but you got to at least consider the most obvious alternative.

Justice Kavanaugh:

So you acknowledge the risk that Congress and the President were concerned about. You're just saying the means they chose to address that risk were incorrect, not permissible?

Mr. Francisco:

I mean, I certainly acknowledge the risk, but I think there are lots of reasons. Not just the one I just gave, but there are lots of reasons why that risk still can't justify the law. When it sits alongside of the impermissible covert manipulation risk, I think it falls under Mt. Healthy. It's no different if they came in and said, "We passed this law, one for data security-"

Justice Kavanaugh:

I understand that. But just on the data collection, that seems like a huge concern for the future of the country.

Mr. Francisco:

And Your Honor, again, it is a concern. Two responses. First, it is a concern that can be addressed directly. The reason why there's no evidence in this record about whether that kind of direct prohibition on TikTok Incorporated from sharing sensitive user data with anybody, including ByteDance, the reason why the record is devoid of any evidence of that, is because Congress never considered the other side of the balance.

And that's the minimum that Congress has to do under the First Amendment. It's got to at least consider the consequences of shutting down a speech platform used by 170 million Americans, against the benefits of an alternative, like simply saying to TikTok's employees, "You're essentially going to get massive fines, potentially jail sentences, if you share any of that sensitive user data with anybody. Not ByteDance, not China, not anybody else in the world." Yet there's nothing in this record that suggests that even considered that alternative.

Justice Kavanaugh:

What happens after January 19th, if you lose this case? Can you just spell that out?

Mr. Francisco:

At least as I understand it, we go dark. Essentially, the platform shuts down.

Justice Kavanaugh:

Unless there's a divestiture?

Mr. Francisco:

Unless there's a divestiture, unless-

Justice Kavanaugh:

Presidential extension?

Mr. Francisco:

... President Trump exercises his authority to extend it. But he can't do that On January 19th. On January 19th, we still have President Biden. And on January 19th, as I understand it, we shut down. It is possible that come January 20th, 21st, 22nd we might be in a different world. Again, that's one of the reasons why I think it makes perfect sense to issue a preliminary injunction here, and simply buy everybody a little breathing space. This is an enormously-

Justice Kavanaugh:

What do you mean by shut down too? Can you just spell that out?

Mr. Francisco:

So-

Justice Kavanaugh:

If you can.

Mr. Francisco:

One, the app is not available in the app stores. That's at a minimum. But in addition, what the act says is that all of the other types of service providers can't provide service either. Now, there's enormous consequences for violating that for the service providers. So essentially, what they're going to say is that, I think, we're not going to be providing the services necessary to have you see it.

So it's essentially going to stop operating. I think that's the consequence of this law. Which again, is why a short reprieve here would make all the sense in the world. It's an enormously consequential decision, and I think all would benefit if it weren't necessary.

Justice Kavanaugh:

Thank you.

Chief Justice Roberts:

Justice Barrett?

Justice Barrett:

Just following up on Justice Kavanaugh's questions, let's say I agree with you that some level of scrutiny applies, and I'm trying to figure out which level of scrutiny applies. And I'm trying to figure out if there's content discrimination. And let me ask you different question than I did before about the algorithm.

You keep saying shut down. The law doesn't say TikTok has to shut down. It says ByteDance has to divest. If ByteDance divested TikTok, we wouldn't be here. Right? If ByteDance was willing to let you go, and willing to let you take the source code with you, that would be fine, right? We would not be here.

Mr. Francisco:

Well, Your Honor, if ByteDance divested, then the law wouldn't fall on TikTok. But the law will. The law, not ByteDance.

Justice Barrett:

But that's because of ByteDance's choice.

Mr. Francisco:

The law requires TikTok to shut down.

Justice Barrett:

Right?

Mr. Francisco:

Well, it's-

Justice Barrett:

This is like Justice Kagan's point. I mean, I'm trying to figure out how we account for the reality of third-party choices. And the choices of third party, is that the whole reason for the law being passed in the first place?

Mr. Francisco:

Yeah, Your Honor, I think that the way the analysis works is step one, is there a First Amendment violation? Step two, you get to the question that we're grappling with. What standard of scrutiny do you apply? Typically, what you do is you ask, is this law content-based? Is it content-based based on its face? Is it content-based in its decision? Here, we know it's content-based on its face, because it says what it says. We know it's content-based in its motivation, because the government concedes it's content-based in its motivation.

Justice Barrett:

Well, that's not quite what I'm asking.

Mr. Francisco:

I think-

Justice Barrett:

I mean, that's the dispute between you and the government. Is, is it content-based, if it's about divestiture, and not about telling TikTok what content it can display on the platform?

Mr. Francisco:

And I think it has to be, because I think that that really goes to the first question, does the burden fall on the speaker? If the burden falls on the speaker, that triggers the speaker's First Amendment rights. But the law is in fact content-based, whether it comes in the form of a divestiture or something else.

When the law specifically says it's content-based, we're worried about the content on the platform. And when the government tells you, that one of our reasons, one of the things that we're worried about is TikTok, not ByteDance, but TikTok Incorporated, and TikTok in the United States will, absent the divestiture, have a mix of content that we find objectionable. They will mix around their videos in a way that is too pro-Chinese, or two anti-American. And that is TikTok the platform.

Justice Barrett:

Okay. Let me just ask you one last question. Why is it impossible to divest in the 270 days, even assuming that the Chinese government hadn't said you couldn't?

Mr. Francisco:

Sure. And this is the exchange I was having with Justice Gorsuch. There are two basic reasons. The first is that the underlying source code, that's the source code that comes in here and then has to be converted and executed and....

Justice Barrett:

But that's what Justice Gorsuch said, just not ever. So it's not really that you can't do it within the timeframe, it's that you really couldn't ever divest because you never are going to get the source code?

Mr. Francisco:

Well, let me unpack that a little bit. No, it's that the underlying source code, it takes a team of engineers to update and maintain that. It would take us many years to reconstruct a brand new team of engineers to do that with respect to the source code. With respect to the sharing of content, that was the different reason. In theory, we could send our salesmen around the world, go to Ireland, go to Finland, go to every country, and say, "Look, you used to automatically get our content, but now you got to separately sign up for our platform."

Justice Barrett:

Okay, so last point, let me make sure I understand what you're saying. It's not that you couldn't execute the disentanglement. You could say, "We're independent," you just can't recreate TikTok in any kind of way?

Mr. Francisco:

Well, I think that any new TikTok would be a fundamentally different platform, with different content, which is yet another reason why I think this is a content-based restriction that falls directly on TikTok incorporated itself and our platform.

Chief Justice Roberts:

Justice Jackson?

Justice Jackson:

So I guess I'm back to some of the questions that Justice Barrett and Justice Kagan asked about, the threshold issue that you point out, which is, is there a burden on the speaker? I'm trying to understand what the burden is that you are articulating, and whether it really isn't about association, and not speech.

You say you have in your brief some cases that talk about American speakers being free to choose whether to affiliate with foreign organizations, and the colloquy you had with Justice Kagan made me think that what you're really complaining about is the inability to associate with ByteDance and its algorithm, that it's not really about, if TikTok came up with its own algorithm, or bought an algorithm from some other company, or devised it, or whatever, this law would have nothing to do with them, from your perspective.

But the problem I think you're articulating, and I'm seeking your clarification.

Mr. Francisco:

Sure.

Justice Jackson:

The problem I think you're articulating is that you want to use ByteDance's algorithm, and therefore associate with ByteDance. And Congress has prohibited that by requiring divestiture. So isn't this really a right of association case under the First Amendment?

Mr. Francisco:

I think it's both, Your Honor. I do think that that is a component of it. We want to use the algorithm that we think reflects the best mix of content. That's the algorithm that reflects the best mix of content. What this law says is we can't do that unless ByteDance exercises a qualified divestiture. But I also think more directly what this law does is it says that TikTok Incorporated, if ByteDance doesn't exercise a qualified divestiture, you have to go mute. You cannot speak at all. Full stop. Period.

Justice Jackson:

No, I don't think it says that though. I mean, if TikTok were to, post-divestiture, or whatever, pre-divestiture come up with its own algorithm, then when the divestiture happened, it could still operate.

Mr. Francisco:

I think-

Justice Jackson:

It doesn't say, "TikTok, you can't speak."

Mr. Francisco:

I think that's theoretically correct, Your Honor.

Justice Jackson:

Right, but-

Mr. Francisco:

But I think that also underscores the content-based nature of the restriction.

Justice Jackson:

No, but the-

Mr. Francisco:

We have to change our speech.

Justice Jackson:

Excuse me. The fact that that's true suggests that you're wrong about the statute being read as saying, "TikTok, you have to go mute." Because TikTok can continue to operate on its own algorithm, on its own terms, as long as it's not associated with ByteDance. So isn't this really just all about association?

Mr. Francisco:

Your Honor, I think it is partly about association, but I'm going to take another shot at explaining why it's not just about association.

Justice Jackson:

Okay. Well, let me just take you down the association path for a second, because if it is about the association of TikTok with ByteDance, then don't we have cases that seem to undermine your view that Congress can't do this? I mean, I thought we had cases about Congress prohibiting association with terrorist organizations, prohibiting association with foreign adversaries. And so why doesn't this fall into that kind of group of our jurisprudence?

Mr. Francisco:

Well, at least as I understand all of those cases, they applied strict scrutiny. The material support statute most definitely applied strict scrutiny.

Justice Jackson:

And ultimately upheld the law. So fine.

Mr. Francisco:

But sure, and I think if we go down the strict scrutiny road here, I don't see that this law can possibly be satisfied under the interest that they assert here.

But I do want to emphasize why this is also about TikTok speech. Even under your hypothetical work, theoretically, they can say something differently than they're saying today. That in and of itself is a direct restriction on TikTok speech. They can't engage in the speech they want to engage in, they have to engage in a different kind of speech, the speech they don't want to engage in. That is a direct burden on TikTok Incorporated speech-

Justice Jackson:

All right, I think I understand that argument-

Mr. Francisco:

... wholly apart from association.

Justice Jackson:

Let me ask you a question about your colloquy with Justice Kavanaugh. Did I understand you to concede that there is a compelling interest, and that the problem is really tailoring? I mean, you said, "I understand the risks." I don't hear you suggesting that the risks don't exist. So it sounds like we've gotten past, even if we're in strict scrutiny world, we've gotten past the compelling interest part of this?

Mr. Francisco:

No, Your Honor, what I was saying is that if all you had standing alone were the data security, that would be a different case. Here, when you have the content manipulation sitting right alongside of the data security, that taints the data security rationale. If Congress came in and said, "We're passing this law for two reasons. One, we really care about data security. And two, we hate the speech on TikTok," the data security wouldn't alone sustain that law. Under cases like Mount Pleasant, it would take them-

Justice Jackson:

I understand, by why... You're equating, we don't want foreign adversaries to be able to manipulate the content on this platform. You're equating that with, we hate the content. And I'm just trying to understand why?

Mr. Francisco:

Sure. Because content manipulation is by definition a content-based distinction. Look, everybody manipulates content. There are lots of people who think CNN, Fox News, The Wall Street Journal, The New York Times are manipulating their content. That is core protected speech. That's why they put so much weight on this mere covertness.

Justice Jackson:

Right. But that analysis is just about content-based versus content-neutral, and therefore whether you apply strict scrutiny. I'm in the strict scrutiny world. Okay? I'm assuming that you're right, that strict scrutiny applies. And now prong number one in that world is, does the government have a compelling interest?

And so I'm trying to understand why the government's argument that, we have data manipulation concerns, which I understood you in colloquy with Justice Kavanaugh to say, is a risk. And we are concerned, based on what Justice Gorsuch says when he's looking at the facts, that the government contends that there's this real problem with this foreign adversary doing manipulation in other places. Are you saying those are not compelling government interests?

Mr. Francisco:

I am 100% saying that content manipulation is not just not a compelling governmental interest, it is an impermissible governmental interest. You could not go to CNN or Fox News and say, "We're going to regulate you because you're manipulating the content in the way that we don't like." That is, per se, impermissible. That is why-

Justice Jackson:

Okay. Can I just ask you one last thing? You say, with respect to the tailoring issue, that disclosure, you think, is a possible more narrowly tailored way of handling some of this. And I guess, I'm just wondering whether disclosure under this court's case law and the law of other lower courts doesn't carry its own First Amendment complications. That wouldn't we have compelled speech problems if disclosure was required in this situation?

Mr. Francisco:

Sure, Your Honor. Now look, I might think so because I think that the factual predicate is wrong. But they think the factual predicate is right. And if the factual predicate is right, then there are no First Amendment problems at all, under Zauder and the cases that you're suggesting.

And that underscores the larger problem here. Not all disclosures are perfect. I'm not here to argue that they are. But you've always got to consider what the alternative is. And here the alternative is shutting down one of the largest speech platforms in the America. The reason there's no evidence in this record as to disclosures is because Congress never even undertook that balancing in the first place-

Justice Jackson:

Thank you.

Mr. Francisco:

... the bare minimum that has to be done before we take an unprecedented step of shutting down the voices of 170 million Americans.

Justice Jackson:

Thank you.

Chief Justice Roberts:

Thank you, Counsel. Mr. Fisher?

Mr. Fisher:

Mr. Chief Justice, and may it please the court, wholly apart from the company's legal interests here, the act directly restricts the First Amendment rights of American creators to participate and speak, in what the court, a little less than a decade ago, called the modern public square, and what you might say today is the most vibrant speech forum in the United States of America. And the act therefore is inescapably subject to strict scrutiny, because of the First Amendment implications. And the act fails that test, and indeed, any level of scrutiny under this court's case law, because the act and the reasons behind it defy our history and tradition as well as precedent. American creators have long and always enjoyed the right to speak in conjunction with foreign speakers or work with foreign publishers. Americans even have the right, under the Lamont case, to receive information from foreign speakers, indeed, foreign governments. So that leaves the government with its implication in its use of the phrase national security in this context, but that just simply doesn't change the calculus.

Throughout our history, we have faced ideological campaigns by foreign adversaries, yet under the First Amendment, mere ideas do not constitute a national security threat. Restricting speech because it might sow doubt about our leaders or undermine democracy are kind of things our enemies do. It is not what we do in this country. And so we think the court should reverse, and I would welcome the court's questions.

Justice Thomas:

How exactly are the creators' speech being impeded?

Mr. Fisher:

So two ways, Justice Thomas. First, I just point you to the text of the statute, which directly regulates text, images, communicate real-time communications, videos. My clients, the creators, are the ones creating that speech and posting it to speak to other areas.

Justice Thomas:

But it doesn't say anything about creators or people who use the site. It's only concerned about the ownership and the concerns that data will be manipulated or there will be other national security problems with someone who's not a citizen of this country or a company that's not here.

Mr. Fisher:

So there's two ways, and I think the Sorrell case is where you look for the analysis of the First Amendment burden here. As I said, the text of the statute regulates our speech. And then you point out ownership, and this was talked about a lot in the first part of the argument here, so let me be very clear. The American creators have a right to work with the publisher of their choice.

So imagine somebody wanted to work on... Post speech on Twitter, now known as X. And Congress passed a law saying, "We don't like the current owner of X. The current owner of X has to sell that platform or else it has to shut down." People who post on that platform and who indeed, some of them, make a living commentating, engaging on current events, news, politics would have a First Amendment claim to work with that particular publisher.

Justice Thomas:

But using that argument, you could have said that about the breakup of AT&T. You could say that about the limitations on foreign ownership of broadcast companies.

Mr. Fisher:

Well, no. Look, I think that you have to dig a little deeper than that, Justice Thomas. It's not mere foreign ownership, and it's certainly the broadcast cases I'll get to in a moment, but it's foreign ownership because of a particular perspective. If you boil it down to an essence, the owner of a print media or online media publication is the essence of the viewpoint of that publication. The current owner of X or the current owner of Fox News or the current owner of MSNBC has a particular perspective, and working with that particular platform is shot through with the ownership from top to bottom.

Justice Jackson:

But why couldn't Congress prohibit Americans from associating with certain foreign organizations that have interests that are hostile to the United States? I mean, I thought that's what Holder v. Humanitarian Law Project allowed. So, I don't really understand what you mean.

Mr. Fisher:

Right. So, I'm glad you're bringing that up.

Justice Jackson:

Yes.

Mr. Fisher:

So when it comes to this national security, you are right. That Congress can prohibit Americans, to use that case as an example, from associating with terrorist organizations or other organizations that pose a clear and present danger to this country.

This case, Justice Jackson, is fundamentally different. What the government tells you, in its own brief that it is worried about here, are the ideas that might be expressed on TikTok. We might undermine US leadership, we might sow doubts about democracy, we might have pro-China views. And so if you look to whether that is legitimate interest, my fundamental submission...

And this, I think, goes to the last colloquy you were having with Mr. Francisco, is that is an impermissible government interest. And you look throughout our history and tradition, and I think the place I would point you most directly would be the opinions of Justice Brandeis in Whitney, and Justice Holmes in Abrams.

Justice Jackson:

I guess, I don't understand how that's distinguishable from what's happening in Holder. Can you just say a little bit more?

Mr. Fisher:

It goes to the nature of the national security threat. So my position is the government just doesn't get to come in and say, "National security in the case is over," or, "You don't get to associate." You have to dig underneath what is the national security claim.

And what Justice Holmes said in his Abrams dissent... And I know that was a dissent. These are hard issues, but that has been vindicated over time, is that it's not enough to say national security. You have to say what is the real harm. Is it terrorism? Is it where our battleships are located? Is it war?

Justice Jackson:

Justice Kavanaugh presented a number of potential risks with foreign adversaries using covert manipulation of the data platforms that are being used by youths today, that would then make it more likely that people would turn into spies and do terrible things to the United States. This is a hypothetical, but you know what I'm saying?

Mr. Fisher:

I get it. I think if I understood Justice Kavanaugh correctly, he was talking about the data security argument. So, let me just pull these apart. You first have an argument, and the government itself separates these two arguments in its brief. The first argument, the one I'm focusing on initially is the content manipulation argument. And that argument is that our national security is implicated if the content on TikTok is anti-democracy, undermines trust in our leaders. They use various phrases like that in their brief. So, my primary submission is that is an impermissible government interest that taints the entire act.

Now, there's a secondary argument the government makes. And we say, you don't even get to that because once you have an impermissible motive like that, the law is unconstitutional. But even if you could get to that, Justice Jackson, I do grant that data security, in the way Justice Kavanaugh spelled it out, is compelling. That is compelling. But that's not the question. You just don't ask in the air, "Was Congress worried about data security or could it reasonably be worried about data security?" You say, "Can this act, the act before you, be sustained on data security grounds?" And our answer to that has to be no. Don't have to look any further than the divestiture provision itself, which says that the content recommendation algorithm cannot be used in the future. Well, that has nothing to do with data security.

So the core feature of the divestiture provision is going at content manipulation, which I said is impermissible. You can't uphold that under data security grounds. And the rest of the act, when you look at the covered company's provision, Justice Jackson, if this were primarily a data security law, what you think you'd find is what kind of data is procured? How is it stored? Is it shared? Those are the things you think you'd find under covered companies, but you don't find that. What you find is are text images shared? Is content being shared between users? Is it being created and posted in the social media platform?

So I don't dispute for one second that data security is a very important thing, and Congress, in this very law, regulated data security in other ways with data brokers. That's perfectly permissible. But the question before you today was narrower. The question is, is this law before you sustainable on data security grounds? And that answer has to be no.

Chief Justice Roberts:

This Congress doesn't care about what's on TikTok. They don't care about the expression. That's shown by the remedy. They're not saying TikTok has to stop. They're saying that the Chinese have to stop controlling TikTok. So, it's not any a direct burden on the expression at all. Congress is fine with the expression. They're not fine with a foreign adversary, as they've determined it is, gathering all this information about the 170 million people who use TikTok.

Mr. Fisher:

Again, Mr. Chief Justice, if I may, let me separate where you started, which was the content manipulation, and then go to the data security part of it. So, I understand-

Chief Justice Roberts:

Well, the first part, I'm not talking about the content manipulation. I'm talking about the content harvesting.

Mr. Fisher:

When you say content harvesting, do you mean people data or that-

Chief Justice Roberts:

They've got all the information, whatever algorithms they want, that has access to the personal information or at least information that is not readily available about 170 million Americans. And whether they're going to use it in 10 or 15 years when those people grow up and have different jobs in different places, or whether they're going to use it now. That, at least, as I look at the congressional record, is what Congress was concerned about.

Mr. Fisher:

I think they're-

Judge:

And they're not concerned about the fact that it is available. As I said, the remedy is just somebody else has to run TikTok.

Mr. Fisher:

Right.

Chief Justice Roberts:

So they're not concerned about the content, they're concerned about what the foreign adversary is doing.

Mr. Fisher:

So if I may, I think I still... To answer your question properly, I think you have to separate two things. One is the content recommendation algorithm, and that's what I was speaking about a moment ago. That has nothing to do with data security, that doesn't itself procure data. That just determines what videos people see on their feed on TikTok. As to that, I think the answer is inescapably that the government and Congress itself was worried about content. The government itself is here saying national security. So, a mix of cat videos or dance videos doesn't affect national security. No matter what happens, the only thing that can affect national security are the substance of those videos. And when the government's pressed in its briefing, it outright tells you that. It says, "What we're really worried about is sowing doubts about US leaders, et cetera."

So, let me turn then to data security. Yes, there were various congresspersons, and in the record that we have in the DC Circuit, there were conversation about the problem of data security here. As I said, I don't dispute that that is a valid governmental interest, so I think you addressed whether that alone can sustain the act in two steps. First, you would ask, if you have an impermissible motive and a permissible one, can we sustain the act based simply on the permissible motive? And I think for the reasons Mr. Francisco said, and we lay out in our brief, that alone, the answer is no under Hunter against Underwood and other cases.

Even if you could get just to the data security question, again, you'd have to ask the question, would this law have been passed by Congress for data security reasons? Because you're being asked to uphold a law based on that single governmental interest. And when you look through the provisions like the content recommendation algorithm provision, like the covered company provisions, the answer is no. And if you're still in doubt on that, just go back to the underinclusiveness problem. Would a Congress really worried about these very dramatic risks, leave out an E-commerce site like Temu that has 70 million Americans using it and every bit the connection to the world of Chinese...

Justice Kavanaugh:

Does Congress have to go all or nothing on that? I mean-

Mr. Fisher:

It doesn't have to go all or nothing.

Justice Kavanaugh:

Didn't they isolate a particular problem and they might be getting to what you're talking about next. Who knows? But you're really sitting up there and saying Congress would not pass the divestiture law if data security were the only interest.

Mr. Fisher:

I'm saying it would not have passed this divestiture law if data security were the only interest. It's very curious why you just single out TikTok alone and not other companies with tens of millions of people having their own data taken in the process of engaging with those websites, and equally, if not more available to Chinese control.

So, I'm not trying to say that Congress has to do everything at once. I'm trying to say that once you've concluded that content manipulation, for the reasons I've said, as a matter of our history and tradition, has to be impermissible.

Justice Sotomayor:

Is there another site like this one that covers half the American population?

Mr. Fisher:

I don't think just by way of sheer numbers, Justice Sotomayor, that the answer has to be no. But 70 million seems like a lot.

Justice Sotomayor:

170 million is a lot. But put that aside and then go to the next question, which is how many of these sites have all of the data collection mechanisms that TikTok has? From what I understand from the briefs, not only is it getting your information, it's asking. And most people give it permission to access your contact list, whether that contact list has permitted them to or not. So, they can now have data about all of your contacts and anything you say about them.

How many other sites gather information by keystrokes to be able to do voice and finger ID information, if they choose? And there's a whole lot of data stuff that was discussed in the brief that I don't think any other website gathers. So, wouldn't this be a unique site?If I viewed the evidence that way, how would this be underinclusive?

Mr. Fisher:

Justice Sotomayor, I don't think a lot of the suppositions you're making actually bear out. And as Justice Gorsuch was pointing out, one of, obviously, the real challenges in this case as it comes to you without an ordinary trial record compiled than all the rest. So, we have only limited amounts of information. But absolutely, these other websites are taking much the same kind of information, if not more. And as to the contact list thing, I think you also... That points out one other aspect of this. That is voluntary decision by an American user to share that information.

In the Riley case-

Justice Sotomayor:

But not informed. And even if informed-

Mr. Fisher:

Well, that could be solved-

Justice Sotomayor:

But even if informed

Mr. Fisher:

If you don't think it's informed, that could be solved by a warrant or disclosure.

Justice Sotomayor:

But no, they can't be because for the United States, the threat of using that information is what is that issue. It's not whether the user thinks it's okay, it's whether the US believes that it could put sites at issue. But let me ask you one last question and fundamental question, assuming that content-neutral data collection concerns where the Congress is... Is one of Congress's provisions divest because of this. Why can't we separate that out from how we analyze the algorithm question? And couldn't we sever the two provisions, to say, the divestiture is right but you can't force them not to discuss algorithm.

Mr. Fisher:

I think the reason why you can't do that is as Mr. Francisco explained. I thought I'd direct you to a case like Hunter against Underwood and just analogize it to this situation. If what you had is the government saying, "We are shutting down TikTok or requiring divestiture for two reasons. One, because we think it helps the Democratic Party too much. And number two, because we're concerned about data." I think that first interest would be a poison pill. That would be an impermissible... Or because we think there's too much pro-Catholic content on TikTok. I think there are some interests that are just so constitutionally verboten that I think that just makes the act unconstitutional, and you can't go looking for other interests.

You send it back to Congress, "Look, if you want to pass the data security law free and clear of this impermissible interest, you go ahead and do it."

Justice Sotomayor:

Thank you, Counsel.

Mr. Fisher:

Can I say one other thing, Justice Sotomayor? Just because I think it is also telling here that even if you didn't buy that poison pill argument and you just asked whether Congress would've passed this law, something else that I think you might notice is even if all this act goes into effect and the law goes through, TikTok gets to keep all of the data. So, wouldn't a data security law require them to expunge that data or get rid of it or something?

I mean, it's a very weird law if you're just looking at it through a data security lens. Maybe, Congress could do better.

Justice Gorsuch:

Mr. Fisher, often we require divestiture for any trust reasons, for example. And as I take it, your argument here... And we don't think of those as normally implicating the First Amendment interests of users or people who might speak or associate with editors. And the difference here is, as I understand it, your mind, that this law is motivated by a content-based interest. Is that a fair summary?

Mr. Fisher:

I think that, and the only thing I would add to it is just the prior step, which it is regulating the speech itself for content-based reasons, yes.

Justice Gorsuch:

Yeah. We don't do that in the antitrust area-

Mr. Fisher:

Exactly.

Justice Gorsuch:

... because we say this law does. Okay.

And on the covert content manipulation side, do you think that's a compelling interest or not? Forget about the tailoring for a moment.

Mr. Fisher:

No. My point is, is that preventing content manipulation whether it's covert or not-

Justice Gorsuch:

Is simply not compelling.

Mr. Fisher:

... is impermissible if what you mean by content manipulation are the kinds of interests the government is saying, like undermining trust in our leaders, undermining trust in democracy.

Justice Gorsuch:

Yep. That's Whitney and Abrams in your mind.

Mr. Fisher:

That's Whitney and Abrams. And those cases-

Justice Gorsuch:

Got it. I got it.

Justice Kagan:

So, Mr. Fisher-

Justice Gorsuch:

Just a couple more. I'm sorry, I'll finish up real quick. And so that would take us to the tailoring question, and there you say disclosure and alerting Americans that there is covert content manipulation possibility, putting aside the data collection part of it.

Mr. Fisher:

Yeah.

Justice Gorsuch:

Telling Americans that there is covert content manipulation going on in TikTok, or at least it's possible. And the government says that's just simply not enough, and the DC Circuit did, too. And I wanted to give you a chance to respond to that.

Mr. Fisher:

Right. I think that's the only aspect of the governmental interest that could be permissible, the covert part. And my answer is you just said is, disclosure solves that problem. And you have a long-standing law which we haven't talked about yet today, that gives you that example of, again, under a history and tradition test, you look at not just precedent, but laws and our traditions of our country. Look at the Foreign Agent Registration Act passed in the run-up to World War II, and the concern was Americans would be controlled by foreign agents to speak and advocate certain causes.

Justice Gorsuch:

We didn't ban them. We just required the disclosure.

Mr. Fisher:

You did not ban them. All you did is require... You, Congress. All Congress did is require a disclosure.

Justice Gorsuch:

Yeah. Certainly, I wasn't around for that. (Laughter.)

On the secret evidence point, I'm concerned about the government's attempt to lodge secret evidence in this case without providing any mechanism for opposing counsel to review it. I expressed that concern in Zubaydah, and I noted that there are mechanisms to reading counsel and that other countries, including our allies, often do that. I just wanted to give you a chance to give me your thoughts on that.

Mr. Fisher:

Yes, Justice Gorsuch. We made all those arguments in the DC Circuit, so there was a flurry of motion practice about whether or not the government could rely on classified evidence. Those motions were never resolved, but the DC Circuit did. I think you probably noticed from the decision is, say, "We're going to decide this case solely based on the public record." And my understanding is that's how it comes to this court.

Justice Gorsuch:

It's interesting that.

Mr. Fisher:

If the court were ever-

Justice Gorsuch:

It's interesting that Congress hasn't acted in this field. We have, in the FISA area, lots of opportunity. They have regulated this area and it does seem like an area that Congress might want to pay attention to, given the increased appeals to secret evidence that the government has made in recent years.

Last question for you. Could the new administration, after January 20th... Mr. Francisco suggested that it might be able to extend the deadline even though if you were to lose here by January 19th. Is that possible as you read the law?

Mr. Fisher:

I'm not sure it is. I'm not sure. Maybe, that's a question for the Solicitor General, but-

Justice Gorsuch:

Oh, it certainly is. I thought maybe I'd give you a chance, too.

Mr. Fisher:

As I understand the law, it's 270 days unless extended. And once that time runs, I'm not sure you're talking about an extension anymore.

Justice Gorsuch:

Okay.

Mr. Fisher:

There's ex post facto law kind of does this stuff.

Justice Gorsuch:

Yeah, got it. Thank you.

Justice Kagan:

Can I take you back, Mr. Fisher? Let's say I agree with you that if you're talking about content manipulation, that's an inherently content-based rationale for acting. So if Congress had passed a law that says, "We hate the content manipulation that TikTok is doing," that's strict scrutiny land, and I don't know that the government can do that however important the interest. But that's not what Congress is doing here, and this is the same kinds of questions that I asked Mr. Francisco. Because if a...

Let's take it as a given that Congress actually can do whatever it wants with respect to a wholly foreign corporation or a foreign government. And so, Congress could act with the intent to interfere with the content manipulation that a foreign corporation is doing. And so now we're in this strange world where we're saying they can't act with respect to TikTok, they could act with respect to ByteDance. Why isn't this Congress acting with respect to ByteDance in the sense that all it's doing is saying ByteDance has to divest. And then TikTok can go about its business, use whatever algorithm at once, use whatever content moderation policies at once just like everybody else does, choosing from everything that's available on the open market.

Mr. Fisher:

Let me answer that question in two parts from the perspective of the creator, Americans who want to use this platform to speak to other Americans. So, the first thing is what the act does? As you said, Justice Kagan, is prevent us from working with an application that is owned by ByteDance that uses this algorithm. Well, that's exactly what we want to do. That's our editor and publisher of choice that we think best disseminates our speech.

Justice Kagan:

Yeah. But what I'm saying to you is if you just assume a world without TikTok, that where it's only ByteDance. And you were trying to say, "Well, we really want to work with ByteDance," and Congress was saying, "We think ByteDance presents national security interests and they don't have First Amendment rights. They're just a foreign corporation." I think that in that case, the government...

I mean, tell me if you think this is wrong. It doesn't matter that you have creators who want to work with ByteDance because ByteDance is a foreign corporation with no First Amendment rights. Is that what you're contesting?

Mr. Fisher:

That is what I'm contesting. You said two things, though. So I could be clear, there's two aspects. Do we have a First Amendment right to work with a foreign company or even a foreign country to publish our speech? And then, there's a national security part that you put into that which goes to the justification.

Justice Kagan:

Forget that.

Mr. Fisher:

Forget that. Yes, let's do that. So if that is right, Justice Kagan, then American creators have no right to make documentaries with the BBC. They can't work with Al Jazeera. If Congress wants to prevent that, any number of other publications that are state-owned wholly or partially. And even under Lamont. Remember where you're not even creating speech, you're just listening. That was speech from China that the court said you have a First Amendment right to receive.

Justice Kagan:

So, would I'd be right to say that your position is that because of the users who want to associate and want to partner with this foreign corporation, the foreign corporation ends up having, in your view, the exact same first Amendment rights as your users do. In other words, it's irrelevant that the foreign corporation doesn't have First Amendment rights.

Mr. Fisher:

I don't think it's irrelevant because you could imagine a situation where no American distributor or speaker wants to work with that. But let me put it to you this way, The Communist Manifesto written by Karl Marx has no First Amendment standing on its own in America. But if a bookstore wants to sell that publication, I don't think Congress can prevent it from doing so.

Justice Barrett:

Well- oh, sorry. Go ahead.

Mr. Fisher:

No, I'm fine.

Justice Barrett:

No. It’s...

Justice Kagan:

I'm good.

Justice Barrett:

Okay. But I want to press you a little bit on the distinction because in Lamont, the prohibition worked directly on the American. You have to specifically request this information that comes. As Justice Kagan's questions we're pressing you, this is working on ByteDance. It's not saying to your creators, "You can't post on ByteDance." That's indirectly going to happen if ByteDance chooses itself not to permit TikTok to walk away with the code.

So, does that matter that distinction between Lamont and this case?

Mr. Fisher:

No, for two reasons. One, under the Sorrell case, you look to not just the law itself but it's practical operation, and the practical operation has prevent us from working with ByteDance. So, that's one answer. And you bring up Lamont, and Lamont's actually a very important case. I'm sure you all recognize here. It's important to look not just at the court's opinion, but look at the briefing in that case. The government itself never came in and argued there's no right to receive this information. That's the greater argument. All the government argued was, of course, Americans have a right to receive this, but it's just not so much of a burden to require them to raise their hand to get it.

So Archibald Cox, when he was the Solicitor General said to the court, quite explicitly in the brief, "We're not even going to make this argument because we think it's so contrary to history and tradition. All we're going to argue is the burden isn't enough." Now, what happened is the DC Circuit turned that upside down and said, "Oh, Lamont's just a case about the burden." Well, that's because that's the only argument the government was even willing to make in this court. There was no argument that Americans didn't have the right to hear that speech.

Justice Barrett:

What about... So I think this goes to Justice Gorsuch's questions about antitrust divestiture. Let's say that for antitrust reasons or... Or let's even say not for that. Let's say for suspect First Amendment reasons, Congress tells Jeff Bezos that he has to divest in the Washington Post. He can no longer own the Post. And let's say that neither Bezos nor the Post challenges that, but let's say that you represent clients who really like the Post as it was, who really want to keep receiving the post, who really want to publish op-eds in the post. Would you have standing? What kind of a claim would you be making there?

Mr. Fisher:

I believe so, Justice Barrett. And the Court has cited Lamont in other cases in more recent years to say, "We've recognized a right of American listeners to receive information from others." And remember, even that is a lot. That's only a small part of the argument I'm making on behalf of the creators. I don't mean to diminish Mr. Francisco's arguments on behalf of the company and ByteDance, but the core speech in front of you in this case are the videos and other forms of communication that people, like my clients, are posting by the millions every day on this platform to share with other Americans.

Justice Barrett:

Is it possible for you to win and Mr. Francisco to lose? Or you rise or fall together?

Mr. Fisher:

No, I think it's possible.

Justice Barrett:

How?

Mr. Fisher:

I mean, I don't think we should. (Laughter.) But...

Justice Barrett:

Was it possible for you to win and him to lose? I mean, you want to win.

Mr. Fisher:

Let me put it this way. If you were to conclude that something about the corporate ownership structure, and I think with some conversation about this earlier, impeded Mr. Francisco from being able to assert full-throated First Amendment rights in this case, I would step in and say, "Well, certainly we can do that and get you to the strict scrutiny." And then, the arguments pretty much line up. Then you're in a question of, can the government satisfy strict scrutiny?

And I think Mr. Chief Justice, you asked about do we have cases for this and that. I think that the idea is, yes, we have cases that say once you're in strict scrutiny, that regulating the content because you don't think it's going to be pro-American enough or it's going to be too pro-foreign interest is just verboten under the First Amendment. That's the history and tradition.

And Justice Kavanaugh, when you asked about the broadcast cases, they're grounded not just in scarcity, but they're grounded in scarcity in a particular way. It has to do with the absolute need Congress has for licensing in a world of scarce resources. And so, that's the very small carve-out that even in Turner, the court wouldn't extend a cable television that exists for broadcast licensing. And if you look on the 200 plus years of our country, for any other example of foreign ownership of media being regulated by Congress let alone being permitted in the case law, you are not going to find it. And I think the reason why is because everybody has understood that if you're not in a world of scarcity where licensing is impossible, you cannot give the government. And in this more extreme example, the president himself unbridled discretion to choose who is a proper owner of a speech platform in this country because it is so hand-in-hand with viewpoint.

As I said earlier, any number of owners of big media enterprises, whether they be Americans or foreign citizens, could be accused of having a particular viewpoint. But speakers who engage in those platforms have choices they can make. And so on behalf of our creator clients, we find it not at all satisfactory to be told, "Look, just go post somewhere else." It's not enough to tell a writer, "You can't publish an op-ed in the Wall Street Journal because you can publish it in the New York Times instead." Just like here, to say, "You can publish it on Instagram or some other platform, not just TikTok." TikTok has a distinct editorial and publicational perspective, and it particularly benefits people like my clients who are not famous people. They're not actors from Hollywood who have a lot of people following them. They're ordinary American citizens whose content that they create on the platform gets privileged by way of the quality of that content.

And that's what's so powerful about the platform. So whether you're an ordinary American citizen or, I might add, whether you're a presidential candidate in our last election, if you want to reach new and different audiences, TikTok is the place people go.

Justice Alito:

This may not make any difference for constitutional purposes, but just out of curiosity, I'd like you to explain what the practical consequences would likely be for your clients if TikTok went dark as Mr. Francisco put it. There, I assume is a great demand for what TikTok provides. And if TikTok was no longer there to provide what your clients really want, is there a reason to doubt that some other social media company would not jump in and take advantage of this very lucrative market?

Mr. Fisher:

There are two reasons, Justice Alito. One is many of the declarations from my clients actually explain. They've tried on other platforms to generate the kind of audience and engagement they've been able to on TikTok, and they've fallen dramatically-

Justice Alito:

Yeah, I know they haven't so far, and I'm just wondering whether this is like somebody's attachment to an old article of clothing. I really love this old shirt because I've been wearing this old shirt, but I could go out and buy something exactly like that, but no, I like the old shirt. Is that what we have here or is there some reason to think that only ByteDance has this? ByteDance has devised this magical algorithm that all of the geniuses at Meta and all of these other social media companies, no matter they put their minds to it, they couldn't come up with this magical thing.

Mr. Fisher:

I think empirically the other companies have been trying for a few years to catch up with TikTok and replicate it and have been very unsuccessful, and so that ought to tell you something, and so just imagine the algorithm here as a collection of thousands of editors. Imagine the floors of an office building being filled with a collection of editors. You can imagine a situation where that collection of genius that is on a particular floor cannot be replicated by another group of people, and that's kind of what you have here.

Justice Alito:

Okay. All right. I understand that.

Chief Justice Roberts:

Thank you, Counsel. Justice Thomas? Anything further, Justice Alito?

Justice Alito:

Yeah, one other question. I'm intrigued by your Mount Healthy Hunter versus Underwood argument. I mean, maybe you're right, but Mount Healthy arose in an entirely different context where you're trying to get to an employer's motivation. Hunter versus Underwood involved an extreme situation where the court looked at the records of a state constitutional convention and came to the conclusion apparently that racism was the only motivation for what was done, but it does seem to me to be potentially quite unworkable in contrary to what we've generally said about legislative intent to apply the Mount Healthy framework to a congressional enactment. Or do you recognize or do you acknowledge that that would be very difficult? Because when an act of Congress has passed, there could be more than 250 different motivations for the votes that were cast by the members.

Mr. Fisher:

Yeah, I totally understand that, and in Hunter, the court actually engaged with that problem to some degree, and what Hunter said is, "To avoid that problem, we're going to look just to two things. One is the state's brief," which I'd say is the Solicitor General's brief by comparison here and the text to the law and here, that's the only thing I need to rely on to get you to the place that they wouldn't have enacted this-

Justice Alito:

Well, it gets you to the place that this was part of what motivated Congress, but why does it get you home, particularly when there's a severability clause in this act?

Mr. Fisher:

It can't be only part of it. It has to be enough to sustain the entire act or at least the parts that you wouldn't sever from the act, and so I think the reason why is because it's not just the content recommendation algorithm part that can be theoretically, I guess severed out. It's also the covered company provisions and it's just the whole approach of the statute that is based on content, not on data security.

Justice Alito:

All right. Thank you.

Mr. Fisher:

Okay.

Chief Justice Roberts:

Justice Sotomayor?

Justice Sotomayor:

No. I'll save it for the issues.

Chief Justice Roberts:

Justice Kagan? Justice Gorsuch? Justice Kavanaugh? Justice Jackson?

Justice Barrett:

One quick question. You repeatedly say that from your perspective, the government's motivation is that the content might be too anti-American or too pro-China, et cetera, so that's why you think this is a content-based restriction, but I guess I'm curious if you would say the same thing if the government had articulated its rationale as saying our motivation is to limit foreign interference in American social media platforms or discourse. Isn't that a different motivation from the standpoint of how we characterize this?

Mr. Fisher:

I agree, but then the question I would ask if the government said that, which I think kind of in the reply brief, maybe the government does say that, is that how on earth are you then serving a national security interest if all you're doing is just saying, "We don't like a foreign country rearranging cat and dance videos," it's hard to come in and make a national security argument. So the only way you get to national security, which is the government's own argument, is to look at the substance that's being rearranged and say, "We don't like the way the substance is going to be rearranged and curated differently."

Justice Barrett:

Thank you.

Chief Justice Roberts:

Thank you, Counsel. General Prelogar?

General Prelogar:

Mr. Chief Justice, and may it please the court. The Chinese government's control of TikTok poses a grave threat to national security. No one disputes that the PRC seeks to undermine US interests by amassing vast quantities of sensitive data about Americans and by engaging in covert influence operations, and no one disputes that the PRC pursues those goals by compelling companies like ByteDance to secretly turn over data and carry out PRC directives. Those realities mean that the Chinese government could weaponize TikTok at any time to harm the United States. TikTok collects unprecedented amounts of personal data and as Justice Sotomayor noted, it's not just about the 170 million American users, but also about their non-user contacts who might not even be engaging with the platform. That data would be incredibly valuable to the PRC.

For years, the Chinese government has sought to build detailed profiles about Americans where we live and work, who our friends and coworkers are, what our interests are, and what our vices are. TikTok's immense data set would give the PRC a powerful tool for her harassment, recruitment, and espionage. On top of that, the Chinese government's control over TikTok gives it a potent weapon for covert influence operations, and my friends are wrong to suggest that Congress was seeking suppress specific types of content or specific types of viewpoints. Instead, the national security harm arises from the very fact of a foreign adversary's capacity to secretly manipulate the platform to advance its geopolitical goals in whatever form that kind of covert operation might take.

The act addresses the threat of foreign adversary control with laser-like focus. It requires only divestiture of TikTok to prevent Chinese government control, and that divestiture remedy follows a long tradition of barring foreign control of US communications channels and other critical infrastructure. So no matter what level of First Amendment scrutiny applies, this act is valid because it's narrowly tailored to address compelling national security threats. Now, my friend Mr. Fischer just emphasized, and I acknowledge, that millions of Americans enjoy expressing themselves on this platform, but the important thing to recognize is that the act leaves all of that speech unrestricted once TikTok is freed from foreign adversary control. The First Amendment does not bar Congress from taking that critical and targeted step to protect our nation's security. I welcome the Court's questions.

Justice Thomas:

Is there any difference between content manipulation by a non-US company as opposed to a US company? I didn't hear Mr. Fischer make a distinction between the two.

General Prelogar:

Yes, and I think the important thing to recognize is that the act here is targeting covert content manipulation by a foreign adversary nation. Now, I understand my friends to say--

Justice Thomas:

What difference does that make?

General Prelogar:

The difference is that there is no protected First Amendment right for a foreign adversary to exploit its control over a speech platform.

Justice Thomas:

No. I mean the difference between covert and non-covert.

General Prelogar:

So I think that Congress's concern with the covert operation was that a foreign adversary could effectively weaponize this platform behind the scenes in order to achieve any number of geopolitical goals. Here are some of the examples that come to mind. One of the pages out of the playbook here is for a foreign adversary to simply try to get Americans arguing with one another to create chaos and distraction in order to weaken the United States as a general matter, and distract from any activities that the foreign adversary might want to conduct on the world stage.

Justice Kagan:

What do you mean by covert, though? I mean, does covert just mean it's hard to figure out how the algorithm works? Because we could say that about every algorithm.

General Prelogar:

No, the covert nature of it comes from the fact that it's not apparent that the PRC is the one behind the scenes pulling the strings here and deciding exactly what content is going to be made to appear on the site, and another way that the PRC-

Justice Kagan:

It's just because we don't know that China's behind it? That's what covert means?

General Prelogar:

Well, I think they-

Justice Kagan:

It doesn't have anything to do with the difficulty of figuring out what the algorithm is doing? It's just because people don't know that China is pulling the strings? That's what covert means?

General Prelogar:

What it means is that Americans are on this platform thinking that they are speaking to one another and this recommendation engine that is apparently so valuable is organically directing their speech to each other, and what is covert is that the PRC, a foreign adversary nation is instead exploiting a vulnerability in the system to suppress and silence-

Justice Kagan:

Well, if that's all it means that people don't know the China's behind it, everybody now knows the China is behind it.

General Prelogar:

No, but it's the specific content that's being manipulated would be unapparent, and so I think that's-

Justice Kagan:

Well, that's true of every search engine. I mean, you can take any of these algorithms, whether it's X or whether it's you name it. What are the new ones? Blue Sky. I mean none of these are apparent, right? You get what you get and you think, "That's puzzling," and it is all a little bit of a black box. So you can't just mean it's a black box. It's covert. They're all black boxes, and if you just mean what's covert is the fact that there's China behind it, I mean honestly, really, like everybody does know now that there's China behind it. So I just don't get what this covert word does for you.

General Prelogar:

I think the problem with just saying as a general matter, China has this capability and might at some point be able to exercise it and manipulate the platform is it doesn't put anyone on notice of when that influence operation is actually happening, and therefore, it doesn't guard against the national security harm from the operation itself.

Justice Gorsuch:

General, isn't that a pretty paternalistic point of view? I mean, don't we normally assume that the best remedy for problematic speech is counter-speech? And TikTok says it could even live with a disclaimer on its website saying this can be covertly manipulated by China, in case anybody were left in doubt after today about that possibility. So you're saying that won't work because?

General Prelogar:

That won't work because it is such a generic, generalized disclosure that it wouldn't put anyone reasonably on notice about when it's actually happening.

Justice Gorsuch:

That's your best-

General Prelogar:

The example I've been thinking about is-

Justice Gorsuch:

That's your best argument is that the average American won't be able to figure out that the cat feed he's getting on TikTok could be manipulated, even though there's a disclosure saying it could be manipulated?

General Prelogar:

But imagine if you walked into a store and it had a sign that said, "One of 1 million products in this store causes cancer." That is not going to put you on notice about what product is actually jeopardizing your health, and I think that's roughly equivalent to the type of disclosure they're contemplating here. They brought up the example of the Foreign Agents Registration Act, FARA. There you have to disclose the actual content.

Justice Gorsuch:

If that's true, then wouldn't that be true for all social media companies, for all content? I mean every editor, every newspaper in its editorial room makes decisions about what it's going to run, and how it's going to say it, and every algorithm has preferences, whether it's domestic or foreign, and nobody really knows exactly when those editorial decisions are being or how, but they're generally aware, and we think that that's enough.

General Prelogar:

I think though that there is a real risk that when a foreign adversary has control of that kind of mechanism and a speech platform in the United States, it could weaponize that platform to harm United States interests, and one of the key ways that the PRC flexes this muscle is to suppress speech.

Justice Gorsuch:

General, I'm sorry to interrupt you, but again, we're not arguing about the compelling interest. We're arguing about the tailoring.

General Prelogar:

Right, and so I guess what I would say you began by saying the cure for concerning speech is counter speech. Here, I dispute the premise that Congress was specifically concerned about any particular subject or any particular viewpoint. It wanted to close off the capability of a foreign government, but in any event, it's very hard to engage in counter speech when you don't know because someone is secretly manipulating the platform behind the scenes, and in particular what the PRC has the capability to do is simply silence American voices.

Justice Gorsuch:

Wouldn't the same thing be true with a newspaper owned by a foreign company and a foreign government? You wouldn't know when it's exercising editorial discretion about this article or that article or how it's doing it. So maybe we just need to shut down the Oxford University Press in America or you pick it. Any other foreign owned... Politico, I was told today, is owned by Germany. That would all be okay on your theory, so long as Congress designates that country a foreign adversary?

General Prelogar:

We are not asking the court to articulate bright line rules to govern all kinds of hypothetical situations.

Justice Gorsuch:

I understand that, but I am testing argument.

General Prelogar:

Yes, and what I want to acknowledge is that sometimes the court has recognized that a speaker-based preference might reflect a content-based preference, and in the context of ownership of a newspaper, for example, in part because a newspaper is a one-way channel of communication and is generally understood to represent to some extent its publisher's views. Maybe the court would more readily infer that a regulation targeting that is actually aiming to target content, but I don't think the court could draw the same conclusion here.

Justice Gorsuch:

I'm not talking about the compelling interest or any of that.

General Prelogar:

Right.

Justice Gorsuch:

I'm talking about the tailoring, and you're saying we have no alternative, but to stop this speech altogether. We can't rely on disclosure, but you say that wouldn't apply to Politico or to the Oxford University Press because?

General Prelogar:

In the circumstance where you have a newspaper that is understood to reflect its publisher's views, then you might think that disclosure would be a more adequate remedy there because it's not just holding itself out as a forum for speech between other people. I think social media platforms do raise distinct interests in this regard because what people think when they're engaging with TikTok is that it's organically feeding them videos based on the recommendation engine, and if actually China is behind the scenes engaging in this kind of covert operation, it does prevent a distinct national security risk. Of course, the other big difference with the newspaper is it's not likely to be collecting sensitive personal information about 170 million plus and then having the capacity to send that back to a foreign adversary.

Justice Coney Barrett:

General Prelogar, can I-

Chief Justice Roberts:

Now, so-

Justice Coney Barrett:

Go ahead.

Chief Justice Roberts:

I was just going to say, did I understand you to say a few minutes ago that one problem is that ByteDance might be through TikTok trying to get Americans to argue with each other?

General Prelogar:

That it might be just trying to foment disruption-

Chief Justice Roberts:

If they do, I say they're winning.

General Prelogar:

... or dissent. That might very well be true, Mr. Chief Justice, and I think the point I'm trying to make is that China is a foreign adversary nation that looks for every opportunity it has to weaken the United States and to try to threaten our national security, and if it has control over this key communications channel, it's hard to predict ex-ante exactly how it's going to use that as a tool to harm our interests, but we know it's going to try, first and foremost by seeking to get the data of these American users, which would be of a piece of all of the activity the PRC has already undertaken to breach our laws, hack OPM for example, and ex-filtrate the background files and security clearances of 20 million government employees,

The breach of Equifax to get sensitive financial data, Anthem to get sensitive healthcare data. We know that the PRC has a voracious appetite to get its hands on as much information about Americans as possible, and that creates a potent weapon here because the PRC could command that ByteDance comply with any request it gives to obtain that data that's in the hands of the US subsidiary.

Chief Justice Roberts:

Thank you.

Justice Coney Barrett:

General Prelogar-

Justice Alito:

Suppose-

Justice Coney Barrett:

Go ahead.

Justice Alito:

Suppose that TikTok had no connection whatsoever with any foreign government. It was owned instead by an immensely, immensely rich multinational corporation, and Congress concluded that this multinational corporation really has it in for the United States, and is going to use this extremely popular platform to do everything it can to undermine the United States in all the ways in which you think that TikTok may pursue at the direction of the PRC, would that be the same case?

General Prelogar:

I think there would be a first-order question of whether the multinational corporation itself has First Amendment rights.

Justice Alito:

All right, it's an American corporation.

General Prelogar:

So if it were an American corporation, I think that, and Congress disagreed with the viewpoints or content the corporation would display, obviously that's a direct regulation of protected speech and it would trigger strict scrutiny. I think that's different in kind from what Congress was worried about here, which was not regulating speech as such, but instead regulating foreign adversary control.

Justice Alito:

So your argument depends on the fact that what is at bottom here is the people's Republic of China using TikTok. That's what your argument depends on. If this were an American corporation, it'd be an entirely different thing.

General Prelogar:

Exactly, and the reason we know the statute is different is because all of the same speech that's happening on TikTok could happen post-divestiture. The act doesn't regulate that at all. So it's not saying, "You can't have pro-China speech. You can't have anti-American speech." It's not regulating the algorithm. TikTok, if it were able to do so, could use precisely the same algorithm to display the same content by the same users. All the act is doing is trying to surgically remove the ability of a foreign adversary nation to get our data and to be able to exercise control over the platform.

Justice Barrett:

Oh, sorry.

Justice Alito:

I'm sorry.

Justice Barrett:

I just wanted you to respond to Mr. Fischer's argument about the rights of Americans to receive information, say from the PRC or anyone else, and that even if ByteDance did not itself have First Amendment rights, that Americans would have a First Amendment right to receive that information in the Lamont sense.

General Prelogar:

Yes. So I think that Lamont reflected a principle that there can be a right of American listeners to receive information, and if Congress is directly regulating that based on disagreement with the speech that's being sent into this country, that's obviously going to trigger heightened scrutiny under the First Amendment, but here I think the users have to be asserting a different type of interest because what Congress was safeguarding against was not the ability of TikTok to continue to operate or the users to post content.

It was focused only on foreign adversary control, and so the users would have to demonstrate that they have some unqualified first amendment right to post on a platform that's controlled by a foreign adversary, which could use that access to then threaten our nation's security by gathering data on tens or hundreds of millions of Americans, and also use it for covert influence operations of whatever form, and I don't think there's a First Amendment to do that.

Justice Kagan:

I was trying to think of whether there's a historical analog here, and this is what I came up with and you can tell me whether it's fallacious. In the mid-20th century, we were very concerned about the Soviet Union and what the Soviet Union was doing in this country, and the Communist party of the United States at that time was integrally attached to the Communist International, which was essentially a Soviet operation. So if Congress had said, "Well, it's very nice. We can have the Communist Party USA, but it has to divest, it has to completely divorce itself from the Comintern and from any international ties that it has." Do you think that that would've been absolutely fine? And so if the answer is yes, it would've been fine. It's just like this case. Or if the answer is no, why is it not like this case?

General Prelogar:

So I guess I think I would need to know more information about how the international organization is able to exercise control over the American affiliate and if it had the capacity, for example, in an unqualified fashion, gather data from that affiliate in a way that was going to jeopardize our nation's security.

Justice Kagan:

I'm talking more about sort of the content. Let's put the data collection piece of this aside, which seems not very pertinent to my 1950s analog, but very concerned about the kind of speech that the Communist party was making in the United States, and it turns out that that content was pretty well scripted someplace else.

General Prelogar:

I think if it was specifically a concern about the content, then that would trigger heightened scrutiny under the First Amendment. We're not trying to run away from that principle here. Instead, we're making I think a narrower argument.

Justice Kagan:

Well, then I think you've just given your thing away because content manipulation is a content-based rationale. We think that this foreign government is going to manipulate content in a way that concerns us and may very well affect our national security interests. Well, that's exactly what they thought about Communist party speech in the 1950s, which was being scripted in large part by international organizations or directly by the Soviet Union.

General Prelogar:

I disagree that the concern with covert content manipulation is itself content-based or that it looks anything like the kinds of laws this court has previously said are content-based. The court most recently in City of Austin said you only have a content-based law when Congress is setting out to discriminate against particular subject matters or particular viewpoints. So it's not enough that the law is regulating in the space that involves content in some way. You have to have this motive by Congress to actually want to suppress speech on certain topics or certain viewpoints. Here, Congress just wants to cut the PRC out of the equation altogether and all of the same speech could continue to happen on the platform. It's like patching up a backdoor vulnerability that the PRC has that we can't totally see around all the corners to imagine how it could use it against our interests.

But we know the PRC will do whatever it can to try, and I think that is different in kind from imputing to Congress some motive to specifically get more speech on certain topics or with certain viewpoints. This law was passed by broad bipartisan majorities in both houses of Congress and our legislatures, and our legislators don't always agree on everything. I think it's unlikely that all of them had exactly the same views about what's good content on TikTok or what are good viewpoints. They weren't united on that. What they were united around was the idea that it is a grave threat to our nation if the PRC can itself behind the scenes be controlling how this platform operates.

Justice Alito:

Why doesn't this act classify on the basis of speaker?

General Prelogar:

I do think that when it comes to the PRC and ByteDance, you could treat this as a speaker-based restriction.

Justice Alito:

And aren't speaker-based restrictions almost always viewpoint-based restrictions, content-based restrictions?

General Prelogar:

The Court has said it depends. It hasn't applied an inflexible rule that any, anytime you are regulating certain speakers, you are invariably regulating based on content. Instead, the Court has said it warrants closer consideration, and here if you look at the US, TikTok, US and the users, none of them are being regulated in a way that suggests its disagreement with their content. It's all about what our foreign adversary is doing respect to the platform.

Justice Alito:

It's hard for me to think of situations. Maybe they exist where our classification based on speaker is not viewpoint or content content-based restrictions. I mean, somebody says, "Joe can't talk anymore. We're going to shut Joe up and we don't know what he's going to say tomorrow or two weeks from now. We don't know what he's going to discuss, but whatever he says is bad because Joe is a bad person." I mean that's viewpoint and content-based, isn't it?

General Prelogar:

I think when it comes to a foreign adversary, it's not right to view it that way, and the reason for that, again, is this is a sophisticated adversary nation, and we can't just simplistically say, "Oh, what the PRC is going to want is to see more pro-China content on this app." As Chief Judge Srinivasan observed, there are various ways that the PRC could try to create some kind of false flag operation and actually promote anti-China content not to dictate how Americans should think about things, but simply to create some trumped-up justification for a military or economic action that the foreign adversary wants to take against us, and I don't think a concern with trying to ward off that capability-

Justice Gorsuch:

Why isn't that viewpoint or content still? We don't know what the content's going to be, but we know that Joe is bad?

General Prelogar:

Because I think the better classification is to recognize that what we're trying to prevent is not the specific subject matter, the specific viewpoints, but the technical capability of a foreign adversary nation to use a communications channel against this.

Justice Gorsuch:

I guess I'm just struggling how covert content manipulation isn't content-based restriction.

General Prelogar:

So again, it's because-

Justice Gorsuch:

It's kind of hard to avoid the word content-

General Prelogar:

I don't-

Justice Gorsuch:

... and it's kind of hard to avoid the word viewpoint here, isn't it?

General Prelogar:

I don't dispute that it's related to content, but I don't think it reflects Congress seeking to set out in advance what kind of speech we should have reflecting certain views on certain topics. Instead, it's about trying to close off a vulnerability that our foreign adversary nation could exploit, and I would be remiss if I didn't point out that even if you thought this was content-based, all that means is that we're in strict scrutiny and as the DC Circuit recognized here, we think that this law serves compelling national security concerns that sound in some of the same arguments I'm making here and that have a long-standing correspondence to history and tradition of trying to prevent foreign control.

Justice Gorsuch:

And then we get to the question whether there's less restrictive means. I get that and whether disclosure might suffice. On the data security point, Your friends on the other side make the argument that if that were the concern, Congress could ban TikTok US from sharing data with anyone on pains of penalties that would put people in prison and shut the company down in the future as the government did for example with Arthur Andersen. Why isn't that a less restrictive means available?

General Prelogar:

So I was surprised to hear petitioner offer that up today because there was a long course of discussion between the executive branch and ByteDance and TikTok leading up to Congress's enactment of this act that spanned over four years, and extensive conversation about what limitations could be placed to protect American's data, and it was never a suggestion that there would be any way to create a true firewall that would prevent the US subsidiary from sharing data with the corporate parent, and the reason for that sounds in the technological features of this application, I think there can be no reasonable dispute that the source code development and the maintenance of this algorithm rests in China, which is why China has sought to try to control export restrictions with respect to the algorithm.

What that means is you need substantial data flows between the companies in order to continue to modify that algorithm, refine it and so forth. So I don't think that that was an option ever on the table, including with respect to the national security agreement that was insufficient in protecting our data privacy and security concerns.

Justice Sotomayor:

That didn't come across enough in the briefs. If we are in the world of data protection as opposed to content control, I think it's hard to get around the post-divestiture provision that says you can't do business with them on the algorithm because that very much is content-based. It's a content-based restriction, but what you're saying is you can't do it for a data control reason, meaning that you can't really run their algorithm without sharing the very data that we are concerned about as a threat. Correct?

General Prelogar:

That's right, Justice Sotomayor, and you don't have to take my word for it. You can look at the specific terms of the national security agreement that ByteDance itself proposed. The relevant definition of the accepted data is at JA 239 to 240 and it references categories of information that would, of necessity, technological necessity and business necessity, have to flow back to China, and the relevant categories are in the sealed appendix, but I would really encourage the court to look this up because it's eyeopening. It is at the Court of Appeals sealed appendix 249 to 252 and 254.

If you look at that information, it was a wealth of data about Americans that was going to have to go back to China in order for the platform to just continue its basic operations, and there's a legitimate commercial justification for that, but it creates this gaping vulnerability in the system because once that data is in China, the PRC can demand that ByteDance turn it over and keep that assistant secret. And the one final point on this is that ByteDance was not a trusted partner here. It wasn't a company that the United States could simply expect to comply with any requirements in good faith, and there was actual factual evidence to show that even during a period of time when the company was representing that it had walled off the US data and it was protected, there was a well-publicized incident where ByteDance in China surveilled US journalists using their location data.

This is the protected US data in order to try to figure out who was leaking information from the company to those journalists.

Chief Justice Roberts:

General, you want us to look at that and you get to look at it, but your friends on the other side don't get to look at it. That doesn't look at it, but your friends on the other side don't get to look at it. That doesn't seem fair.

PART 4 OF 5 ENDS [02:00:04]

General Prelogar:

That's the sealed appendix, Mr. Chief Justice. So it's their information, they can look at it. It's just under seal to protect their proprietary business information.

Chief Justice Roberts:

Okay.

Justice Barrett:

General, so I want to go back to the discussion about content discrimination and we're going to shut Joe up. Here, it seems to me like we are saying to ByteDance, "We want to shut you up." And so let's say that I think that that is content discrimination based on speaker. Tell me if I think that, tell me if I have to conclude that it is also speaker-based discrimination and content-based discrimination for TikTok?

General Prelogar:

No, it is not. And the reason for that is because it would be an anomalous principle to say that an entity outside the United States that can't assert its own First Amendment rights can somehow manufacture that right through the expediency of forming a US subsidiary, especially one that it wholly controls.

Justice Barrett:

So you don't have to stand on that argument that you were having with Justice Alito and Justice Gorsuch to still have your point about content discrimination?

General Prelogar:

That's right, and I think if you're focusing in on the relevant US entities here, TikTok US and the users themselves, this Act isn't regulating them in any way. It's not trying to dictate the algorithm that TikTok US can use. And in fact, Congress I think was doing everything it could to preserve access to TikTok in the United States in recognition that Americans enjoy expressing themselves and building community on the site.

Justice Barrett:

One last quick question.

Justice Alito:

Well I don't know, General...

Justice Barrett:

Sorry, just one last quick question.

Justice Alito:

No, no, no. Go ahead.

Justice Barrett:

Justice Gorsuch had asked your friends on the other side whether the new administration on January 20th could extend the deadline. What's your position on that?

General Prelogar:

So I think it tees up a statutory interpretation question of whether there can be an extension after the time period for divestiture has lapsed. I would think the court might start with its decision in the HollyFrontier case, which did recognize the ability to get an extension after a lapse like that.

Justice Barrett:

So it's your position that they could?

General Prelogar:

We have not run it to ground, in part because it's simply not presented here and I'm not prepared to take a position on that statutory interpretation question. I do want to emphasize though that my friends have pointed to January 19th or nine days from now as a moment when TikTok might go dark. At the outset, of course, Congress was hoping to prompt a divestiture, but I think the more important thing to focus on now is that even if that were to happen, Congress specifically anticipated it and provided authority to lift these restrictions as soon as there's a qualified divestiture.

And the reason for that is because foreign adversaries do not willingly give up their control over this mass communications channel in the United States, and I think Congress expected we might see something like a game of chicken. ByteDance saying, "We can't do it. China will never let us do it," but when push comes to shove and these restrictions take effect, I think it will fundamentally change the landscape with respect to what ByteDance is willing to consider, and it might be just the jolt that Congress expected the company would need to actually move forward with the divestiture process. So it's not irrevocable.

Justice Alito:

That's an interesting point and I hope Mr. Francisco or Mr. Fisher, whoever's delivering the rebuttal, will address it. So if we were to affirm and TikTok were forced to cease operations on January 19th, you say that there could be divestiture after that point and TikTok could again continue to operate?

General Prelogar:

That's exactly right. There's nothing permanent or irrevocable that happens on January 19th, and I think that Congress might've thought that we get in a situation here where a foreign adversary is doing whatever it can to just not comply. It's hoping the United States is going to blink first through our court system or through the Executive Branch getting cold feet about enforcing the law, but Congress set a deadline and I think it thought that deadline would have a forcing function.

Justice Alito:

Let me ask you a question about your effort to draw a distinction between ByteDance's speech and TikTok's speech. So suppose that the People's Republic of China funds a movie and there is an entity in the United States, a US corporation that thinks, "Wow, this is a great movie." And while the PRC would not have a First Amendment right to show it in the United States, would you say that the American company would not have a First Amendment right to do that because whatever expression there is in that movie, it's the PRC's expression, it's not their expression?

General Prelogar:

No. No, I wouldn't make that argument and I want to be really careful-

Justice Alito:

I thought that was the argument that was being made. No?

General Prelogar:

No. So our argument is that this is not a direct regulation of protected speech in the first place, or at most, it would warrant intermediate scrutiny because of the indirect effects that it might have on the American users or on the US subsidiary. We're not suggesting that if Congress sought to directly regulate and prohibit speech in the United States based on concerns about its content or viewpoint, that's somehow immune from First Amendment scrutiny just because it comes from a foreign source. Obviously, that kind of law is going to trigger strict scrutiny and I imagine it would be a different constitutional analysis because it's hard to imagine the same profound national security harms that would exist in that scenario as compared to what we have here.

Justice Alito:

Thank you.

Justice Jackson:

General, isn't the whole point of the divestiture requirement that the content on TikTok would be different if it was owned by a different company? I'm still struggling with your insistence that this is content neutral versus content based when we have that kind of circumstance.

General Prelogar:

The reason that I am continuing to try to hold the line on that is because there is nothing in the Act that would directly dictate any different mix of content on TikTok. The US subsidiary could use the same algorithm, show the same content, by the same users, in exactly the same order. It's not about trying to interfere with the US subsidiary's exercise of editorial judgment in any relevant sense. Instead, all Congress was doing was homing in on the problems of having a foreign adversary be able to interject itself and be able to harvest data or exercise....

Justice Jackson:

But your friends on the other side say that the motivation for doing that is because the foreign adversary might influence or change the content. So content matters, doesn't it?

General Prelogar:

Certainly, I think that content was relevant to Congress' concern about an adversary having control over the communications channel. I think not again because of any particular concern about viewpoints or subjects, but just that this would be-

Justice Jackson:

But isn't that relevance enough to trigger at least a heightened scrutiny from the standpoint of our legal tests?

General Prelogar:

I certainly understand that intuition and if the court thought that it were prudent to simply try to rule narrowly here and not dictate broader First Amendment principles, we have no problem with the court assuming that heightened scrutiny applies. We think the law easily satisfies it. We do think that intermediate scrutiny is a more appropriate framework for this kind of law that's not directly targeting protected speech. But in any event, there's a compelling national security interest here and the law isn't just narrowly tailored. It's precisely tailored. It's trying to fix the thing that's creating the problem, which is the PRC's involvement and the Chinese government's ability to exercise this control over the corporate entities.

Justice Kavanaugh:

How are we supposed to think about the two different rationales here and how they interact? The data collection rationale, which seems to me at least very strong, the covert content manipulation rationale as the hypotheticals have illustrated raise much more challenging questions for you about how far that goes, and if that alone, if you didn't have the data collection piece, you only had the covert content manipulation piece. And then Mr. Fisher's point, Mr. Francisco's that Congress would not have enacted this just based on the data collection rationale alone. Just your understanding of how the two arguments fit together?

General Prelogar:

Sure. And let me walk through our defense of the data protection rationale and why we think it's a full justification for this law and the court could stop there, and then be responsive to their arguments that somehow the interest in preventing covert manipulation somehow taints it. So just on data protection, I think that it should be beyond dispute that of course, our nation has an enormous interest in keeping the sensitive data out of the hands of our foreign adversary. And it should also be beyond dispute that our foreign adversary has an existing capability through its laws and through the way that these companies are integrated to get its hands on that data.

There is no question that Congress was sincerely motivated by that concern. There's a whole lead up to the statute here where the Executive Branch across two different presidential administrations was expressing concerns about the data problems. Congress was extensively briefed on those problems. It passed a companion data protection statute at the same time that was intended to prevent selling data to foreign adversary nations. The statute is shot through with protections that I think are key to this concern about closing off the vulnerability of access to the data. So that's a sincere justification for Congress's desire here to act. We think it's a compelling interest and it's narrowly tailored.

Then you get to the question of what to do about the fact that there's also this interest in covert content manipulation? And in the First Amendment context, this court in cases like Heffron has made clear that once you have a justification that satisfies the First Amendment, you don't need to go further and look at other justifications to decide whether they would independently satisfy First Amendment scrutiny. So I think it's not necessary for the court to go on and probe whether it thinks that covert content manipulation itself independently justifies the law.

Now, my friends say that's all fine and good, but they think covert content manipulation is just per se, illegitimate. And I honestly don't understand how that argument could carry the day because just imagine if Congress passed a law that said, "The PRC can't covertly manipulate TikTok." Obviously, that law is not going to violate any constitutional principle. It's a laudable goal, I think, for our legislature to protect us from foreign adversary interference like that. And so there's nothing that's inherently impermissible about wanting to guard against that risk. Maybe you could say that it sweeps in too much protected speech in the way it's operationalized in the Act here, but there's certainly no fundamental taint or anything akin to racial discrimination to call into question whether Congress could seek to vindicate that as one of many interests.

So I guess to just bring it all together, what I would say to the court is they have basically acknowledged that data protection is a compelling interest, that was Congress' real interest. It provides a sufficient basis on its own to uphold this law. The court could say just that and affirm.

Justice Sotomayor:

I don't know how we do that unless we accept your argument that the post-investiture provision that stops them from conferring on the algorithm is not a speech impediment, meaning it's very hard for me to say that it's not motivated to decide that question, that it is a speech impediment and one that on its face itself has to be analyzed separately from the data.

General Prelogar:

So Justice Sotomayor, let me begin by saying again that we do think that an interest in preventing any operational agreement between the US subsidiary and ByteDance, which is the relevant provision you're talking about, is justified by data protection alone, and that includes with respect to cooperation on a content recommendation algorithm, specifically because of the concern that it necessitates data flows between the companies. So I think that as a factual matter, that could justify Congress enacting, but to the extent that you think that actually, the prohibition on coordinating with respect to an algorithm reflects some kind of impermissible content-based problem with the statute, the statute has a severability clause and I certainly don't think that it would give the court a basis to invalidate this law or to stop it from operating with respect to all of the provisions that operate to protect data security. At most, it would suggest that that little piece of the law has to be on its own severed from the rest of how the statute operates.

Justice Sotomayor:

How does that affect whether we would apply... Because assuming it's data protection, then I would think that scrutiny wouldn't necessarily apply. I could understand applying intermediate scrutiny, but how do we do that with respect to this part, the algorithm issue? How do we get to intermediate scrutiny with respect to that?

General Prelogar:

The way you get to intermediate scrutiny there is to recognize that prohibiting foreign adversary control over the operations of the platform, including with respect to the fundamental backbone of the system, is not based on any protected speech or content-based in the relevant sense. I've been thinking of it as akin to something like a piece of software you might have on your phone that would allow the Chinese government to listen in on every American conversation. If Congress wanted to enact a law that patched up that vulnerability and said, "You can't use that piece of software or you can't coordinate with Chinese companies with respect to it," clearly, we would recognize that closing off that capability of China is a laudable and in fact, compelling government interest. And I think when it comes to the risks that foreign adversary control pose here, it's similar and kind. It's simply trying to prevent access by the Chinese government to the TikTok system writ large and that includes through the use of the algorithm.

Justice Sotomayor:

Thank you.

Justice Kavanaugh:

Could the president say that we're not going to enforce this law?

General Prelogar:

I think as a general matter, of course, the president has enforcement discretion.

Justice Kavanaugh:

And would that be binding, in other words, protect the regulated community so they could rely on that under due process principles going forward?

General Prelogar:

That raises a tricky question. So I think there would be a strong-

Justice Kavanaugh:

Then it's not going to be adequate. Right?

General Prelogar:

Well, I think there is a strong due process argument that the third party service providers could invoke if there were enforcement action based on a period of time when the president said the law wouldn't be enforced. The kind of canonical case-

Justice Kavanaugh:

They're not going to take that risk unless they have the assurance that a presidential statement of non-enforcement is in fact, something that can be fully relied on because the risk is too severe otherwise. Right?

General Prelogar:

I think that they might judge that based on this court's precedent in the due process space and principles of entrapment by estoppel, maybe they have a sufficient safeguard here to allow them to continue to operate. I would think even before a non-enforcement policy were announced, of course, the President-elect would want to review all of the updated national security information that has come in over the last four years that undergird Congress' judgment here. But the final thing I would say is that even if you think the third party providers are simply going to choose not to continue to provide these services because it's too much of a risk to take on, again, that's not anything permanent or irrevocable and that might be just what the PRC and ByteDance need to start taking seriously some of the public reporting about interest in acquiring the company.

Justice Alito:

At one point, Mr. Francisco suggested that what we might want to do and what he would regard as certainly preferable to a decision affirming on the merits is to issue an injunction, pending, I guess consideration of what we now regard as the CERC petition that was filed here. What do you think of that suggestion?

General Prelogar:

So I think this court doesn't have any basis to enter a temporary injunction unless it thinks petitioners are likely to succeed on the merits of the First Amendment claim. And to be honest, I think that there is no argument to be made that you should find that likely success. This is an act of Congress. This isn't some unilateral action by the Executive Branch, but it actually was action in parallel between the Executive and Congress, where Congress took action to close up a loophole in some of our laws. The Executive had tried to force divestiture of TikTok under the Trump administration, but that had gotten tied up in litigation about those authorities. So Congress came in and provided additional authority based on a substantial record, including with respect to the data harm. And I don't see any basis for this court to displace the deadline that Congress set without finding that actually, there is a potential First Amendment problem here.

Justice Alito:

Do you think we have the authority to issue an administrative stay as we have done in other cases, or do you think that the January 20 deadline prohibits us from doing that?

General Prelogar:

I don't think this court has a formal basis to not issue an administrative stay if it believed that that was necessary to assist in the court's own consideration of the case. And I would obviously defer to the court and whether it has a sufficient time to resolve the case, but we are here ready to submit the case today and I think it is in the interest of Congress' work and our national security to resolve the case and allow the statute to take effect.

Justice Alito:

Can I just test to see whether your recollection of what Mr. Francisco said about a warning is consistent with mine? I did not hear him say he can address this in rebuttal, that it would be acceptable to his client if Congress had said, "There has to be a stark warning on every TikTok, such as warning the Communist China is using TikTok to manipulate your thinking and to gather potential blackmail material." Did you hear him say that that would be okay?

General Prelogar:

I don't think he's made that concession, but even if he had, I don't think that would address the government's national security concerns. And one of the points here is that it's not just data privacy. So even if you could somehow put users on notice that the PRC could obtain their data and they choose to disregard that, it's not a data privacy interest. It's a national security interest. There's a distinct sovereign harm to the United States if our foreign adversary can collect this massive data set, about 170,000,000 Americans. And as Justice Kavanaugh touched on, there are a lot of teenagers using TikTok today who might ignore a warning like that and not really care, but they're going to grow up and they might become members of our military, they might become senior government officials. And for the Chinese government to have this vast trove of incredibly sensitive data about them, I think obviously exposes our nation as a whole to a risk of espionage and blackmail.

Justice Alito:

Thank you.

General Prelogar:

I did want to touch briefly on the questions about history and tradition here because my friends have said several times that the Communications Act of 1934, which we think is roughly analogous to the type of restriction that Congress was seeking to enact here, is justified entirely by concerns about scarcity, how you can't have sufficient bandwidth. And I of course, recognize that scarcity is what created the need for a licensing regime in the first place, but I think it's important to clarify the historical record here that in choosing to limit foreign control of radio stations, of broadcast stations, Congress specifically cited a concern about national security. That is written into the statute. National defense was one of the listed purposes of having that kind of restriction. And so I don't think my friends can succeed in being dismissive of that concern about history and tradition and what it shows about the national security judgments that undergird this law.

The one other factual point I wanted to make to be responsive to a few points that my friends have touched on relates to whether TikTok US has the ability to alter this algorithm, whether divestiture is feasible, how ByteDance has manipulated the platform in the past. With respect to the algorithm, I think we're simply talking past each other. We don't dispute that TikTok US might engage in some functions in the United States to customize the algorithm for a US audience. The thing we're worried about is happening long before that, over in China where ByteDance is developing the source code, creating the basic backbone and functioning of the system, and is then blasting out the algorithm for use by the various subsidiaries in their home country. So we're not seeking to regulate any activity that TikTok US is engaged in here. Instead, what Congress is doing is trying to close off the vulnerability of PRC access abroad.

With respect to the feasibility of divestiture, my friends have said it would've been impossible to do this within 270 days. At the outset, obviously, there's no inherent impediment to divesting a social media company. We just saw Elon Musk buy X or Twitter in about six months from offer to completion. And even with respect to this particular company, I think my friends are not well-positioned to complain about the timeline because they've been on notice since 2020 that unless they could satisfy the federal government's national security concerns, divestiture might be required. But in any event, I don't think that the court should fault Congress for trying to balance competing interests here in making sure that there was a period for compliance and trying to preserve access to the platform for Americans, while taking steps to safeguard against the risk to national security.

Finally, with respect to the question of whether ByteDance has taken action on the PRC's demands, there is evidence in the record that Congress consulted to demonstrate that outside of China, ByteDance has taken action to misappropriate data at the PRC's request, that included efforts to track dissidents in Hong Kong, protesters there, to track Uyghurs in China itself. We know that ByteDance has misappropriated US data with respect to surveilling of US journalists and there was evidence in the record reinforcing the conclusion that ByteDance has been asked by the PRC to undertake efforts to censor content and manipulate the platform at the behest of the Chinese government. So I don't think there is a factual basis to dispute the record that Congress had before it. If the court has no further questions.

Chief Justice Roberts:

Justice Thomas?

Justice Sotomayor:

I have a question. General, if I understood correctly, under President-elect's first term, he passed an executive order requiring divestiture, correct?

General Prelogar:

That's right.

Justice Sotomayor:

And that was challenged in court and stayed as a result of him exceeding his executive power to do that, but this bill followed a bipartisan investigation, correct?

General Prelogar:

Yes, that's right.

Justice Sotomayor:

I am a little concerned that a suggestion that a President-elect or anyone else should not enforce the law when a law is in effect and has prohibited a certain action, that a company would choose to ignore enforcement on any assurance other than the change in that law. But putting that aside, on the 19th, if it doesn't shut down, there is a violation of law, correct?

General Prelogar:

Yes.

Justice Sotomayor:

And whatever the new president does, doesn't change that reality for these companies.

General Prelogar:

That's right.

Justice Sotomayor:

How long is the statute of limitations in effect, assuming that they violated it that day and later continued to violate it? But how long does the statute of limitations exist for a civil violation of this sort?

General Prelogar:

It would be a five-year statute of limitations.

Justice Sotomayor:

Thank you.

Justice Alito:

Thank you, counsel. A rebuttal?

Mr. Francisco:

Thank you, Mr. Chief Justice for points, all of which go to why we think this law would fail whether you applied intermediate scrutiny or strict scrutiny. I'd like to begin with the least restrictive alternative, simply prohibiting TikTok Incorporated from disseminating any of the sensitive user data to anyone including ByteDance under the threat of massive penalties. That is definitely a less restrictive alternative. Now, my friend pointed to the NSA negotiations. Well, the sensitive user data that we're talking about and that were of concern in the NSA negotiations were not the type of technical data that she's talking about. The NSA did allow certain types of non-sensitive technical data to go back and forth, but that wasn't anybody's concern. And as we say in page 23 of our briefs, they simply cut off the negotiations without ever raising those concerns. But to be clear, if that's a concern, sweep that into the ban too. Put in that non-sensitive technical data into the ban too. We'll deal with that. It's a lot better than simply being forced to shut down. So that is most definitely a less restrictive alternative that would address data security.

We talked about the under-inclusiveness in Temu and Shein, the two large e-commerce sites. Justice Kagan, you might've seen Temu during the Super Bowl. It was heavily advertised. It's one of the most popular e-commerce applications in the United States. It's got 70,000,000 users. Justice Sotomayor, you were asking what they collect? This is from joint Appendix 339 to 343, the US-China Economic and Security Commission Review Report. Shein relies on tracking and analyzing user data, draws on customer data and search history with the assistance of artificial intelligence algorithms it requests users share their data and activity from other apps, including social media. So they apparently go into your social media apps and suck up all of the information. Because they're e-commerce apps, they take names, addresses, and credit card information. If you look at the privacy policies on their website, they collect location data. It looks like they might even collect, at some level, GPS location data. So they collect massive amounts of data.

Point three, their mere covertness argument makes no sense for the reasons that the court explored. If mere covertness were the issue, a disclosure would make perfect sense. Yet they're not concerned about mere covertness. They're concerned, as my friend suggested, with getting Americans to argue with each other. Well, as far as I can tell, that's what news organizations do in this country every single day. That's what we call editorial content. That's what we call content itself, and so it's trained directly on the content. But even if you thought somehow that the mere covertness were the issue, that definitely could be addressed through a risk disclosure. So the data sharing ban, the risk disclosure, those are obvious, less restrictive alternatives, and had the government considered them and rejected them, we would be in a different position.

But if you look at this record, those are two less restrictive alternatives that the government did not address at all. Whether you apply strict scrutiny or intermediate scrutiny, that is fatal because under both standards, restricting speech has to be the last resort, not the first one. And when you fail to consider less restrictive alternatives, you fail under either standard.

My final substantive point is we absolutely think this court has the authority to enter an administrative stay. I didn't understand my friend to disagree with that. We think that given the enormity of this decision, given the complexity of this case, it would make perfect sense for this court to enter an administrative stay. I also think you could enter a preliminary injunction. Yes, likelihood of success is one standard, but you don't have to determine ultimate success. And as you do in other related contexts, like with respect to stays, you often make clear that you are not addressing the merits of the case. I think you could do that here.

The bottom line, your Honor, is this case ultimately boils down to speech. What we're talking about is ideas, and my friends on the other side, when you cut through everything else, are ultimately worried that the ideas that appear on the TikTok platform could, in the future, somehow manipulate Americans, could somehow persuade them, could somehow get them to think something that they ought not be thinking. Well, that whole notion is at war with the First Amendment. If the First Amendment means anything, it means that the government cannot restrict speech in order to protect us from speech. That's precisely what this law does from beginning to end, whether you look at its text, whether you look at the government's justifications in its brief, where they talk about being worried about speech, criticizing our leaders or undermining democracy, it's what you see in the House Report, which drains specifically on the dangers of misinformation, disinformation and propaganda. And it's what you see in this legislative record writ large, which is saturated with objections to TikTok's existing content. We ask that you reverse the court below. Thank you.

Chief Justice Roberts:

Thank you, counsel. The case is submitted.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Related

A Guide to the Supreme Court Oral Arguments on the TikTok Case

Topics