Transcript: TikTok Inc. v. Merrick Garland Oral Arguments in the DC Court of Appeals
Gabby Miller / Sep 17, 2024On September 16, 2024, the District of Columbia Court of Appeals heard two hours of oral arguments in TikTok Inc. v. Merrick Garland. TikTok and a group of creators filed the initial complaint challenging the constitutionality of a law that would force TikTok’s China-based parent company, ByteDance, to sell the company or face a ban from app stores in the US. President Joe Biden signed the divestiture law, or the Protecting Americans From Foreign Adversary Controlled Applications Act (HR 7521), in April 2024, and absent an injunction or stay from the court, it will go into effect on January 19 – just one day before the new president is inaugurated.
The panel of judges hearing the case
- Chief Judge Sri Srinivasan (appointed by President Barack Obama)
- Judge Neomi Rao (appointed by President Donald Trump)
- Judge Douglas H. Ginsburg (appointed by President Ronald Reagan)
Attorneys arguing before the court
- Andrew J. Pincus, on behalf of TikTok
- Jeffrey L. Fisher, on behalf of TikTok creators
- Daniel Tenney, on behalf of the US government
Key arguments and case law
TikTok’s initial complaint argues that a “qualified divestiture” is neither commercially, legally, nor technically feasible, especially within the 270-day compelled timeline. It also accuses the US government of circumventing the First Amendment by invoking national security. During Monday’s oral argument, the plaintiffs’ counsel emphasized this argument, asserting that the law should be subject to strict scrutiny, a standard that the government cannot meet, since it is based on “the possibility of future Chinese control,” rather than any current threat. The plaintiffs further argued that even if TikTok were owned by a Chinese company, and its content moderation decisions were being made abroad, the law would still suppress US users’ right to free expression.
In contrast, the US government focused its argument on a data security rationale, emphasizing the sensitive data TikTok collects from its American users. The government claimed that the data could be “extremely valuable” to a foreign adversary seeking to influence Americans’ views or exploit them as an intelligence asset, and asserted the law as having “nothing to do with protected speech by American citizens.”
During the oral arguments, the judges and lawyers referenced a wide range of case law, including significant discussion of the US Supreme Court’s recent ruling in the NetChoice cases. The full list of referenced rulings include:
- Netchoice opinion
- Murthy v. Missouri
- Sorrell v. IMS Health, Inc.
- News America Pub., Inc. v. F.C.C
- Whitney v. California
- Holder v. Humanitarian Law Project
- Lamont v. Postmaster General
- Arkansas Writers’ Project, Inc. v. Ragland
- Minneapolis Star v. Minnesota Commissioner
Below is a lightly edited transcript of the oral arguments. Please refer to the official audio when quoting.
Judge Srinivasan:
Good morning, counsel. Mr. Pincus, please proceed when you're ready.
Andrew Pincus:
Good morning, your honor. Thank you. And may it please the court. The law before this court is unprecedented and its effect would be staggering. For the first time in history, Congress has targeted a specific U.S. speaker banning its speech and the speech of 170 million Americans. The law is subject to strict scrutiny, and the government bears the burden of proving its constitutionality. Its arguments fail as a matter of law for two fundamental reasons.
First, the government's asserted interest in addressing what it calls content manipulation is facially illegitimate. Speech regulations cannot be justified on content or viewpoint grounds. And the gross under inclusiveness of the data security interest fatally undermines it as a standalone justification. Second, this law is just like the one invalidated by this court in News America. No compelling reason justifies Congress acting like an enforcement agency and specifically targeting petitioners. Congress excluded petitioners and only petitioners from the more protective general standard that the executive branch must apply to every other speaker it seeks to classify as a national security risk. This law imposes extraordinary speech prohibition based on indeterminate future risks, not withstanding the obvious less restrictive alternatives, the government has not come anywhere near satisfying strict scrutiny.
Judge Srinivasan:
So one matter in the case that you didn't mention as far as I could detect in that, is the fact that the initial operative incidence of the law is predicated on the idea that the curation is occurring abroad. So it's a foreign entity abroad who's engaging in the curation that's causing the content manipulation that you highlighted.
How does that factor into the analysis from your perspective? Because I think the government's point is sure, you've got your point about content moderation, content manipulation, and that comes up in a lot of cases including NetChoice in the Supreme Court. But this case is different because it involves something that's happening abroad. And what we're worried about is the effect of something that happens abroad and when it's a foreign organization, they don't have a First Amendment right to object to a regulation of their curation.
Andrew Pincus:
Well, a couple of answers to that, your honor, if I can walk through them. I think the first step is TikTok. Inc is a U.S. entity that engages in speech. It curates third-party content just as the NetChoice, and it engages in its own speech. So the speech here that's being banned, we would say, or at the minimum burdened, is the speech of the U.S. speaker. I think the government tries to argue that because TikTok Inc. ultimately has a foreign owner, that that somehow affects whether TikTok Inc. the U.S. entity has First Amendment rights, that can't possibly-
Judge Srinivasan:
I don't think they're arguing that. I think what they're saying is TikTok Inc. may well have First Amendment rights and does, but TikTok and TikTok Inc. can continue to curate to its heart's content. But what it can't do is do that while it's owned by China, because we're worried about what China does vis-a-vis TikTok Inc.
Andrew Pincus:
Okay. I want to get to the bottom line of your honor's question, but I want to correct the premise. It's not owned by China.
Judge Srinivasan:
Got it.
Andrew Pincus:
The owner of TikTok Inc., it's ByteDance Limited. It's a Cayman Islands holding company that it's owned by-
Judge Srinivasan:
Subject to Chinese control. They're subject to Chinese control.
Andrew Pincus:
No, they argue that, but I think the critical issue here is what they're saying is because there is the possibility of future Chinese control, right? They don't claim anything has happened yet. They claim there's future Chinese control, and therefore we can burden in a very significant way the speech of a U.S. entity and its users, which by definition is fully protected speech. So in order to say that they can do that, essentially take away the speaker rights of a U.S. entity, they surely have to meet strict scrutiny. So even if the court concludes that the foreign government manipulation is not subject to the First Amendment, I think there are questions about that. No court has held that. The most the courts that the United States has ever done before is to say that foreign speech has to be labeled, but I don't think the court has to-
Judge Srinivasan:
Isn't that what the Supreme Court said in USAID?
Andrew Pincus:
I don't think so, your honor. I think what USAID said is the speech outside the United States by non-U.S entities is not protected. So there were two elements there. They aren't here. We're talking about speech in the United States. But as I say, even if the-
Judge Srinivasan:
I'm just saying if the curation that's being worried about is the assertion of control by the Chinese government in China, then that fits within the USAID.
Andrew Pincus:
I don't think so, your honor, because it's still speech coming to the United States. I think it raises a complicated question that the courts really haven't addressed. But as I say, I don't think the court has to go that far, although we advocate that. I think the court, what's clear here is the burden of the government saying because of this indeterminate future risk, we can impose burdens on speech that there's no dispute. It's 100% protected by the First Amendment. That requires strict scrutiny.
Judge Rao:
Mr. Pincus, what is the best evidence that TikTok U.S. or TikTok Inc. is engaged in its own expressive activity, expressive activity that's not controlled by ByteDance?
Andrew Pincus:
Well, A) it's a U.S. entity. B) we have in the right-
Judge Rao:
It is a U.S. entity.
Andrew Pincus:
It is a U.S. entity, and open society essentially establishes a presumption that we respect the corporate form. The government is not arguing that there's a sham here. So I think that's the starting point, but I think the record also makes clear that the curation occurs in the United States. The record is clear.
Judge Rao:
The Presser Declaration.
Andrew Pincus:
Sorry?
Judge Rao:
Presser Declaration?
Andrew Pincus:
Presser declaration exactly, your honor, at the page of 799 to 800 and 812 makes clear that that content moderation occurs in the United States, and there's no dispute that the TikTok Inc.'s own posts are in the United States, its own speech. And of course there's a lot of U.S. user speech that is on the platform. So I think all of those things together combined to say there's no showing here and no argument here that there's some kind of a sham with respect to the content moderation that's occurring now. The government's argument is there might something might happen in the future.
Judge Rao:
Do you think there's not a sham? I mean, what about our case from 1988, Palestine Information Office? Are you familiar with that case?
Andrew Pincus:
I don't think I am, your honor, unfortunately. I apologize.
Judge Rao:
The government doesn't cite it, which I was a bit surprised about, but it's a case from 1988. Judge Mikva joined by Judges Starr and Silberman, and there I think this circuit essentially said that the fact that the Palestinian, Palestine Information Office, which was an entity in the United States could be shut down by the State Department in part because of its affiliation as a foreign mission of the PLO, which is a designated terrorist organization. And our circuit seemed to suggest very strongly that the control or the relationship itself was part of the strong justification for what the government did.
Andrew Pincus:
Well-
Judge Rao:
If you're not familiar with that case, then-
Andrew Pincus:
I'm not familiar with that case, your Honor. Happy to address it as a supplemental brief. But I think even from your honor's recitation of the facts, the government isn't arguing here that there's control now either by China or by ByteDance Limited. I do think just stepping back from China and talking about your honor's question about whether foreign ownership by itself-
Judge Rao:
Justice Barris suggests that this might make a difference in her separate opinion and that choice.
Andrew Pincus:
She did, your honor but-
Judge Rao:
One justice said.
Andrew Pincus:
It would really have quite far flowing ramifications. As we talk about in our brief, there are lots of U.S. speakers, Politico, Business Insider. We talk about Reuters. We talk about a lot of them in our brief that are owned by foreign entities.
Judge Rao:
But not foreign adversaries.
Andrew Pincus:
I don't think that affects the First Amendment question that might affect scrutiny. So I think it's very important in this case to take things in stages. Are there protected rights that are burdened? Does that burden strict scrutiny? And then we can look at the justifications and see if they hold up. But I think at the first stage, are there First Amendment rights that are being burdened? Even the government doesn't argue that TikTok Inc. has no First Amendment rights.
They make this brief argument about foreign ownership making a difference. But as I say, mere foreign ownership can't possibly be a justification because it would turn the First Amendment on its head. We have lots of publications that are owned by foreign entities. And to say your foreign ownership casts your First Amendment rights into doubt or in a defamation case or a government regulation case, it's open to explore the interactions between the foreign owner and the U.S. speaker. To see precisely what speech is controlled and what's not would really fundamentally change First Amendment analysis in a lot of issues. And it's what open society rejected. Open society basically said, "We're going to presume that there is corporate separateness." It may be that in that case there was a showing of control that was satisfied, but here there's no showing certainly that ByteDance Limited, which as I say-
Judge Rao:
Well, it's interesting because in that case, the court defers to the fact that the government thought there was foreign control. But any event, so even if we assume that TikTok U.S. has First Amendment rights as a U.S. corporation, why wouldn't we apply intermediate scrutiny? Because the act itself arguably regulates both conduct, which is foreign ownership, but also incidentally burdens the expressive activity. I know you won't accept that characterization, but why is that wrong?
Andrew Pincus:
I really don't here, because a couple of reasons. I think what was clearly targeted here was the TikTok platform, a speaker, and I think when an individual speaker is targeted by a law, courts have said that that's enough to trigger strict scrutiny. I mean, News America stands for that proposition. Once the court concluded in News America that the law targeted only News America and wasn't going to apply to anyone else, it went right to heightened scrutiny.
Now, in that case, the heightened scrutiny was intermediate because it was the broadcast context. But I think the analogy hits here and Citizens United gives the reason for that distinction. What it says is there's such a risk of content and viewpoint-based discrimination when you target a specific speaker that we have to apply strict scrutiny. The second reason, the justification here is viewpoint. And so that by itself, Reed says, triggers strict scrutiny. But I want to go back to just one more answer to your prior question, which is you said in the Palestinian case, there had been a determination.
We don't really know what was determined here because this was Congress enacting a statute that has no findings. It doesn't say why Congress did what it did. It targeted TikTok. And the record here indicates that although the government says future covert manipulation by China is the risk, is the justification, the record is certainly not that clear, and-
Judge Rao:
We've never held that Congress is required to enact findings in order ... I mean, in some sense the finding is the fact that they passed a law under Article 1, Section 7, designating TikTok for this treatment. I mean, there's no requirement that Congress needs to put in a statute its findings.
Andrew Pincus:
Totally agree, your honor. There's no general requirement, but I think-
Judge Rao:
Not the APA.
Andrew Pincus:
As the court, it is not the APA. But that's part of the problem in this statute just to ... I guess two answers. The problem here is the statute doesn't say, and it's not a statute that sets up a general rule where you can say from the general rule. Usually what happens in the enforcement context is Congress sets up a general rule. Facial challenges are hard because the general rule has some principles, and the applications occur in enforcement actions by the executive branch where there are specific findings and specific basis to judge what happened. That didn't happen here. And the record to the extent we want to go beyond the words of the statute, which as you say, just say TikTok, it doesn't say why. Looking at the record, the record is suffused with much broader content justifications than the one government…
Judge Srinivasan:
Can I ask this question? On the content modification rationale, there's another rationale too.
Andrew Pincus:
Yes.
Judge Srinivasan:
There's the data security one, and let's just assume for present purposes, I know you resist the assumption, and I can understand why. But just for hypothetical purposes, let's just assume the law was only undergirded by the data security rationale. So the content manipulation, the concerns about content manipulation just drop out. If the law was only undergirded by the data security rationale, do you think strict scrutiny applies?
Andrew Pincus:
I think it still does because it's targeted a specific speaker. If a law singled out the New York Times company and said, "You're going to have to meet special workplace safety rules or special overtime rules," no one would say, "Oh, we are not going to apply strict scrutiny because there's a justification for workplace safety." There would be a significant argument about whether that ....
Judge Srinivasan:
And the specific speaker is TikTok U.S. But I think what the government would say about that, and we'll hear from the government, but I think what the government would say is, "We're not targeting TikTok U.S. Quad TikTok U.S. We only care about TikTok U.S. to the extent that it's subject to Chinese control. And so TikTok U.S. can continue to be TikTok U.S. full board as long as it's no longer subject to Chinese control. And the way that that happens is to have a divestiture. But apart from that, TikTok U.S. is totally fine with us."
Andrew Pincus:
But I think, your honor, the problem is, I mean, I guess we have two answers to that. One is divestiture is infeasible here for the reason that Professor Milch indicates. And so this isn't just about divestiture. It's really about a ban. But even if they're in some theoretical world, divestiture would be possible, there's still a burden on TikTok U.S. It's costs. Different speeches required. The statute says you can't bring in the foreign user content that is a critical part of TikTok. Your speech would've to be different. You can't use this recommendation engine. That would make your content moderation different. And the costs of divestiture by themselves are burden on TikTok Inc. Just the kind of burden that Minneapolis Star and other Supreme Court cases have said triggers scrutiny when you're single out a speaker.
Judge Srinivasan:
So can I ask you this question then? If under your rationale, suppose the United States is at war with a country, and then there's a question about whether that foreign country can own a major media source in the U.S. while the war is going on. Is your submission that Congress can't bar the enemy's ownership of a major media source in the U.S.?
Andrew Pincus:
I think we would still be in the world of strict scrutiny. Maybe that would be a sufficient justification, but I think we would still have to look at those rationales and decide that they were sufficient. And when you're at war, probably they would. But that's a couple of things to say here. That's certainly not the rationale that they're giving. And just to finish up my answer, I think in the divestiture context, we still have a burden on the U.S. speaker's rights. I just want to return to your data privacy question because I also want to say it would be impermissible for the under-inclusiveness reason that we cite in our brief the statute ... If you look at the broad statute, I mean it's singling out TikTok, is pretty under-inclusive by itself. But even if you want to consider the broader statute, there's an exclusion. It only applies to sites that host user content.
There's the business review exclusion that we claim ... We have a little dispute with the government about what that means. The plain language seems to say, if you're a company that has a business review app, then you're entirely out of the statute. But even if you read it the way that, that's certainly a shocking content-based distinction that undermines the data privacy interest. But even if you just read it as business, having a business review app by itself is excluded, those exclusions exclude e-commerce sites. And as we say in our brief, there are very significant e-commerce sites based in China and other places that collect much more data than TikTok does. Very sensitive data. The record refers to one of them that was cited by a U.S. commission as a possible danger, and they're categorically excluded.
Judge Srinivasan:
So I just want to understand the implications of your positions. And we're not at war. I'm not suggesting that we are, but just to understand how the way you view the case would play out. If we were at war, if the United States was at war with say China, and what the law did was to bar Chinese ownership of say ABC, because China wants to buy ABC and Congress and the national security establishment is worried about the repercussions of that. And so it says ABC can continue to be ABC to its heart's content. The one thing it can't do is be subject to Chinese control when in a time when we're a war with China, because we're worried because if China's in control of it, then it could engage in content manipulation of a type that's going to be problematic vis-à-vis U.S. interests. Your view is that strict scrutiny would apply to that and the government would have to-
Andrew Pincus:
Well, let me say your example is ABC. That's the television network and it's licenses. That's a slightly different world. But let's just assume it's someone who's not being regulated because of their broadcast.
Judge Srinivasan:
Got it. Okay. Right, right. Take the broadcast, take the point, but a major media source. And then-
Andrew Pincus:
I think strict scrutiny would be the question.
Judge Srinivasan:
Under strict scrutiny, one of the points that you've made repeatedly and understandably, is that the concern here appears to be directed at something that could happen in the future, not something that's necessarily happening now. So let's just say that that's true in the war context too. There's no particular reason to know what's happening right now. It's a concern about the future. Would that necessarily mean that strict scrutiny's unsatisfied and so therefore ...
Andrew Pincus:
I think it would depend on the facts. It might depend on the facts regarding the ownership. I think there are a couple of questions embodied in your question. One is whether this is a sufficient compelling interest. Even if it is, there's a less restrictive means question. And I think a critical point that we make, and that's true throughout the law, is the government's solution to foreign propaganda in every other context has been disclosure. It has not been a ban. The Meese case talks about that and in Footnote 15, has a very fulsome and explanation why that our view in America is if speech is made clear, then Americans can decide. And the answer ...
Judge Srinivasan:
I mean, the disclosure idea might depend on the voluntary cooperation of the very control that you're worried about.
Andrew Pincus:
Well, I don't think so, your honor because I mean, I don't know what the disclosure would be. Obviously the government would have to show the predicate of the risk of Chinese control.
Judge Srinivasan:
And you'd have to know that the manipulation's going on, right? Because to ...
Andrew Pincus:
Well, the government, I mean, in a hypothetical world, maybe. I guess this is how I think the process would proceed. The first question would be, is this a sufficiently compelling interest? And there really isn't a precedent for a court-finding government regulation of protected speech, which is what's going on here because we still have a lot of protective speech, in a general sense is a compelling interest sort of ever. So that would be a pretty shocking and big step. But assuming you could get over that step, then I think the next question would be is the risk of control since we're talking about the future imminent, and I don't think imminent here. That's the word that Holder and Pentagon papers use, and I don't think that's necessarily imminent in terms of happening tomorrow. I think it's imminent in the sense that the government is arguing here, which is the China gov do this at will. That's our risk.
China could do this at will, so we have to act now. And I think that has to be shown as a factual matter. And I think that would require just working through this case. The court to delve into the factual record and see if that's true, given the protections that are in place and the less restrictive means protections that we argue could be in place. I mean, we don't want our content, TikTok, Inc. and ByteDance do not want the content to be controlled. That's why they've not only negotiated the national security agreement, it's why they've implemented it voluntarily, including a number of provisions that go to this issue. So the government have to show imminent in terms of China being able to do it at will, notwithstanding the possibility of technological protection. And then the question would be disclosure.
And I think the government, if it met those tests, would then have to come forward with a possible disclosure? Zauderer sets the test. There's a compelled speech element, but that's what the government has done in other contexts where there's disclosure, movies, television, printed material. And I think NetChoice points the way, that for the government to say that disclosure won't work here requires some quite dramatic showing because the court's general approach has to say, "We treat these media the same." Now, maybe in some hypothetical world, the government could show that, but certainly there's nothing in this record. And I think the critical thing, again, we're looking at a statute-
Judge Rao:
Is your argument though then that if the government is concerned about covert influence by a foreign adversary, it has to ... disclosure is always the least restrictive means? I mean, that would seem to me quite a remarkable determination to make-
Andrew Pincus:
It certainly.
Judge Rao:
... for this court under Holder and other precedents.
Andrew Pincus:
Well, it certainly would have to show that a reason why it isn't a least restrictive means, and remember the actor here is Congress. And Congress didn't even, as far as we know, consider disclosure. And Sable and Playboy Enterprises say, "Congress has to consider less restrictive means." And so for that reason alone, there's a problem. This course is invalid. Whether the government could in a hypothetical case where there was actually a reasoned decision record, say, "We've looked at disclosure. Here are the possibilities. We've actually concluded it doesn't work."
Judge Rao:
I'm not sure. If the concern is data protection and covert influence by foreign adversary, how would disclosure be ... I mean, disclosure under your view that the only reason for this law is propaganda, maybe disclosure addresses that. But that, I mean, the act is not even aimed at expressive activity directly. It's aimed at foreign ownership of a U.S. corporation.
Andrew Pincus:
Well, just to take your last point. First, as I said, your honor, we think it is aimed at expressive activity. The justification relates to expressive activity. TikTok, it singles out a specific speaker, and the whole debate, the Justice Department went up to Congress and said-
Judge Rao:
At a minimum though, this act regulates both. Okay, say it regulates expressive activity because it directly targets TikTok U.S., but it is also regulating foreign ownership, which is a separate non-expressive interest of the government. So with those combinations, why shouldn't we apply the O'Brien framework, for instance?
Andrew Pincus:
Because this is not a statute that's generally regulating foreign ownership. It's regulating foreign ownership. The provision, we're talking about foreign ownership only of a particular speaker. O'Brien was upheld because the law there regulated a whole range of activity, the court said, and incidentally, it might fall on speech. Here, the regulation falls directly and the burden falls directly on a speaker. Whether you think of it as divestiture or as a ban, the burden falls on the singled-out speaker. And that distinguishes from O'Brien and O'Cara and all those other cases.
Judge Srinivasan:
Would you think you lose under O'Brien?
Andrew Pincus:
Excuse me?
Judge Srinivasan:
I mean, you're resisting O'Brien for, of course, you'd rather have strict scrutiny, but would you have an argument under O'Brien that the laws-
Andrew Pincus:
I think we still have arguments under O'Brien. I think the same arguments we're making I think would be strong enough, but I don't think ... I think the government mentions these-
Judge Srinivasan:
Because one of the criteria under O'Brien is whether the law's related to the suppression of free expression.
Andrew Pincus:
That's why we get out of O'Brien, right? Because it's a law related to the suppression of expression. That's what Texas-
Judge Srinivasan:
That means that O'Brien's not satisfied. It doesn't mean you get out of O'Brien. That's what happens in NetChoice. I mean, that's one of the criteria under which a law survives O'Brien's scrutiny. And your argument would be, I would assume, that the law can't survive O'Brien's scrutiny because if it's related to the suppression of free expression, that it's invalid.
Andrew Pincus:
Yeah, I mean maybe. I think we're just talking about a doctrinal difference here. I think the law is out of O'Brien and subject to strict scrutiny because it's not incidental. If I can just go back to Judge Rao's-
Judge Ginsberg:
Before you do that.
Andrew Pincus:
Sure.
Judge Ginsberg:
You started off, not started off, earlier on said that curation would occur in the United States under TikTok's NSA proposal. Correct? And the NSA proposal is more or less reflected in Project Texas. Is it not?
Andrew Pincus:
Project Texas has implemented some, but not all of the protections in the NSA.
Judge Ginsberg:
And the curation, the instrument of curation is the so-called recommendation engine. Is that correct?
Andrew Pincus:
Well, there are multiple forms of curation, your honor. Some is our content-
Judge Ginsberg:
Is that one of them?
Andrew Pincus:
That is one of them, but not the only one.
Judge Ginsberg:
So here's a little later in the Presser Declaration. Project Texas contemplates the source code supporting the TikTok platform, including the recommendation engine, will continue to be developed and maintained by ByteDance subsidiary employees, including in the United States and in China. So the curation is not entirely in the United States.
Andrew Pincus:
Well, I guess a couple of answers to that. First of all, the-
Judge Ginsberg:
One will do.
Andrew Pincus:
Okay, well, there's a review of those changes to the recommendation engine in the United States, but the recommendation engine is not the only form of curation. In the United States, there are also changes made to how the curation engine, the recommendation engine works that are U.S. specific. Those relate to the content, the community guidelines.
Judge Ginsberg:
Insofar as the changes are originating in China, they would be subject to review before being implemented in the U.S., review through the TTD USA.
Andrew Pincus:
They are not reviewed before they have implemented, but they are reviewed and subject to recall. But I think the critical point is those changes to the code are not the sole content moderation activities. The U.S. recommendation engine also takes account of determinations about, what is acceptable content in the U.S.? How should that content be treated? Some content is out-
Judge Ginsberg:
But in order to apply what you just said, they have to look at what's coming down from Beijing and decide whether it comports with what's acceptable in the U.S.
Andrew Pincus:
Well, some of what they do is just done in the U.S. It's true that the source code. But I think your honor's point, I guess I just have to say there are many, many U.S. companies that use source code that's developed in China. The Weber Declaration talks about that in detail. So the mere fact-
Judge Ginsberg:
Do any of them involve apps that reach whatever it is, a million people a month?
Andrew Pincus:
I'm sorry?
Judge Ginsberg:
Do any of them involve apps that reach 10 million people a month?
Andrew Pincus:
Yes.
Judge Ginsberg:
In the U.S.
Andrew Pincus:
Many of them.
Judge Ginsberg:
Why is it, I could not find in any of your ... the briefs, indeed, in the declarations and so on, any reference to another company that would be subject to the second procedure provided in the statute, the alternative procedure that ends with a presidential determination.
Andrew Pincus:
Well, your honor, we didn't try to identify one, but certainly one of the reasons-
Judge Ginsberg:
You did assert that there were several, but well at least implied that, but by saying every other company would be subject to the second type of procedure. Is there such a company?
Andrew Pincus:
Well, I think we don't know what companies, your honor. I mean, I guess two answers.
Judge Ginsberg:
Well, surely by chance knows who their ...
Andrew Pincus:
Well, we don't know what companies the U.S. government is going to say are subject to foreign adversary control. I think that's one of the problems with this statute, is that the U.S. government could say, "Here is platform X. We think that they are subject to control by Russia. We're looking at their content. We think they're too susceptible to infiltration by Russia, and therefore they have to be moved to a new owner, even if the parent is a U.S. owner, because that risk is too great."
So we don't know what websites the government might argue, but I think the other critical question is one of the reasons is the exemption of e-commerce sites. We do talk about two Chinese ... two e-commerce sites that would certainly meet all of the other criteria in the law, but that are exempted because of the business review exception. Those sites, we don't know, but certainly those sites could well be susceptible to the government's action, but they've been excluded by Congress.
Judge Ginsberg:
With respect to your earlier colloquy, particularly with Judge Rao, I'm not sure if you did, but if you wouldn't mind again telling me why this is any different than from a constitutional point of view, than the statute precluding foreign ownership of a broadcasting license?
Andrew Pincus:
Because the court has said that a lesser standard of scrutiny applies to broadcast licenses. That's been the justification of all of those decisions, is the spectrum scarcity and lesser scrutiny permits a greater degree of government control. So those decisions, because we [inaudible] applies here.
Judge Ginsberg:
Okay, what about the other dozen or 15 statutes that prohibit foreign control that don't have to do with spectrum, or the outdated idea of the spectrum?
Andrew Pincus:
Well, nuclear waste sites, the government sites. A few of those don't implicate First Amendment interests at all. So the fact that the government says "no foreign ownership of a nuclear waste site," there's no First Amendment issue in requiring divestiture of it.
Judge Ginsberg:
So it all depends upon accepting your view that there is a First Amendment issue here, because notwithstanding the foreign control potential, we have a U.S. speaker.
Andrew Pincus:
I think that's right, Your Honor, and I think that's a fundamentally important-
Judge Srinivasan:
Can I just ask about the broadcast idea? So if we're in the land of allocating broadcast spectrum space, and the rationale for not allowing foreign ownership in that context is we're worried about foreign ownership begetting foreign propaganda, would that, in your view, be something that needs to be justified by strict scrutiny? If that's the rationale. The rationale is we're not going to allocate this slice of spectrum to either foreign control or an entity that's subject to foreign control. That seems to be fine, because that's what the law does and it's been sustained, but the rationale for that is it's because we're worried that if that happens, it'll beget foreign propaganda.
Andrew Pincus:
I think if that's the justification, strict scrutiny might well apply. Again, it might be satisfied because of the scarcity rationale that also applies, but I do think once you get into viewpoint basis ... Even in Holder, the court said ... very targeted speech, a very targeted congressional restriction, the court said, "Strict scrutiny applies. This is a restriction of speech."
Judge Srinivasan:
Yeah, but the speech that was at issue in Holder wasn't foreign speech. It had implications abroad, to be sure, but what was going on was it was lessons that were being taught. There wasn't a question about the analysis being different because what was being targeted was foreign control, or even foreign adversary control, obviously.
Andrew Pincus:
Yeah. I just think, Your Honor, if the premise of a lesser standard is foreign control, then surely the foreign control has to be demonstrated by some strict scrutiny standard because you're then taking away the rights of the U.S. speaker. So if the premise of applying a lesser standard is foreign control, I think you're still in strict scrutiny because the consequence of that is totally taking away the First Amendment rights of a speaker, and the government really doesn't argue that.
And I really want to draw a distinction between foreign ownership and foreign control. I think foreign ownership, for the reasons I said in our colloquy before, would be a pretty shocking change here. Just to go back to Judge Rao's question for a minute about disclosure, I think disclosure has been the historic answer for covert content manipulation. That's what the statute at issue in Meese v. Keene talked about. Just identify the source, and then Americans can make the choice. That's exactly what happened. Again, I'm not saying there might, in some hypothetical world, be possible in some situation to say that that doesn't work, but that's certainly been what happens, and we're not saying that covert-
Judge Rao:
How do you identify covert influence in a code that they estimate ... They say it would take three years to just review the existing source code.
Andrew Pincus:
Well-
Judge Rao:
Much less any updates to the code. So how are you supposed to have disclosure, or verified disclosure, in that sort of circumstance?
Andrew Pincus:
Your Honor, it might be that the disclosure is just that the government says there's a risk to control. Maybe the disclosure doesn't have to be targeted? I don't know exactly what it could be, but again, we haven't had any exploration here of even if there is control on a record that this court can review, let alone whether disclosure could work. Maybe the government would come up with some arguments that it wouldn't work. I don't think they could.
But let me turn for a minute to News America, because I think that really is a root and explains the problem here. News America was a case where the court said, "This statute, even in the broadcast context, targets a specific speaker. We therefore are going to apply heightened scrutiny, and we don't see any reason why Congress exempted this individual speaker from the general rule about when you can get discretionary waivers from the FCC, and therefore we're going to invalidate a prohibition on that."
And this case seems to us to be on all fours with that. Here we have a specific US speaker targeted. It's been exempted from a general process that answers a lot of the conundrums that are before this court. What about less restrictive alternatives? What actually is the basis for alleging government control? Here's a record, here's a recent decision that can be reviewed, and then this court can review it. We'd have a lot of the same legal arguments. But part of the issue in this case, and we think it's a Constitutional flaw and not just a problem, is that Congress didn't do any of the things that the First Amendment requires.
Judge Rao:
Mr. Pincus-
Andrew Pincus:
Oh…
Judge Rao:
Oh, I'm sorry. I understand, of course, you think strict scrutiny applies, but assume for a moment that O'Brien is a framework and intermediate scrutiny applies. What is your best argument that ByteDance, TikTok can win under that level of scrutiny?
Andrew Pincus:
Well, we still think under intermediate scrutiny there's a requirement to look at alternatives, and there's no indication that Congress looked at the alternatives here. There's no indication that disclosure-
Judge Rao:
Well, there's alternatives, but it doesn't have to be the least restrictive means.
Andrew Pincus:
But they have to be considered, and they weren't even considered. And I do think, going back to sort of a broader problem in this case, we have a real threshold question about-
Judge Rao:
So you're not challenging the government's interest as substantial under intermediate scrutiny, just the [inaudible]...
Andrew Pincus:
No, I am challenging the government's interest in scrutiny. What I was about to say is that the government, as I said, has plucked out this very targeted interest. But I think if you look at what Congress talked about, the problem here is that there was a lot of discussion about the imbalance of content on TikTok at times where the government concedes there's no foreign manipulation whatsoever. And I think figuring out what Congress's actual purpose was here ... that's the test that the Supreme Court has set up in First Amendment cases ... is very problematic, because we really don't know. The government is arguing that it's this very, very narrow interest here, but the record is suffused with comments by legislators both in the House and the Senate about the supposedly ... imbalance about Palestinians and Hamas, all kinds of current events.
Now, we have Mr. Weber in his declaration explains why those allegations of imbalance are wrong, but they clearly motivated Congress in a significant way. It's another reason why the availability of the general standard and the real tainted problems with the specific TikTok provision sets up an alternative where, if the government thinks that it can establish a record based on the argument that it's sort of culled together, let it put together that record, look at the less restrictive alternatives that have not been addressed, and also, frankly, consider the facts. Another sort of issue in this case is to rule for the government-
Judge Rao:
I think you're arguing for us to remand without vacatur to Congress for more findings.
Andrew Pincus:
Well, I don't think-
Judge Rao:
It's a very, very strange framework. I know Congress doesn't legislate all the time, but here they did. They actually passed a law, and many of your arguments want us to treat them like they're an agency.
Andrew Pincus:
I don't-
Judge Rao:
It's a very strange framework for thinking about our first branch of government.
Andrew Pincus:
I think it's an unusual law, though, Your Honor. It's a pretty unusual law, an unprecedented law as far as we know, that specifically targets one speaker and bans generally. This isn't Kaspersky or one of these laws that talk about government procurement or the use of government funds. This is a law that broadly regulates, and targets that regulation at one speaker. That's pretty unusual, and I do think News America supplies the paradigm. Now, in News America, the court didn't say, "We're remanding to the FCC," but the functional effect of its decision was to say, "The FCC will apply the general standard, and then if there's a problem, we'll figure it out." So I'm not saying remand to Congress; I'm saying exactly what the News America court said.
Judge Ginsberg:
It's a rather blinkered view that the statute just singles out one company. It describes a category of companies, all of which are controlled by adversary powers, and subjects one company to an immediate necessity because it's engaged in two years of negotiation with that company, held innumerable hearings, meeting after meeting after meeting, an attempt to reach an agreement on a national security arrangement, which failed. That's the only company that sits in that situation, that is so advanced in its negotiations and its relationships with the government that it's exhausted any further possibility of relief through the second procedure.
Andrew Pincus:
Well, respectfully, Your Honor, I guess I'd have two answers to that. One is-
Judge Ginsberg:
As usual, Mr. Pincus.
Andrew Pincus:
Sorry, I just want to give the court multiple reasons. The generally applicable standard is more protected for companies. It gives them a statement of reasons for this court to review. It has the business review exclusion. We can debate about what it means. Maybe it's broad and it says-
Judge Ginsberg:
Maybe so, but you're not making the claim to have that exclusion.
Andrew Pincus:
Well, it's not an option for us. If we were in that generally applicable standard, something that's certainly possible ... There's a lot of review content on TikTok, business, travel, product reviews. TikTok-
Judge Ginsberg:
Well, then you can come back, I suppose.
Andrew Pincus:
I don't think so, your Honor.
Judge Ginsberg:
Maybe not.
Andrew Pincus:
The statute is an absolute bar. Other companies could say-
Judge Ginsberg:
It's an absolute bar on the current arrangement of control.
Andrew Pincus:
Yes, but under the generally applicable standard, that arrangement of control won't be disturbed if the exclusion applies, so that's something that's not available to us.
Judge Ginsberg:
That's essentially your equal protection argument, correct?
Andrew Pincus:
No, I-
Judge Ginsberg:
Equal protection heightened with a sort of First Amendment flavor enhancer.
Andrew Pincus:
Exactly the argument that was in News America, Your Honor. What the court said is, "We are looking at the First Amendment equal protection with a little flavoring of bill of attainder."
Judge Ginsberg:
But talking about levels of scrutiny ... which I'm not sure we need to sort of waste all this time on, frankly, or use all this time on ... certainly there's no precedent, no case going either way, involving a designated adversary nation. Surely, that might have something to do with the level of scrutiny that a court should apply to a judgment by the Congress about a foreign power.
Andrew Pincus:
I think that the issue ... I think that might-
Judge Ginsberg:
It’s a matter of deference, let's say. Go ahead.
Andrew Pincus:
I think that might apply whether strict scrutiny is satisfied, but as I said before, the problem here is the predicate isn't just ... This is not claimed to be all the speech of the designated foreign adversary. Maybe that would be a different situation. Might not be, but we don't have to decide that. What's claimed here is there might be some influence on this fully protected U.S. speaker in the future, and therefore we can burden the fully protected speech now. So the predicate control is what the government has to establish, and I think our argument is it has to meet a really high standard to do that, because what it's doing is taking away the rights of an American speaker.
Judge Ginsberg::
So you're quibbling with whether there's actual Chinese potential control, that the company could not be directed to do something or refrain from doing something?
Andrew Pincus:
I think that is one of our arguments, yes, whether it's possible, and whether-
Judge Ginsberg:
Under Chinese law.
Andrew Pincus:
... the protections-
Judge Ginsberg:
That's your interpretation of Chinese law?
Andrew Pincus:
Well, I don't think the government claims Chinese law could do that. I think their claim ... I may be wrong ... is mostly related to the data privacy part of the equation, but I think that our ... The conclusion is-
Judge Ginsberg:
They've argued flat out that being subject to Chinese control, the company ... it's a misfortune, perhaps, for the company ... would have to and would certainly comply with a requirement with respect either to content manipulation or to hoovering up information.
Andrew Pincus:
I don't think the government has established that yet. We haven't seen what's in the confidential secret submissions, but I don't think they've established even as a matter of Chinese law. But even if they do, I think then the question is, do these two justifications apply, or is there some less restrictive means? And that hasn't been decided. It was a question ... Even in Holder, that was an issue.
Judge Srinivasan:
Let me make sure my colleagues don't have additional questions for you, because we still have to hear from the users, and we'll definitely give you some rebuttal time. We'll give you rebuttal time, Mr. Pincus. Thank you. Mr. Fisher.
Jeffrey Fisher:
Morning, may it please the court. The creators' fundamental submission in this case is that, wholly independent of TikTok and the company's interests that are at play here, the act here directly implicates the First Amendment rights of American speakers to speak, associate, and listen to free expression in this country. Any other holding would prevent American writers from publishing in Politico or Al Jazeera, would prohibit American musical artists from posting their music on Swedish-owned Spotify, or would prevent American filmmakers ... allow Congress to prevent American filmmakers from creating documentaries to be edited and aired on the BBC.
Our arguments on the compelling interest and narrow tailoring side of the case do parallel TikTok's to a great extent, but I do want to emphasize that the government's content manipulation rationale is wholly illegitimate and invalid and anathema to the First Amendment, and itself taints the entire act. If you could, as Judge Srinivasan hypothesized, isolate just the data privacy, that itself would also be not a compelling interest and not narrowly tailored. Taking a step back, it is truly striking-
Judge Ginsberg:
Excuse me, what is the creators' interest in that aspect of the case?
Jeffrey Fisher:
In the data privacy aspect of the case?
Judge Ginsberg:
Yeah.
Jeffrey Fisher:
Well, the creators' interest here is the First Amendment right to publish and coordinate with their publisher of choice, as in the net choice sense of TikTok. And so if you work all the way down through strict scrutiny to the government's purported interests, we think the data security interest is invalid for a couple of reasons, and I can walk through those.
Judge Ginsberg:
We're talking about the concern of the government with hoovering up all the information about American users, including your speakers, right?
Jeffrey Fisher:
Right, so we just think that is an insufficient justification to satisfy certainly strict scrutiny, and even if we were in a world of intermediate scrutiny, and for a couple of reasons. One is, as the company has elaborated in its briefing, the government's arguments themselves are overblown. Geolocational information is not gathered to the extent the government asserts. Contact lists are not given to the company unless the users opt into that. And in general, there's a real problem with the government's data security argument, particularly from my client's standpoint, because those are voluntary acts. These are opt-in procedures to share your data if you wish. So, that's the first problem.
The more dramatic problem, though, if I could really emphasize this, is that even if you could isolate data security, under the Arkansas Writers' Project and Minneapolis Star cases, that would have to be subjected to strict scrutiny because you have a law that is singling out speakers. It is singling out media in the press in a way that triggers strict scrutiny in those cases, and there's no way ... if I could just add one quick thing, and I want answer your question ... there's no way that can satisfy strict scrutiny given all the exclusions Mr. Pincus has described with e-commerce and all the rest.
Judge Ginsberg :
So, they opt in for sharing your data. The user is asked, "Do you want to share your data?" That doesn't mean sharing it with ByteDance, right? The problem is TikTok is going to have the information. The question is, "Do you want us to be able to share it with others?" Is that correct?
Jeffrey Fisher:
Well, I think that yes, you understand you're sharing your data with TikTok, and it's public information who TikTok is ultimately owned by is a company called ByteDance. And so it's wholly voluntary on the user's part, but it's really the under-inclusivity in a world of strict scrutiny, which has to be applied under Minneapolis Star and the Arkansas Writers' Project, that sinks the government's ship on the data security side.
Judge Rao:
Mr. Fisher?
Jeffrey Fisher:
It can't be ... Yeah.
Judge Rao:
Does your argument depend on a conclusion that divestiture is impossible?
Jeffrey Fisher:
No, it doesn't. So, I think it is. It seems like the record shows that it's impossible, and I'm not sure how much the government pushes against this, but even if it were possible, for two reasons, we would still-
Judge Rao:
So, what would be the creators' interest in TikTok U.S. being owned by ByteDance?
Jeffrey Fisher:
Well, our interest is in working with the publisher and editor of our choice, including the current ownership, which works very well for our creators. You couldn't tell an American writer they wouldn't have a First Amendment interest in working with Twitter owned by a particular individual, or if Fox News was required to divest from Rupert Murdoch.
Judge Rao:
Do you have a First Amendment interest in who owns TikTok?
Jeffrey Fisher:
Yes, that is who the publisher is, ultimately, you could say. And so the act directly singles out-
Judge Rao:
Doesn't that argument bolster the government's argument, though, that TikTok U.S. is controlled by a Chinese-headquartered company?
Jeffrey Fisher:
Well, I think that you have to walk through this step by step. So you've asked, "Do we still have a First Amendment interest in a particular owner or publisher?" Absolutely, yes. No case that I'm aware of has ever suggested that singling out a speaker or publisher does not implicate First Amendment rights.
Judge Rao:
But a lot of the argument depends on TikTok U.S. being a separate corporate entity, right?
Jeffrey Fisher:
I think a lot of the argument does depend on it, particularly from the company-
Judge Rao:
So it remains a separate entity; it just-
Jeffrey Fisher:
No-
Judge Rao:
... its ownership changes.
Jeffrey Fisher:
No, no, no. But I want to emphasize, even if I would spot the government all of that and say all the way down to control, which we don't think is in the record in this case, we would still have a First Amendment interest of working with whatever foreign-owned publisher we want. And all the hypotheticals I've just given you, from Politico to Al Jazeera to Oxford University Press, all the way down ... And I want to emphasize, Judge Rao, and this goes to Judge Ginsburg, you asked about foreign adversaries and foreign governments. There are absolutely Supreme Court cases, Lamont and Whitney first and foremost, that hold that even American speakers speaking in conjunction with foreign governments who are hostile to this country ... That is the holding of Lamont, which is-
Judge Srinivasan:
So in the broadcast context, then, would you say that there's a First Amendment interest in making sure that broadcast spectrum is subject to foreign control if there are users who would like to work with the foreign-controlled license?
Jeffrey Fisher:
I think the way you were describing in your back and forth with Mr. Pincus got it right in the end, which is simply saying rules about foreign ownership simpliciter are okay as a matter of the First Amendment, but if the government were sorting between viewpoints, even as according to who is more hostile to this country or their views of communist nations, again, that brings us right back to Lamont. It brings us right into Whitney. Remember, the speaker in Whitney was a member of the American Communist Party working in conjunction to espouse the Moscow principles laid down in the manifesto that Justice Brandeis describes, so-
Judge Srinivasan:
So then, if that's the relevant distinction, then it's not enough for the government to say, in the broadcast context, "We're just going to exclude foreign ownership, period." You'd say, "Well, you have to explain why. Why do you want to exclude foreign ownership?" It's not enough for you just to say you want to exclude foreign ownership and then win on that basis, because if the reason you want to exclude foreign ownership relates to a concern about the content implications of the foreign ownership, then that might not be permissible.
Jeffrey Fisher:
Well, look, let me just give a quick preface and then give your answer. I don't want to get too far into broadcast, because ACLU v. Reno makes very clear that the world of the internet and the unlimited marketplace of ideas on the internet is very different from broadcast, so I think whatever you say here wouldn't bleed over to broadcast.
Judge Srinivasan:
I'm just trying to-
Jeffrey Fisher:
But on the broadcast side, to answer your question, I think that if the government came in and said, "We are worried about the viewpoint of the speaker, not just the foreign ownership," that would be a problem. It would be something that I don't think the US Supreme Court has ever said that is okay. And as I said, Whitney-
Judge Srinivasan:
Even though, as I understand it, that was the reason from the very beginning with the Communications Act. That was the reason that Section 310 excludes foreign ownership, is a concern about foreign propaganda.
Jeffrey Fisher:
Well, I think that, again, you'd have to trace that back and ask whether that's still the justification today, if that were what happened in the past. You'd have to ask whether the government has other arguments in those sorts of cases. There'd be a lot to sort of work through in that case. But I think it's fair to stand here at the podium and say the US Supreme Court, nor this court, has never said that viewpoint-based restrictions among foreign speakers is a legitimate interest to pursue into the First Amendment.
Meese against Keene, the Supreme Court stressed ... and actually, Solicitor General Fried, in that case, stressed to the Supreme Court, that was exactly what saved that law in that case, was that foreign propaganda was just described by Congress in entirely content-neutral terms. And so the government didn't even make the argument in Meese against Keene or in Lamont that the government is making here, that a foreign speaker or a foreign government can be suppressed from ownership, or speaking, or curation, or any of these other First Amendment activities based on their viewpoint.
But remember, the reason I'm standing at the podium here is that this isn't just a case about a foreign speaker. If this were a case like Lamont, just about foreign speakers and Americans wanting to hear that, I think you'd already have a victory for the American listeners under Lamont. And certainly no Supreme Court case has ever held to the contrary, and there's no history and tradition in our country of banning U.S. speakers from hearing from foreign governments, even if they're hostile to our country, simply to express their ideas. But that's not even this case. This case is American speakers, like the creators and Based Politics, who want to speak to other Americans on an American platform and at the very most can be alleged to say they want to coordinate with a foreign publisher when they do so.
Judge Srinivasan:
Can I ask about Lamont? So if we look at Justice Barrett's concurrence in that choice, it clearly presumes, in her view, that foreign control changes the equation under the First Amendment. And if that's so, is Lamont always a trump card? Can Lamont just always kick in and say, "Even if there's an ability to deal with foreign control vis-a-vis the foreign speaker, because the foreign speaker doesn't have First Amendment rights, it turns out that that's going to be an illusory ability on the part of the government because you can always bring into play the US recipients, and their rights kick in"?
Jeffrey Fisher:
So, let me start with Justice Barrett's concurrence and explain how this all sorts out. So really all she said is, "Foreign ownership might change things under the Open Society case," which is true as far as it goes. It's a question to ask, a fairly fair question. But I think the way I distinguish Open Society on the one hand and Lamont and all the things I'm describing on the other is, Open Society is just about a foreign speaker speaking abroad. That's what that case is about, full stop.
And there is no First Amendment interest in foreign speakers speaking abroad, but once that speech is directed into the United States, and certainly once that speech is in concert with other Americans, and indeed propagated by other Americans, the speech on TikTok is not Chinese speech. It is American speech that at most is curated by a foreign company and, the government says, potentially by a foreign government as well, but it's American speech. You're way, way, way on the First Amendment protective side of the equation, and we even have a much stronger case than Lamont. So all Justice Barrett, I think, is saying in that choice is, "Oh, let's ask that question if and when it comes up, but when you have speech inside the United States, our history and tradition is we do not suppress that speech because we don't like the ideas." If I could just give one more example-
Judge Srinivasan:
Would that apply in wartime, too? Just to continue the hypothetical, I'm curious about your reaction to that. So if the rationale for this, for barring foreign ownership of a media establishment, is a concern about what that is going to produce vis-a-vis content, and we're in a time of conflict with a foreign adversary, would you say that the U.S. ... Protest here against a war is completely protected under the First Amendment, and so that viewpoint would be one that American listeners might well have an interest in hearing a lot about. Would you say, then, in that situation, because of clients like yours, the American recipients, that a bar on foreign adversary ownership of a media establishment during war is invalid?
Jeffrey Fisher:
Leaving the broadcast question aside-
Judge Srinivasan:
Yeah, aside from broadcast.
Jeffrey Fisher:
... definitely strict scrutiny would apply, and I can imagine particularized facts where strict scrutiny might be satisfied in the heat of war depending on what exactly the content of the speech is. But in a situation like this, where we're not at war and all we're talking about is so-called foreign propaganda, really, again, arranging American speech and manner-
Judge Srinivasan:
But the negative part of that is even in war, you wouldn't just accept that the government can prohibit foreign adversary control, the foreign ...
Jeffrey Fisher:
As I said, that might be the case, but I think you'd ... Let me give you a couple of historical examples to show why even that is a very hard question. George Washington, in his farewell address ... One of the central themes of that farewell address was, "Beware of foreign influence." Remember, France had supported Thomas Jefferson in the election, and George Washington told Americans, "Beware of foreign influence, and be careful about foreign influences, and be careful about making your own association with foreigners." Never did he suggest the answer was to suppress foreign speech.
Look at FARA, the statute passed in the run-up to World War II that was at issue in Meese against Keene. Congress there dealt with foreign agents in the U.S. spreading foreign propaganda in a run-up to war, and that law ... All it does is require disclosure. It does not prevent Americans from ... as the government, in its most robust form of the argument would say, creators like the ones on TikTok do. And then Lamont, again, in the height of the Cold War. Again and again in our history, we have this. And Judge Rao, you also asked about the PLO case?
Judge Rao:
Yeah, I'm interested ...
Jeffrey Fisher:
Yeah.
Judge Rao:
That case seems very similar, because there you had American citizens who associated with the Palestine Information Office, and our court said requiring that office to disband was only an incidental burden on their speech, because they could continue speaking about Palestinian causes in other fora or even form other groups. They just couldn't be a foreign mission of a terrorist organization.
Jeffrey Fisher:
Right, so I think you stressed the important part about that case, which is distinguishable from here, which is ... My understanding is the court there stressed, "We're not suppressing any speech here at all. We're just preventing them from having this mission in this office, but they can still speak."
Judge Rao:
But why is that not analogous to what's happening here, right? The Congress has decided that TikTok U.S. cannot be owned by, effectively, a Chinese corporation, and so that leaves creators that you represent free to continue speaking on TikTok U.S. if they divest or on other platforms, or create a new platform, or any number of other ways they could continue sharing their content.
Jeffrey Fisher:
Right, so remember, I think the facts would show that divestiture is impossible, but leaving that aside for the moment-
Judge Rao:
But you said that your argument doesn't depend on it.
Jeffrey Fisher:
It doesn't, so I just wanted to put a pin in that. So, there's a couple of big differences between what you're describing and this case. So the first is that there are not interchangeable mediums for our clients to speak on, and I don't think the government could seriously dispute the declarations that show that TikTok is unique in terms of how it looks and feels, the audience that people are allowed to reach. So one of our clients reaches millions of people and has millions of followers on TikTok, has tried to work on other platforms. 10,000 or even fewer than 100 on YouTube are able to ... So, there's a whole different audience that you wouldn't have had in that case. You have whole different tools available, like CapCut and editing tools, so the nature of the speech is different. In the PLO case, I think you would've still been able to exercise all the same expression and reach the same audience-
Judge Rao:
But you wouldn't ... One of your arguments is that the creators want to work with their foreign owner-publisher of TikTok U.S. Arguably, in the Palestinian Information Office case, they wanted to work with the PLO and to represent the PLO in America. That was the whole mission of that office.
Jeffrey Fisher:
But I don't take that-
Judge Rao:
Maybe the interest in the Palestinian case is actually stronger.
Jeffrey Fisher:
So, I don't take that case to say that speakers going forward there would not be able to speak in conjunction with the PLO in the United States.
Judge Rao :
But they can't represent the PLO.
Jeffrey Fisher:
Well, I think ... Maybe I'm getting too far into the details, but in this case, our first and foremost argument is, "We're American speakers wanting to work with an American company, TikTok." If you cut through that and say, "Well, what about ByteDance?" for all the reasons I've said, we have a fundamental interest in being able to work with the publisher and editor of our choice, even if it's a foreign editor or foreign publisher. The implications of writing an opinion that accepts that argument from the government truly are staggering. As we note in our brief, Democracy in America is written by a French author sent by the French government. And if an American bookseller wanted to sell that, it would be quite surprising to have the government be able to answer that and say, "Congress can ban American bookstores from selling democracy in America because it's written by a foreign author in conjunction with a foreign government," Blackstone's commentaries could be banned.
Judge Rao:
We're not talking about banning Tocqueville in the United States. We're talking about a determination by the political branches that there's a foreign adversary that is potentially exercising covert influence in the United States. It's very different from selling a book.
Jeffrey Fisher:
I don't want to be histrionic, so let me walk through that. As I take the government's opening argument and even the suggestion maybe that some members of the panel voiced about Justice Barrett's concurrence, the first argument is simply because there is foreign ownership in the TikTok chain. That itself makes this not a First Amendment case or prevents First Amendment claims from being raised, so that's my answer that I'm just giving. That can't possibly-
Judge Rao:
Okay, so the First Amendment maybe covers this, right? It's implicated by this-
Jeffrey Fisher:
The First Amendment covers this. And then the question is, is the argument that the Supreme Court squarely represented in NetChoice, that it is an impermissible government motive to regulate, to change the curation of speech on an internet platform any different because you have a foreign owner? And I think the answer has got to be no. Certainly when you have American speakers, for all the reasons I've said, American speakers who are entitled to seek-
Judge Srinivasan:
Justice Breyer at least thought the answer to that question was yes, because you said, "Is the analysis any different because you have foreign ownership?" I thought the entire premise of her concurrence is that the analysis is different.
Jeffrey Fisher:
Well, I can't remember the exact words she used, but I think it's a question that you would ask. So I think at the end of the day, let's just say the conclusion is no different because under Lamont, under Whitney, under just basic history and tradition of this country, there's no example in law or judicial decisions of an American speaker being treated any differently because they want to associate with a foreign publisher or a foreign co-author or sell a foreign book as compared to an American one. And then I think the last point-
Judge Ginsberg:
I'm sorry. The statute doesn't apply to all foreign editors or publishers or what have you. It applies to foreign publishers or editors from four specific adversaries.
Jeffrey Fisher:
Right, so I think-
Judge Ginsberg:
So instead of saying foreign control, let's say adversary control. At every point, think of it as adversary control, and then go on.
Jeffrey Fisher:
So I think we're exactly at that same... We're in strict scrutiny. We're asking now whether foreign adversary relationships or control make any difference. And again, look at Lamont. Lamont is about the Communist Party. This case is allegedly about the Communist Party of China. Look at Whitney. The Speaker there was for the Communist Party-
Judge Ginsberg :
Look at the Congress-
Jeffrey Fisher:
... so the Court has never suggested-
Judge Ginsberg:
Look at the Congress making a decision that perhaps unlike the Soviet Union at the time that the case arose, or Russia, that that country and three others are now adversaries. Perhaps they were not then. Now, we don't know. The Congress didn't speak to it until just now.
Jeffrey Fisher:
Well, I think just as a simple matter of history, the Court could take judicial notice that the governmental actors in those cases were every bit as adverse to this country in those moments in history as now.
Judge Ginsberg:
Well, if you want us to say that the provision of the statute designating the four countries is irrelevant to this case, say so.
Jeffrey Fisher:
No, no, no. I'm not saying it's irrelevant, and I totally take the Court's question as being a very serious and important one. And different countries in the world's history have resolved this question very differently. But our history and tradition in this country is that yes, you might grant Congress more leeway in a time of war particularly, or maybe even with adversaries. But as matter of strict scrutiny, the notion that a foreign adversary is going to spread its ideas about political issues and social issues, which is exactly what the government says in its brief, has never in our history been a basis for suppressing speech in this country, even of the foreign governments, let alone of American speakers speaking on their own terms to other Americans. And I think the last thing judge you mentioned is-
Judge Rao:
Does this act suppress the speech of a foreign corporation? ByteDance would remain free to post or to speak or to do anything else in the United States. It doesn't prevent ByteDance from doing that. It doesn't suppress their speech.
Jeffrey Fisher:
No, it does. The Act by its terms forbids any app ByteDance would own, so certainly any application covered. And I think, Judge Rao, you also asked about O'Brien and the level of scrutiny, and I think that's part of my answer is that let's just look at the text of the statute to ask whether or not strict scrutiny applies because it's content-based. And thrice over, the statute is content-based. Once it singles out TikTok and ByteDance for differential treatment from other owners or other publishers, so that's one reason. The next reason is because it singles out social media interaction. Generating, sharing, viewing text, images, videos, real-time communications and similar content. Content is in the statute. And thirdly, because it targets the content recommendation engine that TikTok uses and says that even a successor of interest cannot use that content recommendation engine, again, going right in the teeth of NetChoice when it comes to curating-
Judge Srinivasan:
So NetChoice, there's been the suggestion that delving into doctrine is too much law geekdom, but let me just do it for a second because I think it actually affects things in terms of the analysis. If we're not in strict scrutiny land and we're in intermediate scrutiny land, which is a lesser level of review, and let's just say we're doing that because this case involves anomalous circumstances because it's congressional determination of foreign adversary status and that tilts the equation, and a balancing of considerations suggests that you get in a lower tier of scrutiny but not abandoning the First Amendment altogether. What's your answer there? Because part of that analysis is that that the justification for the law is unrelated to the suppression of expression. Would your analysis be that, well, for the same reasons that we think the law is content-based and so strict scrutiny would apply, even if you foist intermediate scrutiny upon us, we would still say it fails intermediate scrutiny because the law's motivation is related to the suppression of expression?
Jeffrey Fisher:
That's exactly my answer, yes. And I think that under Mount Whitney or Arlington Heights, that would be where actually you would stop because once you have an impermissible motive behind the law, you don't even look at the data security.
Judge Srinivasan:
So that's what the NetChoice majority indicated was going on, what would go on there when they went on and talked about how you would analyze it. And here's my question about that. So it speaks in terms of related to the suppression of free expression, and one way to potentially understand that is related to the suppression of protected free expression. And if you're talking about the foreign organization's interests... You represent the users, I get that, the recipients. But from perspective of the foreign speaker, if those are unprotected by the First Amendment, because under Agency for International Development they don't have a First Amendment claim to make, then could an argument be made that the law actually is not related to the suppression of free expression? Because what it's really trying to suppress is foreign ownership and there's no First Amendment stake on that side.
Jeffrey Fisher:
I think I'd have two answers there on that particular one, and then I want to do the rest of intermediate scrutiny as well. But on that answer, I would say there's still two problems. One is you still have under the Sorrell concept, in practical operation, American speakers are silenced by, or their speech is certainly affected by this law. So you can't get out of the First Amendment problem with that formalistic move. Even if you wanted to isolate Judge Srinivasan, the foreign speaker, and make the argument you just made, I actually think then you'd still have an R.A.V. type problem when it comes to the government discriminating among foreign speakers based on viewpoint. So remember, the R.A.V. principle is even in a world where you're dealing with totally unprotected speech, if the government is choosing and selecting and suppressing some based on viewpoint but not others, Justice Scalia's opinion in that case says even then, strict scrutiny applies.
Judge Srinivasan:
No, but you can do that if the reason that you're choosing a subcategory is related to the reason that the category has a lesser status. And here the argument would be, well, the category is foreign speech. The subcategory is foreign adversary control over a really important medium. That's a related subcategory because what we're really concerned about is foreign-
Jeffrey Fisher:
I think the way I would define the subcategory or the interest is the way the government puts it in his brief, is the concern that the Chinese government is going to influence political discourse and ideas in this country, which I think is an impermissible motive at least if you're outside the broadcast media world and regular First Amendment world. And so if you want me to walk the rest of the way through intermediate scrutiny, I think that the content manipulation justification fails under intermediate scrutiny and then you should be done. Arlington Heights says that if you have an impermissible motive behind a law, and just imagine TikTok was banned or imagine this divestiture provision was motivated, Congress said, by the fact that one particular religion over another is being favored on the platform or too many people of one race or another are using the platform, but also data security. Once you were done with that impermissible motive, the case would be over.
But even on the data security side, even under immediate scrutiny, you need to have some reasonable means-ends fit between the government's rationale, and you just don't have that here, because of the e-commerce sites that are left out. The other American technology companies that have Chinese subsidiaries that are left out. There's just woefully under-inclusive, which-
Judge Srinivasan:
As to that, under Holder, I think we extend a substantial degree of deference to Congress's and the government's assessment of how the alternatives would work out and whether they're sufficient. It may still not be enough in your view, but it's at least overlaid with a substantial degree of deference because of what the Supreme Court said in Holder.
Jeffrey Fisher:
Yes, you can give the government some deference, and yes, the government doesn't have to do everything, especially in an intermediate scrutiny world, but the government still has to come in and explain in reasonable terms why it has singled out one particular collector of data and excluded everybody else. And I just don't think these e-commerce sites that millions of Americans visit as well, and these many other platforms that would be susceptible to Chinese hackers or influence or whatever the facts may be, are meaningfully different from TikTok. And I think what the Supreme Court tells us when it comes to under-inclusive arguments is what that often is a signal that something else is at play. And that's what we think is going on here. And again, the government admits that it's the content manipulation rationale that also justifies... They say justifies. Motivates, we say, this law and that's where the problem is.
And then finally, in terms of intermediate scrutiny, again, we think there's just no way to get there under Minnesota Star and Arkansas Rides Project on the data security side. But if you could, there are still not alternative means available to my clients for the reasons I've been describing that are laid out in the affidavits. The audience is completely different in other platforms. The tools and the feel of the medium of speech is completely different, and just the identity of the editor and publisher can't be singled out and taken away from us.
Judge Srinivasan:
Okay. I'll make sure my colleagues don't have additional questions for you now. We'll give you a little bit of time for rebuttal as well. Thank you, Mr. Fisher. From the government now, Mr. Tenny.
Daniel Tenny:
Thank you, Your Honor. May it please the Court, Daniel Tenny for the United States. There's been a lot of discussion of what the government's motivations for this statute were and what its justifications are, so I'd just like to start up front by making those clear, and there are two primary ones. First, TikTok is an application, and what it does is it gathers a lot of information from users of the application, both consumers of content and creators of content. And it uses that information to try to assess what sorts of videos and other content is going to be of interest to consumers and what will keep them looking at the app. They want to keep people's eyeballs on those screens so that they're continuing to consume the app. And that requires the collection of data, and that data is commercially useful to them. And in today's society, the collection of data is an important part of commercial and advertising, figuring out how to tailor to users' needs, how to target advertisements or other things to particular users.
The problem is that that same data is extremely valuable to a foreign adversary trying to compromise the security of the United States. Knowing what Americans' patterns are, who their contacts are, where they go, who they interact with, what sorts of content interests to them, what sort of content turns them off, would be quite valuable to a foreign adversary if it were trying to approach an American to try to have them be an intelligence asset or if it was trying to figure out how to cater its messages to get messages supportive of Chinese national security rather than American. And so that data security rationale underlines the act. That has nothing to do with protected speech by American citizens. That's a separate concern.
Now, the second rationale for the Act is covert content manipulation by ByteDance. And I say by ByteDance deliberately because the point is what is being targeted is a foreign company that controls this recommendation engine in many of the algorithm that's used to determine what content is shown to Americans on the app. And I think there was a quotation read from petitioners' declarations before about how that continues to happen in China rather than in the United States. If you just read those declarations through, you'll find lots of them. There's really no dispute here that the recommendation engine is maintained, developed, written by ByteDance rather than by TikTok US, and that is what's being targeted. So when the petitioners say, as they repeatedly did this morning, that this is targeting expression, to the extent that it's targeting expression, which isn't the exclusive thing, but if you call the covert manipulation of content expression, which may be under NetChoice, if an American company was doing it, you would, but if you're going to call that expression, it is not expression by Americans in America. It is expression by Chinese engineers in China.
Judge Srinivasan:
Maybe we're going to the same place, but you've helpfully isolated what seems to me to be doing all the work from the government's perspective, because under NetChoice, if we were talking about a US company that's heartland First Amendment protected curation. That's just what NetChoice says. So if everything under the government's perspective turns on the fact that ByteDance is subject to Chinese control, because if it was US control, that's NetChoice. That raises a serious First Amendment question. And then the way NetChoice did it, the rationale would be related to the suppression of free expression. And let me just add one thing. Take out the data security. I know that that's in the case, but just for purposes of this part of the analysis, just take that out and let's just focus on the content, what the government itself rightly characterizes as content manipulation. Once you put an interest in play that's called content manipulation, that sets off First Amendment alarm bells in the normal situation when you don't have foreign control in play.
Daniel Tenny:
I think I mostly agree with that statement, but I do have some caveats. It's not just a question of control or ownership. This is being done by ByteDance. I think we would have arguments and we might well win, and I don't mean to suggest otherwise. If the covert content manipulation that we were concerned about were done by TikTok US subject to the oversight of ByteDance, I think we would have strong arguments there. And again, I'm not trying to give away that case, but I'm just telling you that's not the case that you've got because what we're talking about-
Judge Srinivasan:
I'm giving you that.
Daniel Tenny:
... is that this is what's done by ByteDance outside the United States. And the other side makes a big deal about the fact that the code is deployed in the United States in the sense that the source code is written in China and then it's posted. It is a little metaphysical to describe what it means for it to be "in the United States" if it's in the cloud, but the core point that we're making is the one that they've conceded, which is that this code is written in China and the determinations about how it should be changed, how it should be altered, which they say they do a thousand times a day, they push up a new update. All of that is done in China. And I understood them to be weekly suggesting-
Judge Srinivasan:
I'm sorry, I'm giving you all that. I'm just assuming for these purposes that you're right on all that, that all the relevant stuff is going on in China. My only point is that if that relevant stuff was going on in the US, that's a big problem for the government under NetChoice.
Daniel Tenny:
If the government were targeting the curation of content in the United States by US actors, that would be NetChoice.
Judge Srinivasan:
What you're doing is you're targeting curation in the same way. It's just that the curation that's being targeted is happening abroad at the behest of a foreign adversary.
Daniel Tenny:
With the caveat that there's other rationales too.
Judge Srinivasan:
Yes, but we're focusing on this.
Daniel Tenny:
Yeah, that's absolutely right.
Judge Srinivasan:
Right. So if we do that, then what's your answer to the proposition that even if we credit you in every respect on that score, and we assume that the foreign curator doesn't have a First Amendment claim because they're a foreign curator engaging in curation abroad, and under Agency for International development and other sources, that's not subject to First Amendment protection, what's the answer to the proposition that, well, you still got US recipients? And the US recipients definitely have First Amendment interest in place. See cases like Lamont. And for those US recipients, the exercise of foreign curation affects the mix of things that they're getting and they want to have access to by hypothesis, the curation that's occurring abroad. And the fact that that's being denied subjects this to serious First Amendment scrutiny. What's the answer to that?
Daniel Tenny:
I think when you're talking about recipients rather than speakers, the first case I would turn to is Murthy from the Supreme Court Justice this past term, where that was the claim that was made there.
Judge Srinivasan:
Well, also, it's the entire complex of people who are in the second set of petitioners before us because it includes not just recipients who take in TikTok content, but also creators of content that's then disseminated on TikTok, who are speakers in a way. But it's just that they like doing it through a medium that's subject to the very control that you want to deny with the editorial control that's going on abroad, right?
Daniel Tenny:
And we've now moved away from arguing that the government's justification is related to First Amendment activity and we're already on incidental effects on other people's First Amendment rights. Because the government isn't targeting those people, isn't saying, "We don't want you to be able to post on this medium, we don't want you to be able to associate." That's just something that happened to them. And there, I think cases like Arcara and other cases... There are all sorts of circumstances-
Judge Srinivasan:
If you're invoking Arcara, then that means that the First Amendment just doesn't apply at all. So it's not subject to First Amendment of scrutiny at all because you are focusing on foreign control, and that means that the US users of TikTok, both US users who disseminate content via TikTok and who take in content, that their First Amendment interests, which are the kind of interests that were in play in Vermont, just don't matter because the First Amendment just doesn't apply at all. That may well be the government's position. I've detected a little bit of equivocation on that in the briefing. Is your position that under Arcara that the First Amendment just doesn't apply at all?
Daniel Tenny:
We don't think you knew you have to go nearly that far to resolve this case,
Judge Srinivasan:
But would you take that position? I get that you think you'd win anyway, but I'm curious to know, would you take the position that actually the best answer is that the First Amendment just doesn't apply at all?
Daniel Tenny:
I guess it's helpful in thinking about these First Amendment questions to separate the various petitioners. And I think that the First Amendment, obviously, if you have a case like this where the effects on expression are significant, and we acknowledge that, it might sound like a striking proposition, but I'd like to just walk through the various petitioners. So the first petitioner is ByteDance and TikTok themselves, and we discussed that as foreign. Okay. And so then you have the content creator petitioners and their speech may be affected. And we don't know yet what will happen if this law is upheld, whether ByteDance and TikTok and China will have a change of heart and come up with a way to sell the platform and continue to operate or whether they want to forfeit whatever rights they would have to their extremely successful business.
But no matter which of those things happen, that really is an indirect effect of what's going on here. And you could say, well, it's a sufficiently significant incidental effect that will apply something akin to the O'Brien standard. But there's a pretty good argument that you would say, in the same way, if ByteDance got shut down because they engaged in tax fraud or because they violated the import laws and the result was that a very popular internet platform was shut down entirely, I don't think a content creator could come in and bring a First Amendment challenge and say-
Judge Srinivasan:
As to that, the argument I take it the other side would make is that's markedly different, because if they violate the tax laws, the rationale for striking against that has nothing to do with content or expression. But here, at least with the content manipulation rationale, put data security to the side, but the content manipulation rationale, by its very terms, it has everything to do with content. The concern is that the curation is going to result in content that the government fears the consequences of because they think that there's going to be curation that's going to affect American consumers in a way that's going to be problematic for the US's interest, which may well be right. And we may well need to defer to that. And the question is, does the normal First Amendment calculus kicks in where, in that situation, what the system relies on is counterspeech or different ideas.
Daniel Tenny:
Right. I do think that the best way to think about it is to think about whether it's protected expression rather than just expression. The fact that it happens to be expression, but it's expression that's not protected by the First Amendment, seems like a pretty big distinction. And I think that for the same reason, the idea that counterspeech, in addition to some of the practical problems with counterspeech in this area, the idea that the reflexive reaction that counterspeech is the right answer relies on cases in which you're talking about protected expression rather than unprotected expression. And I understand that the fact that this is unprotected expression that is justifying this in part, again, is something that complicates the analysis from these other cases. And that's part of why if we think we'd still readily prevail under really any First Amendment standard, but certainly under O'Brien.
That's why I led my answer by saying we're not asking the Court necessarily to go that far, but it is the case that if there is a concern, a range of national security of concerns, some of which have to do with unprotected expression, there's certainly a good argument that the First Amendment implications of that are materially lower. And maybe even if all your targeting is unprotected expression, it'll go all the way down.
Judge Srinivasan:
So when you say unprotected expression, so do you mean that it's expression that because it's occurring abroad cannot give rise to a successful First Amendment claim on the part of the person whose expression abroad is being restricted? And if that's true, if that's the government's understanding of the natural fallout of that, then I take it then it could be the case that Congress could just pass a statute that says no foreign entity abroad can send speech into the United States.
Daniel Tenny:
I don't think so for various reasons. The other side talks about the Lamont case a lot, and what the Supreme Court said in Murthy about First Amendment standing of recipients of speech is that it requires this sort of close connection. And Lamont was people were getting mail addressed to them and then the government was asking them to raise their hand and say, "I want to receive the communist propaganda," and the Supreme Court said that wasn't okay. Obviously we're not here to quibble with Lamont, which is Supreme Court precedent, but it doesn't hurt our case here because what the Supreme Court said in Murthy itself was, if your interest is just this broader, there's a lot of things that I want to read, and I'm a general consumer, they didn't even think they had standing in that case much less of a strong First Amendment claim. And so the point here is just if the speaker itself doesn't have a claim, it's strange to say, well, the listener does.
Judge Rao:
Mr. Tenny, so maybe moving away from the creator petitioners to TikTok US, TikTok US is a US corporation. It's a wholly owned subsidiary of ByteDance, but it is a US corporation. What about the First Amendment interest there? Does the government recognize that TikTok US as a separate corporate entity has First Amendment rights of content moderation and all the other things that, for instance, the press or declaration says occur by TikTok US?
Daniel Tenny:
Right. Yes, but those are incidental. TikTok US posts things on TikTok, they say, and they engage in some content moderation that occurs after the recommendation engine. The problem there is none of that is what's being targeted here. They're saying that this is singling out a speaker, but that's not what we're going after. And one way we know that is that-
Judge Rao:
But if TikTok US is engaged in expressive activity, which I think you just said that they are, at least of some sort, then the Act does single them out. It requires TikTok US to be divested of its foreign ownership.
Daniel Tenny:
It doesn't single them out because of their own First Amendment activity. It singles them out because there's no indication in this record that Congress said, we don't like the things TikTok US is posting. We don't like the way that TikTok US makes decisions about-
Judge Rao:
Well, but TikTok US is an entity engaged in expressive activity.
Daniel Tenny:
Well, the-
Judge Rao:
So why is this case then not like Minneapolis Star?
Daniel Tenny:
It is a lot more like Arcara in that regard. It was a bookseller and they weren't going to be able to sell their books. And the Supreme Court said, well, I understand that the consequence of this is that you won't be able to sell your books, but the reasons that the government is going after you has nothing to do with your-
Judge Rao:
Arcara books, the statute was a generally applicable statute against prostitution or places hosting unsavory activities. But here, the act itself singles out TikTok. And TikTok is, as you've said, an entity engaged in expressive activity.
Daniel Tenny:
Obviously, whether it's the Congress that's doing the selecting or the executive branch, I'm not sure would affect the First Amendment analysis. But just to answer your question as directly as I can, the relevant First Amendment question, at least in terms of talking about who's targeted, is whether there is a motivation for the statute or the Act, whether that depends on the expressive activity of the regulated entity. And so here, the expressive activity in which TikTok US engages is not creating the recommendation engine because that's done by ByteDance, not by TikTok US. And the data stream is not expressive activity either. And so we don't have a circumstance in which a US company has been targeted because of its expressive activity. That's just not this case.
Judge Rao:
Not because of its expressive activity, but it is an entity that engages in expressive activity, like a newspaper choosing editorials to run.
Daniel Tenny:
Right. Or like a bookstore, which is what we had in Arcara. I guess I'll make two points. One is, at most, if we would get to O'Brien and that's what Arcara was a debate about O'Brien versus. So we're not getting all the way to Spitz Fried no matter what, but even just to take the point a step further, as Arcara pointed out, and I don't think could be seriously disputed, there are government actions all the time that target entities that engage in First Amendment activity and if the government actions are justified and motivated by things that do not target that expressive activity-
Judge Srinivasan:
But this one, I think that's-
Judge Rao:
So you might satisfy intermediate scrutiny, but there's still a question about the fact that the First Amendment applies because one of your arguments is that really doesn't apply at all. I mean when you talk about the ByteDance and TikTok petitioners, your comment to the chief judge was, "Well these are foreign companies," and my point is, there's also an entity TikTok US, which is an American corporation, which has its own first amendment rights as an American corporation under AIT and other cases that ... I guess another question is is the government making an argument that the separation between TikTok US and ByteDance is a sham or that they're fully controlled by ByteDance in a way that makes the corporate form not something we should pay attention to?
Daniel Tenny:
Well we're not making that argument. I do think it's-
Judge Rao:
I think so.
Daniel Tenny:
The First Amendment, the application of the First Amendment is determined not by entities but by activities. So I think it's, because all sorts of people could engage in First Amendment activity and if you put someone in prison because he robbed a bank and he happens to be a prominent speaker, nobody thinks that's a First Amendment claim. So I think it's useful to talk about the activities here rather than the entities and the activity, the arguably expressive activity or the expressive activity that we're talking about in this case that the United States government is targeting is the creation and maintenance of the recommendation engine and the content moderation that proceeds that that is done by ByteDance. That's what the United States is concerned about. That's expressive activity, but it takes place outside the United States. I-
Judge Srinivasan:
So it seems like what gives arguable force to the other side's First Amendment argument is that it's not just that the government is targeting curation that occurs abroad, it's the reason the curation occurring abroad is being targeted and the reason is a concern about the content consequences of that curation in the US. And when the reason itself, you call it content manipulation, when the reason itself is content manipulation, unlike you brought up our car a number of times for totally understandable reasons, the reason there had nothing to do with the content, it was about prostitution.
It didn't have to do with what the bookstore's selling of books. It just that happened to be that the prostitution was taking place at a bookstore. Here it's not a happen-to-be situation. The whole reason for targeting the editorial curation that's occurring abroad is because of a concern about the speech consequences of that on US consumers.
Daniel Tenny:
I'd just like to be clear about this factually, the other side cites what I assume they think are their best examples of members of Congress talking about this and I pulled these and I would encourage the court to do the same, but I'll just read an example. Senator Warner, this is at page 566 of the appendix. It's cited at page 20 of the TikTok brief. I think they quoted the portion that said, "All the TikTok videos will be promoting that Taiwan ought to be part of China."
Judge Srinivasan:
I take that point, just to be clear, I'm not talking about statements of individual legislators. I'm talking about the government's own rationale as articulated by the government. I'm just looking at your brief on page 36, "Congress reasonably acted to prevent this sort of content manipulation by a hostile foreign power. A foreign power's secret manipulation of the content on social media platforms to influence the views of Americans for its own purpose."
And I'm just saying, when the rationale is one that's bound up in a concern about influencing the views of Americans, that's different from a concern about prostitution at a bookstore because the concern here is about the curation decisions abroad influencing the views of Americans, which is typically when you see that kind of language, you'd say that's a First Amendment concern because we're trying to stamp out something because it's influencing the views of Americans in a way that the government doesn't want the American's views to be influenced and usually that's the type of thing that generates a First Amendment concern.
Daniel Tenny:
I guess to be, and I apologize for going back to the quote I was doing, but I hope you'll understand why, there's a difference between saying, "We don't like the way the content turns out on this platform when China does it this way." That would be saying, "Here's content that we like and don't like." That's different from saying, "We don't want China to be in charge of what this platform turns out." And so what Senator Warner, the full quote of what he actually said, again at 566 of the appendix, is, "They could switch the algorithm a little bit and suddenly all the TikTok videos will be promoting that Taiwan ought to be part of China." And the problem here, it's the covert content manipulation. It's-
Judge Srinivasan:
But suppose it was potentially two US owners. Just take foreign ownership out of play. And suppose that what happened is the government has a social media platform that's owned by a company in the US and that company in the US, the government has concerns because that company in the US is potentially manipulating any information in a way that the government doesn't like. And so what the government says is, "All I'm talking about is ownership of control. I'm not talking about anything else. I just want to switch that ownership from company A to company B." There's no doubt that that would be a huge First Amendment concern, right?
Daniel Tenny:
Right, but that would be because the US company has First Amendment rights. That is net choice in some sense. The US company has First Amendment rights and it can have its algorithm what it wants. And there, maybe you could do disclosure because maybe you could get, as a US company regulated in the United States, maybe there are less restrictive means. But to say, "We want this Chinese company to tell us," the whole point is that it's covert and we don't have a way to stop it from being covert because it's being done through incredibly complex computer code that no one including Oracle who they want to hire would understand and because it's being done in a foreign nation where we can't pass a law saying, "ByteDance make this or that disclosure about what your internal operations," because they're off in China. And those are the fundamental points.
I do want, in a related vein, circle back to the point about they talk a lot about how all the source code is reviewed. I just want to say something about what that review entails and this is again from their own declarations. If you look at the Simkins declaration, it's at 741 of the appendix paragraph 61, it describes the purposes of the source code review and it's basically finding viruses or malware. That's what they're doing. They're saying, "We want to make sure this source code doesn't have a problem with it of a sort of computer programming sort." I can read you the quote if you want, but it really does say that.
In the reply brief it's more explicit in the reply in the supplemental appendix at 842, paragraph 10. "The purpose of this source code review with respect to the recommendation engine is not necessarily to inspect how the recommendation algorithm makes decisions, which I understand is largely driven by content and user behavior, but rather to prevent a third party including petitioners from covertly manipulating the recommendation engine once it is deployed by TikTok USDS and Oracle in the secure Oracle cloud."
So they're going at, "Is there a hacker? Are they getting at it afterwards?" That's not what we're talking about. What we're talking about is when they build it, when they create it, when they decide how it's going to work in China, and nobody's looking at that and nobody can. It's farcical to suggest that with this two billion lines of code 40 times as big as the entire Windows operating system, changed 1,000 times every day, that somehow we're going to detect that they've changed it so that it favors a Chinese narrative as opposed to being a neutral expression of American ideas. And they say one of the things they like about it, the content creator petitioners say, "Well we think this is a sort of diverse and organic source of news." The problem is China could decide one day it doesn't want it to be that anymore and we would have no way of knowing that.
Users wouldn't be able to tell and that's fundamentally the problem and that is not solved. It doesn't even purport to be solved by the proposed National Security Agreement, Project Texas, or anything else. And the same on the data security side. There are just reams of information that are going back to China. If you look, and this is under seal because of business information so I won't tell you the particulars, but if you look at the supplemental appendix, and I think I've got the page number here, but maybe I don't. Oh yeah. Supplemental appendix pages 249 to 52. This is the sealed appendix that was filed with their opening brief. It's four pages in minuscule font of all the ways they're going to slice and dice the data that they're going to send back to China ostensibly for business purposes rather than anything nefarious.
But they're not sealing off the US application in any meaningful sense. And so we really return to, I mean that's why the sort of claims that this is all sort of regulating US content. No, this is about that there is so much happening in China outside the control of the United States that it poses a grave national security risk.
Judge Srinivasan:
Can I just get back to on content manipulation? I think it's an adjunct to what you were just talking about. If there's a law before us that says, "We're worried about foreign content manipulation," which is what we're worried about here, we're about something more precise here too, but let's just say it's a law that's worried about foreign content manipulation and in order to address foreign content manipulation because of a concern that the manipulation is going to "influence the views of Americans for their own purposes," meaning the purposes of the foreign country, we're going to require the divestiture of any company, any social media company, that's subject to foreign control. How would we analyze that as a First Amendment matter?
Daniel Tenny:
I mean, it's so tricky to talk about what that's regulating because of its breadth. And-
Judge Srinivasan:
I don't know if it's that tricky to understand what's regulating. It's just saying that any company that's subject to 20% or more foreign control, anyone, not just China, but foreign control at all has to divest because we're worried about foreign content manipulation. How would we analyze that under the First Amendment? Or would it just not even be a First Amendment question?
Daniel Tenny:
I mean, I guess the description of foreign content manipulation, I'm not sure where that would've come from for something that broad and obviously the idea, to the extent that you could read such a statute as suggesting that some company operating in America, which is making all sorts of decisions for itself, we're labeling all of that foreign content manipulation, I'm not sure is something you could do. So that seems like a very different situation. I mean here, there was a record both before Congress and there's a record before this court about precisely the covert content manipulation that's being targeted and evidence that it in fact occurs abroad and that divestiture would solve it and that to the extent that their activities occurring in the United States, Congress was careful to provide a mechanism.
They say China won't let them. I'm not sure why that's supposed to make us feel better, but there's a mechanism to preserve what's going on in the United States to the extent consistent with that rationale. That seems quite different. I don't know that you could, even if you said it was foreign content manipulation, I don't know that you could-
Judge Srinivasan:
So the reason why that law might be potentially problematic, whereas this one isn't in your view is because of the record of the capacity of foreign content manipulation in fact to occur in these circumstances.
Daniel Tenny:
I think there are sort of, broadly speaking, you could distinguish that both in terms of the application of any form of First Amendment scrutiny that might apply and in terms of what sort of scrutiny would apply. So taking those one at a time. In terms of the application of any First Amendment scrutiny, it wouldn't have anything remotely resembling the National Security record that we have here in terms of a compelling interest or in terms of the tailoring. So obviously if you were applying any form of First Amendment scrutiny, that would be markedly different. In terms of what level of First Amendment scrutiny to apply, here we have a very robust record demonstrating that what is being targeted is not protected expression.
It doesn't sound like from your hypothetical you would have anything of that sort. In that case, it would seem like what's being targeted is something much more general that would sweep in lots of expressive conduct by Americans ostensibly because it's influenced in some way, not articulated by foreign companies. It obviously is a lot broader, but it seems constitutionally significant that it's that much broader. So you could distinguish it on both grounds. And here, as I said, we have very strong arguments about the level of scrutiny that applies and also very strong arguments that any type of scrutiny including strict scrutiny given the National Security concerns would be satisfied.
Judge Ginsberg:
It would be very peculiar to have a statute that apply to all foreign ownership in this without that kind of a record, that would be including the four other countries that make up the five eyes with which we have the most intimate security arrangements. That would include Canada, United Kingdom, New Zealand, and Australia, but also countries as to which we have other kinds of partnership with defense relationships, NATO and so on. I mean this is so clearly targeted not just at TikTok and ByteDance, but at China. And the record that you've compiled is all about China, not even the other three adversary nations. They're not relevant here.
Daniel Tenny:
Right, exactly and that is an important point that this is a very specific record about a very specific concern and-
Judge Srinivasan:
That's the result under applying First Amendment scrutiny as opposed to whether the First Amendment kicks in and in what way in the first place. It may be that the record and all the particular concerns affect the result, but I was trying to understand the implications of the government's approach for the analytical method that we would use. And can I ask one last question just about Lamont? So under Lamont, you have a situation in which the government was unable to do what it wanted to do there because the First Amendment interest of the US listeners in the communist, as it was called, propaganda, could the response to that have been to adopt a law that just barred the propaganda from coming into the US in the first place?
Daniel Tenny:
I'm not sure how you would accomplish that. I mean, the line that was discussed in Murthy and that was tied into the listener standing cases that the Supreme Court has had over the years was more tethered to whether there's a particularized relationship and whether they have some First Amendment interest. I mean, you might be able to justify it, but in terms of someone having standing or a First Amendment claim, I mean, this case is just so similar to Murthy that I would just sort of urge the court to lean there. I mean, Murthy, people were saying the content curation decisions that are made by the platforms affect the content that I read on the internet on some particular topics and therefore I have standing to challenge what I regard as impermissible government influence on those content manipulation decisions and the Supreme Court said no.
And I mean I think that's what they're saying in this case. In Lamont it was something different. It was saying either I sent away and said, "Can you please send me this information?" Or somebody sent me information and I want to receive it and I have to go to the Postal Service and tell them that I want to receive it. That's just a totally different kind of a claim. And-
Judge Srinivasan:
I think the relevant difference there is that the affirmative act of saying that you want the information is what gave rise to the First Amendment injury there.
Daniel Tenny:
That's part of it. Another part, which the Supreme Court has emphasized in a number of cases, is the connection between the foreign entity and the US entity. They've required something pretty specific. I mean, in Murthy they described the prospect that any listener would have standing to challenge the inability of someone else to speak as startlingly broad and that that was all US. That entire case was in the United States. And I think combined with the idea from Agency for International Development about exporting your First Amendment right, it just would be startlingly broad if you could say, "Well I understand that the speaker has no First Amendment rights. I understand I have no particularized connection to the speaker. I'm just someone who wants to read what they say on the internet. But now suddenly, even though they have no First Amendment claim, I do."
And that's what the Supreme Court rejected in Murthy and in frankly what should be a harder context because it was all in America and that's what the court should do here. Unless there are further questions.
Judge Srinivasan:
Thank you, Mr. Tenny.
Daniel Tenny:
Oh, thank you.
Judge Srinivasan:
Mr. Pincus, we'll give you three minutes for your rebuttal.
Andrew Pincus:
Thanks very much, your Honor. I just want to start about the question about what happens in the US because the government is just flat wrong. The court looks at the presser declaration, pages 812, 815, 817, 829, and the Federal Declaration 901. I'll say that again. 812, 817, 815, 829. They talk about how the recommendation engine itself is influenced in the US. It's trained in the US on US data. It's modified in the US based on US content moderation decisions. It clearly embodies not just Chinese speech, that's an issue, but US speech by TikTok Inc. So the idea that there somehow is the ability to say, "Oh, this is just foreign," is just plain wrong.
Even if it were, I think that raises some questions. Could a US speaker who wanted to show a foreign movie be told, "No, that's unconstitutional," and there's no strict scrutiny? That seems wrong as well. So then turning to your question, your Honor, Judge Srinivasan, Chief Judge Srinivasan, was asking about what is this target? The government talks about targeting foreign government manipulation, but the question here, the First Amendment question is what's burdened? And what's being burdened, the government admits there is no foreign content manipulation now. There may never be. What's being burdened is US speech by US users and also as a result of the US decisions with respect to the content moderation engine. I want to talk about the alleged hoovering up of information because it's just wrong.
The government talks about precise location data. That's GPS data. That's what that means and the reason GPS data is important, as the Supreme Court said in Jones, is because it lets you create a dossier about people. GPS data is not collected. User content data, the user contact list data, as Mr. Fisher says, it's voluntary, but more than that, it's anonymized. The record makes clear, it's anonymized. The record 847 to 848, the government says, "Anonymization doesn't work." That's talked about there too. It's a standard technique that the US government uses all the time. The government tries to make something about the fact that some provisions in the NSA prevent some data to be sent to China. Some of that is data that is the result of e-commerce data. There's an e-commerce site. The law requires the collection of that kind of data. The government didn't want, in the negotiations, didn't raise that.
If that's an issue, we'd certainly be prepared to talk about it. The other thing is, I think the government's argument here is premised on the idea that Congress somehow made the determinations that the government relies on. There's zero evidence of that. That's the problem here. Congress had a lot of reasons. The government read one quote about the record. There are numerous quotes about content currently on TikTok, non-manipulated content, that were the concerns of Congress. We just don't know what Congress did. And in the First Amendment context, we've got to be pretty careful about what those rationales are. It seems to me, to invalidate the law here, the court can rely on existing precedent, news America, strict scrutiny, the failure to consider disclosure and other options. Disclosure is an option here. If the government, the disclosure doesn't necessarily, if the government could make the required showing have the target only the manipulated content, there could be a general disclosure that the government fears there's manipulated content on this platform.
I'm not saying that would be right, but there certainly has been no attempt to figure out whether disclosure can solve the problem here. And the problem is, if you go the other way and rule for the government, you have to do some pretty unprecedented things. You have to hold a content-based justification for general speech is a sufficient justification. That's something that Holder does not support. That was a very, very, very targeted. And then the court has to say, "Doesn't matter that Congress didn't consider less restrictive means," and the court has to resolve numerous factual disputes in a record that has no entity for the court to resolve them. And we think that's a very, very significant area.
Judge Srinivasan:
Thank you. Mr. Pincus. Fisher, we'll give you two minutes for your rebuttal.
Jeffrey Fisher:
Thank you. Like to make four points about the First Amendment claim. First, the government cites Murthy as suggesting that by clients and base politics as users of TikTok might not even have standing let alone a First Amendment claim. To use the government's own words, we have raised our hand and said, "Please give us information." We follow other users. We've joined various groups on the platform. So we're doing exactly what the mail recipients did in Lamont. And remember, that's not even the heart of our claim. The heart of our claim is as our own speakers working with our editor and publisher of choice and there's no suggestion we wouldn't have standing or anything less than the most severe First Amendment injury there.
Secondly, there were some conversation about can we think about this law as having only an indirect effect on the creators? And the answer is absolutely not. The law by its terms prohibits a certain publisher from publishing under his own content recommendation system online and that is our publisher. The very speech that the ACT singles out, the social media type speech that uses in the quote that I read earlier, is our speech. And so the ACT targets us directly. Even if it didn't-
Judge Ginsberg:
Let me ask you about that. Take the example of the case in which a bookstore was closed, couldn't meet code requirements and ended up having to be closed and of course the readers were the collateral victims of that. So if however, the bookstore is validly closed for that reason or about to be, if the law says, "That's fine. Bookstore has to meet code and it's not meeting it," you say then that readers would have the opportunity to, well to be considered in that final decision and sort of want an exception to the code because it would impinge upon the readers.
Jeffrey Fisher:
I think there'd be standing there for the injury, but I think that would be a losing claim under Arcara for the reasons Chief Judge Srinivasan has discussed with you.
Judge Ginsberg:
Fair enough.
Jeffrey Fisher:
But also just remember, even if it were not direct as I've described, Sorrel says, "You look at impractical operation, how the burden exists under the act," and our speech is being silenced so we satisfy that. Judge Rao, you asked about the covert nature of the alleged manipulation and how a disclaimer might work. Let me say a couple things about that. One is, it just strikes us as an odd argument to say that an editor and publisher can be suppressed or people can be banned from working with editor and publisher simply because they might consult or be even controlled by other app third parties. Publishers in this country every day speak to US government officials and any number of other third parties to make their publication decisions and that's never disclosed to the public and let alone even the author sometimes.
So it just starts as a very odd argument to begin with. But even if it were an argument, you asked what a disclosure might look like, the government could issue its own warning or maybe even as a company suggest a news brief, something like a surgeon general warning on the platform itself. From the creator standpoint and the user standpoint, that would be a whole lot better than shutting down the platform. If the government thought it was factually accurate and could justify a warning says, "This might be influenced by Chinese government officials," that would be a lot different than shutting down the platform and that would fully meet the government's covert interests to covert manipulation interest. And then finally, Judge Ginsberg, I want to return to your question. Isn't this unprecedented in a sense, foreign adversary, content manipulation, and the like?
And the answer is no. Our country throughout history has come up against this problem. Let me just isolate one example, which is the Whitney case. In that case, the Supreme Court actually accepted the government's argument, the state of California's argument, that it could suppress speech of an American because it was spreading Communist propaganda in conjunction with the Soviet Union and Russian officials. And the court said it was allowed to do that because it was tending to incite crime, disturb the public peace, or endanger the foundations of organized government. Now that majority opinion drew a separate opinion from Justice Brandeis that has carried the day of history and that majority opinion has been inferred by the Supreme Court itself. We urge the court not to go back down that road and for two reasons. One is, even the level of lawlessness and imminence described in that opinion is nowhere present in this case.
The most the government is here today to say is that China might someday influence the content on this platform that 170 million Americans use. We are miles away from even the assertion of the majority in Whitney, let alone justice Brandeis to stand that carried the day and said, "Anything short of incitement to violence is protected speech in this country, even if done in conjunction with foreign actors." And so that's the principle I leave the court with, whether it's foreign actors, whether it's US citizens, or any combination of the two, what we're talking about in this case are ideas and ideas about politics and social governance and baking and sports and agriculture and it is the tradition in our country to protect those ideas and that's what this act unfortunately does not do.
Judge Srinivasan:
Thank you, counsel. Thank you to all counsel. We'll take this case under submission.
Clerk:
Stand please. This honorable court now stands adjourned until Thursday, September 19 at 9:30 A.M.