Home

Donate

Eight Notable Themes from US Court of Appeals Oral Arguments In Challenge to TikTok Law

Tim Bernard / Sep 18, 2024

Composite. United States Attorney General Merrick Garland, left; TikTok CEO Shou Zi Chew, right.

The United States Court of Appeals for the DC Circuit held two hours of oral arguments on Monday morning before a panel of three judges. They are tasked with deciding whether or not to grant TikTok’s request for an injunction blocking the Protecting Americans from Foreign Adversary Controlled Applications Act from going into effect. The law, which was signed by the President on April 24, gives TikTok’s China-based parent company, ByteDance, up to nine months to sell the company or face a ban from app stores in the US.

The key question before the court is whether the law violates the First Amendment. The court is grappling with a number of questions, including what level of scrutiny should apply to the law, whether the government’s national security and privacy rationales justify the law, and how legal precedents involving foreign propaganda and influence apply in this situation, which involves a very popular social media platform with more than 170 million American users.

Andrew Pincus represented TikTok’s case, Jeffrey Fisher argued for a group of American TikTok creators, and Daniel Tenny defended the law on behalf of the Justice Department. The Appeals Court panel consisted of Chief Judge Sri Srinivasan (an Obama appointee), Judge Neomi Rao (a Trump appointee), and Judge Douglas Ginsburg (a Reagan appointee). A full transcript of the arguments is here.

While some predilections could be surmised from the questions posed by the judges—Judge Ginsburg, in particular, seemed skeptical of the petitioners’ case and had few questions for the government—both sides arguments’ were rigorously probed, and the decision is not easy to predict. If the Court finds in favor of the government, TikTok will have very limited time to appeal to the Supreme Court, which would be its last chance to avoid the Act’s penalties coming into force just four months from now.

Below are eight notable themes and moments from the arguments.

1. Disagreements over the facts

Although the arguments focused on the complicated intersecting speech rights of US content creators, US content consumers, and entities, including TikTok Inc. (a US company), ByteDance (a foreign company), and potentially the People’s Republic of China, the parties also disagreed about some of the core facts.

In particular, the government framed all content recommendation on TikTok as originating in China, the product of Chinese residents, and updated on a daily basis from China without any real possibility for review by TikTok, Inc., or by Oracle under the Project Texas agreement. If accurate, this would simplify the First Amendment analysis of the act in question, as the speech (i.e., the recommendations) would be clearly originating in China, not the US. TikTok’s lawyer vehemently rejected this characterization, insisting that the algorithm is also trained in the US using US data and that content moderation decisions made in the US factor into the recommendations that it makes. The parties also disagreed about the scope and nature of American user data that is transmitted to ByteDance and China.

Key quotes:

Daniel Tenny: What we're talking about is when they build it, when they create it, when they decide how it's going to work in China, and nobody's looking at that and nobody can. It's farcical to suggest that with this two billion lines of code 40 times as big as the entire Windows operating system, changed 1,000 times every day, that somehow we're going to detect that they've changed it so that it favors a Chinese narrative as opposed to being a neutral expression of American ideas. And they say one of the things they like about it, the content creator petitioners say, 'Well we think this is a sort of diverse and organic source of news.' The problem is China could decide one day it doesn't want it to be that anymore and we would have no way of knowing that.
Andrew Pincus: I just want to start about the question about what happens in the US because the government is just flat wrong. The court looks at the presser declaration, pages 812, 815, 817, 829, and the Federal Declaration 901. I'll say that again. 812, 817, 815, 829. They talk about how the recommendation engine itself is influenced in the US. It's trained in the US on US data. It's modified in the US based on US content moderation decisions. It clearly embodies not just Chinese speech, that's an issue, but US speech by TikTok Inc. So the idea that there somehow is the ability to say, ‘Oh, this is just foreign,’ is just plain wrong.

2. Remand to Congress?

At one point, Judge Rao interjected, “I feel like you’re arguing for us to remand without vacatur to Congress for more findings.” This type of decision directs agencies to clarify their reasoning (without vacating their action in the meantime), which isn’t how judicial review of legislation works. But, as Pincus, TikTok’s attorney, responded, this is an unusual law. It does not (only) set out a general rule and task the executive branch with putting it into effect. Rather, it specifies one particular target without a clear forum for examining whether the application of that law to that target conforms with its own principles and with other legal obligations—including determinations that rely on fact-finding, which the paragraph above suggests could be significant.

Key quotes:

Judge Rao: I think you're arguing for us to remand without vacatur to Congress for more findings.

Andrew Pincus: Well, I don't think-

Judge Rao: It's a very, very strange framework. I know Congress doesn't legislate all the time, but here they did. They actually passed a law, and many of your arguments want us to treat them like they're an agency.

Andrew Pincus: I don't-

Judge Rao: It's a very strange framework for thinking about our first branch of government.

Andrew Pincus: I think it's an unusual law, though, Your Honor. It's a pretty unusual law, an unprecedented law as far as we know, that specifically targets one speaker and bans generally. This isn't Kaspersky or one of these laws that talk about government procurement or the use of government funds. This is a law that broadly regulates, and targets that regulation at one speaker. That's pretty unusual, and I do think News America supplies the paradigm. Now, in News America, the court didn't say, ‘We're remanding to the FCC,’ but the functional effect of its decision was to say, ‘The FCC will apply the general standard, and then if there's a problem, we'll figure it out.’ So I'm not saying remand to Congress; I'm saying exactly what the News America court said.

3. Equal Protection

While the main constitutional contention regarded freedom of expression, one exchange between Judge Ginsburg and the creators’ attorney raised a different issue: that of equal protection, as TikTok / ByteDance was singled out for penalty, whereas, under the general rule no other companies are automatically targets, and if they are identified by the executive branch, they have avenues to defend themselves in accordance with the definitions and exceptions laid out in that section of the law. Pincus repeatedly compared the current case to another that hinged on the intersection of the First Amendment and the equal protection clause, News America Publishing, Inc. v. FCC, where a statutory provision was found to, albeit not by name, target just one individual (Rupert Murdoch), and was eventually struck down.

Key quote:

Judge Ginsberg: It's a rather blinkered view that the statute just singles out one company. It describes a category of companies, all of which are controlled by adversary powers, and subjects one company to an immediate necessity because it's engaged in two years of negotiation with that company, held innumerable hearings, meeting after meeting after meeting, an attempt to reach an agreement on a national security arrangement, which failed. That's the only company that sits in that situation, that is so advanced in its negotiations and its relationships with the government that it's exhausted any further possibility of relief through the second procedure.
...
Judge Ginsberg: That's essentially your equal protection argument, correct?

Andrew Pincus: No, I-

Judge Ginsberg: Equal protection heightened with a sort of First Amendment flavor enhancer.

Andrew Pincus: Exactly the argument that was in News America, Your Honor. What the court said is, ‘We are looking at the First Amendment equal protection with a little flavoring of bill of attainder.’

4. Target, Burden, and Motivation

One of the key disagreements between the parties was whether the First Amendment argument should focus on the technical target of the law, i.e., ByteDance, or on the entities whose speech is burdened, a set also including TikTok Inc., creators, and consumers. The latter grouping has a much clearer and stronger right to free expression under the Constitution. Tenny laid out that the government was motivated merely by (1) ByteDance’s access to user data and (2) the risk of the Chinese authorities covertly manipulating the TikTok feed; all other burdens to speech are incidental, he stated. However, Pincus pointed out that members of Congress are on record as describing their motivations with reference to the speech of Americans on current events, which suggests that the government’s true target was not, in fact as limited as claimed.

Key quote:

Andrew Pincus: No, I am challenging the government's interest in scrutiny. What I was about to say is that the government, as I said, has plucked out this very targeted interest. But I think if you look at what Congress talked about, the problem here is that there was a lot of discussion about the imbalance of content on TikTok at times where the government concedes there's no foreign manipulation whatsoever. And I think figuring out what Congress's actual purpose was here ... that's the test that the Supreme Court has set up in First Amendment cases ... is very problematic, because we really don't know. The government is arguing that it's this very, very narrow interest here, but the record is suffused with comments by legislators both in the House and the Senate about the supposedly ... imbalance about Palestinians and Hamas, all kinds of current events.

Now, we have Mr. Weber in his declaration explains why those allegations of imbalance are wrong, but they clearly motivated Congress in a significant way. It's another reason why the availability of the general standard and the real tainted problems with the specific TikTok provision sets up an alternative where, if the government thinks that it can establish a record based on the argument that it's sort of culled together, let it put together that record, look at the less restrictive alternatives that have not been addressed, and also, frankly, consider the facts.

5. The NetChoice Decision

There were several references during the argument to the very recent NetChoice decision, which vacated and remanded previous rulings on the Florida and Texas laws that placed significant limits on content moderation. The Supreme Court found that curation (when it contains some element of content moderation) is the protected speech of the platform, thus giving some level of support to the petitioners. However, a short passage in the concurrence by Justice Barrett raised—though did not answer—the question of whether “foreign ownership and control” of curation would “trigger First Amendment scrutiny.” This provided an opening to questions to both lawyers challenging the law, though they had potential responses at the ready with factual and precedential bases.

Key quotes:

Judge Srinivasan: So NetChoice, there's been the suggestion that delving into doctrine is too much law geekdom, but let me just do it for a second because I think it actually affects things in terms of the analysis. If we're not in strict scrutiny land and we're in intermediate scrutiny land, which is a lesser level of review, and let's just say we're doing that because this case involves anomalous circumstances because it's congressional determination of foreign adversary status and that tilts the equation, and a balancing of considerations suggests that you get in a lower tier of scrutiny but not abandoning the First Amendment altogether. What's your answer there? Because part of that analysis is that the justification for the law is unrelated to the suppression of expression. Would your analysis be that, well, for the same reasons that we think the law is content-based and so strict scrutiny would apply, even if you foist intermediate scrutiny upon us, we would still say it fails intermediate scrutiny because the law's motivation is related to the suppression of expression?

Jeffrey Fisher: That's exactly my answer, yes. And I think that under Mount Whitney or Arlington Heights, that would be where actually you would stop because once you have an impermissible motive behind the law, you don't even look at the data security.

[Earlier]

Jeffrey Fisher: So all Justice Barrett, I think, is saying in that choice is, ‘Oh, let's ask that question if and when it comes up, but when you have speech inside the United States, our history and tradition is we do not suppress that speech because we don't like the ideas.’

6. The Murthy Decision

The other big social media decision of the summer was also invoked. In Murthy v. Missouri, the Supreme Court ruled that a group of petitioners who had social media content removed, allegedly at the behest of the government, did not have standing to challenge the government’s actions. This was brought up by Tenny as indicating that listeners’ rights (the subject of Lamont v. Postmaster General, a key precedent for the petitioners) only applied when there was a “close” or “particularized” connection between the speaker and the listener. In Fisher’s rebuttal he suggested that by following other users and joining groups, users are indicating that they wish to receive the speech of others (and he noted that, in any case, his clients are actually speakers, not just listeners).

Key quotes:

Daniel Tenny: The other side talks about the Lamont case a lot, and what the Supreme Court said in Murthy about First Amendment standing of recipients of speech is that it requires this sort of close connection. And Lamont was people were getting mail addressed to them and then the government was asking them to raise their hand and say, ‘I want to receive the communist propaganda,’ and the Supreme Court said that wasn't okay. Obviously we're not here to quibble with Lamont, which is Supreme Court precedent, but it doesn't hurt our case here because what the Supreme Court said in Murthy itself was, if your interest is just this broader, there's a lot of things that I want to read, and I'm a general consumer, they didn't even think they had standing in that case much less of a strong First Amendment claim. And so the point here is just if the speaker itself doesn't have a claim, it's strange to say, well, the listener does.

Jeffrey Fisher: First, the government cites Murthy as suggesting that by clients and base politics as users of TikTok might not even have standing let alone a First Amendment claim. To use the government's own words, we have raised our hand and said, ‘Please give us information.’ We follow other users. We've joined various groups on the platform. So we're doing exactly what the male recipients did in Lamont. And remember, that's not even the heart of our claim. The heart of our claim is as our own speakers working with our editor and publisher of choice and there's no suggestion we wouldn't have standing or anything less than the most severe First Amendment injury there. Secondly, there were some conversation about can we think about this law as having only an indirect effect on the creators? And the answer is absolutely not. The law by its terms prohibits a certain publisher from publishing under his own content recommendation system online and that is our publisher. The very speech that the ACT singles out, the social media type speech that uses in the quote that I read earlier, is our speech.

7. Disclosure

If strict scrutiny is the appropriate standard for this case, the government would be required to show that it is using the least restrictive means of achieving its interest. For the stated concern about manipulation of the content via the ByteDance algorithm to promote content that furthers China’s interests, the less-restrictive solution initially proposed during the argument by Pincus was imposing a disclosure.

Both Tenny and Judge Srinivasan suggested that this would not be feasible as the danger that the Act is intended to eliminate would be covert behavior (and therefore would be hidden and not disclosed). Pincus responded that the disclosure could be a general one, noting the risk of manipulation. Fisher suggested that a public advisory from the US government, warning of the risks, might also suffice. While these arguments are cogent, it was striking to hear the attorney for TikTok proposing that his client be forced to disclose that the product bears a risk of covert manipulation by the Chinese government.

Key quotes:

Andrew Pincus: I think there are a couple of questions embodied in your question. One is whether this is a sufficient compelling interest. Even if it is, there's a less restrictive means question. And I think a critical point that we make, and that's true throughout the law, is the government's solution to foreign propaganda in every other context has been disclosure. It has not been a ban. The Meese case talks about that and in Footnote 15, has a very fulsome and explanation why that our view in America is if speech is made clear, then Americans can decide.

Judge Rao: So how are you supposed to have disclosure, or verified disclosure, in that sort of circumstance?

Andrew Pincus: Your Honor, it might be that the disclosure is just that the government says there's a risk to control. Maybe the disclosure doesn't have to be targeted?

Jeffrey Fisher: But even if it were an argument, you asked what a disclosure might look like, the government could issue its own warning or maybe even as a company suggest a news brief, something like a surgeon general warning on the platform itself. From the creator standpoint and the user standpoint, that would be a whole lot better than shutting down the platform. If the government thought it was factually accurate and could justify a warning says, "This might be influenced by Chinese government officials," that would be a lot different than shutting down the platform and that would fully meet the government's covert interests to covert manipulation interest. And then finally, Judge Ginsberg, I want to return to your question. Isn't this unprecedented in a sense, foreign adversary, content manipulation, and the like? And the answer is no. Our country throughout history has come up against this problem.

8. Important Precedents

Finally, there were three precedents discussed multiple times that had echoes in the current case:

  • Lamont, mentioned above, was a case where a US citizen was in fact sent Communist propaganda from China.
  • Whitney v. California dealt with holding a leadership role in a domestic Communist party that supported the violent overthrow of the US government. Though Whitney’s conviction was upheld at the time, it was later overturned as precedent, with the new standard of “imminent lawless action” being required. Fisher proposed that even the standard under which Whitney was decided would not be met in this case, where there is only a risk of content feed manipulation at some point in the future.
  • Palestine Information Office v. Shultz was a case where the District Court upheld the government's action in shutting down the Palestine Information Office as it was found to be operating as a foreign mission of the Palestine Liberation Organization, then a recently designated terrorist organization. The court noted that the PIO was not prohibited from advocacy activities. In the TikTok case, several members of Congress seem to have been motivated to vote for the bill because of their concern that TikTok was spreading pro-Palestinian views.

Authors

Tim Bernard
Tim Bernard is a tech policy analyst and writer, specializing in trust & safety and content moderation. He completed an MBA at Cornell Tech and previously led the content moderation team at Seeking Alpha, as well as working in various capacities in the education sector. His prior academic work inclu...

Topics