Home

Donate

Transcript: TikTok, National Security, and the First Amendment

Justin Hendrix, Prithvi Iyer / Apr 22, 2024

Following its passage in the United States House of Representatives, on Tuesday the Senate is expected to consider legislative language that would force the Chinese firm, Bytedance, to divest from TikTok or risk the app being banned. If the measure passes, it is expected to be signed into law by President Joe Biden. And, it will almost certainly be challenged in court.

On April 15, the Berkman Klein Institute for Internet and Society at Harvard University hosted a panel discussion titled “Dangerous Dancing?: TikTok, National Security, and the First Amendment.” Visiting Fellow and Scott K Ginsburg Professor of Law at Georgetown University Law Center, Anupam Chander, moderated the conversation, which featured:

  • Jennifer Huddleston: Senior fellow in technology policy at the Cato Institute
  • Ramya Krishnan: Staff attorney at the Knight First Amendment Institute
  • Jenna Leventoff: Senior Policy Counsel at the ACLU.
  • Alan Rozenshtein: Associate Professor of Law, University of Minnesota Law School.

The discussion raised a number of key questions related to the forced divestiture, including:

1. Are the government’s concerns over national security sufficient to justify a ban?

"I think we have not seen any evidence of a real threat right now," said Leventoff. "It is all hypothetical and that hypothetical is not going to be enough to survive government scrutiny. But I think what's really clear to me about this is that the government wants to go after China. They perceive that as being politically popular."

“I fully concede that the nightmare scenario that is motivating supporters of this bill, that does appear to be a hypothetical, which is of course what you would expect,” said Rozenshtein. “If your concern is that this is kind of a ticking time bomb that China could use at a moment of high tension, you would expect China to wait to use that. But then you are accepting that that is hypothetical and that is of course a weakness in the government's case. On the other hand, if you look at the component pieces of that concern, they are anything but hypothetical.”

2. What is the appropriate level of scrutiny that should be applied to a First Amendment challenge against the legislation?

“I think the appropriate standard is actually even more than strict scrutiny,” said Leventoff. “This is a prior restraint. This is stopping the speech of 170 million Americans before they can say anything. And in so many cases, that's worse than what we traditionally just deter speech because it's not the case. And then you're punished later. In this case your view is never getting out there to begin with. It is the most strict speech-restrictive thing that you can possibly do.

3. Does the fact that the bill requires divestiture rather than an outright ban matter to the First Amendment analysis?

“So clearly, a forced sale or divestiture, well, very concerning for many reasons that I'm guessing we'll have some time to go into from a speech point of view is a very restrictive means,” said Huddleston. “A full-out ban would be a more restrictive means. But the question is what else exists on that spectrum before you get to something like a forced sale or divestiture? And I also think it matters how this forced sale or divestiture must occur because we're talking not just about a small company that doesn't have many users, it's a small transaction or something like that. We're talking about a very large transaction in a very short period of time that also has to clear some additional regulatory hurdles, not only because of the size of the transaction but because of what else is in the bill. It has to be proven that this satisfies, that this alleviates concerns about foreign interest.”

4. Does the history of restrictions on foreign ownership of communications and media platforms in the US suggest the courts will regard the measure as constitutional?

“I think really good work has been done on this by Ganesh Sitaraman, a law professor at Vanderbilt,” said Rozenshtein. “He wrote a wonderful paper on foreign control of platforms in the Stanford Law Review quite recently. And he goes through this history in a lot of detail. And what's notable, I think from that history is that US restrictions on foreign control of platforms are pervasive, not just in the communications industry, but also in banking and transportation. And then within the communications industry, they go back quite a long way, more than a hundred years back to sort of the original Radio Act of 1912 and then through the various communications revolutions of the 20th Century Radio Telegraph telephone, and so on.”

5. Even if it turns out that China is indeed using TikTok as a means to deliver propaganda, do TikTok users have a First Amendment right to choose to receive foreign propaganda?

“Generally speaking, the court has held that the suppression of speech is not a permissible response to the problem of disinformation, and that there are less restrictive means that the government ought to use before it goes there,” said Krishnan.

"So, I agree that the case stands for the proposition that a blanket ban on foreign propaganda would be unconstitutional," said Rozenshtein. "I just wouldn't push that proposition farther than I think the court meant it. This is not a blanket ban on Chinese propaganda."

6. Will the measure create a precedent for banning other foreign-owned media?

“What this bill does is it says that the President can unilaterally ban other apps from foreign that are partially owned by foreign adversaries, right,” said Leventoff. “And that is ripe for abuse. I imagine President Trump coming in, and if the president of a company says something that he doesn't like about them, he is going to ban that company in the US. Therefore, anytime a US user relies on a company, it could be gone pretty quickly because they've done something to anger the President. I think this just puts too much control in the power of the President with no one else.”

"I think that a real concern that a lot of us have about this ban on TikTok is that if passed will be a real gift to authoritarian regimes around the world that will use this as precedent to ban foreign media in their countries," said Krishnan.

What follows is a lightly edited transcript of the discussion.

Toni Gardner:

Well, welcome everyone to the Berkman Klein Center for Internet and Society and our Institute for Rebooting Social Media at Harvard University. My name is Toni Gardner, and I direct operations for our institutes. And today I have the honor of welcoming you all and our folks on Zoom, our Esteemed Panelists. And introducing our moderator and organizer for today's event, Professor Anupam Chander, who is a visiting scholar with the Institute for Rebooting Social Media and the Scott K. Ginsburg professor of law and Technology at Georgetown, a graduate of Harvard College and Yale Law School. He is the author of a number of books including a new book on data sovereignty just out from Oxford University Press. He has been a visiting law professor at Yale, Chicago, and Stanford and is a member of the American Law Institute. Over to you, Anupam.

Anupam Chander:

Thank you very much, Toni; honored to introduce the panel, and before I do that, I just want to bring you up to speed on where we are with TikTok bans, past and present. So in 2020, Donald Trump famously and dramatically banned TikTok, but instead of Trump banning TikTok by the following spring, it was TikTok that had banned Trump from removing various videos showing his involvement with January 6th. So how did a Chinese big tech app, big tech company, or a Chinese origin big tech company beat the President of the United States on his own terms? It was the US Court that stepped in to protect TikTok and TikTok’s users. In particular, the courts concluded that the President lacks statutory authority to ban this cross-border speech app. There are still a couple of spaces up here if anyone wants to come to the front. The government's efforts to convince the courts that national security required the ban to go into effect failed to convince two district judges, including a Trump appointee, who concluded that they were conjectural and could possibly be met with other measures.

But TikTok’s troubles were far from over. It began negotiating earnestly with CFIUS, a committee in the United States government that reviews foreign investments that present possible national security concerns. TikTok negotiated, as CFIUS offered to place all of its US user data inside servers held in the US and controlled and managed by Oracle, a company whose leaders had coincidentally supported Donald Trump in the 2020 election. TikTok's algorithm would be monitored and vetted closely by Oracle and others, and the board of directors of the TikTok data security arm, the arm that held all the data and controlled the algorithm, would be approved by the US government, a remarkable landscape for a speech app in the United States. Those efforts continued to this day and CFIUS hasn't yet ordered a divestment or accepted TikTok's Project Texas plan to mitigate those risks. Now where Trump's ban failed for lack of statutory authorization, a bill in Congress today would fix at least that deficit.

That Bill proposed by Congressman Mike Gallagher (R-WI) and joined by Congressman Raja Krishnamoorthi (D-IL), would clearly direct ByteDance to either sell TikTok and its other US-facing apps or face an effective ban in the United States. So, while that bill is concerned with both the use of TikTok for surveillance and propaganda. In conversations with the press, Congressman Gallagher has said that he's especially concerned with propaganda from the app. And in November of last year, Congressman Gallagher blamed TikTok for pushing Chinese propaganda and radicalizing Gen Z on the app. Kind of an interesting interpretation of recent events; the bill that passed the US House in seemingly record time and with overwhelming margins is now before the US Senate for consideration. While the bill's proponents say that it's not a ban, it's only compelling a sale; they can't be sure that it won't lead to the shuttering of the app in the United States.

This is because, in 2020, China modified its export controls to add to the technologies subject to export controls, personalization algorithms, the algorithms that would make recommendations based on personal information, thereby kind of hinting that it would ban the sale of TikTok should that arise, should that be compelled. Now, ByteDance may also itself prefer to shut down the US market, and this is not talked about. Rather than create a future competitor at a fire sale price, so ByteDance itself has a TikTok app that operates across the world, it would now have to face the difficult situation of not only having already bifurcated its China facing app Douyin, but now bifurcating TikTok into multiple apps that tried that with obviously interoperability questions that would be potentially difficult.

Coming to the rescue is Trump's former Treasury Secretary Steve Mnuchin. Steve is assembling a group to buy TikTok, and he has a solution for the Chinese government's veto. He proposes to buy TikTok without the algorithm, and so his proposal is to recreate an algorithm in the United States from the ground up. And so now that the TikTok bill has passed the House, I might also note that this is Congressman Gallagher's valedictory. Next week he will resign from the US House to join Palantir, a defense contractor for the United States government that interestingly has one of its first initial investments from the venture capital arm of the CIA, and I kid you not that you would not have expected those words but that is the reality. So, our panel will consider what happens if a TikTok bill is passed. Since signing to law, both TikTok and its users will sue, arguing that the bill violates the First Amendment and the Bill interestingly places original jurisdiction over challenges in any challenges to the bill, which will immediately happen of course, in the DC Circuit Court of Appeals.

That is the only possible appeal from that original court will be directly to the US Supreme Court. So, this is very much an issue that could well be before the court is coming in the months to come before the US Supreme Court. I mean to say. So the TikTok Saga is kind of like as I wrote this, I realize it's kind of like a Hollywood drama, and its final scene may well lie at the US Supreme Court and joining me today to explain the issue. So what we're trying to do with this particular conversation is focus on what happens in those courts, ask that particular question, what happens to the First Amendment challenge that will inevitably be filed if the TikTok bill goes through. So we have some leading experts from across the nation joining us here to explain these questions and to think through how the court will reason this issue.

First to my immediate left is Jennifer Huddleston, a fellow at the Cato Institute. This happens to be a very nice opportunity to host Jennifer, but I'm also hoping that we can join in helping support Jennifer because on Monday, she'll be running the Boston Marathon, an incredible feat. This is going to be her 12th marathon or half marathon. And her time or average time is an eight minute mile. So just shocking to me, I can't do one eight mile, let alone 26. To her left is Ramya Krishnan, senior staff attorney at the Knight First Amendment Institute and a lecturer in law at Columbia Law School. And finally, to my far left is Jenna Leventoffl, a senior policy council at the ACLU, where she develops and advocates for policies relating to protecting free speech and on-screen joining us from Minnesota is a law professor at the University of Minnesota and a senior editor at Lawfare, professor Alan Rozenshtein.

Alan is a graduate of this very law school and a former fellow at Berkman. So with that introduction, I'm going to go around and ask them questions, and there will be time for questions from the audience, and there will even be time for questions from the Zoom audience. So I want to encourage you to think about your questions, and we're doing this very much for lawyers. And so we are going to focus on the question that lawyers ask in such a conversation. So I'll begin with a threshold question and the threshold question that a court will ask is what is the standard of review? So in the Montana case, the district court applied intermediate scrutiny, finding the TikTok ban to be wanted. Even under that standard, it said it didn't need to decide whether or not that was the appropriate level of scrutiny because it failed under intermediate scrutiny with opera. Should I fail under strict scrutiny? And so I want to ask Jenna first, what is the appropriate standard of review in this case?

Jenna Leventoff:

I think the appropriate standard is actually even more than strict scrutiny. This is a prior restraint. This is stopping the speech of 170 million Americans before they can say anything. And in so many cases, that's worse than what we traditionally just deter speech because it's not the case. And then you're punished later. In this case your view is never getting out there to begin with. It is the most strict speech-restrictive thing that you can possibly do. And so when the Supreme Court is lifted, prior restraints like this where again you're just stopping speech before it starts, they say they're going to presumptively fail a constitutional analysis, the government has to go so far above what it normally goes for. So they have to show that there's an immediate; they have to show that harm is extremely serious. And then not only does it need to be a narrowly tailored solution, but it pretty much needs to be necessary.

It needs to be a necessary solution. Is this the only thing that you can do to actually solve the problem? And in this case, it's going to fail. This is not going to meet that analysis because I think the government has yet to put forth any public evidence that there is a real harm, let alone an immediate one. But even if we got to that point, even if the government came out, I know in Congress they're talking about doing a public briefing. Many members of Congress have gotten private briefings about what the potential harms are. And if you ask a lot of those members, they'll say, I haven't heard any things that I find particularly convincing about an imminent harm. This seems theoretical, but some members do seem convinced that there is a real harm. So even if that comes out where we're going to fail here, these bans are the least restrictive thing.

What this is doing is shutting down the app essentially. And there's really no way to do that in that it is speech-restrictive in and of itself. And there are so many other things that we could do to target any of these farms. We could pass a privacy bill right now, let's say the concern is that China is accessing our data, right? Well, China can still access our data even if TikTok is banned; they can get data from a data broker, and they can hack into Facebook's system like every other app and website selects the same data. And so we're not doing very much here to actually solve the problem.

Anupam Chander:

Okay, so Alan, I'm going to come to you. Jenna says this is even more than strict scrutiny. It's obviously a prior restraint; it's shutting off access to speech, an incredibly important speech platform for something like 170 million people in the United States and then before coming to the substantive applying whatever the standard of review is. What is that appropriate standard of review from your perspective?

Alan Z. Rozenshtein:

Sure. So, to be perfectly honest, I'm not sure. I feel like I can argue it many different ways and it's a common problem in First Amendment jurisprudence is these sorts of endless arguments about what the appropriate standard of review is. So lemme say two things. So first, with respect to what the actual standard of review is, I think again, I think you can make arguments up and down the spectrum. So, we just heard the argument for a prior restraint. I think that's very plausible. On the other hand, I think there are lots of potentially analogous cases where we wouldn't apply that kind of standard. So for example, an FCC denial of a license, I don't think you would necessarily apply a prior restraint standard that would keep someone, for example, out of the communications market. So again, I think it just depends on exactly how you characterize the issue.

And there's a lot of play in the doctrinal joints, as it were. You could potentially characterize it as viewpoint-based. If your concern is that the Chinese communist party is pushing a particular viewpoint, then you might characterize this ban or this law that way. And that would obviously be a high level of scrutiny. You could just characterize it as a content-based law, which is to say the content is Chinese propaganda if that's how you want to characterize it. And that would be strict scrutiny or you could characterize it the way that the Montana court did as a more neutral, something analogous to time, place, and manner, in which case you have intermediate scrutiny. And then of course, we haven't even talked about the national security implications, which I think tweak all of these tiers of scrutiny, at least how the courts analyze them. So I think this foundational question is very much open.

But the second thing I want to say is, and not to get sort of too legal realist five minutes into the conversation; I'm not sure that the fight over the tiers of scrutiny here will ultimately matter in the long term. I think these sorts of distinctions are quite important when lower courts are trying to slot a fact pattern into a well-established body of First Amendment law. But I think there are two reasons why that's probably not the case here. First, although this isn't sui generis, as I'm sure we'll talk about later on in the conversation, there's a long history. We can sort of debate the specific parameters of that history of restrictions on foreign ownership of platforms. This sort of move is, I think, quite unusual. There's not a massive amount of case law here. In addition, as you pointed out, Anupam, this litigation will start in the DC circuit, and then it will almost certainly be reviewed by the Supreme Court.

This is such an important issue that it's hard to imagine the Supreme Court, which I think to its credit in the last few years has shown a real willingness to engage with a whole host of internet-related platform issues that it's generally not in the past that will take on. Once you to the Supreme Court, I don't think the tiers of scrutiny play any role whatsoever. I think the Supreme Court, you have nine policy makers who are going to be balancing the various equities here as they see it. And so, while again, I think asking the tiers of scrutiny question is a good place to start, at the end of the day, I don't think that this is going to be determinative of how the doctrine ends up playing out in this litigation.

Anupam Chander:

I love the idea of the Supreme Court having nine policymakers balancing various interests as they judge this. And it reminds me of Justice Kagan, formerly Dean Kagan, saying that they are hardly the nine most brilliant people about expert people on the internet. So fascinating to imagine. Okay, so let's turn to one of these questions that has already been mentioned. Many of the defenders of the TikTok Bill argue that it doesn't actually impose a ban. It simply requires ByteDance to find new owners, ones without ties to a nation that has been identified as a foreign adversary. But this is the little of that national security question teed up that Alan has suggested. Of course the bill would impose a ban if ByteDance doesn't divest. Does it matter for the First Amendment analysis that the bill isn't an immediate ban order but rather a divest or if you don't divest, then you're there? And Jennifer, I'm going to come to you first.

Jennifer Huddleston:

So yes, it does matter in part because it's going to go back to the are there less effective means to achieve this goal? So clearly, a forced sale or divestiture, well, very concerning for many reasons that I'm guessing we'll have some time to go into from a speech point of view is a very restrictive means. A full-out ban would be a more restrictive means. But the question is what else exists on that spectrum before you get to something like a forced sale or divestiture? And I also think it matters how this forced sale or divestiture must occur because we're talking not just about a small company that doesn't have many users, it's a small transaction or something like that. We're talking about a very large transaction in a very short period of time that also has to clear some additional regulatory hurdles, not only because of the size of the transaction but because of what else is in the bill.

It has to be proven that this satisfies, that this alleviates concerns about foreign interest. So, are there certain buyers that the government might potentially strike down? There are all sorts of other elements of how this divestiture must occur. That does not mean that it is as simple as sometimes advocates of this bill make it out to be like TikTok could just go down on the corner and offer itself up for sale. These are complicated business transactions that there are only going to be certain people that can potentially participate in. Now why this matters from a First Amendment analysis is that a lot of us can sit here and think, well, what else could be done if we do say that the government has a national security interest? Is this the least restrictive need for speech, or are there other steps that could be taken? We've seen some of these play out in courts at a state level as well with regards to, for example, banning TikTok from government devices or government networks.

The idea is that if there is a national security concern, it shouldn't be on government devices, it shouldn't be on government networks. That has generally withstood the challenge in the Texas courts. I know, and in many cases, we haven't seen as much challenge to that kind of ban that's much more narrowly tailored to a particular situation. On the other end, you'd have something like the Trump executive order that's a much more flat-out ban, but you also have other things in between. You mentioned Project Texas. That would be an example of something that would perhaps be less restrictive. We could think of something where Congress, for example, could mandate a warning label that says this app is known to have ties to China. Again, there are many First Amendment concerns with such a proposal, but it's probably less restrictive than a forced divestiture. We can think of other steps that could have been taken. So I think it will matter when it comes to identifying if this is the least restrictive means to achieve Congress's goal, which is not always clear what even that national security concern is.

Anupam Chander:

By the way, the privacy bills that have been proposed in Congress do actually have a disclosure requirement for data transfer to China. So the latest bill that is in Congress, like the bipartisan bill that was proposed last year. So there's an interesting alternative that's before Congress right now. Ramya, any thoughts on the posture where this is a bill that says divest, and if you can't divest for some reason, then you're banned as opposed to an outright ban then how does that change or affect the analysis, or does it?

Ramya Krishnan:

So I'm going to get a little bit realist like Alan Pierre. I'm not sure in practice how much distance there is between an order to divest or be banned and a flat-out ban because we know that China would have to approve any deal, and it is on record that it will very likely firmly oppose any such deal. That's what its commerce spokesperson said when last year, the Committee on Foreign Investment in the United States (CIFIUS) when it told TikTok to divest or be banned, it's China spokesperson came out and said, well, actually, we're going to have a problem with this. And that's when they noted also that they would have to feed the export of ByteDance's algorithm, the algorithm that TikTok runs on. And so that's why analysts have said that it's very, very unlikely that a divestiture deal will be accomplished here. And so what we're staring down the barrel of is almost certainly a ban.

But the other thing that I would mention, this is a matter of First Amendment doctrine, is that generally speaking, the government is not meant to be able to indirectly do what it can't do directly. And here the government would be using the threat of a ban in order to accomplish a divestiture. And I think that it should matter for the purposes of the analysis that there is this threat of a ban being held over a company in order to achieve divestiture. If I could just respond though to something that Jennifer said that's unrelated to that question, but how we should look at certain less restrictive alternatives like, for example, the many state laws we've seen that have imposed a ban on state employees accessing TikTok on state-owned or operated devices and a little bit self-interested biased here because as one of the litigators who litigated that Texas case, the case challenging the application of Texas's state employee ban to public university faculty engaged in teaching and research.

And so I just do want to highlight that at least the application of those kinds of bans to the public university context, I think, does raise serious First Amendment concerns. There are faculty that are engaged in the study of TikTok. Many of them focus on the very privacy and security risks that these states said they care about and have offered as a reason that they have passed this ban in the first place. Students' interest in learning about one of the most popular communications platforms is also implicated. And so I would just want to push back against the idea that those bans never raise First Amendment concerns, I think they do.

Jennifer Huddleston:

If I can just clarify really quickly, I am not saying that they do not raise First Amendment concerns. It is more of a, when we're thinking about what are less restrictive means, we've already seen some of those less restrictive means play out. There certainly are First Amendment concerns in many of the things that I mentioned, for example, a warning label or even some of the data localization requirements. So there could be certain other concerns related to that, but I think it's important that when we see what a sizable step the divestment ban would be, I think we both agree that that is a significantly more restrictive means than what we've seen play out so far.

Anupam Chander:

So Alan, does it matter how the bill is styled as an outright ban versus a divestiture order coupled with a ban if divestiture doesn't occur?

Alan Z. Rozenshtein:

I think it does. I agree with both Jennifer and Ramya here sort of simultaneously, and so I'll try to explain why. So I agree with Jennifer that it does matter, right? There's just a difference between saying, this thing is banned versus this thing might be banned, but it might not also not be banned if there's a divestment. On the other hand, I think Ramya is correct that based on everything we know about the geopolitics of this, if this law is passed and the President then identifies TikTok under the law, it's likely to lead to TikTok's ban. And I do think that defenders of the law have to be prepared for that. And so I think you do have to accept that possibility.

Nevertheless, though, I actually think the divestment option is clever for another reason, which is that if China refuses to divest or allows ByteDance to divest, I think that actually then strengthens the national security case for the law itself because it shows how valuable the Chinese government perceives TikTok's role in the United States is now. So at the end of the day, I think Ramya is correct that if this is going to be defended, it's going to have to ultimately be defended as a ban. But I also think Jennifer is correct in that there's a lot of cleverness in adding the divestment option.

Anupam Chander:

Great. Great. Thank you guys. So I do think, by the way, there is something that someone we haven't mentioned yet, just to add a little editorial commentary quickly, Twitter past hands from the shareholders developed before to Elon; from the public shareholders and its leadership that made a big difference. And so Twitter's content changed. So the divested share order itself should have, we should think of that as having First Amendment implications even before we get to a ban. So it's not just the ban that you can't use this app but that it has to be run by someone else. That's a pretty substantial First Amendment intrusion in my personal fear. Okay. I'll let anyone else respond to that if you want to. Sorry, so I don't get the last word. Claim the last word here.

Jennifer Huddleston:

The only other thing I would point out about this is we keep talking about this in the context of TikTok, and I think that's because TikTok is named in the bill that has the best sort of band, the vision that you mentioned. But if you look at that legislation, it's actually broader than just TikTok. So we have to think not only what does this mean for this current debate, but what does this mean more generally for apps that could be determined by the government to fall under this category? And what does that mean more broadly for the way we see not only government interaction in this market, but also government intervention into potential speech apps in the future?

Jenna Leventoff:

I want to just kindly clarify what you were saying, which is that this bill says that the President can unilaterally decide that if there is another app that is owned by a foreign adversary, the President can mandate it. There's no due process. The President has to give notice to Congress and notice to the public, and that is it there.

Anupam Chander:

Have to clarify, when the bill talks about foreign-controlled apps, it means an app that has ownership or 20% or more that originates from a foreign country that is labeled an adversary. That doesn't mean that the government of that country owns part of that app. It means that there are, in this case, Chinese citizens who might have 20% ownership of the app. And so that's the way. So in other words, there's a wide swath of companies that might actually come into the scope of being foreign-controlled apps.

And as Jenna points out, the President can unilaterally declare those apps to be a threat to the United States with very limited challenges available to and very limited publication of what the rationale is, very little scrutiny of what is the basis for that claim and a few challenges that might be available for that designation. Okay, let me move on to a conversation between a congressman and PBS News. The congressman says we would never have allowed CBS to be owned by the Soviet Union in the 1960s. And indeed, of course, going back to the Radio Act and the Communications Act, and Alan already referenced this in his remarks, we've had restrictions on foreign investment in broadcasting. How would this history of restraints on foreign investment affect the analysis? Alan, I'm going to turn to you first.

Alan Z. Rozenshtein:

Sure. So I think it should affect it somewhat, but not too much. And what I mean by that is just because we've been doing something does not by itself make it constitutional. There are lots of things there in American history that were done for a long time until the courts came in and said, that's not a constitutional thing to do. So, I don't want to overstate the importance of the history here. On the other hand, I don't think it means anything. I think really good work has been done on this by Ganesh Sitaraman, a law professor at Vanderbilt. He wrote a wonderful paper on foreign control of platforms in the Stanford Law Review quite recently. And he goes through this history in a lot of detail. And what's notable, I think from that history is that US restrictions on foreign control of platforms are pervasive, not just in the communications industry, but also in banking and transportation.

And then within the communications industry, they go back quite a long way, more than a hundred years back to sort of the original Radio Act of 1912 and then through the various communications revolutions of the 20th Century Radio Telegraph telephone, and so on. Now I think while again, I don't want to overstate the importance of that history, I think it's important for at least two reasons, or at least lemme put it this way: I think it strengthens the case for this law in two ways. First, I do think that the way the constitutional provisions are interpreted by the political branches is a important thing for courts to take into account. It is a kind of political branch precedent that coexist in a certain way with judicial precedent. And I think courts should be appropriately cautious, not overly scared, but appropriately cautious about interpreting the Constitution in a way that would not only strike down a law passed by the political branches but would have the effect of declaring potentially a hundred years of what I think we're not particularly controversial restrictions as unconstitutional.

And so something for the courts to think about: these are the considered judges of the political branches for over a hundred years. The second reason I think the history is important is because I do think that restrictions on foreign control of ownership of communications infrastructure in the United States is compatible with a robust communications industry and robust public sphere in the United States. Now, I think you could respond to that by saying yes, but actually, it would've been a good thing. It would've been a better thing if had the Soviet Union wanted to buy CBS in the 1960s or seventies, we would've allowed it because that would've made a better communicative sphere. I happen to disagree with that, but you could make that argument. But nevertheless, I think the fact that our communications system has been quite vibrant despite a history of foreign ownership restrictions, I think further tells you something about this law, though again, I want to emphasize I don't view the history here as in any way clinching one way or the other as to the constitutionality of a bill like this.

Anupam Chander:

Great, thanks, Jennifer.

Jennifer Huddleston:

I think it's an interesting comparison because I also think it shows another element of this discussion that's often under-appreciated, and that is what this bill would signal for the regulation of the internet and technology more generally, because network television has been regulated much more heavily than the internet to the points that Alan just made because we saw a spectrum as a scarce resource, we saw the airwaves as a scarce resource, so it did not have that full First Amendment run. There were more restrictions; there were more regulations on certain elements of broadcast television as well as on certain elements of broadband today.

But when it comes to apps, when it comes to the internet ecosystem, when it comes to the kind of different platforms that we've had for speech online, we haven't seen those same restrictions. And that's part of what actually allowed the US to be a leader in the internet revolution was the fact that we didn't put many restrictions on the ability to come up with these creative ideas, did anything. We supported platforms and enabled more opportunities for user speech. And that's why the internet has been such a positive tool for user speech. And my concern about when we start to hear comparisons to broadcast television is what that's actually doing is opening the door not only in this particular case, but more generally to placing much more heavy-handed regulation on the internet and particularly on online speech, which has been such a critical tool for so many people who in that broadcast era couldn't have their voices heard.

Ramya Krishnan:

Can I just jump in here?

Anupam Chander:

Yes.

Ramya Krishnan:

I mean, I do think that another relevant point of distinction is that generally, those other frameworks, they were ex-ante; they were sector wide. They generally require compliance with sector-wide regulatory standards. And a reason I think that that matters is sort of the intent behind these frameworks. I think that it's been clear from, and you made this point clear from statements made by the bill sponsors, the bill supporters, that a big motivation for them is that they don't like how TikTok is currently being moderated, and they think that an American company would moderate the app differently. Some of them have made specific statements about concerns, not necessarily grounded in evidence, but suggesting that the app is artificially amplifying pro-Palestinian content at the expense of pro-Israel content. And they would anticipate that an American company would make a different decision. And that is generally that kind of content-based purpose, the viewpoint-based purpose is one that we consider anathema to the First Amendment, and I think it sort of distinguishes this case and this bill from some of those other frameworks.

Anupam Chander:

Thank you very much, Ramya. Now let's get to the heart of the matter. Would this bill survive the challenge? Let's imagine that it's just tested on intermediate scrutiny, and I hear Alan say, the level of screening doesn't matter in practice, but let's imagine that the form of the opinion will follow a particular standard of scrutiny. In any case, would a TikTok law pass intermediate scrutiny in that context? That is, does it advance important governmental interests unrelated to the suppression of speech, not burden substantially more speech than necessary for those interests and leave ample alternative channels of communication? In this context, I think we have to keep in mind the possibility of Project Texas as one of these alternative approaches that might be relevant to that question. So I'm going to ask Alan to lead off here.

Alan Z. Rozenshtein:

Sure. So I think it would, though I don't pretend that this is any sort of obvious call, let me distinguish between the two grounds on which this law generally is defended. The first is data privacy, the second being you can call it propaganda, you can call it, it could be information control. I am not a fan of the data privacy rationale for this law for reasons that I think Jen articulated very nicely, which is that the lack of any sort of data privacy protections we have at the federal level means that I do think that if the Chinese Communist Party wants data on US citizens, they will get that data whether or not they ByteDance owns TikTok. And I will say before I was a law professor, I worked in national security at the Department of Justice and I'm the proud owner of multiple lifelong paid subscriptions to identity protection services courtesy of the federal government because China so thoroughly stole mine and millions of others of my colleagues' data.

So, I don't find the data protection argument particularly compelling. I think it's a compelling interest, but I'm not convinced, given the real free speech stakes here, which I certainly don't deny that that would work. So that's why I think the law is best defended, and I think that frankly, folks in Congress should more explicitly defend it on these grounds. And I think they are increasingly doing so to avoid Chinese interference in the information space. Frankly, I do think that that is a compelling interest. And I do think that this is quite substantially related to that, a divestment of TikTok, which is owned by ByteDance, which, although is a private company given everything we know about the way that the Chinese government operates, means that ByteDance is ultimately under the control of the Chinese government. In particular, Xi Jinping. TikTok is an enormously important source, not just of fun cat videos, but of information of news for millions and millions of Americans, including young Americans.

Now again, that obviously raises profound speech issues, but I think it also very clearly shows the real geostrategic and national security implications of that. We can talk about Project Texas. I think that's an important issue. And I do think that that's something that unfortunately, policymakers have not given enough discussion to. And I think that if Congress if the Senate takes this up and there's some reporting that they might actually, despite a month of being very quiet on the issue, there needs to be a record explaining in more detail than was the case in the house. Why Project Texas or something like that would be insufficient. I think there are things you can complain about Project Texas. There are even ways in which Project Texas, given how much actual US government involvement it would introduce into the day-to-day workings of TikTok is in some sense creates its own First Amendment concerns in a way that just having sort of a TikTok or a TikTok competitor cleanly run by a US or not by China or other foreign countries of concern doesn't propose. So I don't want to concede that Project Texas is sort of unambiguously the right alternative solution here, but I do think that at the end of the day, the concern over the Chinese control over a profound information infrastructure is certainly compelling and I do think this is a reasonable way of dealing with that problem.

Anupam Chander:

Thanks. I'm going to come back to Ramya.

Ramya Krishnan:

Yeah, so I mean I completely agree with Alan on the data privacy point. I think that the reliance on data privacy, its rationale, is very weak. Protecting Americans' privacy is an interest of the highest order, but the way that you protect that interest is by passing a comprehensive data privacy law, not a TikTok ban, which is frankly not just unnecessary but ineffective in actually achieving that interest. And the reason is the one that Alan mentioned that the Chinese government simply doesn't need TikTok in order to be able to purchase or access American sensitive data. It can easily get that data from data brokers and data aggregators on the open market. That's a really big problem, and I really hope that Congress takes up that problem by passing a privacy rule, but the TikTok ban isn't going to do very much there on disinformation. Again, I think that a ban on TikTok is going to be ineffective.

The truth is that foreign governments, China included, don't need to own or own platforms in order to be able to spread disinformation. Many foreign governments have run disinformation campaigns on a variety of platforms, including American-owned ones. Obviously, that was the case with the 2016 Russian campaign on Facebook. So I'm not sure that banning TikTok is really going to be effective in addressing that interest even if it was a permissible one. And I think that that's a real question, not go for that, for the reason I mentioned before, which is that generally we don't like governments controlling the public's access to ideas, including ideas and information and media from abroad. But the other thing that I would mention is I guess I have some trouble with this assumption that foreign speech is uniquely manipulative. Domestic speech can be just as manipulative, just as pernicious, but we generally wouldn't accept restricting domestic speech on those grounds because we would rightly see the potential for government abuse, the potential for the government to use that as a cover to suppress ideas that it doesn't like and the sort of prospect of distortion on one of the major channels of communications that Americans rely on.

I mean, I acknowledge that it's a weighty concern, but it's not one that's limited to TikTok. Again, it was not that long ago that people were talking about fears that a company like Facebook could swing an election, right? I mean, there was a study, I think back from 2010, where it was an internal experiment run by Facebook along with researchers at a university and they ran an experiment on 61 million people. They showed them variations of clickable I vote button, and the result of that study was that, or they concluded that they had gotten an additional 350,000, I think it was Americans to the polls, which is a significant margin and could be the margin of victory and the kind of close elections that we're having. So, this isn't an issue that is limited to TikTok. I mean, I think that this problem flows from having a centralization of power in a handful of for-profit companies, but I don't think we would accept the government imposing a divest or ban or flat out ban on any of these companies simply because they control and foot channels of communication. I think that there are other better policy responses that we get at this underlying problem of concentration of power in a handful of platforms over our public discourse.

Anupam Chander:

Jenna.

Jenna Leventoff:

I agree with most of what you said. I think ultimately where this bill fails intermediate scrutiny or strict scrutiny or anything that has subjected you is that a ban is just not effective to solve any of the problems that this bill is purporting to solve. And we've talked at length about how every other social media company, these same problems are present. And so banning TikTok and TikTok alone is simply not effective.

Alan Z. Rozenshtein:

Can I say just something very quickly? So I agree with Ramya that this would not fully solve the problem, and I agree with Ramya that other platforms have this problem as well, but I do think it's important to distinguish between the scale of the problem on something like TikTok, which again is controlled ByteDance, which can be controlled completely if it wanted to be by or if the Chinese Communist party wanted it to, and a platform like Facebook or X or YouTube or whatever, which while it has potential to be a vector for disinformation, is not potentially under the control of a foreign government. And so I think just the scale of what you are potentially looking at is profoundly different. And I don't think that a law has to completely solve the problem at issue to pass constitutional muster. And so I do think it's important to keep the scale of what we're talking about with TikTok versus other platforms in mind.

Anupam Chander:

So let's go to the question about the national security argument. Many courts, when reviewing this question in the TikTok bans that we've seen, have, and the government has in all those cases submitted secret evidence that has not been made public to us, but the courts have repeatedly concluded the government's claims were hypothetical conjectural. And often, conjectural claims of harm are not enough to justify the free speech burden. And so I'm just wondering how this will play out in this case. You've got the government's claims of possibility. Alan just said the scale of possible manipulation of the information environment in the United States by China, by the TikTok app should potentially justify this. And so I'll come back to Alan, but I think that's the question here that is posed in that it could possibly happen and where the government hasn't yet seemed to show that this is, in fact, occurring, even though as I did mention Congressman Gallagher does believe it's actually occurring today. Let's go to Jenna.

Jenna Leventoff:

Yeah, I think we have not seen any evidence of a real threat right now. It is all hypothetical and that hypothetical is not going to be enough to survive government scrutiny. But I think what's really clear to me about this is that the government wants to go after China. They perceive that as being politically popular. They think that going after China is how they're going to win the election. TikTok of all the apps, has the closest ties to China, therefore, we will ban TikTok, and that's how we'll win our election. I think that's going to backfire. I think half of the country uses TikTok, and I read an article recently that said it's actually the children and members of Congress who have been the best lobbyists for TikTok because they go and they beg their parents not to ban this app they use for so many protected speech activities. But yeah, we just simply don't have evidence that any of these threats are real, let alone rising to that imminent and severe standard that we think it'll need to rise to pass a prior restraint analysis.

Anupam Chander:

It strikes me that I'd love to know how many people in the audience have TikTok on their phones, and who has TikTok on the phones. So I'd say a distinct minority, maybe a third of you have TikTok on your phones. So it's fascinating. So Alan, hypothetical, you post a hypothetical, is that enough to get through this substantial intrusion upon free expression?

Alan Z. Rozenshtein:

So, I think it depends on what you mean by hypothetical. I fully concede that the nightmare scenario that is motivating supporters of this bill, that does appear to be a hypothetical, which is of course what you would expect, right? If your concern is that this is kind of a ticking time bomb that China could use at a moment of high tension, you would expect China to wait to use that. But then you are accepting that that is hypothetical and that is of course a weakness in the government's case. On the other hand, if you look at I think the component pieces of that concern they are anything but hypothetical. So for example, we know that the Chinese government is extremely, extremely prickly, let's say for lack of a better term, about how prickly in terms of trying to control the communications environment, not just within its own country, but outside, whether this is Hollywood changing, how it makes movies so that it can then play them in the Chinese market or the Houston rockets getting banned from Chinese television after the general manager tweeted something nice about the Hong Kong protests.

I think what's definitely not speculative is, again, China's willingness to throw its weight around to change how other countries view it. The other thing that's not speculative is the Chinese government's willingness to, really in an extremely heavy-handed way, control its major private companies. So Jack Ma, the Chinese billionaire and head of Alibaba, which is a huge tech company, he basically disappeared for a while after saying some not nice things about the Chinese government's control over the economy. During this disappearance, the Chinese government basically forced the sale of a bunch of Alibaba's assets. Jack Ma has now reappeared and he seems to be happy with everything. So, those things are not speculative. And so the question is given what's not speculative, what's kind of the margin of additional speculatively to the nightmare scenario? And again, I don't have an answer here, but I just want to emphasize that it's not an either-or that this is or is not a speculative threat. There's going to be nuance there. It's important. So

Anupam Chander:

Coming back to the propaganda question, which seems to be the one that I think everyone agrees is the threat most likely to be of grave concern here in the 1960s. We have a Supreme Court precedent, and that is the case of Lamont vs postmaster general there. The court ruled unanimously that regulations infringe, that restricted the receipt of information from China infringed the recipient's First Amendment rights. That is, Mr. Lamont had the right to receive Chinese communist propaganda, literally the Peking review at the time. So, did TikTok users have a First Amendment right to receive foreign propaganda in the worst-case scenario as you've described it? I'm going to come back to Ramya to delete us all.

Ramya Krishnan:

Yeah, I mean I think the answer is obviously yes based on the case of Lamont, which is a case from the height of the Cold War, I might add. So, in that case, you had a regulation that required Americans who wish to receive information that the government considered to be communist propaganda, communist propaganda from abroad, that they had to send in an opt-in card to the post office saying, yes, please, I'd like to receive communist propaganda. And the court saw through this registration requirement for what it was, which was a very significant burden on the First Amendment interests of Americans to receive ideas and engage with those ideas from abroad. It wasn't; even though this registration requirement fell short of a ban, the court understood that a requirement that the requirement at issue would exert a very powerful chilling effect on Americans in this country and their right to be here.

And so it struck down the law. And so I think if faithfully applying Lamont here, the TikTok ban is actually far more onerous. Obviously, there's the prospect of the ban; you wouldn't just be registering with the government to sort of engage on TikTok, though obviously, that would raise very, very serious First Amendment concerns. And the government doesn't argue that everything on TikTok is disinformation. Yes, it argues that China could one day hijack TikTok's algorithm to push this information. As Jenna said, that is an unsubstantiated claim, and if the government has evidence of that, it should share it with the, it should share it with the public. And I mean, this brings me to my, I think, the central point here, which is that even leaving to one side, generally speaking, the court has held that the suppression of speech is not a permissible response to the problem of disinformation, and that there are less restrictive means that the government ought to use before it goes there.

And Jennifer mentioned one of these before disclosure and that requirement, the disclosure requirement that the Foreign Agents Registration Act requires and requires agents of foreign powers to register. And the court said that kind of requirement requires certain media to label themselves as propaganda. Yes, it does raise First Amendment concerns, but it is a less restrictive alternative to a flat-out ban. And so, if the government has evidence that TikTok is being used in the way it says it could be, it should engage in its own kind of speech. Generally speaking, the court has said the best answer to bad speech is good speech. It's not enforced silence. And so it's not at all clear to me why you would throw away those basic principles simply because we are dealing with foreign speech. And I think Lamont stands for the proposition that we shouldn't do that.

Anupam Chander:

So I just want to pick up on one part of that, which is in the sixties and then again in the eighties and nineties with the Berman Amendment in particular and in the sixties with the Levant, you have this sense that the people who want to allow this speech are very confident in the American people. That is, we can receive foreign propaganda and manage it, and this is what freedom means. It includes the freedom to receive foreign propaganda. There is a kind of realness that is suggested. Now, Ramya, you did mention, well, we can get people out to the polls with these kinds of algorithmic shifts, but that's also different; getting people to the polls is different than getting them to pull a particular lever for one candidate or another and become suddenly communist or anti-communist. There's a kind of sense that manipulation is pretty easy to do and that there might not be other means to respond to that manipulation that we shouldn't consider. But I'm going to turn to Jennifer to pick up on this question.

Jennifer Huddleston:

So a lot of what Ramya said I want, in some ways, just say plus one. I wonder a lot of what she said, but I also think it's important to reframe this conversation in the speech rights of TikTok's users. Oftentimes, when we hear this debate about banning TikTok's, it's seen as a debate over banning a large social media company. And in fact, I would argue that one of the problems with our debate over technology and technology policy in general right now is that we're thinking about these as large companies and not thinking about the millions of users who have found opportunities to have their voice in this way. And that is particularly true when we're talking about banning a particular platform, users have chosen that platform for a reason. They have many choices, and they find that this is the one they like best, whether it's because of how they connect with an audience, whether it's because of the nature of how they consume content.

There can be any reason that an individual user chooses one platform over the other, and many, if not most, users are using multiple platforms for their different speech needs. So I think we have to think about how this impacts user speech, particularly in this context. And traditionally, our response as Americans, as was discussed, has been that the answer to speech that we're concerned about, whether it's propaganda or disinformation or anything else, is to engage in more speech to trust that our fellow Americans will ultimately land on the truth, and that we have those conversations that we don't ban the speech instead.

Alan Z. Rozenshtein:

Yeah, so I love to respond to the great points that both Ramya and Jennifer made. So, with respect to what Ramya was arguing. So, I agree that the case stands for the proposition that a blanket ban on foreign propaganda would be unconstitutional. I just wouldn't push that proposition farther than I think the court meant it. This is not a blanket ban on Chinese propaganda. This does not ban the Peking Review. It does not pose any obstacle to the Peking Review. I think it's called the Beijing Review these days. If there really was a bill that tried to ban literally banned Chinese propaganda, that would be blatantly unconstitutional, and I would very much oppose that. But I just don't think that's what this is. This is a bill that would potentially ban Chinese control of a communications platform. And I think that control is actually much more insidious than straight-up propaganda itself.

Because the whole point is, and I think wherever you fall on this debate, I think we all agree at this point that the power of social media to shape how we perceive the very truth itself is profound. And so when you think about what the First Amendment value underlying lamont is, right, which is that we want more speech because that in the marketplace of ideas presumably will lead to more truth. And some people might agree with the propaganda and other people might disagree, and that will sharpen their own understanding of what's right. I think if you take the concern of control over the medium itself and the ways that the social media algorithms and content moderation can manipulate perception without people even realizing it, I don't think that the principle of Lamont gets you anywhere near this case, though obviously it's relevant now as to the point that Jennifer made about framing this as the speech interest of TikTok users, I completely agree with that, and I do think that's something that gets lost frequently.

And here I think you have to make a guess, frankly, about what would happen if TikTok was banned. I do think that a lot of people who are on TikTok, they would find that extremely disruptive. I think that the kind of content creators on TikTok who have created a lot of content, have invested a lot of that. That would be a huge blow to them. And I don't want to minimize that, but I don't think that TikTok is irreplaceable. I think that the idea of short-form video content with sort of algorithmic curation is well understood, very standardized. You have competitors. I think Instagram Reels is basically a competitor for TikTok. I don't see a realistic possibility that the sorts of affordances that TikTok provides would not be fairly quickly replicated again with disruption. I'm not denying that fact. But I do think when you're thinking about how this would leave the speech interests of US social media users, I think that within a fairly short time, you would have just as much social media content, including TikTok-style content, as before.

Anupam Chander:

Okay. So I want to turn now to the audience for your questions. I'll turn this gentleman up front first. There is a microphone there.

Audience Question 1:

One of you mentioned that the law, although in some ways generic, also singles out TikTok explicitly in the text of the law. Does that itself raise a constitutional question that it is not a generic act, but that it signals out one corporation?

Anupam Chander:

So let's take a couple of questions.

Audience Question 2:

Thank you. I didn't think I would ever ask that question, but I have to. What is the difference between prohibiting us from screaming fire in a crowded theater that I would think would produce a lot less harm to the society, and basically allowing TikTok to produce something of a much bigger nature?

Anupam Chander:

Thank you. And let's ask the question.

Audience Question 3:

Well, thank you. I actually have two questions. The first is, in 2020, after a violent clash between India and China, India immediately suddenly banned the app and I am just wondering if the panel can compare between India and US intensive respective ban on TikTok. That's the first question. The second question is whether there are speculations that in the aftermath of the TikTok Ban, the US might further target some other Chinese apps like Xiaomii, which are the two most influential like Professor Jan mentioned, a Gen Z style e-commerce platform run here in the United States from China as well. So, if that were the case, what would be the possible rationale in the US? Just want to know our panelist perspective.

Anupam Chander:

Great. We'll take another round. And let me begin with Alan. Alan, would you like to respond to any of those questions? Singles out, TikTok, this is much worse than fire in a crowded theater. If we can regulate that. Questions about that. Jeff is rolling over right now, but then we should certainly go and do this. India banned TikTok. What does that teach us? And is this just the first app to roll? Are there other heads that will roll as well?

Alan Z. Rozenshtein:

Sure.

Anupam Chander:

You don't answer all of the questions.

Alan Z. Rozenshtein:

I won't, I won't. So Jeff is a friend of mine, and he would break up with me his friendship if I did not address the fire in a crowded theater. I mean, the whole point of that issue is that sometimes you can yell fire in a crowded theater, and sometimes you can't yell fire in a crowded theater. The whole question is how imminent would the harm be? And I think I mentioned this because that's in a sense a lot of what we've been talking about here. I tend to think that the magnitude of that harm would be enormous potentially if China wanted to use the control that it has over TikTok. But again, as we've talked about in a sense that is somewhat speculative, more speculative let's say, than what would happen if you literally yelled fire in a crowded theater and caused a stampede. And so the question here, and I don't pretend I have an opinion, but I don't pretend to have an answer, is this threat too speculative, right?

I don't think it is because I think when you desegregate it, it's no longer particularly speculative. But to be honest here, it is, in fact, the case that the nightmare scenario is still speculative. I'm asked in the last question about is this the only communications platform that might be affected. No. Right? I mean, this could potentially apply to a bunch of other Chinese-owned or Chinese-controlled apps. I do think that you do have to evaluate the merits of each case somewhat separately. You have to ask questions like what is the potential for whether it's data privacy infringements or for misinformation that may be different for TikTok than for WeChat than for an e-commerce platform. You have to ask questions to Jennifer's point about the rights of users, what are the alternatives here? Right? Again, I tend to think that there are much richer alternatives to something like, let's say TikTok than perhaps to something like WeChat, which really is one of the main ways in which Chinese Americans communicate with Chinese, let's say, family members back in China. So I do think you would have to identify, you'd have to analyze each case a little bit. It's on its own. And so I don't want my argument in defense of this bill as applied to TikTok to necessarily be extended to every other possible sort of Chinese-owned or Chinese-controlled platform.

Anupam Chander:

Thanks, Alan. Let me begin with Jenna.

Jenna Leventoff:

I think the question that I want to answer is about banning other apps. And like I said earlier, what this bill does is it says that the President can unilaterally ban other apps from foreign that are partially owned by foreign adversaries, right? And that is ripe for abuse. I imagine President Trump coming in, and if the president of a company says something that he doesn't like about them, he is going to ban that company in the US. Therefore, anytime a US user relies on a company, it could be gone pretty quickly because they've done something to anger the President. I think this just puts too much control in the power of the President with no one else. Again, there's no due process here. There's no way to fight this. There's notice, and that's it. And so I think that's a thing that we should be concerned about. Those are really broad new powers that we're bestowing just at the end of a bill that's reported about TikTok.

Ramya Krishnan:

Yeah, I guess the first part I be in response to this mention of India is I think that a real concern that a lot of us have about this ban on TikTok is that if passed will be a real gift to authoritarian regimes around the world that will use this as precedent to ban foreign media in their countries. Previously, the US has criticized and has been rightly vocal when other countries have banned their citizens access to foreign media, foreign social media in those countries. And I think it would no longer have the credibility to do that going forward. I mean, we've already seen, obviously India has already banned TikTok, but very recently, I think Israel is planning to ban Al Jazeera on the grounds of national security grounds. I mean, I think that these examples sort of raise this sort of broader concern about investing just to outreach national security discretion in the executive, even if you trust the current executive, imagine a future executive that you might not trust.

Jennifer Huddleston:

I also kind of want to pick up on the two questions about India or other countries having banned this app as well as what this might mean more broadly. I think in addition to what's already been said about the concerns that this has of what this means for countries that are looking for an opportunity to perhaps force divestments or bans on other media apps, including potentially American media apps in some scenarios, how would we feel about an authoritarian regime using this as an excuse to ban American apps? But I also think it's important to recognize that the First Amendment is largely unique. What we will not tolerate with regard to government intervention and speech is distinct from other countries' views of free speech and free expression. And that matters a lot. I think that's a good thing, but it also means that we can't just say Country X did it and it didn't have legal scrutiny there.

Therefore why is it a problem in the US because we do have this standard of the First Amendment that requires different elements when it comes to government intervention and speech in addition to what's already been said about how this does vest broad power in the executive? I think it's also important to note that this isn't just a China bill either, that there are other countries that are named in the bill and other countries that could be added to the foreign adversary list. And so this also not only has to be thought about what does this mean for some of the apps that were mentioned, but what does this mean more generally for the way the US may interact with others.

Anupam Chander:

So one of the apps that I watch that I think gets very little attention in the United States is Telegram, which is a Russian origin app and is very popular in certain parts of the United States. And I'm going to go further than what Jenna said. Jenna worried about Trump too, but I think the Telegram often skews to the far right. You can easily imagine the President on the other side saying this is supercharging hate speech, et cetera, in the United States and, therefore, should be banned and, therefore a threat to national security in some way. So I think there's that, and I certainly when plus one to the idea of borrowing from India's example, India's response was in the context of a literal battle of the Himalayas where Indian soldiers fell to their deaths. And so I think that was a totally unique environment in which to respond in different ways. India chose to respond in this, what the minister at that time called a digital strike against China, which is much better than a kinetic strike against China. So I want to say that was actually a moderate response in those kinds of circumstances. Okay. Now, I want to turn to Guzo and give the floor to Guzo to ask questions from the online audience.

Questions from the online audience:

Yes, many, many callers with questions here. I'm going to try to run through a few so you can have an idea of the commentary that is popping up online. First, in regards to the homework of the government, any sense on why the committee on foreign investment in the United States never ended up assigning public servants as board members of the United States Digital Service as part of Project Texas? And in that sense is what is your view on an effort of actually making the propaganda and manipulation more evidence-based, more showing that case? Another question in regards to that as well is how this is different from the Committee on Foreign Investment forcing the Chinese company Kunlun's divestment of Grindr in 2020. Other questions go about the idea of the theater. Can you explain a little bit better, what's the difference between banning free speech if you have other options to express your free speech online using other apps and other mechanisms as well? Another question here is how would the Supreme Court cases like net neutrality and Chevron contribute to the evidence on these arguments and what are the lines between public safety and health and free expression on this case? Finally? Yeah, I think this is a good picture of the entire thing.

Anupam Chander:

Thank you for summarizing five big questions. So, I'll summarize those questions again very quickly. The question being, well, CIFIUS never followed through on certain aspects of Project Texas where there might be public servants that are assigned to monitor what USDS, US data services, the TikTok arm that controls the data and the algorithm and maybe that might have mitigated some of those propaganda concerns that had been described earlier. So why didn't that transpire in that way and essentially kind of largely address those concerns or at least mitigate those concerns? In 2018, the US government ordered the divestiture by Kunlun of the dating app Grindr, and a year later, Grindr did, in fact sell to another company. So maybe that's a precedent that should be, that serves. How is that different than what's going on here?

The buyer in the credit theater question, and there are other alternatives to express yourself in this than use TikTok. So perhaps it's not such a huge burden after all. What does the net neutrality debate teach us about the question? And what about the public safety, public health arguments against TikTok? So, if you watch the Shou Chew hearings, many of the people said this is leading our kids to drugs and other ills. And so this is a kind of one-way ratchet that the Chinese government is leading our kids like the Pied Piper to their doom. And so I'll open up those many questions, interesting questions to all of you. Let me begin with Jenna.

Jenna Leventoff:

Sure. I will tackle a couple of them quickly. One is I don't think TikTok is that easily replaceable. I've talked to a lot of small business owners who say that they were never able to get their small business stuff up and running with a different app. It didn't work the same; they weren't able to reach the same audience. So for so many people, this is their actual livelihood that's at stake. And earlier Alan was saying, sure, this would be a hit for people when their income is hit. That is not something that you can easily recover from, even if it's only a temporary hit, having no income for months. I mean, for some people, that is the difference between having a home and having food on the table and not. So we really can't underestimate the fact that TikTok is not so easily replaceable, especially for the people who are relying on it for their income.

The second question I want to tackle is the net neutrality question because that's the other thing I've spent my whole week working on. It's also part of my portfolio. I think in both cases what net neutrality does is it says that internet service providers cannot treat different internet traffic differently. They can't decide what is sped up, and what is slowed down. They can't block some things. And I mean, that's the same, right? The reason that net neutrality is happening is to protect the ability of users to go undo what they want to do online. And this is the same right now half of the country wants to use TikTok. And so to ban TikTok, that's really going against the core of net neutrality, which is to make sure that users can access the information online that they want to access. And the difference is that net neutrality is what internet service providers are dictating that constituents are accessing. In this case, it's the government, which obviously has an even bigger implication for the First Amendment. There was one other question that I wanted to answer.

Oh, harm to kids. So I think one thing that, because I also work on a lot of kids online safety, is that the government is not talking about the actual benefits of social media to kids for all of the harms that exist. Ultimately, social media has changed the game for kids in a lot of really positive ways. I know at the ACLU, we have a big focus on equity. We have a whole team that focuses on LGBTQ issues. One thing that you'll hear a lot is people who say if the internet had existed when I was a kid, I wouldn't have felt like the only gay kid in the world, right? Kids who live in places where there aren't others like them have been able to find that information, find resources, explore themselves, find a sense of community, and not feel so alone. Kids who are bullied will often say the internet is what they turn to because that's where they find friends; that's where they find people.

In so many cases, the internet is made out to be this horrible risk for kids and these harms, I will not, they exist, they absolutely exist, but there's a lot of good that happens as well. And I think policymakers jump to regulating the internet because that's easier. It is much easier for 'em to regulate the internet than to invest money in education and digital literacy than to invest money in law enforcement to go after people who are selling drugs and abusing children. There are so many other things that the government can do that are bigger and harder and that's why it's easier for, let's hold Facebook, TikTok, and Instagram accountable here for every harm that's ever happened to kids.

Anupam Chander:

Thank you. Ramya.

Ramya Krishnan:

So I might try and take one of the questions that hasn't already been addressed. The first one is about CFIUS and potentially getting the US government more involved in auditing, having oversight over TikTok's algorithm as a way to protect against the possibility of Chinese co-optation and disinformation. So I'm not sure why that proposal didn't quite work out, but I'm pretty glad it didn't because I think the prospect of having really close US government entanglement in reviewing and having veto power over the content moderation policymaking, but also algorithmic decisions of a social media company are a little bit concerning. Third-party oversight might ameliorate some of those issues, but having US public servants directly involved in that way, I think raises some pretty significant free speech concerns. But this brings me to one of the other less restrictive alternatives that I think the government should very much turn to instead of addressing the risk of disinformation not only by the Chinese government but other foreign adversaries as well, it requires greater transparency of the platforms and potentially imposing obligations on them to share data with independent researchers who study problems like the spread of disinformation on the platforms.

A point that renowned cybersecurity expert Bruce Schneier made in his affidavit in support of a lawsuit challenging the application of Texas's TikTok ban to the public university faculty context is actually researchers, if they had access to this data, would pretty quickly realize that the Chinese government were involved in such a monumental effort to hijack TikTok's algorithm to push this information. So that could act as a significant bulwark against that kind of effort. And so transparency would have a lot of, I think, socially valuable uses, but that is one of them.

Anupam Chander:

Great. And I always, yeah, so the possibility of research or access to data that can help us determine whether or not propaganda is being pushed would seem to be one of the measures that might mitigate these risks. So, always advise that isn't one of the things that rolled out as the first approach to these questions. Jennifer?

Jennifer Huddleston:

So, I want to turn to the CFIUS question a little bit and then also get to the kids' question. So, with CFIUS and particularly with CS and Grindr, I think this also shows another kind of unusual element of this bill and that there was a process in place for considering these questions. The CFIUS process has been ongoing and should be weighing the potential risks, considering what these alternatives are, and building a case if any steps are needed with regard to concerns about national security in this foreign investment. That's a little bit distinguished from what we have here, which are, as we talked about in some of the other elements of this panel, what are very vague national security concerns still, with the case of Grindr, there were some very specific elements about with regards to the LGBTQ community that were discussed in that decision.

With this, it's kind of much more amorphous about what the concern actually is. As we've already said several times on this panel, which in some ways segues to this question about kids online. And I think we have to recognize that the TikTok debate, while there is a TikTok debate going on, is also part of a broader debate. And like Jenna, I'm very concerned with some of what we're hearing in terms of age verification, age-appropriate design codes, and all sorts of calls to ban children from social media when there are so many beneficial uses. When there's not always clear evidence of the harm, let alone clear definitions of what harm we're trying to solve when we say we want to keep kids safe online, at the end of the day, when it comes to kids online safety, I think it should be parents, not policymakers making those decisions. There are going to be so many different options in so many different households that are going to fit so many different situations. We don't necessarily agree on what the problem is, and that makes it a bad place to try and have a one-size-fits-all all call.

Anupam Chander:

Thank you, Alan. Well, I think we actually covered all of those wonderful questions. Thank you all. Please join me in thanking our group. We will post this on YouTube. And thank you all very much.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...
Prithvi Iyer
Prithvi Iyer is a Program Manager at Tech Policy Press. He completed a masters of Global Affairs from the University of Notre Dame where he also served as Assistant Director of the Peacetech and Polarization Lab. Prior to his graduate studies, he worked as a research assistant for the Observer Resea...

Topics