US Power Play Over TikTok Did Nothing to Protect Americans
Alex Turvy, Rebecca Scharlach / Jan 30, 2026
TikTok experienced technical failures and massive malfunctions following the acquisition and transfer of the Chinese app by an American consortium in January, 2026. (Photo by Samuel Boivin/NurPhoto via AP)
In December 2024, we argued on Tech Policy Press that blanket platform restrictions create what they aim to prevent. Restrictions don't mitigate risks; they displace them, driving users to less regulated spaces, fragmenting oversight, and reducing transparency while claiming to increase security. The TikTok saga has now supplied an unusually clean empirical sequence: a deadline, a brief shutdown, a user displacement, a return, a year of non-enforcement, a settlement, and an immediate controversy over what the new arrangement would allow. Within days of the deal closing, lawmakers from both parties questioned whether a structure that leaves ByteDance's recommendation algorithm IP in Beijing under a licensing agreement actually satisfies the law, while users raised alarms about expanded data-sharing provisions in the updated privacy policy and alleged suppression of political content.
The paradox we identified didn't just hold. It evolved through four distinct stages that reveal how the system processes threats it cannot resolve. What began as displacement became reproduction; reproduction enabled domestication; domestication settled into deferral. Each stage absorbed the tensions of the one before. None resolved them.
1. Displacement
At approximately 10:30 PM ET on January 18, 2025, the eve of the ban's effective date, TikTok voluntarily went dark in the United States. The shutdown wasn't triggered by government action; the law penalized app stores and hosting providers, not TikTok directly. This fragmentation of enforcement through intermediaries meant that the "ban" manifested not as a single regulatory act but as a cascade of infrastructural compliance decisions by Apple, Google, and Oracle—each making independent calculations about legal exposure. Facing this uncertainty, TikTok preemptively cut service for its 170 million American users. Fourteen hours later, after President Donald Trump signaled he would delay enforcement, service resumed.
But the paradox had already begun materializing before the shutdown and before the restoration. In the week leading up to January 19, 2025, hundreds of thousands of American users migrated to RedNote (Xiaohongshu), a Chinese app operating entirely under Beijing's legal jurisdiction. From big content creators to average users alike, Americans said their farewells to TikTok on TikTok, posting farewell videos, archival montages, and half-ironic thank-yous to an algorithm they claimed had ‘known’ them. Some treated the moment as a digital funeral, others as a protest, framing their departure from TikTok as an act of defiance against US lawmakers rather than a genuine loss of faith in the platform. Almost immediately, those same creators reappeared on RedNote, announcing new handles, and posting old TikTok content. Reuters reported more than 700,000 new users in two days; analytics firms estimated daily active users peaked around 3.4 million in mid-January. The hashtag #TikTokRefugee went viral. RedNote hit number one on the US App Store.
The irony was lost on no one. RedNote operates without any of the structural separations TikTok had attempted through Project Texas—no US-based data subsidiary, no Oracle oversight, no CFIUS-approved governance board. All data resides on mainland Chinese servers. The same app serves both Chinese and American users under identical CCP-aligned content policies. As the Atlantic Council's Digital Forensic Research Lab observed, US efforts to curtail TikTok's reach have unintentionally driven users toward a platform even more explicitly tied to Chinese state interests.
But the contrast also illuminates something about what Project Texas was always offering. Project Texas was pitched in 2023 by TikTok CEO Shou Chew to address US government national security concerns, but was largely dismissed by US lawmakers. It promised regulatory accommodation: US-based servers, US-claimed oversight, US-sounding governance. As Stanford Professor Fred Turner argues in his recent anatomy of Silicon Valley's migration to the Lone Star State, the "Texan Ideology" represents a shift from the networked utopianism of 1990s tech culture to something older: extraction. Where the Californian Ideology promised that connection would set us free, the Texan variant turns the social world into a resource to be mined—data as the new oil, hosted in a state jurisdiction that asks few questions about what happens under the hood. Project Texas was named for a reason. What it didn't promise was any change to the underlying extraction.
Users didn't flee to a safer space. They fled, briefly, to a space less governed from the standpoint of US regulatory visibility and leverage, precisely because it was beyond Washington's reach. The ban, even in its partial implementation, produced the outcome it was designed to prevent.
But the displacement was temporary. By March, RedNote's US daily active users had fallen to roughly 800,000 according to SensorTower. Most users returned to TikTok—not because the national security concerns had been addressed, but because Trump had signaled non-enforcement. When faced with the choice between regulatory theater and no regulation at all, users chose the familiar. The brief migration offered a preview of what full displacement might look like. It also demonstrated something else: users’ openness to an unknown, non-US and non-English based TikTok-like platform, so long as the platform remained accessible.
The system took note. After a year in limbo, what awaited returning users was not resolution but the paradox's second stage: a deal that would reproduce, in more elaborate institutional form, the same tensions they had momentarily escaped.
2. Reproduction
The RedNote migration revealed what users would tolerate. The deal revealed what the system would settle for.
On January 22, 2026, the TikTok USDS Joint Venture LLC officially closed, ending a year of legal limbo. The deal's architecture looks, at first glance, like a divestiture. Oracle, Silver Lake, and the UAE-based MGX each hold 15% as managing investors. ByteDance's stake was capped at 19.9%: just under the statutory threshold. The board is US-majority. American users' data will be hosted on Oracle's cloud infrastructure.
But what actually changed, and what didn't?
ByteDance retains ownership of the recommendation algorithm—meaning the US entity can operate it under license, but cannot own or independently transfer the core IP. The distinction matters: licensing is not ownership, and monitoring is not control. Oracle can audit what the algorithm does; it cannot unilaterally change how it works or prevent updates that originate from ByteDance's retained IP. The algorithm will be retrained on US user data, but the underlying intellectual property remains in Beijing. Meanwhile, approximately 30% of the new entity is held by affiliates of existing ByteDance investors rather than new American capital. E-commerce and advertising operations remain with a separate ByteDance-controlled entity entirely outside the joint venture's governance. The House Select Committee on the Chinese Communist Party has publicly questioned whether this structure meets the law's requirements. The concern is bipartisan; Senator Ed Markey (D-MA) was blunter: "The White House has provided virtually no details about this agreement, including whether TikTok's algorithm is truly free of Chinese influence. This lack of transparency reeks." ByteDance disclosed few details about the divestiture; Congress received no briefing before the deal closed.
Here a clarification is needed: governance is not one thing. If meaningful governance means data residency and audit trails, the deal represents progress. If it means severing influence over ranking logic and update pathways, the deal looks more like repackaging. The settlement solves for jurisdictional optics (US board, US hosting, US oversight) while leaving technical dependency plausibly continuous. That ambiguity is not a bug; it is the settlement's core architecture.
In our original piece, we argued that bans prioritize symbolic control over meaningful governance — projecting power rather than addressing underlying risks. The settlement suggests the same logic applies to the deals that replace them. Each party could plausibly claim it met its goals: the administration secured a "sale," ByteDance retained its core asset, and American users kept their app. The national security concerns that justified the law were not resolved. They were absorbed into a more complex institutional arrangement, one designed to be too opaque to easily evaluate and too entangled to easily unwind.
The deal's structure was reproduction. But what enabled a settlement this ambiguous to close? The answer lies in what happened during the year between the Supreme Court's ruling and the deal's consummation: twelve months in which the executive branch demonstrated, with increasing boldness, that enforcement was optional and accountability negotiable. Reproduction required a prior stage: one in which the threat itself was transformed from foreign to domestic, from external danger to internal management problem. The paradox's third act was not about the deal's architecture. It was about what happened to the fear.
3. Domestication
On January 17, 2025, the Supreme Court unanimously upheld the Protecting Americans from Foreign Adversary Controlled Applications Act. Two days later, the law took effect. For the next twelve months, TikTok operated in technical violation of a statute affirmed by the nation's highest court, sustained only by a series of executive orders directing the Department of Justice not to enforce it.
The law permitted one 90-day extension, conditioned on presidential certification to Congress that a divestiture was in progress with binding legal agreements in place. That certification never occurred. Instead, the Trump administration issued at least four executive orders between January and September of 2025; three directing DOJ non-enforcement for a certain period and a fourth doing the same while giving a nod to the developing deal’s details. Harvard's Jack Goldsmith called it "maybe the broadest [assertion of executive power] I have ever seen any president or Justice Department make, ever, in any context—and that is saying something."
Congress, which had passed the law with overwhelming bipartisan majorities (House: 360-58; Senate: 79-18), did nothing.
What emerged was an ambiguity-preserving compromise. The statute remained on the books, technically violated but never enforced. The courts never ruled on whether the executive approach was lawful. The deal, when it finally closed, was certified by presidential determination rather than tested against the law's original terms. No one was held accountable to the framework Congress created. The year of non-enforcement didn't just delay resolution; it established the terms on which resolution would occur. By the time the deal closed, the original question (does this arrangement actually address the national security concerns that justified the law?) had been replaced by a different one: does this arrangement allow everyone, particularly Trump, to claim victory?
The threat had been domesticated. Not eliminated, not even addressed; simply absorbed into the ordinary machinery of executive discretion and political negotiation. The foreign danger that had justified unprecedented legislative action became a management problem, handled through the same tools of delay, ambiguity, and stakeholder accommodation that govern less urgent matters.
And then came the new terms of service — and with them, an immediate illustration of how thoroughly the threat had changed address.
Within days of the deal closing, TikTok updated its privacy policy to reflect the new corporate structure. Much of the language that alarmed users, including provisions authorizing collection of immigration status, gender identity, and medical diagnoses, was already present in the previous policy; TikTok had long reserved the right to infer this information from user-generated content, even content obscured by face or voice filters. What actually changed was narrower but significant: a new provision enabling precise GPS-level geolocation tracking, and expanded language permitting the platform to share user data with third parties for "customized ads and other sponsored content" — including advertising off the app. What reads as transparency to a regulator reads as a confession to a worried user (at least to those who actually read the terms of service). And the timing, days after a deal blessed by the Trump administration, made the fine print feel like a policy shift even where it wasn't.
The same weekend, reports emerged that users were struggling to post content critical of Immigration and Customs Enforcement; some alleged their videos received zero views, others that accounts had been flagged as "ineligible for recommendation." Celebrities including Billie Eilish publicly accused the platform of censorship . California Governor Gavin Newsom announced an investigation (Office of the Governor, Press Release, January 27, 2026). TikTok attributed the disruptions to a power outage at a data center.
Whether the glitches were technical or political may never be definitively established, but that uncertainty is itself the point. Users who once worried that Beijing might manipulate the algorithm now worried that Washington might. The fear of foreign influence had not been resolved. It had been domesticated, transformed from an external threat justifying emergency action into an internal anxiety to be managed, investigated, and debated within ordinary political channels. The danger didn't disappear. It simply started wearing American institutional clothing.
Domestication is not resolution. It is the stage at which a problem stops being treated as a crisis and starts being managed as a condition, absorbed into the background hum of how things work. And once a threat has been domesticated, the system's preference is not to revisit it but to defer: to construct arrangements that preserve optionality, avoid accountability, and allow the underlying tensions to persist indefinitely beneath layers of institutional complexity.
4. Deferral
The Supreme Court's ruling was narrower than often reported. The justices "assumed without deciding" whether the First Amendment even applied to the challenged provisions. They explicitly declined to endorse the government's argument about "covert content manipulation," with Justice Gorsuch observing in concurrence that "one man's 'covert content manipulation' is another's 'editorial discretion'." The Court upheld the law based solely on data-collection concerns, leaving the harder questions about algorithmic influence unresolved.
The deal doesn't resolve what the Court didn't address. If the core national security concern was that a foreign adversary could manipulate what Americans see, a licensing arrangement that leaves algorithm IP in Beijing doesn't obviously answer it. If the concern was data collection, the continuation of Oracle-hosted infrastructure may suffice — but that architecture predated the law. And if the concern was influence over American political discourse, the new ownership structure raises questions of its own: the deal closed under a presidential administration whose allies hold significant stakes, in a platform whose content moderation is now subject to domestic political pressure rather than foreign government directive. The threat did not vanish. It was rerouted.
For policymakers, the TikTok saga suggests that "ownership" may be the wrong frame for governing algorithmic platforms with foreign entanglements. The deal demonstrates that equity stakes can be restructured while technical dependencies persist. If Congress wants to address algorithmic influence, it will need to shift from ownership thresholds to dependency surfaces: update pathways, weight inheritance, training data provenance, audit reproducibility. A law focused on who holds shares cannot reach a platform whose core logic is licensed from abroad and updated through opaque channels. The question is not who owns the algorithm, but who can change what it does and whether anyone outside the company can verify the answer.
What does it mean when the system identifies a paradox and responds with institutionalized ambiguity? The TikTok saga suggests that platform governance, at least at the intersection of great-power competition, operates less through legal frameworks than through negotiated settlements that preserve optionality for all parties. Laws get passed, courts uphold them, and then executives decline to enforce them while deals get cut. The result is not resolution but deferral. This has led to arrangements that appear decisive while leaving underlying tensions intact, now buried under layers of corporate structure, licensing agreements, and privacy policies that few users will read.
The aftermath
In December 2024, we warned that the era of simplistic bans must end. A little over a year later, the ban ended — not through the legal process Congress designed, but through an ambiguity-preserving compromise that reproduced many of the concerns it claimed to address.
A lot has happened since December 2024. Yet, the platform resumed its role as a default layer of American cultural life even as its legal status remained unsettled. The US government retained rhetorical authority and some new procedural hooks, but no clear victory. A year in, the episode reads less as a successful US power play than as an accommodation; one that revealed how difficult it has become for the state to decisively govern technologies that are already socially indispensable, despite not being American-made.
The paradox holds. It has simply completed its first full cycle: displacement, reproduction, domestication, deferral. Users fled briefly to an unregulated space, then returned to a platform entering a year of legal limbo. The deal that emerged repackaged the original tensions under new corporate architecture. The year of non-enforcement transformed a foreign threat into a domestic management problem. And the settlement that closed the cycle deferred every hard question to a future that will inherit the same unresolved conflicts in more elaborate institutional form.
The architecture is now in place for the pattern to repeat with more complexity, less public attention, and the same unanswered question at its core: not whether American platforms are safe from foreign influence, or how to protect American citizens, but whether the governance systems meant to protect us can do anything more than absorb threats they cannot resolve, deferring them indefinitely while claiming decisive action.
Authors

