TikTok USDS and Cognitive Warfare: Shifting the Threat Model from External to Internal
A thesis-based reassessment of TikTok’s strategic risk profile: how the USDS (U.S. Digital Security) deal alters the foreign control threat model, and which influence and governance questions remain unresolved.
A thesis-based reassessment of TikTok’s strategic risk profile: how the USDS (U.S. Digital Security) deal alters the foreign control threat model, and which influence and governance questions remain unresolved.
On January 22 to 23, 2026, TikTok and ByteDance confirmed that the long-running U.S. divest-or-ban saga has ended in a new structure: TikTok USDS Joint Venture LLC, a majority American-owned vehicle intended to keep the app online for more than 200 million U.S. users while addressing national security concerns (Reuters). It is the clearest example so far of how a platform can become a bargaining chip in a great power rivalry, exactly the strategic context in which my master'sthesis was written. In that thesis, I treated TikTok as a potential tool in a shifting U.S.-China power struggle: a platform whose value is not only commercial, but political and cognitive. The NYT frames the announcement as the end of a long legal saga, reminding policymakers of the persistent risks and the importance of ongoing oversight (NYT).
The deal, in hard terms
- Ownership: American and global investors hold 80.1%, ByteDance holds 19.9%.
- Anchor investors: Oracle, Silver Lake, and Abu Dhabi-based MGX each hold 15%.
- Leadership: Adam Presser is named CEO; Will Farrell is named chief security officer; Shou Chew is on the board.
- Mission statement: the venture will “secure U.S. user data, apps and algorithms” through privacy and cybersecurity measures, with limited public detail about implementation.
- Algorithm operations (important): Reuters reports the venture will retrain, test, and update the recommendation algorithm on U.S. user data, and that it will be secured in Oracle’s U.S. cloud (Reuters).
Why this fits my 2022 research and thesis
Recent cognitive-warfare scholarship frames influence operations less as persuasion alone and more as the systematic shaping of perception, trust, and decision-making at scale. NATO ACT-linked work characterises cognitive warfare as efforts to weaponise public opinion to influence policy and erode institutional resilience, often operating below the threshold of open conflict while exploiting contemporary information ecosystems. Conceptual analyses of the cognitive domain further emphasise that narratives do not remain in the information space: they enter the cognitive space through emotion, identity, and temporality, and strategic effects follow from how audiences interpret and act on content rather than from content accuracy alone. In parallel, sharp power literature highlights how authoritarian influence can exploit the openness of democratic systems by manipulating, distracting, and distorting rather than persuading in good faith (NATO ACT).
Against this backdrop, my update to the TikTok case is best read as a threat-model shift. Even if the USDS restructuring reduces concerns about foreign state leverage through ownership, governance, and security controls, the platform’s underlying affordances remain: scalable attention steering, memetic mobilisation, and algorithmic amplification. These same affordances that make platforms valuable for external influence can also be repurposed for domestic political advantage, agenda-setting, and legitimacy contests within an open society.
1) TikTok is a geopolitical instrument, not just an app
I argued that the platform sits inside a larger “sharp power” and political warfare environment, where influence efforts exploit openness in democracies. The NYT narrative of a “legal saga ensnared in politicking between two global superpowers” (this is NYT doing NYT, meaning: Aprotractedg legal fight that became more political as the U.S. and China competed) matches that logic. This NYT article shows how the TikTok dispute became a proxy battleground: trade-war dynamics, lobbying pressure, and domestic politics shaping policy.
2) ByteDance’s dual-platform model matters: Douyin vs TikTok
A key thesis point was that ByteDance operates two versions with different governance logic: Douyin, a heavily regulated, domestically aligned system, and TikTok, the global product with different constraints and a less transparent influence environment. The USDS deal is, in effect, a U.S. attempt to force a third version: TikTok as a U.S.-governed system with enforced separation from Beijing-linked leverage.
3) The algorithm is the center of gravity in the cognitive domain
My thesis emphasized the “dark box” recommendation system and the way it can accelerate distraction, radicalization, and narrative shaping. Even without proof of a coordinated campaign, the platform’s incentives and design create a surface for cognitive vulnerability. This is why the most important sentence in the reporting is not the ownership split. It isReuters states that the U.S. venture will retrain, test, and update the recommender on U.S. user data and secure it in Oracle’s U.S. cloud (Reuters).
If implemented as described, USDS can plausibly reduce the simplest “foreign access” pathways by hardening the perimeter around U.S. user data and tightening privileged access, primarily through Oracle-hosted infrastructure and formalized controls. This is precisely what Project Texas and USDS were presented initially for: localize U.S. user data and monitor cross-border flows through a U.S.-based subsidiary, with Oracle as a key security partner (Default).
So, yes, it is fair to write that the deal aims to close the most obvious national security concern: direct Chinese operational control over U.S. TikTok infrastructure and data.
What the deal does not automatically eliminate
The New York Times article rightly flags uncertainty about how much the user experience will change and whether the security concerns are fully addressed. My thesis framework helps explain why.
First, security is not the same as sovereignty
Securing an algorithm in Oracle’s U.S. cloud and retraining it on U.S. data are meaningful measures (Reuters).
But “secure” does not necessarily mean “sovereign.” The critical question is who controls the long-term lifecycle: updates, tuning, enforcement logic, and the contractual dependency structure that makes those changes possible.
Reuters itself notes that key elements, including the venture's business relationships with ByteDance, have not been fully disclosed (Reuters).
That is where influence risks hide, because leverage is often exercised through dependencies, not their headlines.
Then, cognitive warfare does not require state ownership
Yes, read that headline again....Even if Chinese leverage is reduced, the platform remains a high-throughput memetic environment. The system can be used by domestic actors, non-state networks, and influence entrepreneurs who understand what triggers reach and engagement.
What if…What if Greenland becomes meme diplomacy, normalising escalation through humor and virality. What if MAHA turns public health into branding, where identity signals outperform policy coherence. What if anti-vaccine drift is institutionalised, priming distrust before the next outbreak. What if deportation is packaged as viral entertainment, making coercive policy feel ordinary. What if ICE promotional videos adopt short-form aesthetics to recruit, motivate, and manufacture consent. Or another “King Trump” AI meme of a jet dumping feces on protesters normalises humiliating political opponents as entertainment and deterrence.
The argument is analytically strongest when framed as a structural risk rather than a personalised accusation: any powerful political coalition may seek advantage within an influence-rich platform environment. The NYT excerpt itself gestures at this shift, noting concerns that ties between some investors and domestic politics could shape what gets“airing,” and quoting a worry about trading fear of foreign propaganda for domestic propaganda.
Youth harm and toxicity remain, regardless of who owns the cap table
My thesis documented the toxicity pathway: recommendation-driven escalation into harmful content, mis/disinformation spirals, and risky behavioral challenges. Those mechanisms do not disappear because the board changes. A friend of mine said, "garbage in, garbage out." So the post-USDS world still faces a significant policy problem, even if geopolitics cools down.
The real follow-up to the thesis: the threat model shifts, it does not vanish. If my 2023 thesis asked, “What if a foreign adversary could leverage TikTok in a power struggle?”, the 2026 update becomes:
- What if the platform’s influence mechanics are now primarily a domestic governance and accountability problem?
- What if AI-generated political content increases the volume, speed, and plausibility of persuasion and disinformation, independent of who owns the algorithm IP?
- What counts as meaningful oversight in the cognitive domain: audits, transparency centers, academic access, and enforceable guardrails?
Project Texas was always framed as monitored gateways and controlled cross-border flows, with limited necessary exceptions (Default). USDS is the escalation of that concept into a legally meaningful ownership and governance structure. Whether it actually disarms “weaponisation” depends on what we can verify about control, dependency, and measurable outcomes.
What I will watch next
Operational dependency disclosure: What services, licensing, or technical support still flow between ByteDance and the U.S. venture? Reuters notes these relationships have not been fully disclosed.
Algorithm change control: Who has authority to push ranking changes, safety tuning, and policy enforcement logic? Reuters says the U.S. venture will retrain, test, and update on U.S. data. The next question is: who signs off on what “update” means.
Independent audit outputs: What do third-party audits actually publish, and what do they measure? Project Texas and USDS are built around the idea of monitored controls, but credibility comes from repeatable evidence.
Distribution drift: Do we see measurable shifts in political content reach, misinformation prevalence, or moderation patterns post-transition? This is where “domestic leverage” would leave fingerprints.
For You feed monitoring: I will continue tracking changes in my own TikTok “For You” feed over time to identify shifts in political content exposure, narrative prevalence, and recommendation patterns following the USDS transition.
Closing
The TikTok USDS deal is best understood as a geopolitical compromise that tries to turn a national security dilemma into a governance and compliance framework. It likely reduces the most straightforward foreign-control threat model. (Reuters)
But my thesis argument remains relevant because the deeper risk is cognitive and systemic: attention steering, memetic amplification, and the ability for any motivated actor to exploit a high-engagement recommendation engine.
In other words: the battlefield may have changed owners, but it is still a battlefield.
References
McCabe, D. and Lindner, E. (2026) ‘TikTok deal with Oracle, ByteDance, China and US creates a new U.S. TikTok’, The New York Times, January 22. Available at:https://www.nytimes.com/2026/01/22/technology/tiktok-deal-oracle-bytedance-china-us.html
Reuters (2026) ‘TikTok seals deal for new US joint venture to avoid American ban’, Reuters, January 23. (Accessed: January 23 2026).
The Washington Post (2026) ‘TikTok says U.S. spinoff is finalized’, The Washington Post, January 22. (Accessed: January 23 2026).
The Guardian (2026) ‘TikTok announces it has finalized deal to establish US entity, sidestepping ban’, The Guardian, January 22. (Accessed: January 23 2026).
Rifesser, B. (2023) An Interdisciplinary Analysis of the Weaponisation of TikTok (Master’s thesis). Available at: https://www.researchgate.net/publication/373824227_An_Interdisciplinary_Analysis_of_the_Weaponisation_of_TikTok