TikTok has entered a new era in the United States, but it’s hardly a less risky one.
Last week, the company disclosed the contours of a deal intended to allow the platform to continue operating in the United States, bringing it into compliance with a 2024 US law. The arrangement appears largely consistent with the framework reportedly negotiated between US and Chinese officials last fall. Under the proposed structure, a newly created entity called TikTok USDS Joint Venture would assume responsibility for data security and content moderation, with US investors—including the software company Oracle—holding majority control while ByteDance remains the largest single shareholder at 19.9 percent. TikTok’s existing US-based companies would retain control over the platform’s commercial operations, including advertising, e-commerce, and marketing. While the ownership of TikTok’s recommendation algorithm is not explicitly addressed in the latest announcement, a December memo from TikTok CEO Shou Zi Chew indicated that ByteDance would keep ownership of the algorithm’s intellectual property and license it to the joint venture for a fee.
The deal has been framed by some officials and commentators as a meaningful step toward addressing long-standing US concerns about People’s Republic of China (PRC) information manipulation, foreign influence, and data security. In practice, it does little to alter the underlying risks that animated the debate during the previous US administration.
On disinformation and influence operations, the deal is unlikely to be transformative. As we argued in a 2024 report examining TikTok’s national security implications, Beijing’s ability to conduct influence operations does not depend on ownership of a single platform. While the Chinese Communist Party (CCP) could theoretically attempt to shape content via TikTok’s recommendation algorithm, it already engages in influence campaigns across US-based social media platforms and will continue to do so even with TikTok’s structural reorganization. Restricting TikTok does not dismantle the broader information ecosystem in which foreign influence campaigns operate.
The data security case is even more revealing. The type of data generated by TikTok is not fundamentally different from that collected across the digital advertising ecosystem, which over the past decade has evolved into a system capable of extremely granular micro-targeting. Data brokers routinely aggregate information from mobile advertising identifiers, cookies, location data, and online activity to build detailed dossiers on individuals. Although these identifiers are often described as “anonymized,” it is widely understood that combining multiple datasets makes re-identification fairly straightforward.
This ecosystem enables the creation of highly specific audience segments—such as military personnel with financial vulnerabilities, politically active voters, or individuals likely to participate in protests—drawing on data that includes location histories, credit card transactions, employment records, social media activity, and government filings. Investigations by civil society organizations and journalists have repeatedly demonstrated how easy it is to access such data, often with minimal vetting, and how readily it could be exploited by foreign intelligence services or malign actors.
Importantly, this data is not confined to fringe actors. Major US technology platforms continue to earn significant revenue from foreign advertisers, including Chinese firms, even as they attempt to place guardrails on data flows. While companies such as Google have introduced measures to limit the sharing of certain identifiers with Chinese entities, advertising experts note that these restrictions are often porous. Once an ad is served, advertisers can still infer sensitive information—such as IP addresses and device characteristics—and real-time bidding systems offer no technical guarantee that data will not be misused after it is received.
Compared to this sprawling and still inadequately regulated market, TikTok’s data practices are not uniquely dangerous. Focusing narrowly on this one app risks obscuring the far more consequential vulnerabilities embedded in the broader data economy.
Finally, it is worth underscoring how little ByteDance has conceded in the deal. If ByteDance has in fact licensed the algorithm, as subsequent reporting has indicated, the company has preserved control over its most valuable intellectual property. The principal concession—that is, the loss of majority ownership in the entity overseeing data security—imposes limited strategic costs.
In addition, depending on how the actual licensing deal is laid out, this structure could still hypothetically leave room for PRC influence over the algorithm—though it will likely be more difficult than it would be if ByteDance retained full ownership. The licensed algorithm is a continuously trained system shaped by design choices, training data, model updates, and operational parameters. If ByteDance is retaining control over that core intellectual property, in theory, the PRC government could exert some influence over how the system evolves, even if day-to-day content moderation or data security oversight is localized. Once further details of the licensing agreement are released, this risk will be better understood.
At the same time, it is important not to overstate what that influence could look like in practice. Rather than eliminate the risk of manipulation, this structure redistributes it among a different set of actors. Algorithmic manipulation is unlikely to take the form of overt, platform-wide promotion of pro-CCP content. Should manipulation occur, it would likely take the form of more subtle interventions that would be difficult to attribute to PRC influence or parse out from how the recommender system is working on US user data. This is especially the case now as the handover gets under way and the algorithm is being trained on US user data from scratch; in the short term, the app could exhibit high variability in terms of the content it surfaces while the system learns what users want and curates their “For You” page accordingly.