• In late September, a U.S. presidential executive order set the final terms for TikTok's American operations: its parent company, ByteDance, must divest, and control must pass to a U.S.-led investment group spearheaded by Oracle and Dell. This "Divest-or-Ban" deal has finally been settled.

    Foreign-led services are ubiquitous, so why is the U.S. government taking such a hard line on TikTok? The answer lies in its powerful user engagement and its opaque algorithmic "black box."

    In recent years, TikTok has consistently held a U.S. market share of over 25%. Its secret is an algorithm that precisely captures user interests from their behavior, then continually pushes highly relevant content. In this model, people receive an endless, personalized stream of information without needing to actively build social connections or follow accounts, creating an addictive and immersive experience.

    However, this platform-monopolized "content curation power" also gives TikTok a formidable ability to shape public opinion and collective perception. In the long term, this could intensify social division and polarization; in the short term, it creates a significant risk of opinion manipulation. For instance, if tensions in the Taiwan Strait were to escalate, the platform could flood feeds with content suggesting "a majority of Taiwanese favor surrender," thereby steering public discourse.

    Concerned by this influence, U.S. lawmakers reached a rare bipartisan consensus that TikTok is effectively controlled by the Chinese Communist Party and therefore constitutes a national security risk, which ultimately drove this forced divestiture.

    Going forward, TikTok's U.S. data will be managed by Oracle. Updates and adjustments to its algorithm will be reviewed by a board of directors composed primarily of Americans, blocking foreign interference at its source.

    Notably, this controversy has also clarified an important legal principle: protecting free speech does not conflict with regulating the algorithmic amplification of that speech.

    First, anyone—including accounts that praise the CCP—can still express themselves freely on TikTok. That right is protected. The core issue this ban targets is the amplification mechanism: when the platform’s algorithm *proactively* and *at scale* pushes specific, unsolicited content to users who have not subscribed to it.

    However, these platform power structures are not unbreakable. The rise of generative AI presents a new opportunity for users to reclaim their "content curation power." Platforms like Bluesky and X are already exploring such applications. In the future, you will be able to give a platform direct commands—"I want to see content like A, but less of B"—and it will instantly generate a new feed tailored to you.

    When content recommendation is no longer captive to a single, closed black box but can instead offer a truly personalized model for every user, the market will open up to more diverse options. At that point, the monopoly that short-form video holds on our attention could very well be broken.

  • (Interview and Compilation by Yu-Tang You. License: CC BY 4.0)