JRE 0 · November 18, 2021

What China's Crackdown on Algorithm's Means for the US

technologypoliticspsychologybusinessphilosophy

Who is What China's Crackdown on Algorithm's Means for the US?

Taken from JRE 1736 w/Tristan Harris & Daniel Schmachtenberger:

Topics and Timestamps

  • 01China's government crackdown on algorithms represents a fundamental shift in how authoritarian regimes control information and social behavior
  • 02Tristan Harris and Daniel Schmachtenberger discuss how algorithmic systems are designed to manipulate human attention and behavior at scale
  • 03The US tech industry operates under similar attention-capture mechanisms as China but without government oversight, creating a different but equally problematic control structure
  • 04Algorithmic recommendation systems create filter bubbles and radicalization pipelines that fragment society and prevent constructive dialogue
  • 05Both authoritarian and corporate control of algorithms pose existential risks to democracy and human autonomy
  • 06The solution requires rethinking how we design incentive structures in technology rather than just regulating existing platforms
  • Harris explains why China's algorithm crackdown is significant0:05:00
  • Discussion of how algorithms create radicalization pipelines and filter bubbles0:15:30
  • Schmachtenberger breaks down the difference between corporate and state algorithmic control0:28:45
  • Conversation about redesigning incentive structures in tech instead of just regulation0:42:00
  • Discussion of algorithmic power as a threat to democracy and human autonomy0:55:15

The Show

Joe sits down with Tristan Harris and Daniel Schmachtenberger to break down what China's recent algorithmic crackdowns actually mean and why they matter for understanding tech power in America. This isn't just a foreign policy discussion, it's about recognizing that whether algorithms are controlled by the state or by Silicon Valley, the fundamental issue is the same: they're designed to capture and monetize human attention in ways that fragment reality and manipulate behavior.

Harris and Schmachtenberger make the case that China's government is basically doing openly what tech companies in the US have been doing covertly for years. The difference is one of opacity. American platforms aren't being controlled by explicit government mandate, but they operate under business models that require maximizing engagement at any cost. Recommended content that triggers outrage, fear, and tribal division keeps people scrolling. The algorithm doesn't care if it's radicalizing someone or destroying their mental health, it only cares about watch time and clicks.

What makes China's move significant is that it signals recognition of algorithmic power as a tool of social control. The Chinese government looked at what these systems do and decided they couldn't allow them to operate without direct government supervision. In some ways, they're being more honest about the reality: algorithms are too powerful to be left to private interests alone. But their solution, state control, is obviously dystopian. The real problem is that we haven't seriously grappled with the fact that algorithmic control, whether corporate or governmental, is fundamentally incompatible with human autonomy and free society.

The conversation explores how recommendation algorithms create self-reinforcing information ecosystems where people only see content that confirms what they already believe. This isn't accidental. These systems are optimized to keep you in a state of engagement, which means showing you more of what already captured your attention. Over time, this pushes people toward increasingly extreme versions of their existing beliefs. Political polarization, the erosion of shared reality, the rise of conspiracy thinking, they're all downstream from algorithmic recommendation systems designed to maximize engagement.

Harris and Schmachtenberger argue that the fix isn't regulation that just makes the same exploitative systems slightly more transparent. It's fundamental redesign of the incentive structures. If platforms made money from helping people think clearly and build healthy relationships instead of from maximizing watch time, they'd look completely different. But that would require sacrificing the current business model, which means it won't happen voluntarily.

They also discuss how this ties into broader questions about complexity, coordination, and whether technological society can survive its own power. Algorithms are a clear example of a technology that gives enormous power to whoever controls it, and we've basically handed that power to a handful of corporations with profit motives and to authoritarian governments with control motives. Neither serves human flourishing. The conversation suggests we need to be thinking about these systems not just as products to regulate but as fundamental challenges to social order itself.

Best Quotes

The algorithm doesn't care if it's radicalizing someone, it only cares about engagement

What China's Crackdown on Algorithm's Means for the US

From the JRE 0 conversation with What China's Crackdown on Algorithm's Means for the US.

China is doing openly what tech companies in the US have been doing covertly for years

Joe Rogan

From the JRE 0 conversation with What China's Crackdown on Algorithm's Means for the US.

Algorithmic systems are too powerful to be left to private interests or government control alone

What China's Crackdown on Algorithm's Means for the US

From the JRE 0 conversation with What China's Crackdown on Algorithm's Means for the US.

We've handed the most powerful tool for social influence to whoever can maximize engagement

Joe Rogan

From the JRE 0 conversation with What China's Crackdown on Algorithm's Means for the US.

The real problem isn't the algorithm itself, it's the incentive structure that drives how it operates

What China's Crackdown on Algorithm's Means for the US

From the JRE 0 conversation with What China's Crackdown on Algorithm's Means for the US.