JRE 1736 · June 27, 2024
Tristan Harris & Daniel Schmachtenberger
Who is Tristan Harris & Daniel Schmachtenberger?
Tristan Harris is a former Google design ethicist, co-founder and president of the Center for Humane Technology, and co-host of the Center for Humane Technology’s "Your Undivided Attention" podcast with Aza Raskin. Daniel Schmachtenberger is a founding member of The Consilience Project, aimed at improving public sensemaking and dialogue.
Topics and Timestamps
- 01Tristan Harris and Daniel Schmachtenberger discuss how tech platforms are designed to be addictive and capture human attention as a commodity
- 02The conversation explores how social media algorithms amplify divisive content because engagement, not truth, drives ad revenue
- 03They explain the concept of 'race to the bottom' where platforms compete to be more addictive rather than more beneficial to society
- 04Harris and Schmachtenberger break down how attention economy mechanics are reshaping human psychology and collective decision-making
- 05The discussion covers solutions including regulatory frameworks, technological redesigns, and shifts in how we measure business success
- 06They address the gap between what tech executives know about their products' harms and what the public understands about manipulation tactics
- ▶Harris explains how Google's business model created incentives for addictive design0:05:00
- ▶Discussion of algorithmic amplification of divisive content over truth0:15:30
- ▶Schmachtenberger breaks down the race to the bottom dynamic in tech0:28:45
- ▶Exploration of how attention economy metrics became proxy for human value0:42:00
- ▶Discussion of potential solutions and regulatory frameworks needed0:58:15
The Show
Tristan Harris and Daniel Schmachtenberger sit down with Joe to unpack one of the most critical issues of our time: how technology companies have turned human attention into the most valuable commodity on the planet. Harris, coming from his experience as a design ethicist at Google, knows intimately how these systems work from the inside. He explains how engineers are tasked with creating products that maximize engagement above all else, which inevitably means making things as addictive and compelling as possible.
The core problem they identify is the attention economy. Every platform, from Facebook to TikTok to YouTube, makes money through advertising. More engagement equals more data, more data equals better ad targeting, better targeting equals higher ad rates. So the financial incentive is completely divorced from whether the content is true, beneficial, or healthy for users. As Harris puts it, the business model is fundamentally misaligned with human wellbeing. Schmachtenberger adds another layer by discussing how this creates a 'race to the bottom' dynamic where platforms can't afford to be the responsible player because their competitors will just eat their lunch by being more addictive.
They dig into the specific mechanisms of manipulation that have become standard practice. Push notifications engineered to feel urgent. Infinite scroll designed to prevent natural stopping points. Algorithmic feeds that learn what content keeps you clicking and serving more of it, regardless of quality. Likes and comments creating dopamine loops. Every interaction is measured, tested, and optimized to keep you coming back. Harris describes how Facebook's internal research showed their platform was making people unhappy, but they continued designing it the same way because the engagement metrics worked.
One of the most damaging effects they discuss is how these systems amplify divisive content. Rage and outrage are highly engaging emotions. Content that makes you angry, afraid, or indignant gets shared more, commented on more, and watched longer. So algorithms naturally promote polarizing content over nuanced, balanced takes. This is reshaping not just individual psychology but collective decision-making on a societal level. We're literally training our attention systems to become more reactive and tribal.
Schmachtenberger emphasizes that this isn't a conspiracy or a secret plot. The executives building these systems genuinely believe they're creating good products. But the business model itself is the conspiracy. It incentivizes behavior that, when scaled to billions of users, produces predictable harms. The tragedy is that many of the people building these tools understand the problems better than anyone, but they're locked into systems they can't escape without destroying the company's value.
They discuss potential solutions, including regulatory approaches, but emphasize that regulation alone won't work if the underlying business model remains unchanged. Real solutions would require fundamentally rethinking how we measure success in tech, moving away from pure engagement metrics and toward metrics that capture genuine human flourishing. Harris talks about the work of the Center for Humane Technology in pushing for both industry changes and policy shifts, but stresses that individuals also have power to understand and resist the manipulation.
The conversation touches on the generational impacts, particularly on young people whose developing brains are being shaped by these attention-capture systems. It's not just wasted time but fundamentally altered psychology, attention spans, and social capacities.
Best Quotes
“The business model is fundamentally misaligned with human wellbeing”
— Tristan Harris & Daniel Schmachtenberger
From the JRE 1736 conversation with Tristan Harris & Daniel Schmachtenberger.
“Engagement metrics have nothing to do with truth or what's good for people”
— Joe Rogan
From the JRE 1736 conversation with Tristan Harris & Daniel Schmachtenberger.
“These systems aren't secretly designed to harm you. The harm is a predictable side effect of the business model itself”
— Tristan Harris & Daniel Schmachtenberger
From the JRE 1736 conversation with Tristan Harris & Daniel Schmachtenberger.
“Rage is the most engaging emotion for algorithms, so we're literally training ourselves to be more reactive”
— Joe Rogan
From the JRE 1736 conversation with Tristan Harris & Daniel Schmachtenberger.
“The people building these tools often understand the problems better than anyone, but they're locked into systems they can't escape”
— Tristan Harris & Daniel Schmachtenberger
From the JRE 1736 conversation with Tristan Harris & Daniel Schmachtenberger.
Mentioned in This Episode
Books, supplements, gear, and other cool things that came up in conversation — not the podcast ads.
Center for Humane Technology
AmazonOrganization co-founded by Tristan Harris focused on addressing technology's negative impacts on human wellbeing and society.
Your Undivided Attention Podcast
SpotifyPodcast co-hosted by Tristan Harris and Aza Raskin exploring technology, attention, and human flourishing.
The Consilience Project
AmazonOrganization founded by Daniel Schmachtenberger aimed at improving public sensemaking and dialogue.
As an Amazon Associate we earn from qualifying purchases.