JRE 1558 · October 30, 2020
Tristan Harris
Who is Tristan Harris?
Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine, Tristan Harris spent three years as a Google Design Ethicist developing a framework for how technology should “ethically” steer the thoughts and actions of billions of people from screens. He is now co-founder & president of the Center for Humane Technology, whose mission is to reverse ‘human downgrading’ and re-align technology with humanity. Additionally, he is co-host of the Center for Humane Technology’s Your Undivided Attention podcast with co-founder Aza Raskin.
Topics and Timestamps
- 01Tristan Harris explains how tech companies use persuasive design to manipulate user behavior and attention
- 02The concept of 'human downgrading' - how technology is deliberately engineered to exploit psychological vulnerabilities
- 03Silicon Valley's business model is fundamentally based on capturing and monetizing human attention through addiction mechanics
- 04Social media platforms use variable rewards and infinite scroll to create compulsive usage patterns similar to slot machines
- 05Harris discusses the Center for Humane Technology's mission to redesign technology with human wellbeing as the primary goal
- 06The ethical implications of having a small group of engineers in California shaping the thoughts and behaviors of billions worldwide
- ▶Harris explains how he became a Design Ethicist at Google and what that actually means0:02:15
- ▶Discussion of how notification systems are specifically engineered to exploit psychological vulnerabilities0:12:40
- ▶Harris describes the slot machine mechanics of social media infinite scroll and variable rewards0:18:30
- ▶Joe and Tristan discuss mental health correlations with social media adoption in teenagers0:34:20
- ▶Harris explains the Center for Humane Technology and what systemic change would actually require0:48:15
The Show
Tristan Harris sits down with Joe to break down one of the most important conversations in tech today: how the apps and platforms we use every day are literally designed to hijack our attention and alter our behavior. Harris spent three years at Google as a Design Ethicist, which basically means he was inside the machine watching how these companies engineer addiction into their products.
The core argument is simple but disturbing. Tech companies don't make money from you being happy or informed. They make money when you're engaged, clicking, scrolling, and most importantly, staring at ads. So every feature, every notification, every red badge on your app is scientifically designed to exploit how human psychology actually works. Variable rewards, infinite scroll, the fear of missing out - these aren't accidents. They're weapons.
Harris talks about how Facebook, YouTube, Instagram, and TikTok operate like slot machines. You pull down to refresh and you don't know what you're going to get, which creates this compulsive checking behavior. Notification systems are engineered to interrupt you at the moments you're least resistant. Read receipts, typing indicators - seemingly small features that actually create anxiety and social pressure. It's all intentional.
What really gets heavy is the concept of 'human downgrading.' We're not becoming smarter or more connected in meaningful ways. We're being downgraded - our attention spans are shrinking, our ability to have deep conversations is declining, our mental health metrics are getting worse, especially in young people. And this is happening because some of the smartest engineers on the planet are being paid enormous salaries to figure out how to keep you addicted.
Harris founded the Center for Humane Technology because he realized the problem isn't individual users lacking willpower. The problem is that we're all fighting algorithms designed by teams of PhDs with unlimited resources and data about how our brains work. You can't willpower your way out of that. The only solution is to redesign the technology itself to align with human flourishing instead of advertising dollars.
Joe and Tristan dig into the implications of having a relatively small group of people in Silicon Valley essentially controlling the cognitive environment for billions of humans. There's no democratic process. There's no regulation. There's just quarterly earnings reports driving feature decisions that rewire our dopamine systems. The scariest part? Most of these companies genuinely believe they're making the world better while their engagement metrics directly correlate with increases in teen depression, anxiety, and suicide rates.
Best Quotes
“These are not accidental features. They're the result of the smartest engineers at the world's wealthiest companies working to maximize engagement”
— Tristan Harris
From the JRE 1558 conversation with Tristan Harris.
“We're not just spending time on these platforms. We're being downgraded as human beings in terms of our ability to focus, think deeply, and have meaningful relationships”
— Joe Rogan
From the JRE 1558 conversation with Tristan Harris.
“The business model is: if you're not paying for the product, you are the product. Your attention is being sold to advertisers”
— Tristan Harris
From the JRE 1558 conversation with Tristan Harris.
“You can't blame individual users for lacking willpower when they're fighting algorithms designed by teams of PhDs with unlimited resources and data about human psychology”
— Joe Rogan
From the JRE 1558 conversation with Tristan Harris.
“We've outsourced the control of human attention to a handful of companies in Silicon Valley, and there's no democratic process, no regulation, nothing”
— Tristan Harris
From the JRE 1558 conversation with Tristan Harris.
Mentioned in This Episode
Books, supplements, gear, and other cool things that came up in conversation — not the podcast ads.
As an Amazon Associate we earn from qualifying purchases.