AI transcript
0:00:04 I am thrilled to announce I’m launching a brand new show,
0:00:07 Bird’s Eye View, the definitive WNBA podcast.
0:00:10 Every week, we’ll dig into the WNBA stories
0:00:11 that actually matter with guest interviews,
0:00:14 candid takes, and in-depth analysis from around the league.
0:00:16 It’s a show I’ve wanted to make for a while,
0:00:18 and I’m so excited it’s finally happening.
0:00:20 Whether you’re new to the WNBA or a longtime fan,
0:00:21 pull up.
0:00:22 This show is for you.
0:00:24 Bird’s Eye View is coming May 16th.
0:00:25 Follow the show on YouTube
0:00:27 or wherever you listen to your podcasts.
0:00:33 Support for this show comes from ServiceNow,
0:00:36 a company that helps people do more fulfilling work,
0:00:38 the work they actually want to do.
0:00:40 You know what people don’t want to do?
0:00:41 Boring, busy work.
0:00:44 But ServiceNow says that with their AI agents
0:00:46 built into the ServiceNow platform,
0:00:49 you can automate millions of repetitive tasks
0:00:50 in every corner of a business.
0:00:54 IT, HR, customer service, and more.
0:00:57 And the company says that means your people
0:00:59 can focus on the work that they want to do.
0:01:02 That’s putting AI agents to work for people.
0:01:03 It’s your turn.
0:01:08 You can get started at servicenow.com slash AI dash agents.
0:01:16 We’re told from a young age to achieve.
0:01:17 Get good grades.
0:01:19 Get into a good school.
0:01:20 Get a good job.
0:01:22 Be ambitious about earning a high salary
0:01:24 or a high status position.
0:01:28 Some of us love this endless climb.
0:01:31 But lots of us, at least once in our lives,
0:01:33 find ourselves asking,
0:01:35 what’s the point of all this ambition?
0:01:37 The fat salary or the fancy title?
0:01:40 Aren’t those pretty meaningless measures of success?
0:01:46 One proposed solution is to stop being ambitious
0:01:48 and start being idealistic instead.
0:01:51 You hear this from a lot of influencers.
0:01:52 Follow your passion.
0:01:54 Small is beautiful.
0:01:57 The idea is that you should drop out of the capitalist rat race
0:01:58 and do what you love.
0:02:00 Yoga, maybe.
0:02:01 Or watercolor painting.
0:02:05 Even if it makes very little positive impact on the world.
0:02:09 But what if instead of trying to be less ambitious,
0:02:14 we try to be more ambitious about the things that really matter?
0:02:16 Like helping others.
0:02:19 In an era when there’s so much chaos, injustice,
0:02:22 and frankly, a feeling of widespread despair,
0:02:24 it’s worth asking.
0:02:27 What would the world look like if we start measuring our success,
0:02:29 not in terms of fame or fortune,
0:02:32 but in terms of how much good we do?
0:02:38 I’m Sigal Samuel, and this is The Gray Area.
0:02:45 Today’s guest is historian and author Rutger Bregman.
0:02:50 He’s probably best known for what he yelled at policymakers at Davos a few years ago.
0:02:52 Taxes, taxes, taxes.
0:02:55 He’s tried to get billionaires to pay their fair share in taxes,
0:02:59 and he’s also argued for other policies that could make life better for everyone,
0:03:02 like a universal basic income.
0:03:07 Now, he’s written a new book called Moral Ambition,
0:03:10 which urges us to stop wasting our talents on meaningless work
0:03:13 and start trying to do more good for the world.
0:03:17 He wants us to be both ambitious and idealistic,
0:03:21 to devote ourselves to solving the world’s biggest problems,
0:03:25 like malaria and pandemics and climate change.
0:03:29 I invited Rutger on the show because I find his message inspiring.
0:03:34 And, to be honest, I also have some questions about it.
0:03:37 I want to dedicate myself to work that feels meaningful,
0:03:41 but I’m not sure that work that helps the greatest number of people
0:03:43 is the only way to do that.
0:03:47 So in this conversation, we’ll explore all the different things
0:03:48 that can make our lives feel meaningful
0:03:52 and ask, are some objectively better than others?
0:03:58 Hey, Rutger, welcome to the show.
0:04:00 Thanks for having me. Good to see you.
0:04:02 Your book is called Moral Ambition.
0:04:05 Why should people be morally ambitious?
0:04:10 My whole career, I’ve been fascinated with the waste of talent
0:04:13 that is going on in modern economies.
0:04:17 There’s this one study from two Dutch economists
0:04:18 done a couple of years ago,
0:04:23 and they estimate that around 25% of all workers
0:04:27 think that their own job is socially meaningless,
0:04:30 or at least doubt the value of their job.
0:04:33 That is just insane to me.
0:04:35 I mean, this is five times the unemployment rate.
0:04:39 And we’re talking about people who have often excellent resumes,
0:04:41 you know, who went to very nice universities.
0:04:45 I’m going to Harvard tomorrow to speak to students there.
0:04:47 And, well, it’s an interesting case in point.
0:04:51 45% of Harvard graduates end up in consultancy or finance.
0:04:54 Not saying all of that is totally socially useless,
0:04:58 but I do wonder whether that is the best allocation of talent.
0:05:01 And as you know, we face some pretty big,
0:05:03 obvious problems out there,
0:05:05 whether it’s, you know, the threat of the next pandemic
0:05:07 that may be just around the corner.
0:05:10 Terrible diseases like malaria and tuberculosis
0:05:11 killing millions of people.
0:05:15 The problem with democracy breaking down.
0:05:17 I mean, the list goes on and on and on.
0:05:20 And so I’ve always been frustrated
0:05:23 by this enormous waste of talent.
0:05:27 Now, I’m not saying that morality should suck up everything.
0:05:29 I’m personally a pluralist.
0:05:32 I think that there are many things that are important in life,
0:05:34 you know, family, friends, music, art.
0:05:37 And you don’t want to let morality dominate everything.
0:05:40 But I think in a rich, well-rounded life,
0:05:42 it does play an important role.
0:05:44 And if we’re going to have a career anyway,
0:05:46 we might as well do a lot of good with it.
0:05:49 What about that question specifically about,
0:05:51 you know, someone comes to you and says,
0:05:52 I’m a third grade teacher.
0:05:54 I’m a social worker.
0:05:57 Am I not being morally ambitious enough?
0:06:00 So half of the country already works
0:06:02 in these so-called essential jobs.
0:06:04 We discover that during the pandemic,
0:06:07 that, you know, when some people go on strike,
0:06:07 we’re in real trouble.
0:06:10 So my point here is that half of the country
0:06:12 doesn’t need a lecture from me
0:06:14 about being morally ambitious.
0:06:15 They’re already working in essential jobs.
0:06:18 I’m indeed more interested in preaching
0:06:20 to my own people,
0:06:24 to honestly quite a few of my friends.
0:06:26 We used to have big ideals and dreams
0:06:28 when we were still in university.
0:06:31 You know, we wrote these beautiful application essays
0:06:33 about how we were going to fix
0:06:35 tax avoidance and tax evasion,
0:06:37 how we were going to tackle global hunger
0:06:39 and work at the United Nations
0:06:40 and look at us.
0:06:41 What has happened?
0:06:43 It’s pretty sad, isn’t it?
0:06:45 Now we’re old and wrinkled and complacent.
0:06:47 Yeah, yeah, yeah.
0:06:50 Something has gone wrong, I would say.
0:06:53 So that doesn’t mean that I don’t think
0:06:54 anyone can be morally ambitious.
0:06:57 Rosa Parks was a seamstress.
0:06:59 Le Kualesa, you know,
0:07:01 the great social revolutionary in Poland.
0:07:04 He was an electrician.
0:07:06 So, I mean, history is littered with examples
0:07:08 of people who weren’t very privileged
0:07:10 and still did a lot of good.
0:07:13 But they don’t need a lecture from me, I think.
0:07:16 I’m mainly talking to people
0:07:18 who shouldn’t just check their privilege,
0:07:21 but also use that privilege
0:07:22 to make a massive difference.
0:07:26 What role does personal passion play in that?
0:07:27 You write in the book,
0:07:29 don’t start out by asking,
0:07:30 what’s my passion?
0:07:32 Ask instead, how can I contribute most?
0:07:34 And then choose the role that suits you best.
0:07:35 Don’t forget,
0:07:37 your talents are but a means to an end.
0:07:39 Yeah, I think follow your passion
0:07:41 is probably the worst career advice out there.
0:07:44 We, at the School for Moral Ambition,
0:07:45 an organization I co-founded,
0:07:48 we deeply believe in the Gandalf-Frodo model
0:07:49 of changing the world.
0:07:51 So I always like to say that
0:07:52 Frodo, you know,
0:07:54 he didn’t follow his passion.
0:07:55 Gandalf never asked him,
0:07:57 oh, what’s your passion, Frodo?
0:07:58 He said, look,
0:08:00 this really needs to be done.
0:08:01 This needs to be fixed.
0:08:02 You got to throw the ring into the mountain.
0:08:05 If Frodo would have followed his passion,
0:08:08 he would have probably, you know,
0:08:09 been a gardener,
0:08:10 having a life, you know,
0:08:11 full of second breakfasts,
0:08:13 pretty comfortable in the Shire,
0:08:15 and then the orcs would have turned up
0:08:16 and murdered everyone he ever loved.
0:08:19 So I think the point here is pretty simple.
0:08:22 Find yourself some wise old wizard,
0:08:23 a Gandalf.
0:08:27 Figure out what are some of the most pressing issues
0:08:28 that we face as a species
0:08:29 and ask yourself,
0:08:30 how can I make a difference?
0:08:32 And then you will find out
0:08:33 that you can become
0:08:34 very passionate about it.
0:08:37 It’s just don’t start
0:08:39 with looking at your navel
0:08:39 and thinking,
0:08:41 oh, what is it for me?
0:08:44 Just ask smart people out there
0:08:45 and become passionate
0:08:46 about what they say.
0:08:47 So you’re saying,
0:08:49 do the work first,
0:08:51 trust that the passion will come later?
0:08:52 Absolutely, yeah.
0:08:54 And I’ve got a couple of examples
0:08:55 of that in the book.
0:08:59 One school I’ve got a whole chapter on
0:09:01 is called Charity Entrepreneurship.
0:09:03 They’ve since rebranded
0:09:04 as Ambitious Impact,
0:09:06 but it’s a school that I like to describe
0:09:08 as the Hogwarts for do-gooders.
0:09:10 So they recruit
0:09:14 really driven entrepreneurial people
0:09:15 who want to start
0:09:17 a highly effective nonprofit.
0:09:19 And they continuously
0:09:21 research this question.
0:09:22 It’s called prioritization research,
0:09:23 thinking about,
0:09:23 yeah,
0:09:25 what are some of the most pressing issues
0:09:25 we face?
0:09:28 And then they find these founders
0:09:29 of these nonprofits,
0:09:31 and they basically match the founders
0:09:32 not only with each other
0:09:34 so that you have a co-founder,
0:09:36 but also with these tasks, right?
0:09:38 You basically get a mission.
0:09:40 And one of the most successful charities
0:09:41 they’ve launched
0:09:41 is called
0:09:43 the Lead Exposure Elimination Project.
0:09:44 I believe you guys
0:09:45 have also written about them.
0:09:45 That’s right.
0:09:47 one of the co-founders
0:09:47 is Lucia Coulter.
0:09:49 She used to be a doctor
0:09:49 at the NHS.
0:09:51 Loved her work.
0:09:52 But at the same time,
0:09:53 she was like,
0:09:55 can’t I do more good, right?
0:09:57 I’m currently working as a doctor
0:09:58 in a very rich country,
0:10:00 mostly treating patients
0:10:02 who are already relatively old.
0:10:03 It’s beautiful work,
0:10:04 but I want to do more good.
0:10:06 And you should talk to her now.
0:10:06 I mean,
0:10:08 she’s incredibly passionate
0:10:09 about the work she does.
0:10:10 OK, but so that’s a good example.
0:10:11 So it’s not that
0:10:13 she completely ditched
0:10:14 what she was already doing
0:10:16 and her existing passions, right?
0:10:17 She found a way
0:10:17 to take her passion
0:10:19 for health care
0:10:20 or for global health
0:10:21 and sort of
0:10:23 put it on a different scale,
0:10:24 but still using
0:10:26 her existing core passion
0:10:27 and skill set.
0:10:28 That’s a good point.
0:10:31 Maybe we got to be passionate
0:10:32 on a meta level,
0:10:32 you know,
0:10:34 about our higher level goals.
0:10:36 You can be really passionate
0:10:37 about making the world
0:10:37 a better place,
0:10:38 helping a lot of people,
0:10:39 improving,
0:10:40 global health,
0:10:41 something like that.
0:10:43 But it’s quite risky
0:10:44 if you get too attached
0:10:46 to a certain intervention
0:10:47 or something like that.
0:10:49 I think that’s a very sure way
0:10:51 of massively limiting your impact.
0:10:52 And you see it a lot, sadly.
0:10:54 I’ve been walking around
0:10:55 in the world of philanthropy
0:10:56 for the past two years
0:10:58 and it just drives me nuts
0:11:00 how many of these rich people
0:11:01 are all the time,
0:11:02 you know,
0:11:03 they’re gazing at their navel.
0:11:03 And like,
0:11:05 you don’t have to come up
0:11:06 with the answer yourself.
0:11:07 The research has already been done,
0:11:08 right?
0:11:12 Why do you have to be the one,
0:11:12 you know,
0:11:14 who needs to have this epiphany
0:11:14 about,
0:11:15 oh!
0:11:16 Right.
0:11:17 It’s the pandas
0:11:18 in this specific region
0:11:19 that really need our help.
0:11:22 There are already Gandalfs
0:11:22 and Dumbledores
0:11:24 working on it for you,
0:11:24 figuring it out.
0:11:25 Exactly, exactly.
0:11:27 And it takes a team
0:11:28 to make a big difference.
0:11:31 I think it can be
0:11:32 quite liberating as well
0:11:33 to not have to fight
0:11:34 your passion anymore.
0:11:35 I speak to quite a few
0:11:38 teenagers and people
0:11:39 in their 20s
0:11:41 about what they should do
0:11:41 with their career
0:11:43 and a lot of them
0:11:45 find a lot of relief
0:11:46 in this message
0:11:47 that they don’t have
0:11:48 to find their passion.
0:11:49 That there are other people
0:11:50 out there
0:11:51 who have a job
0:11:52 for them to do, right?
0:11:53 That they can just
0:11:53 sign up for it.
0:11:55 Interesting.
0:11:56 In your book,
0:11:59 there is one Venn diagram
0:12:00 that caught my eye.
0:12:01 It’s, you know,
0:12:02 these three circles.
0:12:04 The first is labeled sizable,
0:12:06 the second is solvable,
0:12:08 and the third is sorely overlooked.
0:12:09 And in the middle
0:12:10 where they all overlap,
0:12:12 it says moral ambition.
0:12:13 Explain that to me.
0:12:14 What does that mean?
0:12:15 Yeah, so this is
0:12:16 the triple S framework
0:12:17 of making the world
0:12:18 a wildly better place.
0:12:20 And it’s connected
0:12:22 to this simple point
0:12:23 that choosing the cause
0:12:25 you work on
0:12:26 is probably
0:12:27 the most important question
0:12:28 you’ve got to answer.
0:12:29 And so,
0:12:30 at the School for More Ambition,
0:12:32 we work with this framework
0:12:35 in selecting these causes.
0:12:37 Take something like
0:12:38 climate change, for example.
0:12:39 Climate change is obviously
0:12:41 a very sizable problem.
0:12:41 It’s very big.
0:12:44 Threatens a lot of people.
0:12:46 It’s also very solvable, right?
0:12:47 We know what we can do.
0:12:49 We’ve got a huge toolbox,
0:12:50 a lot of solutions out there
0:12:51 that are waiting
0:12:52 to be implemented.
0:12:54 And then the question is,
0:12:56 is it also sorely neglected?
0:12:57 And the good news here
0:12:59 is less and less so.
0:12:59 You could ask yourself,
0:13:01 what was the best time
0:13:02 to be a climate activist?
0:13:03 And the answer is not now.
0:13:05 30 years ago.
0:13:05 Exactly.
0:13:06 That was the moment.
0:13:08 So if you, again,
0:13:09 want to maximize your impact,
0:13:10 if you want to ask
0:13:11 the morally ambitious question,
0:13:12 then the question is,
0:13:13 okay,
0:13:15 what would the climate activists
0:13:16 of the 70s
0:13:17 have done today, right?
0:13:19 Or what is the problem
0:13:20 that’s currently
0:13:20 where climate change
0:13:22 was in the 1970s?
0:13:23 You see what I mean?
0:13:27 That is an entrepreneurial way
0:13:29 of looking at doing good.
0:13:31 You are really looking
0:13:31 for the gap in the market.
0:13:32 You could also do that
0:13:34 within a cost area,
0:13:34 by the way.
0:13:36 So if you look at climate change,
0:13:37 then you can think,
0:13:38 okay,
0:13:40 what is the part of the problem
0:13:40 that is currently
0:13:41 most neglected?
0:13:42 Okay,
0:13:43 so looking at the neglected
0:13:44 or sorely overlooked,
0:13:45 looking at the solvable
0:13:47 and looking at the sizable.
0:13:48 I do wonder about
0:13:50 the sizable part of that.
0:13:51 Does moral ambition
0:13:53 always have to be about scale?
0:13:55 Yeah,
0:13:55 I think so.
0:13:55 Yeah.
0:13:56 Yeah.
0:13:57 It’s about making
0:13:58 the biggest possible impact.
0:14:00 And if you can achieve
0:14:01 your goals
0:14:02 during your lifetime,
0:14:02 then you’re probably
0:14:04 not thinking big enough.
0:14:05 Look,
0:14:06 I’m not saying
0:14:06 that everyone
0:14:07 has to be morally ambitious
0:14:08 or something like that.
0:14:09 I’m not like
0:14:11 preaching with my finger
0:14:11 and saying,
0:14:11 oh,
0:14:12 if you don’t live
0:14:13 this kind of life,
0:14:14 you’re a bad person.
0:14:15 I am saying,
0:14:18 if you are ambitious anyway,
0:14:19 you know,
0:14:21 why not redirect that energy
0:14:22 to do a lot of good?
0:14:23 I think it will make your life
0:14:24 much more meaningful.
0:14:25 If you’re going to have
0:14:26 a burnout anyway,
0:14:27 you know,
0:14:27 you might as well
0:14:28 get that burnout
0:14:30 while you help
0:14:30 a lot of people,
0:14:31 right?
0:14:33 And the same is true
0:14:34 for some people
0:14:36 who are very idealistic
0:14:36 but not very ambitious.
0:14:37 Like,
0:14:38 wouldn’t it be nice
0:14:39 to actually achieve a lot?
0:14:40 I mean,
0:14:41 I personally come
0:14:42 from the political left
0:14:43 and,
0:14:45 yeah,
0:14:45 there’s this weird
0:14:46 leftist obsession
0:14:48 with being pure
0:14:48 and irrelevant,
0:14:50 right?
0:14:52 Calling out a lot of people,
0:14:53 winning the debate
0:14:54 in the group chat,
0:14:55 but not actually
0:14:55 making a difference
0:14:57 for the people you say
0:14:57 you care so much about.
0:14:58 I think that’s
0:14:59 what you call in the book
0:15:00 the noble loser,
0:15:01 right?
0:15:01 Yeah,
0:15:01 yeah,
0:15:02 yeah,
0:15:02 yeah.
0:15:04 But I guess
0:15:04 what I’m wondering is,
0:15:05 do you believe
0:15:06 that there is sort of
0:15:07 a moral imperative
0:15:09 to do the most good
0:15:10 you possibly can do
0:15:11 to have the most impact,
0:15:12 the most scale?
0:15:14 Well,
0:15:15 obviously at some point
0:15:17 you’ve done enough.
0:15:19 I talk about
0:15:20 Thomas Clarkson,
0:15:21 my favorite abolitionist.
0:15:23 He was
0:15:25 a British writer
0:15:25 and activist
0:15:28 and when he was 25
0:15:29 he had this epiphany
0:15:30 that slavery
0:15:31 was probably
0:15:31 the greatest moral
0:15:33 atrocity of his time
0:15:33 and he was like,
0:15:34 you know what,
0:15:34 maybe I can make
0:15:35 a difference.
0:15:36 Maybe I can
0:15:38 spend my life
0:15:39 fighting this
0:15:40 horrible institution
0:15:42 and that’s basically
0:15:42 what he did.
0:15:43 The first seven years
0:15:44 he traveled across
0:15:45 the United Kingdom
0:15:46 35,000 miles
0:15:47 spreading his abolitionist
0:15:48 propaganda everywhere
0:15:49 and then
0:15:50 he had a total
0:15:51 nervous breakdown.
0:15:53 Utter burnout.
0:15:53 He couldn’t walk
0:15:54 the stairs anymore.
0:15:55 He couldn’t speak.
0:15:56 He started sweating
0:15:57 profusely whenever
0:15:59 he wanted to say something
0:16:00 and I read that
0:16:00 in his memoirs
0:16:01 and I was like,
0:16:02 Thomas, Thomas, Thomas.
0:16:04 Remember your
0:16:05 breathing exercises.
0:16:05 You can take things
0:16:06 too far.
0:16:07 Now, the reason I say
0:16:08 that only at the end
0:16:08 of the book
0:16:10 because, you know,
0:16:11 most of us first
0:16:12 deserve a kick in the butt.
0:16:13 So, yeah,
0:16:14 there are some
0:16:15 do-gooders out there.
0:16:17 I think they, you know,
0:16:18 take morality
0:16:19 a little bit too seriously.
0:16:20 As I said,
0:16:22 I’m personally a pluralist.
0:16:22 I’m a father
0:16:23 of two young children.
0:16:24 I think they’re
0:16:25 way more important
0:16:26 than, you know,
0:16:27 my career.
0:16:29 But I am
0:16:31 pretty ambitious, right?
0:16:32 I do want to make
0:16:33 a mark on this world
0:16:34 and I think there are
0:16:35 a lot of people out there.
0:16:36 We are all,
0:16:37 or most of us,
0:16:38 are scared to death.
0:16:40 And what do you want
0:16:41 to look back on
0:16:42 when you lie on your deathbed?
0:16:44 All the PowerPoints,
0:16:44 you know,
0:16:46 you hated to make
0:16:47 or all the reports
0:16:48 you wrote
0:16:48 that no one ever
0:16:49 wanted to read,
0:16:50 all the products
0:16:51 that you didn’t believe in
0:16:52 that you still spend
0:16:53 a lifetime selling?
0:16:54 Seems pretty sad to me.
0:16:56 I think this is touching
0:16:57 on something really honest,
0:16:58 which is that
0:16:59 I think a lot of
0:17:01 the desire
0:17:02 for this sort of
0:17:03 big impact
0:17:04 may actually come
0:17:05 from our fear
0:17:07 of our own mortality
0:17:08 and this desire
0:17:09 to leave a legacy
0:17:10 that will outlast us
0:17:11 so that we feel like
0:17:11 in some sense
0:17:12 it actually mattered
0:17:13 that we lived it all.
0:17:15 And I remember
0:17:17 dealing with this myself.
0:17:20 I’m a journalist now
0:17:20 but before that
0:17:21 I was a novelist
0:17:23 and I didn’t care
0:17:25 how many people
0:17:26 my work impacted, right?
0:17:27 It was for me
0:17:28 really not about scale.
0:17:29 My feeling was,
0:17:30 look, if my novel
0:17:31 deeply moves
0:17:32 just one reader
0:17:34 and helps them feel
0:17:35 less alone in the world,
0:17:36 helps them feel
0:17:36 more understood,
0:17:38 I will be happy.
0:17:40 So I guess
0:17:42 my question for you
0:17:42 as someone who has
0:17:43 personally struggled
0:17:44 with this issue of scale
0:17:45 is, you know,
0:17:46 are you telling me
0:17:47 I shouldn’t be happy
0:17:47 with that?
0:17:49 The title of chapter one
0:17:50 in your book
0:17:50 is literally
0:17:52 no, you’re not fine
0:17:52 just the way you are.
0:17:55 So I think
0:17:56 there is absolutely
0:17:57 a place for
0:17:59 as the French say
0:18:00 art pour l’art,
0:18:01 right?
0:18:03 It’s just music
0:18:04 or art
0:18:04 for the sake
0:18:05 of art itself.
0:18:07 I don’t want to,
0:18:07 you know,
0:18:09 let everything succumb
0:18:09 to kind of
0:18:12 utilitarian calculus.
0:18:14 I think
0:18:15 it’s better
0:18:16 to help a lot of people
0:18:17 than just a few people
0:18:17 people.
0:18:20 So, and as I said,
0:18:21 in any rich life,
0:18:22 morality does play
0:18:23 a big role.
0:18:25 I wouldn’t want
0:18:26 to live in a society
0:18:26 where everyone
0:18:27 is like Thomas Clarkson,
0:18:27 you know,
0:18:28 running around
0:18:29 on his horseback
0:18:31 doing morally
0:18:31 ambitious work.
0:18:33 But on the margins,
0:18:35 I think in the world
0:18:35 today,
0:18:36 we need a lot
0:18:37 more ambition.
0:18:38 We need much more
0:18:39 moral ambition
0:18:39 than we currently have.
0:18:41 Yeah, I mean,
0:18:41 I personally
0:18:42 would not want
0:18:42 to end up
0:18:42 in a world
0:18:43 where everyone
0:18:44 is so focused
0:18:45 on moral ambition
0:18:46 and scale
0:18:46 that we,
0:18:47 like,
0:18:47 that no one
0:18:48 ever writes a novel
0:18:49 because they worry
0:18:49 it won’t impact
0:18:50 enough people.
0:18:51 You know,
0:18:52 when I was reading
0:18:52 your book,
0:18:53 I kept thinking
0:18:54 of the philosopher
0:18:55 Susan Wolfe,
0:18:57 who has this great
0:18:57 essay called
0:18:58 Moral Saints,
0:18:58 and I know you
0:18:59 mention it
0:19:00 in a footnote,
0:19:00 but I think her ideas
0:19:01 are very,
0:19:01 very important
0:19:02 in this context,
0:19:02 so I want
0:19:03 to talk about them.
0:19:05 Wolfe,
0:19:05 in that essay
0:19:06 Moral Saints,
0:19:07 she says,
0:19:08 if the moral saint
0:19:10 is devoting all his time
0:19:10 to feeding the hungry
0:19:11 or healing the sick
0:19:12 or raising money
0:19:13 for Oxfam,
0:19:14 then necessarily
0:19:15 he is not reading
0:19:15 Victorian novels,
0:19:17 playing the oboe,
0:19:18 or improving his backhand.
0:19:19 A life in which
0:19:20 none of these possible
0:19:22 aspects of character
0:19:22 are developed
0:19:23 may seem to be
0:19:24 a life strangely barren.
0:19:27 Quite an elitist idea
0:19:28 of how to spend
0:19:29 your life,
0:19:29 by the way,
0:19:30 reading a novel
0:19:31 and improving
0:19:32 your backhand,
0:19:33 or maybe just
0:19:34 watching Netflix all day.
0:19:35 Fair, fair,
0:19:36 but you could
0:19:36 swap that out
0:19:38 with reading
0:19:39 your favorite book
0:19:42 and any hobby,
0:19:43 playing soccer,
0:19:44 whatever it might be.
0:19:45 But basically
0:19:46 what she’s saying
0:19:47 is if you try
0:19:47 to make all
0:19:48 of your actions
0:19:49 as morally good
0:19:49 as possible,
0:19:50 you kind of end up
0:19:51 living a life
0:19:52 that’s bereft
0:19:52 of hobbies
0:19:54 or relationships
0:19:55 or all the other
0:19:55 experiences
0:19:56 that make life meaningful.
0:19:58 Talk a little more
0:19:59 about how you square
0:19:59 that with your urge
0:20:00 to be morally ambitious.
0:20:02 There is some tension,
0:20:03 but I think
0:20:04 that tension
0:20:04 is mainly felt
0:20:05 by philosophers
0:20:06 for some reason
0:20:08 and not really
0:20:09 by me
0:20:10 or, I don’t know,
0:20:12 a lot of normies.
0:20:14 It’s just,
0:20:17 as I said,
0:20:17 for me,
0:20:18 it’s super obvious
0:20:19 that life is about
0:20:19 many things,
0:20:20 including improving
0:20:22 your backhand.
0:20:23 I’m not saying
0:20:24 that people aren’t
0:20:25 allowed to play
0:20:26 tennis anymore,
0:20:27 but we spend,
0:20:28 what is it,
0:20:30 2,000 work weeks
0:20:30 in our career,
0:20:32 10,000 working days,
0:20:33 80,000 hours.
0:20:34 That’s a lot of time
0:20:36 still left at the job.
0:20:37 And as I said,
0:20:38 25% of people
0:20:39 currently consider
0:20:40 their own job
0:20:41 socially meaningless.
0:20:42 And a lot of
0:20:43 our so-called
0:20:44 best and brightest
0:20:45 are stuck in those jobs.
0:20:46 So,
0:20:47 I don’t know.
0:20:49 We are living
0:20:49 in a world
0:20:50 where a huge amount
0:20:50 of people
0:20:51 have a career
0:20:52 that they consider
0:20:52 socially meaningless
0:20:53 and then they spend
0:20:54 the rest of their time
0:20:56 swiping TikTok.
0:20:58 That’s the reality,
0:20:59 right?
0:21:01 I really don’t think
0:21:03 that there’s a big danger
0:21:03 of, you know,
0:21:05 people reading my book
0:21:05 and, you know,
0:21:07 moving all the way
0:21:08 in the other direction.
0:21:09 And that’s a problem
0:21:09 I would honestly
0:21:10 like to have.
0:21:11 So,
0:21:11 you’re saying,
0:21:11 like,
0:21:12 we’re currently
0:21:14 very far away
0:21:14 from this problem
0:21:15 of, like,
0:21:15 everyone going
0:21:16 full tilt
0:21:17 on moral ambition
0:21:18 and ignoring
0:21:19 everything else in life.
0:21:20 There’s only one
0:21:21 community I know of
0:21:22 where this has
0:21:23 become a problem
0:21:24 and, as you know,
0:21:25 it’s the effective
0:21:26 altruism community.
0:21:28 In a way,
0:21:29 moral ambition
0:21:30 could be seen
0:21:31 as effective
0:21:32 altruism for normies.
0:21:34 Okay,
0:21:34 I definitely,
0:21:35 I definitely want
0:21:36 to get to that,
0:21:36 but I’m going
0:21:36 to put a pin
0:21:37 in that for a moment
0:21:39 because I just want
0:21:41 to take the flip side
0:21:41 of what you were
0:21:42 just saying.
0:21:42 You’re saying,
0:21:43 like,
0:21:43 okay,
0:21:45 I’m not really
0:21:46 concerned,
0:21:46 Seagal,
0:21:47 that we’re,
0:21:47 like,
0:21:48 edging into this world
0:21:48 where everyone
0:21:49 is so focused
0:21:50 on moral ambition.
0:21:53 But how
0:21:54 do you then
0:21:55 actually know
0:21:56 when it’s enough?
0:21:57 I think you used
0:21:57 the phrase earlier,
0:21:58 like,
0:21:58 at some point
0:21:59 it’s enough,
0:21:59 you know?
0:22:01 And I think,
0:22:01 you know,
0:22:03 you write in the epilogue
0:22:04 of the book,
0:22:05 morality plays a big role
0:22:06 in a rich and full life,
0:22:07 but it’s not everything.
0:22:08 And if your inner fire
0:22:09 burns bright,
0:22:10 no need to stoke it hotter.
0:22:12 But to me,
0:22:12 that is pretty,
0:22:12 like,
0:22:13 fuzzy sounding.
0:22:14 How can I know
0:22:15 what’s enough
0:22:17 and avoid pushing
0:22:17 so far
0:22:18 that moral ambition
0:22:20 does take over my life?
0:22:21 That does happen
0:22:21 to some people.
0:22:24 So how can I concretely know,
0:22:24 like,
0:22:24 Seagal,
0:22:25 you’ve done enough.
0:22:26 Chill.
0:22:27 Well,
0:22:28 it depends
0:22:30 on how far
0:22:30 you want to
0:22:31 push yourself.
0:22:33 Look,
0:22:34 there are no
0:22:35 easy answers here.
0:22:37 I think that at some point
0:22:38 when you really start
0:22:41 to suffer
0:22:42 from your moral ambition,
0:22:43 that’s not where
0:22:45 I would want you
0:22:46 to end up.
0:22:48 I think you should be fueled
0:22:49 for 80%
0:22:50 by enthusiasm
0:22:52 and for maybe 20%
0:22:53 by feelings of guilt
0:22:53 and shame.
0:22:55 So a little bit
0:22:56 of guilt and shame
0:22:56 in the mix,
0:22:57 that’s fine.
0:22:59 It’s actually how,
0:23:00 you know,
0:23:01 this journey started
0:23:01 for me.
0:23:02 You know,
0:23:03 I published
0:23:04 this previous book,
0:23:05 Humankind,
0:23:06 made quite a lot
0:23:07 of money on it,
0:23:07 honestly,
0:23:08 which I never
0:23:09 would have expected.
0:23:10 I always thought
0:23:11 that it would be
0:23:12 a broke history teacher
0:23:13 or something like that.
0:23:15 And yeah,
0:23:16 that gave me
0:23:17 a feeling of responsibility
0:23:18 like,
0:23:18 huh,
0:23:19 what does this mean?
0:23:20 I actually need
0:23:21 to do something.
0:23:23 And I also felt
0:23:23 a little bit ashamed
0:23:25 for spending a decade
0:23:26 in what I like to describe
0:23:28 as the awareness industry.
0:23:28 You know,
0:23:29 I’d been
0:23:32 saying a lot
0:23:32 about all the things
0:23:33 that need to happen
0:23:33 in the world.
0:23:34 A lot of people
0:23:34 would know me
0:23:35 for shouting
0:23:35 taxes,
0:23:36 taxes,
0:23:37 taxes at Davos,
0:23:37 right?
0:23:37 Yep.
0:23:39 And I was a bit
0:23:41 fed up with myself,
0:23:41 honestly,
0:23:43 for standing
0:23:44 on the sidelines.
0:23:45 To me,
0:23:46 what this is indicating
0:23:46 is like,
0:23:47 there’s some element
0:23:48 of subjectivity here,
0:23:48 right?
0:23:49 Like the question
0:23:50 of what percentage
0:23:51 of my life
0:23:52 should be focused
0:23:53 on moral ambition
0:23:53 and what should be
0:23:55 like playing the oboe
0:23:56 or like whatever,
0:23:57 making watercolor paintings.
0:23:58 To some degree,
0:23:59 you’re deciding
0:23:59 how much
0:24:00 you want to push yourself,
0:24:01 how much
0:24:02 you’re okay
0:24:03 with having
0:24:03 some suffering
0:24:04 in your life
0:24:04 to achieve
0:24:05 a greater goal,
0:24:06 how much you’re like…
0:24:08 Can I push back
0:24:08 a little bit?
0:24:09 Yeah, please.
0:24:10 I think the question
0:24:12 itself sort of presumes
0:24:13 that doing a lot
0:24:13 of good
0:24:14 or making a lot
0:24:15 of impact
0:24:16 is not going
0:24:17 to be a nice
0:24:18 experience or something
0:24:18 like that,
0:24:20 that pushing harder
0:24:22 will always involve
0:24:23 more sacrifices.
0:24:24 But if you talk
0:24:24 to a lot
0:24:25 of entrepreneurs,
0:24:26 they find a lot
0:24:27 of joy
0:24:28 in thinking big.
0:24:29 They find a lot
0:24:30 of joy
0:24:31 in climbing the ladder.
0:24:33 It’s what I always
0:24:34 experienced in my career.
0:24:35 I love becoming
0:24:36 a member
0:24:37 of a student society
0:24:38 in Utrecht
0:24:38 in the Netherlands
0:24:39 where I grew up
0:24:41 because I felt
0:24:42 so dumb
0:24:43 compared to all
0:24:43 these older students.
0:24:44 And I was like,
0:24:45 this is awesome.
0:24:45 I want to learn
0:24:46 about philosophy
0:24:47 and anthropology
0:24:48 and history.
0:24:49 And again,
0:24:49 when I started
0:24:49 my career
0:24:50 as a journalist
0:24:52 at the Volkskrantz,
0:24:52 which is sort of
0:24:53 the Guardian
0:24:55 or, well,
0:24:55 I guess the New York Times
0:24:56 at the Netherlands,
0:24:57 I just love being
0:24:59 the youngest journalist
0:25:01 there and learning
0:25:02 from my older colleagues.
0:25:04 And when I started
0:25:06 as a writer,
0:25:07 I had these big dreams
0:25:08 about, you know,
0:25:09 I want to write a book
0:25:10 that will speak
0:25:11 to millions of people
0:25:12 about the big questions
0:25:12 of history,
0:25:13 like why have we
0:25:14 conquered the globe?
0:25:16 What makes humans special?
0:25:19 And then as I did that,
0:25:19 you know,
0:25:21 I was in my early 30s,
0:25:22 I was, yeah,
0:25:23 a bit bored
0:25:24 and looking for the next
0:25:24 ladder to climb.
0:25:27 So for me,
0:25:29 climbing a new ladder
0:25:30 has mostly been
0:25:31 about excitement
0:25:33 and enthusiasm.
0:25:49 Support for this show
0:25:50 comes from Shopify.
0:25:52 When you’re creating
0:25:53 your own business,
0:25:54 you have to wear
0:25:54 too many hats.
0:25:56 You have to be on top
0:25:56 of marketing
0:25:57 and sales
0:25:58 and outreach
0:25:59 and sales
0:26:00 and designs
0:26:01 and sales
0:26:02 and finances
0:26:03 and definitely
0:26:04 sales.
0:26:05 Finding the right tool
0:26:07 that simplifies everything
0:26:08 can be a game changer.
0:26:09 For millions of businesses,
0:26:10 that tool
0:26:11 is Shopify.
0:26:13 Shopify is a commerce
0:26:15 platform behind millions
0:26:15 of businesses
0:26:16 around the world
0:26:17 and,
0:26:18 according to the company,
0:26:20 10% of all e-commerce
0:26:21 in the U.S.
0:26:22 From household names
0:26:23 like Mattel
0:26:24 and Gemshark
0:26:25 to brands
0:26:26 just getting started,
0:26:27 they say they have
0:26:28 hundreds of ready-to-use
0:26:29 templates to help
0:26:30 design your brand style.
0:26:32 If you’re ready
0:26:32 to sell,
0:26:33 you’re ready
0:26:34 for Shopify.
0:26:35 You can turn
0:26:36 your big business
0:26:37 idea into
0:26:39 with Shopify
0:26:40 on your side.
0:26:41 You can sign up
0:26:41 for your $1
0:26:42 per month trial period
0:26:43 and start selling
0:26:44 today at
0:26:45 shopify.com
0:26:46 slash vox.
0:26:47 You can go to
0:26:48 shopify.com
0:26:49 slash vox.
0:26:50 That’s
0:26:51 shopify.com
0:26:52 slash vox.
0:26:59 Support for the gray area
0:27:00 comes from Bombas.
0:27:02 It’s time for spring cleaning
0:27:02 and you can start
0:27:04 with your sock drawer.
0:27:05 Bombas can help you
0:27:06 replace all your old
0:27:07 worn-down pairs.
0:27:08 Say you’re thinking
0:27:09 of getting into running
0:27:09 this summer.
0:27:11 Bombas engineers
0:27:11 blister-fighting,
0:27:13 sweat-wicking athletic socks
0:27:13 that can help you
0:27:14 go that extra mile.
0:27:16 Or if you have a spring
0:27:17 wedding coming up,
0:27:17 they make comfortable
0:27:18 dress socks too
0:27:20 for loafers, heels,
0:27:20 and all your other
0:27:21 fancy shoes.
0:27:23 I’m a big runner.
0:27:24 I talk about it all the time.
0:27:25 But the problem is that
0:27:27 I live on the Gulf Coast
0:27:28 and it’s basically
0:27:29 a sauna outside
0:27:30 for four months of the year,
0:27:31 maybe five.
0:27:32 I started wearing
0:27:34 Bombas athletic socks
0:27:34 for my runs
0:27:36 and they’ve held up
0:27:37 better than any other
0:27:38 socks I’ve ever tried.
0:27:39 They’re super durable,
0:27:40 comfortable,
0:27:42 and they really do
0:27:43 a great job
0:27:43 of absorbing
0:27:44 all that sweat.
0:27:45 And right now,
0:27:46 Bombas is going
0:27:46 international.
0:27:48 You can get
0:27:48 worldwide shipping
0:27:50 to over 200 countries.
0:27:51 You can go to
0:27:52 bombas.com
0:27:53 slash gray area
0:27:54 and use code
0:27:54 gray area
0:27:55 for 20% off
0:27:56 your first purchase.
0:27:57 That’s
0:27:59 B-O-M-B-A-S
0:27:59 dot com
0:28:00 slash gray area.
0:28:02 Code gray area
0:28:02 for 20% off
0:28:03 your first purchase.
0:28:05 Bombas dot com
0:28:05 slash gray area.
0:28:07 Code gray area.
0:28:12 Harvey Weinstein
0:28:13 is back in court
0:28:14 this week.
0:28:15 An appeals court
0:28:16 overturned his
0:28:16 2020 conviction
0:28:17 in New York
0:28:18 saying he hadn’t
0:28:19 gotten a fair trial
0:28:21 and so his accusers
0:28:22 must now testify again.
0:28:25 Weinstein has always
0:28:26 had very good lawyers,
0:28:27 but the court
0:28:28 of public opinion
0:28:29 was against him.
0:28:30 Until now,
0:28:31 it seems.
0:28:32 After looking over
0:28:32 this case,
0:28:32 I’ve concluded
0:28:33 that Harvey Weinstein
0:28:34 was wrongfully convicted
0:28:35 and was basically
0:28:35 just hung on
0:28:36 the Me Too thing.
0:28:37 The commentator
0:28:38 Candace Owens,
0:28:38 who has previously
0:28:39 defended Kanye
0:28:40 and Andrew Tate.
0:28:41 Andrew Tate
0:28:42 and his brother
0:28:43 were actually a response
0:28:45 to a misandrist culture.
0:28:46 Women that hated men.
0:28:47 Before Andrew Tate,
0:28:48 there was Lena Dunham.
0:28:49 Has taken up
0:28:50 Weinstein’s cause
0:28:51 and it seems to be
0:28:53 gaining her followers.
0:28:54 Coming up on Today Explained,
0:28:56 when Candace met Harvey.
0:28:56 When Candace met Harvey.
0:29:26 Let’s talk about
0:29:27 the effective altruism
0:29:28 piece of this.
0:29:28 Some of our listeners
0:29:29 may have heard of it,
0:29:31 but for those who haven’t,
0:29:31 it’s a movement
0:29:32 that’s all about
0:29:33 using reason
0:29:34 and evidence
0:29:34 and data
0:29:35 to do as much
0:29:36 good as possible.
0:29:37 I will say
0:29:39 I’m not an effective altruist,
0:29:40 but I am a journalist
0:29:41 who has reported
0:29:42 a lot on EA
0:29:43 because I work
0:29:44 for Vox’s
0:29:45 Future Perfect section,
0:29:46 which was sort of
0:29:47 loosely inspired
0:29:48 by EA
0:29:50 in its early days.
0:29:52 So I am curious
0:29:53 where you stand on this.
0:29:54 You talk about
0:29:55 effective altruism
0:29:55 in the book
0:29:56 and you do echo
0:29:58 a lot of its core ideas,
0:29:59 like this idea
0:29:59 that you shouldn’t
0:30:00 just be trying
0:30:00 to do good,
0:30:01 you should try to do
0:30:03 the most good possible.
0:30:05 So is being morally ambitious
0:30:06 different from being
0:30:07 an effective altruist?
0:30:09 Yeah, so I wouldn’t say
0:30:10 the most good.
0:30:11 I was like,
0:30:12 you should do
0:30:12 a lot of good.
0:30:14 Okay, okay.
0:30:14 Which is different, right?
0:30:15 That’s not about
0:30:16 being perfect,
0:30:17 but just about
0:30:18 being ambitious.
0:30:20 So in the book,
0:30:21 I study a lot of movements
0:30:22 that I admire.
0:30:23 As you know,
0:30:24 I write extensively
0:30:25 about the abolitionists,
0:30:26 about the suffragettes,
0:30:28 about the civil right
0:30:28 campaigners,
0:30:30 about extraordinary people
0:30:31 like Rosa Parks,
0:30:32 who was such a
0:30:33 strategic visionary.
0:30:33 A lot of people
0:30:34 remember her
0:30:35 as this,
0:30:36 you know,
0:30:37 quiet seamstress,
0:30:38 but she was actually
0:30:39 a highly experienced
0:30:39 activist,
0:30:43 and they really planned
0:30:45 this whole Montgomery bus boycott.
0:30:46 It didn’t just happen.
0:30:47 I talk about
0:30:48 the animal rights movement.
0:30:49 I talk about
0:30:50 Ralph Nader
0:30:52 and the extraordinary
0:30:53 Nader’s Raider movement
0:30:54 in the 60s and the 70s,
0:30:55 when Ralph Nader
0:30:56 was able to recruit
0:30:58 a lot of really talented
0:31:00 young Ivy League graduates
0:31:00 and convince them
0:31:01 to not work
0:31:03 for boring law firms,
0:31:03 but instead
0:31:04 go to Washington
0:31:06 and influence legislation.
0:31:07 There’s one historian
0:31:08 who estimates
0:31:08 that they’ve influenced,
0:31:09 what is it,
0:31:11 25 pieces of federal legislation.
0:31:12 So anyway,
0:31:13 the book is a whole collection
0:31:14 of studies of movements
0:31:15 that I admire,
0:31:16 and indeed,
0:31:17 effective altruism
0:31:18 is also one of those
0:31:19 movements that I admire
0:31:19 quite a bit.
0:31:20 I think there’s a lot
0:31:21 we can learn from them,
0:31:22 and there are also
0:31:23 quite a few things
0:31:24 that I don’t really like
0:31:25 about them.
0:31:28 So the main thing
0:31:29 I think indeed
0:31:30 what I really like
0:31:31 about them
0:31:31 is their
0:31:33 moral seriousness.
0:31:35 As I said,
0:31:36 I come from the political left,
0:31:37 and if there’s one thing
0:31:39 that’s often quite annoying
0:31:40 about lefties
0:31:40 is that they
0:31:41 preach a lot,
0:31:42 but they
0:31:43 do little.
0:31:44 For example,
0:31:45 this simple thing
0:31:46 about donating
0:31:47 to charity,
0:31:49 I think it’s
0:31:49 pretty easy
0:31:50 to make the case
0:31:50 that
0:31:52 that is one of the most
0:31:53 effective things
0:31:54 you can do,
0:31:55 but then
0:31:56 very few
0:31:56 of my
0:31:57 progressive
0:31:58 leftist friends
0:31:58 donate
0:32:00 anything.
0:32:01 So I really
0:32:02 like that
0:32:03 moral seriousness
0:32:04 of EAs.
0:32:05 You know,
0:32:05 you go to conferences
0:32:07 and you will meet
0:32:07 quite a few people
0:32:08 who have donated
0:32:09 kidneys to
0:32:11 random strangers,
0:32:12 which is
0:32:13 pretty impressive.
0:32:14 I’m sorry to say
0:32:15 that I still have
0:32:16 both of my kidneys.
0:32:18 My condolences.
0:32:18 And I’m quite attached to them.
0:32:21 But yeah,
0:32:22 I admire the people
0:32:24 who really
0:32:24 practice what
0:32:25 they preach.
0:32:28 I guess the main
0:32:30 thing I dislike
0:32:31 is probably
0:32:32 what we already
0:32:33 talked about.
0:32:33 Like,
0:32:34 where does the
0:32:35 motivation come from?
0:32:38 One of the
0:32:39 founding fathers
0:32:40 of effective
0:32:40 altruism was
0:32:41 the philosopher
0:32:42 Peter Singer,
0:32:42 obviously,
0:32:43 also one of the
0:32:44 founding fathers
0:32:44 of the mother
0:32:45 animal rights
0:32:45 movement.
0:32:46 And everyone
0:32:47 knows him for
0:32:47 this,
0:32:49 you know,
0:32:50 that thought
0:32:51 experiment of
0:32:51 the child
0:32:52 drowning in the
0:32:53 shallow pond.
0:32:55 I’m pretty sure
0:32:55 that he must be
0:32:56 really fed up
0:32:58 with talking about
0:32:59 that thought
0:33:00 experiment because
0:33:01 like,
0:33:01 I am already
0:33:02 fed up talking
0:33:03 about it and
0:33:03 it’s not even
0:33:04 my thought
0:33:04 experiment.
0:33:05 Right.
0:33:05 So that’s the
0:33:06 thought experiment
0:33:07 where Peter Singer
0:33:08 says,
0:33:09 look,
0:33:09 if you are
0:33:10 walking to work
0:33:10 and you see
0:33:11 a little kid
0:33:12 drowning in a
0:33:12 shallow pond,
0:33:13 you know you
0:33:14 could save this
0:33:14 kid.
0:33:15 Your life will
0:33:15 be in no danger.
0:33:16 It’s shallow,
0:33:18 but you will
0:33:19 ruin your expensive
0:33:19 suit or you will
0:33:20 muddy your shoes
0:33:21 should you do it.
0:33:21 And it’s
0:33:22 supposed to
0:33:22 be like,
0:33:22 yes,
0:33:23 obviously you
0:33:24 should do it.
0:33:25 And well,
0:33:26 by comparison,
0:33:26 you know,
0:33:27 by analogy,
0:33:28 we have money.
0:33:29 It could easily
0:33:30 save the lives
0:33:30 of people in
0:33:31 developing countries.
0:33:33 So you should
0:33:34 donate it.
0:33:34 Yeah.
0:33:35 Thank you so much
0:33:36 for helping me
0:33:36 out with that one.
0:33:37 Anyway,
0:33:39 I never really
0:33:39 liked the thought
0:33:41 experiment because
0:33:41 it always felt
0:33:43 like a form of
0:33:44 moral blackmail to
0:33:44 me.
0:33:45 And now I’m
0:33:46 suddenly supposed
0:33:47 to see drowning
0:33:47 children everywhere
0:33:48 and like,
0:33:48 oh,
0:33:49 this microphone,
0:33:50 it was too
0:33:50 expensive.
0:33:51 Could have
0:33:51 donated that
0:33:52 to, I don’t
0:33:52 know,
0:33:53 a charity in
0:33:54 Malawi or,
0:33:55 you know,
0:33:55 I just had a
0:33:56 sandwich and,
0:33:57 you know,
0:33:59 the peanut butter
0:33:59 on it was also
0:34:00 too expensive.
0:34:01 It’s like a
0:34:02 totally inhuman
0:34:03 way of, I
0:34:03 don’t know,
0:34:04 looking at life.
0:34:05 It just doesn’t
0:34:05 resonate with me
0:34:06 at all.
0:34:07 But there are
0:34:07 quite a few
0:34:08 people who
0:34:10 instantly thought,
0:34:11 yes,
0:34:11 that is true.
0:34:12 they discovered,
0:34:12 hey,
0:34:13 wait a minute,
0:34:13 I’m not
0:34:13 alone.
0:34:15 Let’s build a
0:34:15 movement together.
0:34:17 And I really
0:34:17 like that.
0:34:18 For me,
0:34:19 the historical
0:34:21 comparison is
0:34:22 the Quakers,
0:34:23 the early
0:34:24 abolitionists,
0:34:25 who were very
0:34:26 weird as well.
0:34:27 It was like
0:34:28 this small
0:34:29 Protestant sect
0:34:30 of people who
0:34:31 deeply believed
0:34:31 in equality.
0:34:32 They were some
0:34:33 of the first
0:34:34 who allowed
0:34:35 women to
0:34:36 also preach
0:34:36 in their
0:34:37 meeting houses.
0:34:38 They would
0:34:39 never take an
0:34:40 oath because
0:34:40 they were like,
0:34:41 yeah,
0:34:41 we always
0:34:42 speak the
0:34:42 truth,
0:34:42 so why
0:34:42 would we
0:34:43 take an
0:34:43 oath?
0:34:44 Anyway,
0:34:44 they were
0:34:45 seen as
0:34:45 very weird
0:34:48 and quite
0:34:48 amazing as
0:34:49 well.
0:34:49 The
0:34:50 abolitionism
0:34:50 sort of
0:34:51 started as
0:34:52 a Quaker
0:34:52 startup.
0:34:53 So that’s
0:34:54 also how
0:34:54 I see
0:34:55 EA,
0:34:56 as very
0:34:56 weird,
0:34:57 but pretty
0:34:58 impressive.
0:35:01 And I
0:35:01 think a lot
0:35:02 of people in
0:35:02 there have
0:35:02 done a lot
0:35:03 of good
0:35:03 work,
0:35:04 even though
0:35:05 I’d never
0:35:06 joined the
0:35:06 church.
0:35:08 It’s not
0:35:08 for me.
0:35:09 And there are
0:35:10 some obvious
0:35:12 downsides to
0:35:13 the ideology
0:35:14 as well.
0:35:15 Let’s pick
0:35:15 up on that
0:35:16 weirdness bit,
0:35:16 right?
0:35:17 So in
0:35:17 your book,
0:35:18 you straight
0:35:19 up tell
0:35:20 readers,
0:35:21 join a
0:35:22 cult or
0:35:22 start your
0:35:22 own.
0:35:23 Regardless,
0:35:24 you can’t
0:35:24 be afraid to
0:35:25 come across
0:35:26 as weird if
0:35:26 you want to
0:35:26 make a
0:35:26 difference.
0:35:27 Every milestone
0:35:28 of civilization
0:35:29 was first seen
0:35:29 as the crazy
0:35:30 idea of some
0:35:31 subculture.
0:35:33 I’m curious
0:35:34 how you think
0:35:35 about the
0:35:36 downsides of
0:35:37 being in a
0:35:37 cult.
0:35:38 cults don’t
0:35:39 have a
0:35:39 great
0:35:39 reputation,
0:35:40 do they?
0:35:42 So I
0:35:42 got to give
0:35:43 some credit
0:35:43 to Peter
0:35:44 Thiel here.
0:35:46 Maybe not
0:35:48 someone that
0:35:49 people naturally
0:35:50 associate with
0:35:50 me.
0:35:52 For those who
0:35:52 don’t know
0:35:53 him, he is
0:35:54 a venture
0:35:54 capitalist,
0:35:55 very much on
0:35:55 the right
0:35:57 wing side of
0:35:57 the political
0:35:58 spectrum.
0:35:59 He’s written
0:35:59 this fantastic
0:36:00 book called
0:36:01 Zero to One
0:36:02 about how to
0:36:03 build a
0:36:03 successful
0:36:03 startup.
0:36:05 And indeed,
0:36:06 one of his
0:36:07 advices is to
0:36:07 start a cult.
0:36:09 a cult is
0:36:10 a small
0:36:10 group of
0:36:11 thoughtful,
0:36:12 committed
0:36:13 citizens who
0:36:13 want to
0:36:14 change the
0:36:14 world.
0:36:15 And they
0:36:16 have some
0:36:18 shared beliefs
0:36:18 that make
0:36:18 them very
0:36:20 weird for
0:36:21 the rest of
0:36:21 society.
0:36:23 Now, as I
0:36:23 said, I
0:36:24 spent the
0:36:25 first decade
0:36:25 of my
0:36:25 career as
0:36:26 a journalist
0:36:28 and most
0:36:29 journalists
0:36:30 think that
0:36:30 they should
0:36:31 break out
0:36:31 of their
0:36:31 bubble,
0:36:33 that they
0:36:34 should meet
0:36:34 people on
0:36:34 the other
0:36:35 side of the
0:36:35 political
0:36:35 spectrum.
0:36:36 This is a
0:36:37 debate that
0:36:37 I used
0:36:38 to have
0:36:38 with my
0:36:38 colleagues.
0:36:39 They would
0:36:39 say, yeah,
0:36:40 we’ve got to
0:36:40 make sure
0:36:41 that the
0:36:41 plumbers read
0:36:42 our essays
0:36:43 as well.
0:36:44 And my
0:36:44 response was
0:36:45 always like,
0:36:45 you know,
0:36:46 I would love
0:36:47 for plumbers
0:36:47 to read my
0:36:48 essays, but
0:36:49 currently my
0:36:50 friends aren’t
0:36:50 reading them.
0:36:52 So maybe we
0:36:52 can start
0:36:53 there.
0:36:54 Right?
0:36:56 And this is
0:36:56 why I think
0:36:57 it sometimes
0:36:57 makes sense to
0:36:58 actually double
0:36:59 down on a
0:37:00 cult, because
0:37:01 in a cult,
0:37:02 it can be
0:37:02 radicalized,
0:37:03 and sometimes
0:37:04 that’s exactly
0:37:05 what’s
0:37:05 necessary.
0:37:06 To give you
0:37:06 one simple
0:37:07 example, in a
0:37:08 world that
0:37:08 doesn’t really
0:37:09 seem to care
0:37:09 about animals
0:37:10 all that much,
0:37:11 it’s easy to
0:37:12 become disillusioned.
0:37:14 But then once you
0:37:15 join a safe space
0:37:16 of ambitious
0:37:16 do-gooders, you
0:37:18 can suddenly get
0:37:18 this feeling like,
0:37:19 hey, I’m not the
0:37:20 only one, right?
0:37:21 There are other
0:37:21 people who deeply
0:37:22 care about animals
0:37:23 as well, and you
0:37:23 know what?
0:37:24 I can do much
0:37:25 more than I’m
0:37:26 currently doing.
0:37:26 So it can have a
0:37:27 radicalizing effect.
0:37:29 Now, I totally
0:37:29 acknowledge that
0:37:30 there are all
0:37:31 signs of dangers
0:37:32 here.
0:37:34 Like, you can
0:37:34 become too
0:37:35 dogmatic, you
0:37:36 can be, you
0:37:37 know, quite
0:37:38 hostile to people
0:37:39 who don’t share
0:37:39 all your beliefs.
0:37:41 So I do see
0:37:42 all of that.
0:37:43 I just want to
0:37:44 recognize that if
0:37:45 you look at some
0:37:45 of these great
0:37:46 movements of
0:37:46 history, the
0:37:48 abolitionists, the
0:37:49 suffragettes, yeah,
0:37:50 they had cultish
0:37:51 aspects.
0:37:52 They were in a
0:37:53 way, yeah, a
0:37:54 little bit like a
0:37:54 cult.
0:37:56 I want to push
0:37:58 a little bit on
0:37:59 this question
0:38:00 about, you
0:38:00 know, cults and
0:38:01 dogmatism.
0:38:03 Obviously, a big
0:38:04 downside, as you
0:38:04 mentioned, is that
0:38:05 you can become
0:38:06 dogmatic, you can
0:38:06 become kind of
0:38:07 deaf to criticism
0:38:07 from the outside.
0:38:09 Do you have any
0:38:10 advice for people
0:38:11 on how to avoid
0:38:12 the downside?
0:38:14 Yeah, don’t let
0:38:14 it suck up your
0:38:15 whole life.
0:38:16 There’s this quote
0:38:18 from Flaubert, the
0:38:19 novelist, who once
0:38:20 said something like,
0:38:20 if you want to be
0:38:22 violent and original
0:38:23 in your work, you
0:38:24 need to be boring
0:38:25 in your private
0:38:25 life.
0:38:26 I’m paraphrasing
0:38:26 here.
0:38:27 But I’ve always
0:38:29 like that quote.
0:38:30 I don’t know, it
0:38:31 gives you a certain
0:38:32 groundedness and
0:38:33 stability.
0:38:35 So maybe surround
0:38:36 yourself with
0:38:38 other types of
0:38:39 people and other
0:38:40 types of pursuits,
0:38:40 right?
0:38:41 Basically be a
0:38:42 pluralist.
0:38:44 Look, I don’t
0:38:45 know, honestly.
0:38:46 I don’t have the
0:38:48 perfect recipe here.
0:38:53 In general, it’s
0:38:54 super important to
0:38:55 surround yourself with
0:38:56 people who are
0:38:56 critical of your
0:38:57 work, who don’t
0:38:58 take you too
0:38:59 seriously, who
0:38:59 can also laugh
0:39:02 at you, who
0:39:03 have a good
0:39:03 sense of humor,
0:39:06 or who can just
0:39:06 see your
0:39:07 foolishness and
0:39:07 call it out and
0:39:08 still be a good
0:39:09 friend.
0:39:10 But this is
0:39:11 general life advice
0:39:12 for everyone.
0:39:13 Right, right.
0:39:15 Having a strong
0:39:16 dose of pluralism
0:39:17 can help
0:39:20 counteract a lot
0:39:21 of the potential
0:39:22 pitfalls with
0:39:23 these sorts of
0:39:24 ideological movements.
0:39:25 Yeah, absolutely.
0:39:25 At the same
0:39:26 time, you know, I
0:39:27 come from such a
0:39:27 different place,
0:39:28 you know.
0:39:30 I was mainly
0:39:31 frustrated with all
0:39:33 these people on
0:39:33 the left side of the
0:39:34 political spectrum
0:39:35 saying, oh, we
0:39:36 need systemic
0:39:37 change.
0:39:38 We need to
0:39:39 abolish capitalism,
0:39:40 overthrow the
0:39:42 patriarchy, and
0:39:43 write, you know,
0:39:44 a hundred more
0:39:45 monographs about it
0:39:46 in utterly
0:39:47 inaccessible
0:39:48 academic jargon.
0:39:49 And I was like,
0:39:50 come on, can we
0:39:51 actually do
0:39:51 something, right?
0:39:53 Can we actually
0:39:55 find some effective
0:39:55 way of actually
0:39:56 making a difference?
0:40:24 I think one
0:40:25 important question
0:40:26 is the question
0:40:27 of who
0:40:27 should we be
0:40:28 trying to
0:40:28 make a difference
0:40:29 for?
0:40:31 There is a very
0:40:31 interesting concept
0:40:32 that you mention
0:40:33 in the book,
0:40:34 which is humanity’s
0:40:35 expanding moral
0:40:36 circle.
0:40:37 What is that?
0:40:38 It’s, again, a
0:40:39 term from Peter
0:40:41 Singer, the
0:40:41 philosopher, who
0:40:42 makes the simple
0:40:43 case that throughout
0:40:45 history, our
0:40:46 moral circle has
0:40:46 expanded.
0:40:49 So, back in
0:40:49 the old days, we
0:40:50 mainly cared about
0:40:52 our own tribe and
0:40:53 members of our
0:40:53 tribe.
0:40:54 And then, you
0:40:55 know, we got the
0:40:56 big religions and
0:40:57 we started caring
0:40:58 about people who
0:40:59 believe the same
0:40:59 things.
0:41:00 And then we got
0:41:01 the nation states
0:41:02 and so on and so
0:41:02 on.
0:41:03 And he basically
0:41:04 says that moral
0:41:04 progress is all
0:41:05 about expanding the
0:41:07 moral circle and
0:41:08 to keep pushing
0:41:09 that expansion.
0:41:11 A couple of
0:41:11 years ago, I was
0:41:12 actually working on
0:41:13 a different book.
0:41:14 I wanted to write
0:41:14 the history of
0:41:15 moral circle
0:41:15 expansion.
0:41:17 because it’s
0:41:18 really interesting
0:41:19 that a lot of
0:41:19 the first
0:41:21 abolitionists, they
0:41:22 already cared
0:41:23 deeply about animal
0:41:24 rights, which makes
0:41:24 a lot of sense
0:41:25 because once you
0:41:26 start expanding your
0:41:27 moral circle, once
0:41:27 you start opening
0:41:29 your heart to
0:41:30 people who first
0:41:30 weren’t included in
0:41:31 your moral circle,
0:41:32 then the question
0:41:32 is, like, why
0:41:33 stop at some
0:41:33 point?
0:41:34 And I was writing
0:41:35 about that, learning
0:41:36 about that, and I
0:41:37 was like, huh,
0:41:39 maybe I should
0:41:40 finish this book
0:41:41 when I’m 60 or
0:41:42 70 or something.
0:41:44 Maybe I should be
0:41:44 doing this stuff,
0:41:45 you know, not
0:41:46 just be writing
0:41:46 about it.
0:41:47 So for me, that
0:41:48 was incredibly
0:41:48 inspirational.
0:41:50 That’s funny.
0:41:50 Okay, so if the
0:41:51 moral circle is
0:41:52 like, okay, who’s
0:41:53 worthy of our
0:41:54 moral consideration,
0:41:54 who’s not, who’s
0:41:56 in, who’s out, you
0:41:58 kind of acknowledge
0:41:58 in the book, like,
0:41:59 maybe it’s not
0:42:00 obvious how to
0:42:01 tell, are we
0:42:02 including everyone
0:42:03 in the moral circle
0:42:03 that should be
0:42:04 included?
0:42:05 And you have a few
0:42:06 pointers that you
0:42:08 offer people on
0:42:08 how to check that
0:42:09 they’re including
0:42:09 everyone that should
0:42:10 be included.
0:42:11 Do you want to
0:42:12 give us a little
0:42:13 summary, a few
0:42:14 pointers?
0:42:15 I think that
0:42:16 there are some
0:42:17 classic signs
0:42:20 that can tell
0:42:20 us whether we’re
0:42:21 on the right
0:42:22 side of history.
0:42:23 This is one of
0:42:24 those fascinating
0:42:24 questions that we
0:42:25 can ask, right?
0:42:26 We can look back
0:42:27 on, say, the
0:42:29 Romans who threw
0:42:30 naked women before
0:42:31 the lions, but
0:42:32 still thought they
0:42:32 were super
0:42:33 civilized because
0:42:34 unlike the
0:42:35 barbarians, they
0:42:36 didn’t sacrifice
0:42:38 kids to the
0:42:39 gods, right?
0:42:40 and every
0:42:40 civilization
0:42:41 throughout history
0:42:41 has always
0:42:43 thought we
0:42:43 are the most
0:42:44 civilized.
0:42:45 And obviously
0:42:46 we think that
0:42:46 today as well,
0:42:47 like any
0:42:48 modern-day
0:42:49 liberal in
0:42:50 the US or
0:42:52 the West in
0:42:52 the 21st century
0:42:53 will be like,
0:42:53 yeah, there’s
0:42:54 still bad stuff
0:42:55 happening, but
0:42:57 basically we’ve
0:42:57 figured things
0:42:58 out.
0:43:00 And the
0:43:01 uncomfortable
0:43:01 truth is that
0:43:02 probably we are
0:43:04 still committed,
0:43:06 engaged in some
0:43:07 really terrible
0:43:08 moral atrocities.
0:43:09 I mean, that’s
0:43:10 highly likely if
0:43:10 you just look at
0:43:11 the historical
0:43:11 track record.
0:43:12 So the question
0:43:13 is, what will
0:43:14 the historians of
0:43:15 the future say
0:43:16 about us?
0:43:16 And then I’m
0:43:17 not just talking
0:43:17 about, oh,
0:43:18 yeah, the bad
0:43:19 MAGA people or
0:43:19 something like that.
0:43:20 No, no, no,
0:43:21 I’m talking to
0:43:22 you directly who’s
0:43:23 listening to this
0:43:24 podcast right now
0:43:24 and probably thinks
0:43:25 of his or himself
0:43:27 as a pretty
0:43:27 decent person.
0:43:29 Then the question
0:43:29 is, okay, what
0:43:30 is that?
0:43:30 A couple of
0:43:30 signs.
0:43:31 Well, one is
0:43:32 we’ve been
0:43:33 talking about it
0:43:34 for a long
0:43:34 time.
0:43:35 So the alarm
0:43:35 bells have been
0:43:36 ringing for a
0:43:37 long time.
0:43:37 That’s one
0:43:38 clear sign.
0:43:39 In the book,
0:43:39 I give the
0:43:40 example of the
0:43:40 way we treat
0:43:41 animals.
0:43:41 And it’s not
0:43:42 as if these
0:43:43 arguments are
0:43:44 new or anything.
0:43:44 You know, a lot
0:43:45 of smart people
0:43:46 have said this
0:43:46 for a long
0:43:47 time.
0:43:47 You know,
0:43:48 Jeremy Bentham
0:43:49 already in the
0:43:50 late 18th
0:43:51 century wrote
0:43:51 that, you
0:43:52 know, it’s not
0:43:52 about whether
0:43:53 these animals
0:43:54 can speak or
0:43:54 reason or do
0:43:55 mathematics.
0:43:56 No, it’s about
0:43:56 the simple
0:43:57 question, can
0:43:58 they suffer?
0:43:59 And we’ve got
0:43:59 an enormous
0:44:00 mountain of
0:44:01 evidence that
0:44:02 tells us, yeah,
0:44:03 they can probably
0:44:04 suffer really
0:44:04 badly.
0:44:06 So yeah, if
0:44:07 you eat meat
0:44:07 and dairy
0:44:08 today, then
0:44:10 you are, yeah,
0:44:10 that it’s quite
0:44:11 likely that you’re
0:44:11 involved in one
0:44:12 of those moral
0:44:12 atrocities.
0:44:14 I’ve got a few
0:44:15 other signs that
0:44:15 I talk about.
0:44:17 For example, we
0:44:18 rationalize these
0:44:19 kind of things by
0:44:20 saying that they’re
0:44:22 natural or normal
0:44:23 or necessary.
0:44:24 This is what
0:44:25 Melanie Joy, the
0:44:26 psychologist, calls
0:44:27 the three ends.
0:44:28 And you look at
0:44:29 something like
0:44:31 slavery, and that’s
0:44:31 also what we did
0:44:32 back then, right?
0:44:32 We said it was
0:44:33 natural.
0:44:34 Like, throughout
0:44:35 history, every
0:44:35 civilization has
0:44:36 always practiced the
0:44:37 institution of
0:44:37 slavery.
0:44:39 Like, it’s just
0:44:40 what people do,
0:44:40 right?
0:44:41 What are you going
0:44:42 to do about it?
0:44:43 Or necessary, people
0:44:44 would say.
0:44:45 Yeah, it was just
0:44:47 essential for the
0:44:47 economy.
0:44:48 If we would
0:44:49 abolish slavery
0:44:49 today, you know,
0:44:50 the economy will
0:44:51 collapse and there
0:44:51 will be all kinds
0:44:52 of perverse
0:44:53 consequences.
0:44:54 So anyway, it’s
0:44:55 interesting to look
0:44:55 at those signs and
0:44:56 then think, okay,
0:44:57 what are some of the
0:44:58 worst things that may
0:44:58 be happening today?
0:45:00 There’s sort of a
0:45:01 pet peeve I have
0:45:02 about the way people
0:45:03 sometimes talk about
0:45:03 the expanding
0:45:04 moral circle.
0:45:06 People, I find,
0:45:07 typically talk about
0:45:09 it as if moral
0:45:09 progress or the
0:45:10 expansion of the
0:45:11 moral circle is
0:45:11 some sort of
0:45:12 linear process.
0:45:16 But to me, that
0:45:16 seems like a very
0:45:17 Eurocentric reading
0:45:19 of history because
0:45:20 there are other
0:45:21 cultures, right?
0:45:21 I’m thinking of the
0:45:23 Jains in India or
0:45:24 the Quechua people
0:45:25 in Latin America.
0:45:27 For them, you know,
0:45:28 the inclusion of all
0:45:29 animals and all
0:45:30 nature in the
0:45:31 moral circle has
0:45:32 been morally
0:45:33 obvious for a
0:45:34 long time and
0:45:35 that’s still not
0:45:36 obvious to
0:45:36 Americans.
0:45:38 I think that’s a
0:45:38 really good point
0:45:39 you’re making.
0:45:40 So historians call
0:45:41 this the Whig
0:45:43 view of history,
0:45:44 you know, named
0:45:45 after the Whigs,
0:45:47 the political
0:45:49 party in the
0:45:50 UK a few
0:45:51 centuries ago,
0:45:52 which indeed had
0:45:53 this Western
0:45:55 triumphalism baked
0:45:55 into it.
0:45:56 Like, we know
0:45:57 what’s right for
0:45:59 the world and we
0:45:59 will show the rest
0:46:00 of the world,
0:46:00 you know, how
0:46:01 to be good,
0:46:02 how to be moral.
0:46:04 And obviously,
0:46:06 the fight against
0:46:07 the slave trade and
0:46:08 slavery was essential
0:46:08 to that.
0:46:13 So, I have
0:46:13 complicated views on
0:46:14 this.
0:46:15 There are some
0:46:15 people who are
0:46:16 like, look, it’s
0:46:18 just total BS that,
0:46:19 you know, Britain was
0:46:20 so important in
0:46:21 abolishing the slave
0:46:22 trade because, you
0:46:23 know, it was mainly
0:46:24 the revolutions in
0:46:25 Haiti, you know, it
0:46:26 was enslaved people
0:46:27 themselves who did
0:46:27 it.
0:46:30 So, yeah, stop
0:46:30 with the colonists
0:46:31 crap.
0:46:33 And I think that’s
0:46:34 just not true, to
0:46:34 be honest.
0:46:37 People who have
0:46:38 been suffering from
0:46:39 slavery and the
0:46:40 slave trade, you
0:46:40 know, they’ve always
0:46:42 revolted, obviously,
0:46:42 you know, from
0:46:43 Spartacus onwards.
0:46:46 One in ten slave
0:46:47 voyages saw a
0:46:48 revolt.
0:46:49 But the reality is
0:46:50 that this system was
0:46:51 so horrible, and
0:46:52 not just in the
0:46:53 West, in the
0:46:54 colonies in the
0:46:55 Caribbean, but in
0:46:55 many places around
0:46:57 the globe, that
0:46:58 yeah, abolitionism
0:47:00 was for a long
0:47:00 time unthinkable.
0:47:02 And it was really
0:47:03 a new idea that
0:47:05 originated among
0:47:07 Anglo-Saxon
0:47:09 Protestants, first
0:47:10 the Quakers, and
0:47:10 then also the
0:47:11 Evangelicals, this
0:47:13 new idea that you
0:47:13 could actually
0:47:15 abolish slavery as
0:47:16 an institution.
0:47:17 It was really a
0:47:17 small group of
0:47:18 people who had
0:47:19 this crazy idea.
0:47:21 And then because
0:47:21 they did it in
0:47:22 Britain, and they
0:47:23 were successful in
0:47:24 Britain, then that
0:47:25 country was able to
0:47:27 use its power on
0:47:28 the Seven Seas, the
0:47:30 Royal Navy, to
0:47:31 force a huge
0:47:31 amount of other
0:47:32 countries to also
0:47:33 stop slavery, slave
0:47:34 trading.
0:47:34 So the Netherlands,
0:47:36 where I’m from, we
0:47:37 didn’t abolish the
0:47:38 slave trade on our
0:47:38 own.
0:47:38 Like, we were
0:47:39 making a lot of
0:47:40 money and enjoying
0:47:41 it quite immensely.
0:47:43 But then, you know,
0:47:44 these moralistic
0:47:46 British people came
0:47:46 along and, okay,
0:47:47 okay, we will
0:47:48 abolish it.
0:47:49 And that happened
0:47:50 again and again.
0:47:51 The irony is,
0:47:52 obviously, that this
0:47:53 was, again, also an
0:47:54 excuse for more
0:47:55 colonialism, so
0:47:57 that, you know, some
0:47:58 new horrors grew out
0:47:59 of that, that under
0:47:59 the banner of
0:48:01 anti-slavery, a new
0:48:03 colonial era dawned
0:48:04 and the whole
0:48:05 scramble for Africa
0:48:05 happened.
0:48:07 So I really don’t
0:48:09 want to, you know,
0:48:09 suggest that there
0:48:10 are some natural
0:48:11 progress in history.
0:48:13 If the arc of
0:48:15 justice bends, or if
0:48:16 the arc of history
0:48:16 bends towards
0:48:18 justice, then it’s
0:48:20 because, like, people
0:48:20 do that.
0:48:21 And if we don’t
0:48:22 keep bending it, it
0:48:23 might easily snap
0:48:24 back.
0:48:25 And there’s really
0:48:27 no natural order
0:48:28 of things here.
0:48:29 And indeed, in some
0:48:31 ways, we’ve made,
0:48:32 what’s the opposite
0:48:32 of progress?
0:48:33 What’s the English
0:48:33 word?
0:48:34 Backsliding.
0:48:35 Yeah, we’ve been
0:48:36 backsliding.
0:48:37 And I think animals
0:48:38 is a great example.
0:48:40 Imagine a world where
0:48:40 the Industrial
0:48:41 Revolution would have
0:48:42 happened in India.
0:48:43 I mean, maybe we
0:48:43 wouldn’t have
0:48:45 ended up with these
0:48:46 horrible systems of
0:48:47 factory farming.
0:48:49 It could have been
0:48:51 so much better.
0:48:53 Yeah, when I think
0:48:55 about progress, I
0:48:56 mean, I think of it
0:48:58 as, first of all, like,
0:48:58 who gets to define
0:48:59 what’s progress?
0:49:01 I think that depends a
0:49:01 lot on who’s in power
0:49:02 and who’s defining it.
0:49:05 But I don’t see it as
0:49:06 a sort of straight line
0:49:07 linearly going up.
0:49:08 I very much see it as
0:49:09 a messy squiggle.
0:49:11 And it’s entirely
0:49:12 plausible to me that
0:49:15 in 100 years, we will
0:49:17 have expanded our
0:49:18 moral circle in some
0:49:19 ways and given more
0:49:19 rights to certain
0:49:20 human beings.
0:49:22 You know, for example,
0:49:24 that we’ve abolished
0:49:25 factory farming and we
0:49:26 are treating animals
0:49:28 great, even as we’re
0:49:30 now really repressing
0:49:31 certain classes of
0:49:31 human beings.
0:49:33 Does that prediction
0:49:35 sound plausible to you?
0:49:35 Oh, no, no.
0:49:36 I’m not making any
0:49:37 predictions here.
0:49:38 I think the future
0:49:39 could be much worse
0:49:39 than today.
0:49:42 For me, that’s one of
0:49:42 the main lessons of
0:49:43 history.
0:49:44 Things can change
0:49:46 quite radically, for
0:49:46 better or for worse.
0:49:47 I’m pretty sure
0:49:49 that when you would
0:49:50 have talked to, you
0:49:51 know, most Germans in
0:49:53 the 1920s, I mean,
0:49:53 they couldn’t have
0:49:54 imagined, like, the
0:49:55 terrible abyss that
0:49:56 was ahead of them.
0:49:58 If I look at the U.S.
0:50:00 today, I am really
0:50:01 pessimistic, to be
0:50:01 honest.
0:50:03 I think there’s a real
0:50:04 threat of democracy
0:50:06 breaking down, and I
0:50:07 think that things can
0:50:08 get much, much worse
0:50:10 quite soon, actually.
0:50:11 Mm-hmm.
0:50:13 Let’s talk about
0:50:14 what’s ahead for you
0:50:15 personally.
0:50:18 Maybe you have a little
0:50:19 more ability to predict
0:50:20 that, potentially.
0:50:21 It, you know, it
0:50:22 strikes me with your
0:50:24 book, like, you could
0:50:25 have been like, look,
0:50:26 I’m happy, I’m
0:50:27 content to just write a
0:50:27 book about moral
0:50:28 ambition, leave it at
0:50:29 that, you know.
0:50:31 But you did not just
0:50:31 leave it at that, you
0:50:33 also decided to co-found
0:50:34 something that you
0:50:34 mentioned earlier.
0:50:35 It’s called the School
0:50:36 for Moral Ambition.
0:50:38 What is that, and how
0:50:39 did that get started?
0:50:40 I was at a point in
0:50:42 my career where I
0:50:43 looked at what I
0:50:44 had, you know, a bit
0:50:44 of a platform.
0:50:46 I think I have the
0:50:48 ability to, you know,
0:50:50 write things that
0:50:51 perhaps some people
0:50:51 want to read.
0:50:54 But I also felt this
0:50:57 itch, right, and felt
0:50:58 a little bit fed up
0:50:58 with myself.
0:51:00 And I was hugely
0:51:02 inspired by, for
0:51:03 example, what Ralph
0:51:03 Nader did in the
0:51:05 60s and the 70s, that
0:51:06 he was able to build
0:51:08 this beacon, this
0:51:08 magnet for very
0:51:09 driven and talented
0:51:10 people to work on
0:51:11 some of the most
0:51:12 pressing issues.
0:51:15 Throughout history, I
0:51:16 think we’ve seen
0:51:17 movements that have
0:51:18 been successful at
0:51:19 redefining what it
0:51:20 means to be
0:51:20 successful.
0:51:21 That was one of the
0:51:23 epiphanies I had when
0:51:24 I studied the British
0:51:25 abolitionist movement,
0:51:26 is they were actually
0:51:27 part of a much bigger
0:51:29 societal shift that
0:51:30 was all about making
0:51:30 doing good more
0:51:31 fashionable.
0:51:33 So I guess that’s
0:51:34 what we are betting
0:51:34 on.
0:51:36 Again, we are trying
0:51:37 to build that
0:51:37 magnet.
0:51:38 We are trying to
0:51:39 redefine what it
0:51:40 means to be
0:51:41 successful.
0:51:42 So we do a couple
0:51:43 of things.
0:51:45 One is we organize
0:51:45 these so-called
0:51:46 moral ambition
0:51:46 circles.
0:51:47 They’re groups of
0:51:48 five to eight
0:51:49 people who want to
0:51:50 explore what a
0:51:50 morally ambitious
0:51:51 life could mean for
0:51:51 them.
0:51:54 This is all freely
0:51:55 accessible on our
0:51:56 website, moralambition.org.
0:51:57 And at the same
0:51:58 time, we organize
0:51:59 so-called moral
0:52:00 ambition fellowships.
0:52:03 And you could see
0:52:04 them as small SWAT
0:52:06 teams of extremely
0:52:08 talented, very driven
0:52:09 people who have
0:52:10 agreed to quit their
0:52:13 job, follow Gandalf,
0:52:15 and work on some of
0:52:16 the most important
0:52:17 global problems.
0:52:18 We got started in
0:52:18 Europe.
0:52:20 No, no, no, no, no,
0:52:20 no, no.
0:52:21 I’m not coming up with
0:52:22 the mission statements.
0:52:24 It’s actually our
0:52:25 researchers who are
0:52:25 our Gandalfs.
0:52:26 I’m more like the
0:52:27 Muppet, you know?
0:52:30 Like the mascot, you
0:52:31 know, in the silly
0:52:33 suit, right?
0:52:35 That’s me who walks
0:52:36 on the field before
0:52:37 the match gets
0:52:37 started.
0:52:38 That’s my job.
0:52:41 But, yeah, so we
0:52:42 asked our researchers
0:52:43 what are some of the
0:52:44 most important things
0:52:45 we can do in
0:52:45 Brussels.
0:52:46 And to my big
0:52:47 surprise, actually,
0:52:47 one of the things
0:52:48 they advised us is to
0:52:49 work on fighting big
0:52:50 tobacco.
0:52:51 It’s the single
0:52:52 largest preventable
0:52:53 cause of disease
0:52:54 still today.
0:52:55 Eight million
0:52:55 deaths every year.
0:52:56 year, and very
0:52:57 few people are
0:52:58 working on
0:52:59 countering it.
0:53:00 So we’ve been
0:53:02 recruiting corporate
0:53:02 lawyers,
0:53:03 marketeers.
0:53:04 Actually, we’ve got
0:53:05 someone in our
0:53:06 last cohort who
0:53:07 used to work for
0:53:09 Big Tobacco, and
0:53:10 now they’re applying
0:53:11 their skills and
0:53:12 their talents to
0:53:13 doing a lot of
0:53:13 good.
0:53:15 And, yeah, we
0:53:16 want to scale up
0:53:17 this machine.
0:53:18 Obviously, the point
0:53:19 is that it is very
0:53:20 hard to get into
0:53:21 one of our
0:53:22 fellowships because
0:53:22 we want to make
0:53:23 it more prestigious.
0:53:24 You went to
0:53:24 Harvard.
0:53:25 Okay, well,
0:53:25 that’s not
0:53:26 nearly enough.
0:53:27 That’s nice, but
0:53:30 we are, yeah,
0:53:32 it’s quite
0:53:32 extraordinary, I
0:53:33 think, the
0:53:34 groups that we
0:53:35 are now bringing
0:53:36 together.
0:53:38 I think because of
0:53:39 two reasons.
0:53:39 One, because we
0:53:40 want to make doing
0:53:41 good more
0:53:41 prestigious and
0:53:43 more fashionable.
0:53:44 The other thing is
0:53:45 that we genuinely
0:53:46 believe that if you’re
0:53:47 very selective and
0:53:48 that some very
0:53:49 entrepreneurial people
0:53:50 can just do so
0:53:51 much.
0:53:51 where is the
0:53:52 school for
0:53:53 moral ambition
0:53:54 getting all the
0:53:55 funding, getting
0:53:56 the money to be
0:53:56 able to pay
0:53:57 people to quit
0:53:57 their jobs?
0:53:59 Mostly from me
0:53:59 now.
0:54:02 Everything I earn
0:54:02 with the book is
0:54:03 going all into the
0:54:04 movement.
0:54:06 So that’s been
0:54:06 helpful.
0:54:07 And we’ve got a
0:54:09 group of entrepreneurs
0:54:09 supporting us as
0:54:10 well.
0:54:11 So these are
0:54:11 people who have
0:54:12 indeed built their
0:54:13 own companies and
0:54:14 who are looking
0:54:16 to climb, as
0:54:17 David Brooks would
0:54:18 say, their second
0:54:18 mountain.
0:54:19 You know, you
0:54:20 mentioned that
0:54:21 the School for
0:54:22 Moral Ambition is
0:54:24 highly sort of
0:54:25 competitive to get
0:54:25 in.
0:54:27 And most of the
0:54:28 listeners won’t end
0:54:29 up going to the
0:54:31 school, but I am
0:54:32 kind of interested to
0:54:32 hear that you’re
0:54:33 also promoting
0:54:34 these moral ambition
0:54:35 circles that people
0:54:35 can start with
0:54:36 their friends.
0:54:38 I personally am not
0:54:40 really sold on the
0:54:41 idea of maximizing,
0:54:42 like do the most
0:54:44 good possible as my
0:54:45 entire guiding
0:54:46 philosophy for life,
0:54:47 but I am attracted
0:54:49 to the idea of
0:54:50 trying to do more
0:54:51 good.
0:54:52 Exactly.
0:54:53 Right?
0:54:54 We’re totally on
0:54:54 the same page.
0:54:55 Yeah.
0:54:58 And I very much
0:54:59 think I could enjoy
0:55:00 kind of just sitting
0:55:01 with five or six
0:55:02 friends on a regular
0:55:03 basis and trying to
0:55:04 challenge each other
0:55:05 to be more
0:55:06 intentional about
0:55:07 whatever the values
0:55:08 are that we do
0:55:09 believe in, right?
0:55:09 Yeah.
0:55:11 So maybe one way
0:55:12 to say this,
0:55:13 Sikal, is that
0:55:15 when I talk to
0:55:15 some of my banker
0:55:17 friends, I’m not
0:55:18 inclined to talk
0:55:18 about all these
0:55:19 drowning children
0:55:20 in shallow
0:55:21 ponds, right?
0:55:23 I’m also not
0:55:24 inclined to talk
0:55:25 in a more leftist
0:55:25 way and say,
0:55:25 oh, you’re so
0:55:26 bad, you’re so
0:55:27 greedy.
0:55:30 What I’ve
0:55:31 discovered is
0:55:32 that it’s much
0:55:32 more effective to
0:55:33 say something
0:55:34 like, oh,
0:55:35 wow, you’re so
0:55:36 talented, you’re so
0:55:38 experienced, and
0:55:39 this is what you’re
0:55:39 doing?
0:55:40 Boring.
0:55:43 And that hurts
0:55:44 them much more
0:55:46 in my experience.
0:55:47 And it’s also
0:55:48 honestly what I
0:55:48 believe.
0:55:50 Yeah, people
0:55:50 really don’t like
0:55:51 to be boring.
0:55:54 I will say this
0:55:55 conversation has
0:55:55 been far from
0:55:56 boring.
0:55:57 I really enjoyed
0:55:58 chatting with you
0:55:59 and reading your
0:55:59 book.
0:56:00 It’s called
0:56:01 Moral Ambition.
0:56:03 Rutger, just
0:56:03 want to say thank
0:56:04 you so much for
0:56:05 being on our
0:56:05 show.
0:56:06 Thanks for
0:56:06 having me.
0:56:15 I hope you
0:56:15 enjoyed this
0:56:15 episode.
0:56:16 I know I
0:56:17 enjoyed wrestling
0:56:17 with all these
0:56:18 ideas.
0:56:19 And while I
0:56:20 don’t think I’ll
0:56:20 be enrolling at
0:56:21 the School for
0:56:21 Moral Ambition,
0:56:23 I will consider
0:56:23 setting up a
0:56:24 moral ambition
0:56:25 circle with my
0:56:25 friends.
0:56:26 But as always,
0:56:27 we want to know
0:56:28 what you think,
0:56:29 so drop us a
0:56:29 line at
0:56:31 thegrayareaatvox.com
0:56:33 or leave us a
0:56:33 message on our
0:56:34 new voicemail
0:56:35 line at
0:56:38 1-800-214-5749.
0:56:39 And once you’re
0:56:40 finished with that,
0:56:41 go ahead and rate
0:56:42 and review and
0:56:43 subscribe to the
0:56:43 podcast.
0:56:45 This episode was
0:56:46 produced by Beth
0:56:47 Morrissey, edited
0:56:48 by Jorge Just,
0:56:49 engineered by
0:56:50 Christian Ayala,
0:56:51 fact-checked by
0:56:52 Melissa Hirsch,
0:56:53 and Alex Overington
0:56:54 wrote our theme
0:56:54 music.
0:56:56 The episode was
0:56:56 hosted by me,
0:56:57 Sigal Samuel.
0:56:58 I’m a senior
0:56:59 reporter at
0:57:00 Vox’s Future
0:57:01 Perfect, where I
0:57:02 cover AI,
0:57:03 neuroscience, and a
0:57:03 whole lot more.
0:57:05 You can read
0:57:05 my writing at
0:57:06 vox.com
0:57:07 slash future
0:57:07 perfect.
0:57:09 Also, if you
0:57:10 want to learn
0:57:10 more about
0:57:11 effective altruism
0:57:11 and the
0:57:12 drowning child
0:57:13 thought experiment,
0:57:14 check out
0:57:15 Vox’s Good
0:57:16 Robot podcast
0:57:16 series.
0:57:17 I highly
0:57:17 recommend it.
0:57:18 We’ll drop a
0:57:19 link to that
0:57:19 in the show
0:57:19 notes.
0:57:22 New episodes of
0:57:22 The Gray Area
0:57:23 drop on Mondays.
0:57:24 Listen and
0:57:25 subscribe.
0:57:26 The show is
0:57:27 part of Vox.
0:57:28 Support Vox’s
0:57:29 journalism by
0:57:29 joining our
0:57:30 membership program
0:57:30 today.
0:57:31 Go to
0:57:32 vox.com
0:57:33 slash members
0:57:34 to sign up.
0:57:35 And if you
0:57:36 decide to sign
0:57:36 up because of
0:57:37 this show,
0:57:38 let us know.
We’re told from a young age to achieve. Get good grades. Get into a good school. Get a good job. Be ambitious about earning a high salary or a high-status position.
Some of us love this endless climb. But lots of us, at least once in our lives, find ourselves asking, “What’s the point of all this ambition?”Historian and author Rutger Bregman doesn’t think there is a point to that kind of ambition. Instead, he wants us to be morally ambitious, to measure the value of our achievements based on how much good we do, by how much we improve the world.
In this episode, Bregman speaks with guest host Sigal Samuel about how to know if you’re morally ambitious, the value of surrounding yourself with like-minded people, and how to make moral ambition fashionable.
Host: Sigal Samuel, Vox senior reporter
Guest: Rutger Bregman, historian, author of Moral Ambition, and co-founder of The School for Moral Ambition
Listen to The Gray Area ad-free by becoming a Vox Member: vox.com/members
Show Notes
Vox’s Good Robot series can be found here:
Episode 3 (discusses the “drowning child thought experiment” and effective altruism)
Learn more about your ad choices. Visit podcastchoices.com/adchoices