Author: The Gray Area with Sean Illing

  • The world according to Werner Herzog

    AI transcript
    0:00:04 Support for this show comes from Constant Contact.
    0:00:07 If you struggle just to get your customers to notice you,
    0:00:10 Constant Contact has what you need to grab their attention.
    0:00:15 Constant Contact’s award-winning marketing platform offers all the automation,
    0:00:20 integration, and reporting tools that get your marketing running seamlessly,
    0:00:23 all backed by their expert live customer support.
    0:00:27 It’s time to get going and growing with Constant Contact today.
    0:00:30 Ready? Set. Grow.
    0:00:34 Go to ConstantContact.ca and start your free trial today.
    0:00:38 Go to ConstantContact.ca for your free trial.
    0:00:41 ConstantContact.ca.
    0:00:50 When it comes to smart water alkaline 9.5+ pH with antioxidant, there’s nothing to overthink.
    0:00:55 So, while you may be performing mental gymnastics over whether the post-work gym crowd is worth it,
    0:00:58 if you’ll be able to find a spot for your yoga mat,
    0:01:03 or if that spin instructor will make you late for dinner again,
    0:01:05 don’t overthink how you hydrate.
    0:01:07 Life’s full of choices.
    0:01:09 Smart water alkaline is a simple one.
    0:01:17 What’s the role of the poet in our society?
    0:01:23 Do we look to poetry for deep truths about our world,
    0:01:26 or do we look to poetry for something different?
    0:01:30 And if poetry is just, or even mostly, about truth,
    0:01:35 then what distinguishes it from philosophy or science?
    0:01:42 These are very old questions.
    0:01:47 The kinds of questions you find Plato pondering over more than two centuries ago.
    0:01:52 But they will always be worth asking, especially in this moment,
    0:01:56 when our relationship with truth feels as fluid as it’s ever been.
    0:02:02 I’m Sean Elling, and this is The Grey Area.
    0:02:16 Today’s guest, and I can’t believe I’m saying this, is the one and only,
    0:02:18 Werner Herzog.
    0:02:26 He’s a filmmaker, a poet, an author of the book “Every Man for Himself and God Against All.”
    0:02:32 Herzog is known for his films like “Grizzly Man” and “Fitz Corraldo,” among many others.
    0:02:36 But he thinks of himself as a poet and a writer more than he does a filmmaker,
    0:02:41 and you can certainly hear that side of him in his films.
    0:02:48 We only sound and look like badly pronounced and half-finished sentences
    0:02:54 out of a stupid suburban novel, a cheap novel.
    0:03:07 And we have to become humble in front of this overwhelming misery and overwhelming fornication,
    0:03:12 overwhelming growth and overwhelming lack of order.
    0:03:17 Even the stars up here in the sky look like a mess.
    0:03:25 I think it’s fair to say that Herzog is one of our greatest living filmmakers.
    0:03:28 He’s on my personal Mount Rushmore, for sure.
    0:03:34 What I’ve always loved about his work, and you can hear it a little bit in that clip,
    0:03:40 is that it has this dual quality of being both realistic and poetic at the same time.
    0:03:45 That is hard to pull off, and no one does it better than Herzog.
    0:03:50 Which is why I was delighted to see that he released a memoir.
    0:03:56 It’s not really an autobiography, it’s about his approach to life and what he’s after in his art.
    0:04:00 And he’s after something he calls ecstatic truth.
    0:04:03 I explore that with him in this conversation.
    0:04:08 We also talk about a few other things, like whether humanity is destroying itself,
    0:04:13 and why he wants to go to Mars, just for a few days.
    0:04:17 Hi, good morning. This is Van der Herzog.
    0:04:20 This is Sean Elling. Pleasure to meet you.
    0:04:27 I’ll try to be concise enough so that you don’t collect too much garbage.
    0:04:32 Trust me, my editors and producers are used to getting lots of garbage from me,
    0:04:35 and they’re experts at cleaning it up.
    0:04:39 A lot of people know you as a filmmaker, obviously.
    0:04:43 But you’re really a writer, you’ve always been a writer.
    0:04:47 And this memoir is great, and I’m not just saying that because you’re here.
    0:04:50 It really is, and it’s also quite distinctive.
    0:04:54 But it’s not really a biography, it’s something else.
    0:04:56 How would you describe this book?
    0:05:00 Well, some of it is, of course, furious storytelling,
    0:05:06 and much of it is origins of ideas, not so much the events.
    0:05:11 If you look for event, event, you shouldn’t read the book.
    0:05:17 For example, all of a sudden, interspersed there are five ballads of the little soldier.
    0:05:25 I was with an elite commando unit of mostly child soldiers, between 8 and 11,
    0:05:29 and some ballads pop up out of nowhere.
    0:05:33 But, of course, they’re an integral part of what I’m writing.
    0:05:41 It is probably very much the style as well because my experience in the real world
    0:05:46 was unique or different from what other filmmakers have gone through.
    0:05:49 It was different. I’m fairly certain about that.
    0:05:53 And because of that, my writing, my style is different.
    0:05:58 And I think you’re correct saying that I’m a filmmaker as well.
    0:06:04 At the moment, it seems to be more a distraction because since about more than four decades
    0:06:08 I keep preaching to deaf ears, look at my writing.
    0:06:11 It will probably outlive my films.
    0:06:16 So it’s at the moment the focus of what I’m doing.
    0:06:20 It’s still weird to hear one of the greatest, I think, living filmmakers
    0:06:22 describe film as a distraction.
    0:06:24 At the moment, yes, it is here.
    0:06:29 You have always described in interviews the universe as a place of overwhelming chaos
    0:06:33 and you write a good bit about your childhood, which is also a bit chaotic.
    0:06:39 Was your worldview, was that worldview in particular shaped pretty early in your life?
    0:06:44 It is obvious when you look at the universe that it’s hostile out there, not made for us.
    0:06:49 We cannot survive easily in the cosmos anywhere else.
    0:06:51 We haven’t found a place yet.
    0:06:56 Mars is possibly reachable, but we shouldn’t settle there.
    0:07:02 It would be obscene to leave our planet behind and not keep it inhabitable
    0:07:08 and try to make a foreign planet habitable for us.
    0:07:14 So of course, and you look out into the universe, you don’t even need to have a telescope.
    0:07:18 You don’t need to be an astronomer to know it is chaotic.
    0:07:20 It is hostile.
    0:07:24 It is against life, not against all life.
    0:07:30 We can assume that there is life out there, some forms of life, maybe microbic life,
    0:07:39 little creatures or like as much life as there is the snot in the nose of your toddler.
    0:07:41 Actually, it’s biological.
    0:07:46 So that may happen when we encounter the aliens out there.
    0:07:50 We can assume there’s life out there among the trillions of stars
    0:07:55 because we share the same physics with the cosmos.
    0:07:59 We share the same chemistry and we share the same history.
    0:08:05 So let’s assume there’s some forms of life not reachable for us right now.
    0:08:08 And we don’t need to reach it.
    0:08:14 I’ve always found that people really want to believe that there’s a certain order to the universe
    0:08:16 because it makes the world feel more coherent.
    0:08:19 And in that sense, maybe a little more hospitable.
    0:08:25 But I’m not sure anything is more obvious than the fact that the universe is totally indifferent to us.
    0:08:31 The harmony of the spheres, a very old idea, of course, there’s no harmony of spheres.
    0:08:35 It’s a figment of our fantasy, of our thinking.
    0:08:44 But it makes our existence more tolerable in a way believing that there’s some sort of harmony out there.
    0:08:52 Otherwise, the universe is completely and utterly indifferent vis-à-vis what’s going on here on our planet
    0:08:57 and what we are doing in our toils and our daily struggles.
    0:09:01 It’s monumentally indifferent and we have to face that.
    0:09:03 And it’s quite okay. Why not?
    0:09:08 And the second thing I wanted to say, my childhood was not chaotic.
    0:09:16 It was chaotic in the first 14 days of my life because I was born in the city of Munich.
    0:09:19 It was carpet bombed several times.
    0:09:28 When I was only two weeks old, everything around us where we lived was destroyed in ruins.
    0:09:35 So she fled, she fled into the most remote part of the Bavarian mountains.
    0:09:41 And then from there on, after I was two weeks old, it was a wonderful childhood.
    0:09:46 It couldn’t have been better as a refugee or displaced by war.
    0:09:59 I grew up in a wonderful, really beautiful valley in the mountains and as a wild child almost in anarchy
    0:10:06 because there was an absence of fathers, no drill sergeant to tell us what to do and how to behave.
    0:10:10 So it was just really, really good.
    0:10:18 What do you think is behind that? Is it the simplicity of that, the sense of purpose and shared mission that comes with that kind of strife?
    0:10:21 Why is there such beauty in such hardship?
    0:10:25 It’s not simplicity because life was harsh.
    0:10:28 We all grew up in real poverty.
    0:10:31 In my case, we didn’t have running water.
    0:10:40 You had to go to the well with a bucket, hardly an electricity, not enough to eat.
    0:10:42 That was the only harsh thing.
    0:10:47 Didn’t have enough to eat for up to two and a half years and I was always hungry.
    0:10:52 And that’s why I mind when I see that people are throwing too much food away.
    0:11:02 I don’t like to see that. I don’t raise my voice, but it’s a kind of consumerism that I can tolerate.
    0:11:05 It’s people who do not have my experience.
    0:11:09 But otherwise, it was a wonderful time.
    0:11:11 You had to invent your own toys.
    0:11:15 You had to invent your own games.
    0:11:18 You had to fabricate your tools.
    0:11:25 You had to start learning by trial and error.
    0:11:27 You see, there was not much guidance.
    0:11:32 In fact, our mother didn’t educate us that much.
    0:11:35 We reeducated her as boys.
    0:11:39 And only a few things that stick in my mind.
    0:11:44 She was a very principled woman, smoking all her life, a heavy smoker.
    0:11:51 And when my older brother and I were something like 19, 18, 19, 20 or so,
    0:11:55 we had a motorcycle and it was a time of no helmets and so on.
    0:11:59 We had some minor injuries on a weekly basis.
    0:12:03 Sliding somehow into a ditch or whatever.
    0:12:09 And my mother said to us, “I do not want to be in a position to bury one of my sons.”
    0:12:14 And she said it once or twice and we didn’t pay attention.
    0:12:21 And one day she’s at dinner table and she smokes and she stubs out her cigarette after two puffs.
    0:12:27 And she says, “Boys, I think you’re going to sell your motorcycle.
    0:12:30 It’s not healthy. It’s not good.”
    0:12:33 And this, by the way, was my last cigarette.
    0:12:40 She never, ever smoked a cigarette again and we sold our motorcycle within a week.
    0:12:44 So it’s that kind of education.
    0:12:49 And is it true that you didn’t even know that cinema existed until you were 11?
    0:12:54 I did not because there was hardly any electricity.
    0:12:56 There were no telephones.
    0:13:00 I made my first phone call when I was 17.
    0:13:05 Probably kids who are five years old or 10 years old cannot believe that.
    0:13:09 But until today I don’t even have a cell phone.
    0:13:13 Making a phone call is something strange and foreign for me.
    0:13:17 But of course there was no theater or no cinema.
    0:13:22 And a traveling projectionist came to this schoolhouse.
    0:13:26 It was one classroom for first till fourth grade.
    0:13:29 We were something like 25 kids.
    0:13:33 The older ones would teach us the alphabet and help the teacher.
    0:13:38 So school was also a very intense and beautiful experience.
    0:13:42 And a projectionist came and showed two films.
    0:13:46 The first time I ever learned that there was such a thing like cinema.
    0:13:50 And it didn’t impress me at all. It was just lousy, lousy stuff.
    0:13:52 When did you realize you were going to be a filmmaker?
    0:13:54 I think you used the word destiny at some point.
    0:13:56 You realized you were destined to make films.
    0:14:01 But that came at a time when there was a very intense moment
    0:14:05 or a few weeks of very intense insights.
    0:14:10 And I call it now, you better touch it with a pair of pliers
    0:14:12 because it sounds pathetic.
    0:14:16 I had some sort of insight or illumination
    0:14:20 or I became known to my own destiny.
    0:14:25 And that was a time when I started a very dramatic religious phase.
    0:14:30 When I started to travel on foot and where I knew I was a poet
    0:14:32 and I had to be a poet.
    0:14:34 And it was some sort of duty.
    0:14:41 Destiny was meant for me to accept what was out there for me.
    0:14:44 Why do you think destiny exists in this universe?
    0:14:48 In a universe that does seem so indifferent.
    0:14:55 There are certain laws out in the universe that proceed
    0:15:01 and we are in this mill grinding us but calling it destiny.
    0:15:05 I don’t know, it would be pretentious.
    0:15:11 It’s more human thought, probably the universe functions in a different way.
    0:15:16 Nature functions in a different way than our interpretation of it.
    0:15:20 Things just happen and we’re storytelling creatures, right?
    0:15:26 No, no, we have something like free will
    0:15:31 which of course is determined by lots of borderlines
    0:15:35 and lots of obstacles and lots of restrictions.
    0:15:38 But yet we do have choices.
    0:15:45 And do we have a choice against the plowing on of destiny?
    0:15:47 I don’t know.
    0:15:51 But the way you talk about being a poet and being a filmmaker and being a writer
    0:15:55 it’s as though you didn’t really have a choice, it chose you.
    0:16:00 Yes, there was something out that I had to accept.
    0:16:07 I understood my destiny and I keep saying touch this term only with a pair of pliers.
    0:16:09 It sounds pretentious.
    0:16:18 In fact, I understood my duties, my task out there, my destiny in a way.
    0:16:21 What do I have to do with my life?
    0:16:26 Why am I and a sense of responsibility and duty in it?
    0:16:30 The part of the book where you write about truth
    0:16:35 and not having much interest in making, in your words, purely factual films
    0:16:39 was a joy to read for me for lots of reasons.
    0:16:44 What is it about factual filmmaking that you find too constraining
    0:16:46 or too narrow or too small?
    0:16:51 Let’s face it, all these films that are fact-based are legitimate.
    0:16:55 Many of them are journalism, a form of journalism
    0:16:57 and you better stick to the facts.
    0:17:01 You don’t invent, you do not put out fake news.
    0:17:04 And I adhere to it, but it depends on what I’m doing.
    0:17:11 I made a film, for example, with Mikhail Gorbachev, the last leader of the Soviet Union.
    0:17:15 And you do not invent, you do not stylize.
    0:17:19 It’s just a very clear task that you have in front of you.
    0:17:26 Otherwise, I try to depart from the mere facts because I do not illuminate you.
    0:17:31 The phone directory does not illuminate you, although everything is correct in there.
    0:17:34 But it doesn’t give you insight.
    0:17:39 It does not inspire anything in you.
    0:17:45 So I have done things where I always make it clear I’m inventing now
    0:17:51 or later I make it clear to the audience here there is invention.
    0:17:59 But I do in documentaries, for example, things that you would normally do only in feature films.
    0:18:06 Casting, rehearsing, repeating a scene or repeating some statement.
    0:18:10 When it’s way too long, I ask, “Please, can we do it again?”
    0:18:13 But concentrate to the essentials.
    0:18:16 So I do all these things.
    0:18:21 I do it in feature films as well, of course, much more inventive.
    0:18:23 I do it in literature.
    0:18:28 All my poetry is not really that much fact-based.
    0:18:33 I sometimes run into these sorts of questions as a journalist,
    0:18:37 thinking about not just the role of journalism, but also the limits of journalism,
    0:18:39 the limits of just telling the facts.
    0:18:43 The facts can tell us what happened, but it can’t tell us what it means.
    0:18:47 To do that requires something different, something more.
    0:18:50 As you were saying, if your films and books were just factual,
    0:18:52 it would just be journalism, wouldn’t it?
    0:18:55 And you’re not trying to be a journalist.
    0:19:00 Read the phone directory instead.
    0:19:03 But you say you’re after something called ecstatic truth.
    0:19:04 What is that?
    0:19:06 Well, I coined this term.
    0:19:07 It’s a lovely phrase.
    0:19:16 In a way, I try to find an expression or confronting or search
    0:19:23 for something that is truthful in a way that forces us to step outside of ourselves.
    0:19:30 Ecstasy in ancient Greek means outside, standing outside of our existence.
    0:19:37 And it’s more an experience you would find with late medieval mystics, for example,
    0:19:41 although I don’t want to compare myself to them.
    0:19:48 It’s something which starts to invent and starts to dig into something deeper.
    0:19:54 I say truth now with great caution, because philosophy has no consensus
    0:19:58 what truth is all about, nor do mathematicians know.
    0:20:01 Nor does the pope in Rome really know.
    0:20:04 So we have to be cautious with that.
    0:20:08 But in my opinion, truth is somewhere out there.
    0:20:09 We sense it.
    0:20:10 It’s a human thing.
    0:20:11 We sense it.
    0:20:12 We know it.
    0:20:14 We yearn for it.
    0:20:16 We want to find it.
    0:20:19 And it’s like a dim light somewhere.
    0:20:26 We know the direction and the quest to find it, approaching the voyage to it.
    0:20:28 That’s what is important.
    0:20:32 And that’s what I’m doing in my films, in my books.
    0:20:37 And it gives a certain meaning to my life, our lives.
    0:20:43 Does the way you think about truth and art and your responsibilities change at all
    0:20:48 in this twisted era of misinformation and fake news and all of that?
    0:20:51 Well, you have to become street smart.
    0:20:57 And in particular now you have to become smart with the media and with the internet
    0:20:59 and artificial intelligence.
    0:21:06 So when it comes to media, let’s say mainstream or even outside of mainstream media,
    0:21:09 the news, do not trust anyone.
    0:21:10 Not one.
    0:21:12 Do not trust anyone.
    0:21:19 But try to corroborate important information by going to parallel sources.
    0:21:27 When you read about, let’s say the Western interpretation about a big event,
    0:21:31 just why don’t you switch over to Al Jazeera, for example.
    0:21:33 All of a sudden it looks different.
    0:21:39 And from there you move to the internet and read the full speech of a politician
    0:21:43 or switch into Chinese sources.
    0:21:45 Or you just name it.
    0:21:46 Can be anything.
    0:21:48 But do not trust anything or anyone.
    0:21:51 Do not trust your emails anymore.
    0:21:55 You see, it could be written by artificial intelligence.
    0:22:00 Do not trust anything, but it does not mean we do have to hate the media.
    0:22:03 We do not have to hate the internet.
    0:22:06 We just have to learn to be cautious.
    0:22:15 And I would like to compare it to, let’s say, early human being, prehistoric humans.
    0:22:16 Neolithic people.
    0:22:19 They were roaming the forests and the fields.
    0:22:22 And they would pick berries and they would find mushrooms.
    0:22:25 And they would now don’t eat this mushroom.
    0:22:26 It must be poisonous.
    0:22:29 But there is an automatic sort of caution.
    0:22:32 Be careful with an unknown mushroom.
    0:22:34 Be careful with this or that.
    0:22:42 And I’m sure that Neolithic people, hunters and gatherers, did not hate nature.
    0:22:45 They just had the right attitude.
    0:22:49 Just be cautious and you roam around and you’ll find the right thing.
    0:22:52 You can love nature without romanticizing it.
    0:22:53 Exactly, yes.
    0:23:00 And you can love the internet and artificial intelligence without romanticizing it
    0:23:03 because it has phenomenal possibilities.
    0:23:08 It’s extraordinary, but at the same time, be vigilant.
    0:23:21 [Music]
    0:23:25 We’ll be back with more from Werner Herzog after the break.
    0:23:40 [Music]
    0:23:42 This is an ad for better help.
    0:23:43 Welcome to the world.
    0:23:46 Please read your personal owner’s manual thoroughly.
    0:23:50 In it, you’ll find simple instructions for how to interact with your fellow human beings
    0:23:53 and how to find happiness and peace of mind.
    0:23:55 Thank you and have a nice life.
    0:23:58 Unfortunately, life doesn’t come with an owner’s manual.
    0:24:00 That’s why there’s BetterHelp Online Therapy.
    0:24:04 Connect with a credentialed therapist by phone, video or online chat.
    0:24:07 Visit BetterHelp.com to learn more.
    0:24:09 That’s BetterHELP.com.
    0:24:13 Support for the gray area comes from green light.
    0:24:18 The school year is already underway and you’ve probably wrapped up all your back to school shopping.
    0:24:24 Which means it’s time to kick back and pretend like you remember how to do algebra when your kid needs help with homework.
    0:24:28 But if you weren’t your child to do more learning outside the classroom that will help later on,
    0:24:30 then you might want to try green light.
    0:24:36 It can help teach your kids about money and not just the adding and subtracting, but how to manage it.
    0:24:39 Green light is a debit card and money app for families.
    0:24:41 Parents can keep an eye on kids spending and money habits,
    0:24:45 and kids learn how to save, invest and spend wisely.
    0:24:50 And with a green light infinity plan, you get even more financial literacy resources
    0:24:53 and teens can check in thanks to family location sharing.
    0:24:59 My kid’s a bit too young for this, but I’ve got a colleague here at Vox who uses it with his two boys and he loves it.
    0:25:04 You can join the millions of parents and kids who use green light to navigate life together.
    0:25:09 You can sign up for green light today and get your first month free when you go to greenlight.com/grayarea.
    0:25:13 That’s greenlight.com/grayarea to try green light for free.
    0:25:16 Greenlight.com/grayarea
    0:25:22 Support for where should we begin comes from Bombas.
    0:25:27 It’s fall in New York, it’s gotten chilly, and we’re all evaluating our closets
    0:25:30 and taking out the sweaters and the woolen socks.
    0:25:32 That’s what I do with my Bombas.
    0:25:40 Bombas offers incredibly comfortable essentials like socks, underwear and buttery smooth t-shirts that you may want to wear every day.
    0:25:46 They just release some playful new colors for fall with plenty of soft socks for fireside reading
    0:25:50 and sweat-weaking ones that can keep up with your next autumn run.
    0:25:53 I just ordered some compression socks for myself.
    0:25:59 When I was on tour in the last few months, every time I took a plane, I wished I had some compression socks,
    0:26:01 so I can’t wait now to try these out.
    0:26:08 Plus, for every item you purchase, Bombas donates one to someone experiencing housing insecurity.
    0:26:14 So, ready to feel good and do good, you can head over to bombas.com/ester
    0:26:18 and use code Esther for 20% of your first purchase.
    0:26:25 That’s B-O-M-B-A-S.com/ester, code Esther at checkout.
    0:26:38 [Music]
    0:26:42 In a lot of ways, we’re kind of talking about the uses and the misuses of language,
    0:26:47 and I am fascinated by your fascination with the limits of language.
    0:26:49 Why does this interest you so much?
    0:26:56 Because as a poet, you have to discover the outer margins of your language.
    0:27:00 Where does it go? Where does language start to unravel?
    0:27:04 Where do images become unclear?
    0:27:10 I can give you an example when I traveled on foot from Munich to Paris in early winter
    0:27:15 and I wrote a diary in the book which is called “Orph Walking in Ice”.
    0:27:22 I did this because my mentor, an old Jewish woman who had fled Nazi Germany,
    0:27:28 was dying at age 80 or so, and I said I will not allow her to die.
    0:27:30 I’m going to travel on foot now.
    0:27:36 I didn’t tell her and I came and I said to myself one million steps in defiance.
    0:27:42 It’s like a pilgrimage and when I arrive she will be out of hospital, which she actually was.
    0:27:47 At the end I walked non-stop 85 kilometres.
    0:27:52 That’s awfully long and against snowstorms against you.
    0:27:56 A whole day, a whole night and almost another whole day.
    0:28:04 And I arrived and somehow the images or language came apart and I said a very odd thing to her.
    0:28:13 I said to her together we shall cook a fire and we shall stop fish.
    0:28:19 You see, you cook a meal but you do not cook a fire.
    0:28:27 Language became somehow not correct anymore and we stopped the traffic but we do not stop fish.
    0:28:37 And she looked at me in a fleeting moment of understanding and I said to her please open the window from these days or I can fly.
    0:28:39 So it was as laconic as that.
    0:28:44 But I noticed that my language was incorrect.
    0:28:54 The metaphors were incorrect and it’s important I think for poets to understand where are the outer limits
    0:28:58 of what you can pass on through language.
    0:29:00 You’re a poet at heart.
    0:29:01 I wish I was.
    0:29:02 I’ve tried to write poetry.
    0:29:08 I just, I don’t have it but you’re sort of speaking to the power of poetry, I think, right?
    0:29:11 That it lives at this border between language and meaning.
    0:29:18 That it uses language to express truths that we don’t have a language for exactly.
    0:29:21 That is correct but we have approximations.
    0:29:25 We have a quest out there and we pursue it.
    0:29:39 And I describe, for example, in another book and I think even in my memoirs that sometimes there’s a vortex of words in me that I can’t get out of my mind.
    0:29:48 It’s sometimes like you’re haunted by a melody, a silly melody and you can’t get it out of your mind for weeks and weeks.
    0:29:55 You drive in a car and it comes back to you and for me sometimes like a vortex of words.
    0:30:07 And all of a sudden at a moment where I name them and I write them down in a specific situation liberates me from this vortex.
    0:30:14 So it’s very, very odd how language sometimes is playing its crazy games with me.
    0:30:18 Do you think that certain truths can only be expressed in their native language?
    0:30:22 That certain thoughts can only be thought in the language that conceives them?
    0:30:37 Yes, yes, because there’s a deep world view always involved in language and this is one of the reasons why I have been most fascinated by the disappearance of languages.
    0:30:53 We’re too much looking at the disappearance of, let’s say, mammals like whales or like the panda bear or the snow leopard or whatever species of amphibians, frogs that are very endangered.
    0:31:02 And we overlook that some of the most precious things like languages, whole cultures disappear without a trace.
    0:31:11 We have about 7,000 languages left roughly and every 10 days or 12 days we are losing one.
    0:31:20 There are 14, 15 languages out right now where there’s only one single last speaker of that language left.
    0:31:35 And while we are talking here, one of those may die right now and with him or her the last traces of a whole culture, of a world view, of a language, of song will disappear.
    0:31:37 And I find this is catastrophic.
    0:31:46 It goes faster than any extinction of species, disappearance, extinction of cultures and languages.
    0:32:01 So of course it’s a deep thing for me and my wife has actually done an installation called Last Whispers, an oratorio that was composed of extinct languages.
    0:32:12 Meaning only existing in tape recordings and voices in songs of critically endangered languages.
    0:32:18 Meaning there’s only one single last speaker left or maybe two or three.
    0:32:24 I get the sense that you think poets are really the glue of civilization.
    0:32:29 I think you’re right even in the book at some point that only the poets can hold Germany together.
    0:32:40 Well, I traveled on foot around the borders, all the simulations of the borders around Germany to hold it together like a belt before the reunification.
    0:32:48 Politics had given up, or some part of politics, including the German Chancellor Willy Brandt, whom I liked.
    0:33:01 But he declared the book of the German Unities closed, which a German Chancellor should not declare in an official declaration at the Bundestag, the parliament.
    0:33:14 And I thought it’s only the poets in our culture that holds a country together and I traveled on foot along where I grew up, was right at the Austrian border and then up and down the mountains.
    0:33:27 And along Austria, Switzerland, France, Belgium, Holland, Luxembourg, Denmark, I never completed this whole round around Germany because I fell ill and was in hospital for a week or so.
    0:33:35 And then all of a sudden the Berlin Wall fell and I knew this will lead to the unification which actually happened.
    0:33:42 And I love that quote from Albert Camus that the job of the writer is to keep civilization from destroying itself.
    0:33:44 Certainly I would include the poets.
    0:33:47 That’s a good one. I did not know he said that.
    0:34:02 I would like to quote the greatest of all German poets late 1700s, early 1800s, Hülderlinia, and he said what remains forever was always made by the poets.
    0:34:06 Yeah, I read a good bit of Heidegger when I was in college and he turned me on to Hülderlinia.
    0:34:14 I wish I could read German, but I cannot and I get the sense that there’s no way to understand what he was saying if you can’t read it in the original German.
    0:34:20 Yeah, and well, Hülderlin, he became insane and his language unravels.
    0:34:32 He was the one actually who went to the very outer limits of my language German and it comes apart and unravels.
    0:34:37 And that’s very, very tragic and also very fascinating to see that.
    0:34:43 And Heidegger, well, how can I say, I’ve never cracked the Heidegger code.
    0:34:47 No one has. I’m not even sure he knew what he was saying at the time.
    0:34:53 Whatever, I’m not an expert. You’re much more into philosophy than I am.
    0:35:07 But I think he was a great philosopher, but up to a certain degree and what comes, what is beyond our comprehension is maybe dubious.
    0:35:20 I was rewatching some of your documentaries when I was preparing for this and it struck me again how well you’re able to let people show themselves even when you’re directing them.
    0:35:27 Is that a very deliberate thing for you paying attention to people in that way, seeing their true nature and pushing them to reveal it?
    0:35:40 Of course, I do not have to push them. Most of the cases there’s no push, but you have to have such a fascination and radiated and awe and sympathy.
    0:35:48 They open up and you have to have it in you. If you make films like that, you have to have it in you.
    0:35:58 That’s a profession of a director and when you do a documentary, you have to be a director. You should not be the fly on the wall.
    0:36:09 The fly on the wall would be the surveillance camera in the bank and for 15 years it records nothing because no bank robbery ever happens.
    0:36:17 So wait another 15 years and still there wouldn’t be anything of significance, nothing worth recording.
    0:36:31 So I interfere, I shape, I’m the hornet out there that stings and that’s what I think is filmmaking. We are creators.
    0:36:43 I think your films have given me an appreciation for how much space there is for revelation in silence and often the only way to see someone is to just shut the hell up and get out of the way.
    0:36:49 Which seems simple enough, but a lot of people don’t do it, but you do and I think it’s to your credit.
    0:37:05 And many of them follow some sort of a journalistic approach. They come with a catalogue of questions. I never have a paper with a catalogue, I just listen and I start to follow leads and I dig very deep.
    0:37:10 I want to look deep into the heart of men and also of course women.
    0:37:19 So if you don’t have it in you as a director to see the heart of men, you should not be a director.
    0:37:31 Well, you can be a journalist and it’s all what you see on television, day in, day out, totally legitimate, but not my kind of filmmaking, not my kind of writing.
    0:37:41 We’ll be back with more from Werner Herzog after one more break.
    0:38:14 Support for the gray area comes from Shopify. Every great business starts with a great idea or a kind of “meh” idea that’s so bizarre it becomes weirdly successful like that singing big mouth bass that’s been installed in a million garages.
    0:38:23 But to make your business successful over time, you need more than an idea. You need a partner who can help you achieve sustainable growth, a partner like Shopify.
    0:38:29 Shopify is an all-in-one digital commerce platform that may help your business sell better than ever before.
    0:38:36 No matter where your customers spend their time, scrolling through your digital feed or strolling past your physical actual storefront,
    0:38:41 Shopify may help you convert those browsers into buyers and to sell more over time.
    0:38:49 There’s a reason companies like Allbirds turn to Shopify to sell more products to more customers, businesses that sell more sell with Shopify.
    0:38:58 Want to upgrade your business and get the same checkout Allbirds uses? You can sign up for your $1 per month trial period at Shopify.com/box.
    0:39:05 You can go to Shopify.com/box to upgrade your selling today. Shopify.com/box.
    0:39:09 Support for the gray area comes from NetSuite.
    0:39:15 Imagine running a magical business where you could see next quarter’s trends before they happen.
    0:39:24 No more guessing if pumpkin spice kale smoothies will be the hit of the season or if it’s more of a pumpkin spice deep fried Oreos kind of vibe.
    0:39:30 You would just consult your crystal ball and then order enough supplies to make a healthy or very unhealthy profit.
    0:39:35 Sadly, that’s just not how things work in this corner of the wizardverse.
    0:39:40 That’s why smart businesses get themselves future ready with NetSuite by Oracle.
    0:39:49 NetSuite says they’re the go-to business management suite for almost 40,000 companies offering everything you need to stay on track, no matter what tomorrow brings.
    0:39:54 Get ahead of the curve and stay focused on what’s coming next with NetSuite by Oracle.
    0:39:59 And now I’ll finish this section of the ad by saying the word opportunity.
    0:40:06 Speaking of opportunity, you can download the CFO’s guide to AI and machine learning at netsuite.com/grayarea.
    0:40:15 The guide is free to you at netsuite.com/grayarea. That’s netsuite.com/grayarea.
    0:40:22 Support for the show comes from Into the Mix, a Ben and Jerry’s podcast about joy and justice produced with Vox Creative.
    0:40:30 In season three of this award-winning podcast, Into the Mix is covering stories on the ordinary people fighting for justice in their local communities.
    0:40:40 Starting with the fight against the workhouse, a penitentiary in St. Louis known for its abject conditions, mold, and pest infestations, and its embrace of the cash bail system.
    0:40:47 Host Ashley C. Ford interviews Ainez Bordeaux, who spent a month in the workhouse when she couldn’t afford her $25,000 bail.
    0:41:06 Experiencing what I experienced and watching other women go through it and know that there were thousands before us and there were thousands after us who had experienced those same things, that’s where I was radicalized.
    0:41:10 Eventually, her charge was vacated, but the experience changed her.
    0:41:16 They’re starting a campaign to close the workhouse. Are you interested? And I was like, hell yeah. Hell yeah, I’m interested.
    0:41:24 You can hear how she and other advocates fought to shut down the workhouse in one on the first episode of this special three-part series, Out Now.
    0:41:27 Subscribe to Into the Mix, a Ben and Jerry’s podcast.
    0:41:50 I think the chapter in the book about the projects you wanted to make but weren’t able to for whatever reason, it might be my favorite, certainly one of my favorites.
    0:41:55 You wanted to make a film with Mike Tyson, you wanted to blow up an opera house in Sicily.
    0:41:59 Well, an abandoned opera house that was built, probably mafia.
    0:42:02 I should, yeah, to be clear, yes, there were no people in it.
    0:42:15 Exactly, mafia, money, and in the city of a small obscure town, Shakka in southern Sicily built it, I think, mostly laundering mafia money.
    0:42:23 And now it’s there. It has no administration, no opera ever was played in there, no technicians, no singer, no choir, nothing.
    0:42:27 But I couldn’t do it, so that was the end of the project.
    0:42:30 And of course, many other projects.
    0:42:42 One of the projects I’m actually trying to pursue now a story about twins, young twin women who spoke in unison.
    0:42:50 It’s phenomenal sometimes twins create a secretive language, but they exchange in this language.
    0:42:55 But there were two twins, twin sisters who spoke in unison.
    0:43:01 Even if you ask them a question, they could not expect, they would ask like a chorus.
    0:43:06 And the project is called Bucking Faster.
    0:43:10 Not the fucking bastard, but Bucking Faster.
    0:43:15 They made the same slip of tongue in a court hearing.
    0:43:20 They were testifying together and they shouted across the courtroom.
    0:43:30 There were defendants and a truck driver tried to get a restraining order against them and they yell across the courtroom simultaneously.
    0:43:34 He’s lying. Don’t you hear every word is a lie.
    0:43:39 He’s lying under oath. The bucking faster is lying.
    0:43:43 They make the same slip of tongue at the same moment.
    0:43:49 How do you come upon these subjects, these stories? Do you just collide with them?
    0:43:53 No, no, I find them. I don’t know. Sometimes they find me.
    0:43:56 Is there some kind of gravitational pull that they just, they find you?
    0:44:00 There’s something, something out there. They find me, I find them.
    0:44:12 And I actually was very close in contact with them, let us met them, took them out in the restaurant, for example, which was a big deal for them.
    0:44:16 Because they were very shy, they wouldn’t like to leave their apartment.
    0:44:24 So I do find them and they, they, I stumble into them or they stumble into me as well.
    0:44:33 If there was one project, if you had to pick one project that you never quite got off the ground but wish you did, which one would it be? Is it the opera house?
    0:44:47 No, I am describing a dozen or so projects and, and of course there were projects that I couldn’t do, for example, the conquest of Mexico, but seen from the perspective of the Aztecs.
    0:44:57 Aliens are landing. The ships are descending from clouds and they bring miraculous stags with them, I mean horses.
    0:45:05 And, and they create thunder from barrels, I mean cannons and things. So a totally alien invasion for them.
    0:45:26 And of course it would have been very, very expensive. You have to build pyramids and temples and recreate the capital city, Tenochtitlan, which is Mexico City today, canals like Venice today and thousands of extras and open battles and you just name it.
    0:45:44 It was just too expensive. It would have required a huge Hollywood budget, but Hollywood would only finance it and side with me if my last film made, let’s say, $400 million box office domestic.
    0:45:56 Then they would approach me and say, oh, let’s do that together. But I do not lose a sleepless night over this. It’s okay. And people said, ah, you have to pursue it. It’s so beautiful.
    0:46:10 And why don’t you try on asset in 20 years? I’m not going to find the money either because there are some iron laws of the industry, which I thankfully understand them.
    0:46:20 And I said, no, I plow on. There’s many other things in these in the last 20 years of this undone project. I’ve made 27 films.
    0:46:29 Speaking of expensive, I read recently that you wanted to go to space and even applied with the Japanese company for an opportunity to do it.
    0:46:30 I did.
    0:46:33 They turned you down, which is outrageous, I should say.
    0:46:52 No, they didn’t turn me down. They still, they didn’t respond to my application because there must have been thousands. It’s actually a Japanese billionaire who invites eight guests or something like that and flying out and flying around moon.
    0:47:00 And I argued, you got to have a poet along with you. I send a daily poem down to earth in the short movie.
    0:47:04 Is that why you wanted to go? Because I wanted to ask, why did you want to go in the first place?
    0:47:22 Because it’s a perspective that is completely new for a filmmaker for a poet. I would also go to Mars, but let’s face it, I applied against the vigorous objections of my wife, which I understand, but I applied anyway.
    0:47:27 And I was not turned down. I’d never got an answer. So I was not elected.
    0:47:30 Well, that’s that’s equally outrageous, but I’ll let it go.
    0:47:38 No, no, come on, come on. The other eight people who are instead of me there will enjoy it tremendously.
    0:47:43 I’ve always been enamored with what astronauts call the overview effect.
    0:47:57 Almost every person who goes to space and looks back down on earth describes the same kind of transformative experience where they can really feel how special this place is against the backdrop of space.
    0:48:03 Do you suspect you’d feel the same way? Maybe it would give you a whole new appreciation for the for the chaos of the world.
    0:48:11 I do have a lot of appreciation for our world, sometimes even against my better judgment.
    0:48:17 But probably I would have a similar experience. I do not want to predict what would happen.
    0:48:30 But what I find very, very significant is one of these voyage emissions that has left our solar system, I think, launched in the 70s, made photos back of planet Earth.
    0:48:37 The last photo I think that we have is a tiniest speck of a star somewhere out there.
    0:48:43 And that’s our planet Earth. So how insignificant we are. That’s really stunning.
    0:48:49 But from our moon, planet Earth is very close from Mars.
    0:48:58 It’s still fairly close. It’s not this really far out view of what we are and where we are.
    0:49:03 But totally fascinating for me, I would instantly go.
    0:49:11 While we’re on the subject of Earth, our home, do you think humanity is destroying itself?
    0:49:18 That’s part of what is happening to us. But we have to face it biologically.
    0:49:27 We are very vulnerable and we have it somehow in us that is self-destructive. That’s not healthy.
    0:49:36 But I do not believe that we will have a permanent existence here on our planet.
    0:49:44 So the way dinosaurs disappeared, I’m pretty certain we will disappear as well.
    0:49:47 It doesn’t make me nervous, by the way.
    0:49:51 I’m not sure I’ve ever thought of you as a pessimist.
    0:49:56 But I do think of you as someone who’s very clear-eyed about the fragility of civilization, really.
    0:50:03 Do you even think in these terms, does the language of pessimism and optimism mean anything to you at all?
    0:50:05 Or is it just the wrong language?
    0:50:11 No, I avoid it here. It’s too primitive to categorize a person as an optimist or pessimist.
    0:50:18 I’m just looking at what’s out there. Who are we? How fragile is our own biology?
    0:50:26 How fragile are societies? For example, if the Internet disappears from one moment to the next.
    0:50:30 And it can if we have a massive, a real massive solar flare.
    0:50:40 Or if we have, let’s say, a war event that will destroy all the servers and rout us, we would be without Internet.
    0:50:54 And it would be like New York City, and I’m sitting here in New York City, downtown, when the hurricane hit all of a sudden below 32nd Street.
    0:51:00 It was without electricity, without Internet, without cell phone coverage.
    0:51:16 And my wife, who was here at exactly that time, says, “Tens of thousands of people were dazed and confused and moving north in Manhattan Island just in search of a toilet, of a flushable toilet.
    0:51:26 Tens of thousands. And all of a sudden, within days, we are thrown back in a situation like hunters and gatherers.
    0:51:38 And that doesn’t bode well for our species, because you can go up to Central Park and hunt the squirrels, but it will not feed you for long.
    0:51:45 Not eight million inhabitants will eat long and survive long on a few squirrels in the park.
    0:51:54 So it doesn’t look good. And a good survival would be, for example, for tribal hunters and gatherers like the Inuit.
    0:52:03 They don’t need the Internet, or for the Amish, who are doing homestead farming without technology.
    0:52:10 They don’t have electricity, or many of the fundamentalist Amish don’t have electricity.
    0:52:17 They don’t have cars. They don’t have ridges. And they live very well. They would survive.
    0:52:24 I loved your film, Low and Behold, as an exploration of the Internet and what it’s done, what it’s doing to us.
    0:52:33 And I think it makes the point far better than I can, and I’ve certainly tried, about how these sorts of technological revolutions aren’t really planned.
    0:52:40 And the people who give birth to these revolutions, these technologies, haven’t the faintest idea of what it will lead to.
    0:52:49 But it’s interesting, and I guess not surprising, that the digital world and social media and that kind of thing, it doesn’t really exist for you.
    0:52:54 You’re not on those things. Is that just a clear decision you made at some point to abstain?
    0:53:04 No. I would say long live the digital world. And I’m using it. I’m using it for filming. I use it for editing.
    0:53:16 I use it for communications. I do emails. My main tool of communication is email, but I do not need to be part of certain things that are out,
    0:53:25 possibly in the Internet. I’m not on social networks. If you find me on Twitter or on Facebook, there are forgeries.
    0:53:35 And there are many forgeries of voice imitators out there. I have lots of doppelgangers, lots of duplicates. Let them be out there.
    0:53:45 I don’t mind, but if you listen to that, you know, it’s forgeries. And if you find me on Facebook, it’s a complete forgery then.
    0:53:51 You asked people at the end of Lo and Behold, actually. I’m just thinking of it, if they thought the Internet dreams of itself.
    0:54:00 And I love the question, and I didn’t quite understand it. I still don’t think I understand it. So I think I’m just going to ask you if you think the Internet dreams of itself.
    0:54:13 Well, that’s the deepest of all questions, I think, and not really fully answerable. And I have to admit, it is just a projection of a statement by a war theoretician.
    0:54:30 Napoleonic time, Prussian war theoretician von Klausiewicz. And apparently von Klausiewicz once in his study on war, which is still a revolutionary insight into warfare.
    0:54:42 He said, “It seems that war sometimes dreams of itself.” It’s a stunning statement, and I extended it as an Internet dream of itself.
    0:54:50 The strange thing now is that experts on von Klausiewicz told me von Klausiewicz never said that.
    0:54:59 Maybe I invented it and talked myself over decades so much into it that I believe it was von Klausiewicz.
    0:55:09 So it’s very odd how our memory is shifting and shaping its own world, shaping its own quotations from books.
    0:55:15 But it’s the deepest of all questions. Does the Internet dream of itself?
    0:55:26 And you can extend it. Does artificial intelligence dream of itself? And that’s why it gets interesting.
    0:55:31 Yeah, it’s been a few years since I watched the film. I watched it again a few weeks ago, and the question just lingered with me.
    0:55:37 It sort of runs over my head, I think, but I sense the depth of it. If you put it to me, I wouldn’t have an answer.
    0:55:46 And there’s not a single scientist in the film who can answer it. They are puzzled. They are stunned that a filmmaker is asking this question.
    0:55:53 But I’m not so much a filmmaker in that case. I’m a poet who asks them, and they sense it.
    0:56:00 Do you think much about legacy? You talk about being a poet, and you talk about how you think your writing will survive longer than your films?
    0:56:02 I think so, yes.
    0:56:04 Why do you think that is?
    0:56:27 I can’t really give you a clear answer. There’s a gut understanding and feeling that this will last, and that my prose and my poetry has an intensity that is beyond the illumination or the intensity of the films.
    0:56:45 And people always are puzzled. How does he reconcile being a filmmaker and a poet? And I have a simple formula now that makes it very easily understood. Filmmaking is my voyage, but poetry, writing, is home.
    0:56:54 So is the filmmaking more experimental and exploratory for you, whereas the writing feels more settled and secure, if that makes any sense at all?
    0:56:59 Well, we shouldn’t try to analyze now this very simple dictum.
    0:57:01 I have a nasty habit of doing that sometimes.
    0:57:10 No, no, it’s self-explanatory. I couldn’t even explain any further. I’m glad that I have a simple formula.
    0:57:23 It’s a bit simple then. I’ve read your books and watched most of your films, if not all, and I guess I didn’t have a full appreciation for the diversity and the adventurousness of your life until I read the book.
    0:57:35 It really is a quite remarkable life, and the connection between your experiences, what you’ve actually done, the ways that you have collided with the world and other people is so essential to the work that you’ve done.
    0:57:50 It’s true, yes, and it’s puzzling. It’s puzzling because there was an intensity of life as if it had been five lives in a row and things that you normally do not do as a writer, as a filmmaker.
    0:58:01 And people immediately start to doubt, am I telling them wild stories? Did I really move a ship over a mountain? Yes, I did. It’s documented.
    0:58:08 Did I put a whole cast of actors under hypnosis? Yes, I did. It’s documented.
    0:58:22 Was I shot during a live interview for BBC? Yes, I was shot. I mean, it was not a very big wound that I had, but I was shot on tape. It’s caught on tape.
    0:58:33 And on and on, and for example, New York Times, the writer is completely puzzled. Is this all invented or so? But you see, no stone was left unturned.
    0:58:45 I gave the memoirs to my two brothers, verified. In some cases I had a different shade of experience, but that’s legitimate. That’s fine.
    0:59:00 Well, for example, did I do a stunt at the opera house in Bologna, where I wanted to have a stage worker falling from the skies and through the stage, and there was not money enough for a stuntman.
    0:59:09 So I tested it myself. Immediately doubted, but there’s a series of photos, and I have them where you see me flying through the air.
    0:59:18 And of course, at the bottom, there was a huge air cushion. The same thing that Hollywood uses for stuntmen.
    0:59:32 So things are documented and all the big things, of course, had dozens of witnesses or crew members or actors, extras. You just name it. You can’t make it up.
    0:59:39 Werner, I have admired your work for many, many years, and it was an absolute pleasure to chat with you.
    0:59:46 Once again, the book is called Every Man for Himself and God Against All. Thank you so much for coming in today.
    0:59:48 And thank you and greetings to Mississippi.
    0:59:50 Thank you. Come on down.
    1:00:02 [Music]
    1:00:09 This episode was produced by Katelyn Boguki. Patrick Boyd engineered this episode with help from Chris Shirtleff.
    1:00:17 Alex Overington wrote our theme music. Serena Salen is our fact-checker, and A.M. Hall is the boss.
    1:00:24 As always, let us know what you think of this episode. Drop us a line at thegrayarea@vox.com.
    1:00:29 And please share it with your friends on all these socials.
    1:00:35 New episodes of The Gray Area Drop on Mondays. Listen and subscribe.
    1:00:40 This show is part of VOX. Support VOX’s journalism by joining our membership program today.
    1:00:44 Go to vox.com/members to sign up.
    1:00:50 [Music]
    1:00:56 Support for the gray area comes from Mint Mobile. Sometimes a deal is too good to be true.
    1:01:04 You know the feeling. You find that great deal on a rental car only to realize the lock is broken and the windows won’t roll up.
    1:01:09 Well, Mint Mobile says with their deals there are no catches. What you see is what you get.
    1:01:15 When you purchase a new three-month plan with Mint Mobile, you’ll pay just $15 a month. That’s it.
    1:01:21 No hoops to jump through. No sneaky fine print that you can barely read. Just a great deal.
    1:01:29 All Mint Mobile plans come with a high-speed data and unlimited talk and text delivered on the nation’s largest 5G network.
    1:01:32 You can even keep your phone, your contacts, and your number.
    1:01:41 You can get this new customer offer and a three-month premium wireless plan for just $15 a month by going to mintmobile.com/grayarea.
    1:01:50 That’s mintmobile.com/grayarea. You can cut your wireless bill to $15 a month at mintmobile.com/grayarea.
    1:01:58 $45 upfront payment required equivalent to $15 per month. New customers on first three-month plan only.
    1:02:04 Speed slower above 40 gigabytes on unlimited plan. Additional taxes, fees, and restrictions apply.
    1:02:06 See Mint Mobile for details.
    1:02:09 (upbeat music)

    Sean Illing speaks with one of his heroes: Werner Herzog.

    Herzog is a filmmaker, poet, and author of the memoir Every Man for Himself and God Against All. The two discuss “ecstatic truth,” a term invented by Herzog to capture what he’s really after in his work, why he’s interested in Mars, and whether he thinks humanity is destroying itself.

    Host: Sean Illing (@seanilling), host, The Gray Area

    Guest: Werner Herzog, author, Every Man for Himself and God Against All

    This episode was originally published in October of 2023.

    Support The Gray Area by becoming a Vox Member: https://www.vox.com/support-now

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Ta-Nehisi Coates on complexity, clarity, and truth.

    AI transcript
    0:00:06 Support for the show comes from Into The Mix, a Ben & Jerry’s podcast about joy and justice
    0:00:09 produced with Vox Creative.
    0:00:13 Along the Mississippi River between Baton Rouge and New Orleans, my old home, there’s
    0:00:19 a stretch of land nicknamed Cancer Alley because the cancer rate is more than seven times higher
    0:00:24 than the national average thanks to a high concentration of petrochemical plants.
    0:00:28 Here how one community is fighting against some of the top polluters in the country in
    0:00:30 their own backyard.
    0:00:35 Subscribe to Into The Mix, a Ben & Jerry’s podcast.
    0:00:38 Support for this show comes from Constant Contact.
    0:00:42 If you struggle just to get your customers to notice you, Constant Contact has what
    0:00:45 you need to grab their attention.
    0:00:50 Constant Contact’s award-winning marketing platform offers all the automation, integration
    0:00:55 and reporting tools that get your marketing running seamlessly, all backed by their expert
    0:00:57 live customer support.
    0:01:02 It’s time to get going and growing with Constant Contact today.
    0:01:03 Ready?
    0:01:04 Set.
    0:01:05 Grow.
    0:01:09 Go to ConstantContact.ca and start your free trial today.
    0:01:16 Go to ConstantContact.ca for your free trial, ConstantContact.ca.
    0:01:25 As you know, this show is called The Gray Area and when we created it in 2022, that name
    0:01:30 felt right because it represented my instincts as a host.
    0:01:44 I really do believe that the world is complicated, far too complicated for any of our ideologies
    0:01:46 to capture perfectly.
    0:01:54 And our desire for simplicity is understandable, but often it comes at the expense of truth.
    0:02:01 It’s not that I don’t have any principles or a point of view.
    0:02:06 I do, and if you listen to this show, you know that.
    0:02:13 But this temptation to oversimplify is something we try to resist on the show.
    0:02:17 And we’ll keep doing that because I think it’s essential.
    0:02:23 But when does the impulse to embrace ambiguity become its own pathology?
    0:02:29 The world is complex, sure, but sometimes we have to pass judgment.
    0:02:33 Sometimes we have to be willing to say that something is true and something is false,
    0:02:36 that something is right and something is wrong.
    0:02:40 So how do we know when things really are that clear?
    0:02:51 And how do we avoid the impulse to lie to ourselves when we know they’re not?
    0:03:06 I’m Sean Elling, and this is the Gray Area.
    0:03:10 Today’s guest is Ta-Nehisi Coates.
    0:03:15 He’s an author, essayist, and one of our most celebrated living writers.
    0:03:20 He’s just published a new book called The Message, which is a collection of three very
    0:03:22 personal essays.
    0:03:27 The book has garnered a lot of attention because whenever Coates publishes something
    0:03:30 new, it’s an event.
    0:03:34 But it’s also stirred up quite a bit of controversy because the longest essay in the book is about
    0:03:38 his trip to Palestine.
    0:03:43 If you know almost nothing about the conflict between Israel and Palestine, the one thing
    0:03:49 you’d probably be comfortable saying is that it’s complicated.
    0:03:54 This is an assertion Coates challenges directly in the book.
    0:04:00 For him, the moral arithmetic is simple, and Israel’s treatment of the Palestinian population
    0:04:13 is fundamentally wrong.
    0:04:17 Obviously lots of people take issue with this.
    0:04:22 And this is something Coates and I explore in our conversation.
    0:04:29 But this isn’t a debate show, and I didn’t invite him here for an argument.
    0:04:34 I invited him because I think he’s smart and sincere, and I don’t think he writes
    0:04:40 anything without having put a great deal of thought into it.
    0:04:44 So by having him on the show, I hope to have a discussion about the role of the writer
    0:04:50 and the intellectual and what it means to describe the world with moral clarity.
    0:04:57 Coates challenged me, and hopefully I challenged him in ways that felt generous and fruitful.
    0:05:02 It certainly was for me.
    0:05:05 Tana Hasi Coates, welcome to the show.
    0:05:07 Thanks for having me.
    0:05:08 It’s great to finally talk.
    0:05:10 I think this is the first time.
    0:05:13 I still remember reading your blog back in the day.
    0:05:14 Yes, really.
    0:05:16 It’s been quite a ride for you ever since.
    0:05:17 Congrats on all the success.
    0:05:18 Oh boy, oh boy.
    0:05:19 Yes it has.
    0:05:20 Thank you.
    0:05:28 You know, I have to say, what I’ve always appreciated about you, whether I agreed with
    0:05:36 whatever you happen to be saying or writing at the moment, is that you seem to be genuinely
    0:05:37 thinking out loud.
    0:05:41 You have a point of view, obviously.
    0:05:46 But you’ve always struck me as someone earnestly seeking the truth.
    0:05:49 And I cannot say that about a lot of people in our game.
    0:05:57 Do you think about your writing and your intellectual project, for lack of a better word, in that
    0:06:01 kind of open-ended way?
    0:06:02 I do.
    0:06:03 I’ll say two things.
    0:06:09 The first thing is, and I didn’t always notice, but I think the truth of writing in terms
    0:06:16 of its power is maybe a little bit earnest and maybe a little naive on my part.
    0:06:23 And your writing is at its strongest when you’re not lying.
    0:06:27 And when you’re not lying to yourself too, not just to your readers, but to yourself.
    0:06:32 There was a period, I think back in the early arts or so, where there was a lot of confessional
    0:06:33 memoirs.
    0:06:36 People were really just, there was a lot of exhibitionism.
    0:06:38 So I don’t mean that.
    0:06:43 But it does have to be a real kind of vulnerability, a real searching.
    0:06:47 And sometimes that vulnerability can be expressed not even as memoir.
    0:06:51 Maybe it happens in the reporting and your willingness to push yourself in certain places
    0:06:53 that don’t feel comfortable.
    0:06:58 Maybe it does happen in terms of exploring something that is internal to you.
    0:07:02 Maybe it happens in your ability to revisit certain things that you thought were actually
    0:07:03 truth.
    0:07:11 But I do think a vulnerability, a willingness to have an open-ended, I don’t know, just
    0:07:13 tell the truth.
    0:07:15 I think that really, really, really matters.
    0:07:18 There’s a kind of strength you get from it.
    0:07:26 The way I can only express it is that my feet, when I was in writing, by the time I get to
    0:07:28 the end, feel firmly on the ground.
    0:07:35 And you don’t feel firmly on the ground because I know exactly that I’m right.
    0:07:40 But as silly as this might sound, I know that I’m not lying.
    0:07:41 I do know that.
    0:07:43 I’ve achieved the kind of, which is a process, right?
    0:07:44 Because we lie to ourselves too.
    0:07:45 You know what I mean?
    0:07:47 That actually is a process.
    0:07:49 So I think there’s great power in that.
    0:07:53 The second part of it is something that I’ve really had to learn to live with and get over.
    0:07:59 And that is that, unfortunately, people view your work in isolation.
    0:08:02 They don’t see it as a journey, as a stop.
    0:08:08 They see it as a complete thing, be it at articles, be it at books or whatever.
    0:08:14 And so there’s really not an expectation that there’ll be a growth period that is happening
    0:08:18 over the course of your work, even though you hope that’s what happens.
    0:08:20 You kind of get frozen in stone.
    0:08:27 But I’ve decided, and this is also kind of daily work too, that the public has its relationship
    0:08:28 with my work.
    0:08:29 And that’s fine.
    0:08:32 And they have the right to have that perception with my work.
    0:08:37 And my perception or my relationship with my work has to be separate.
    0:08:40 It’s just two different things, and I have to be okay with that.
    0:08:42 And I can’t ask them to have my relationship with it.
    0:08:44 Are you ever really okay with that?
    0:08:49 I mean, look, every writer wants to be read, and you’ve reached a level most don’t.
    0:08:52 But your work becomes this thing.
    0:08:56 It becomes this symbol in the hands of other people, and you almost lose control of it
    0:08:57 in a way.
    0:08:58 And that’s got to be a weird–
    0:08:59 It’s not great.
    0:09:00 It’s not great.
    0:09:03 It’s not your feeling at all.
    0:09:05 But I do think you have to accept it.
    0:09:06 Yeah.
    0:09:10 One of the things I decided that I was going to do, and I did not do this for any other
    0:09:13 book, I was like, I really, I can’t read reviews this time.
    0:09:16 I can’t read articles about me.
    0:09:18 I can’t consume that.
    0:09:21 Not because I think that those articles are worthless, or because I think those reviews
    0:09:23 are worthless.
    0:09:29 But as I heard somebody say relatively recently, it really is none of my business.
    0:09:33 That is the relationship between readers, between critics, between discourse, and the
    0:09:34 work.
    0:09:37 And they have a right to that relationship.
    0:09:43 But I have a right to this kind of private relationship also, and I have to really safeguard
    0:09:44 that.
    0:09:52 I feel like every interesting writer, truly interesting writer, and you are certainly
    0:10:00 that, has some kind of anchoring worldview, even if it’s never quite explicit.
    0:10:06 And when I think about your work, and maybe it’s just me because I happen to share this
    0:10:11 sensibility, the word that comes to mind is tragic.
    0:10:19 Do you choose a different word or is that about right?
    0:10:24 This is kind of an illustration of the relationship question I was just talking about.
    0:10:27 I can’t really pick that.
    0:10:28 You know what I mean?
    0:10:32 I don’t have the ability, and this is why critics are important, and this is why people
    0:10:33 are writing about it is important.
    0:10:37 It’s very hard for me to see it in that way.
    0:10:40 I’m too close to it to name that.
    0:10:42 So that might be true.
    0:10:44 Your guess is as good as mine.
    0:10:45 Probably better.
    0:10:46 You know?
    0:10:47 Yeah.
    0:10:51 I mean, I’ve heard you describe yourself as a cold, hard materialist man.
    0:10:58 It’s not that this shit’s not going to work out, but it’s probably not going to work
    0:10:59 out.
    0:11:00 It probably won’t.
    0:11:01 It probably won’t.
    0:11:03 But I’m a romantic too, you know?
    0:11:08 There’s that part of me too, and you hope that it does.
    0:11:09 You hope you’re wrong.
    0:11:10 You might be wrong.
    0:11:18 But part of the reason why I was asking is that I was wondering if you thought a certain
    0:11:25 relationship or certain tragic sensibility was necessary if we’re going to avoid the
    0:11:32 temptation to lie to ourselves, about ourselves, and the world, and other people, and the
    0:11:33 future.
    0:11:36 I just think you just got to be honest.
    0:11:40 I think you just got to be like, like you have the courage to be straight.
    0:11:41 Now, here’s the hard thing.
    0:11:49 It requires you to maintain a certain distance from political aims, even if you have sympathy
    0:11:51 for some of those aims.
    0:11:59 One of the things I lament if I have a critique is that there are a number of writers of my
    0:12:06 generation and maybe a little older some and a little younger others who I think have
    0:12:10 been consumed by mean politics.
    0:12:14 So not politics at large, but electoral politics.
    0:12:20 I guess what I’m trying to say is I don’t want to be an arm of the Democratic Party.
    0:12:27 I need to have, obviously, I have a politic to me, but I have to have the ability to try
    0:12:34 to explore that politics to know when it says something that is hopeful, know when it says
    0:12:40 something that I think is not so hopeful, and to be relatively straight about that, even
    0:12:46 when it may not necessarily advantage my political aims.
    0:12:48 I can give you a very specific example of this, actually.
    0:12:55 It is really clear to me that the world I want to live in, that I would be much closer
    0:13:00 to that world with Kamala Harris than Donald Trump, right?
    0:13:06 So there is a version of this where you say if I know that, and that’s my preferred political
    0:13:11 outcome, I will not say certain things until Kamala Harris is elected.
    0:13:16 I will not write certain things until because I’m trying to figure out, you know what I
    0:13:18 mean?
    0:13:21 And that’s where I think you start getting a little dangerous.
    0:13:27 That’s what I mean about the kind of contamination, and you want a little distance between the
    0:13:28 two things.
    0:13:33 But once you start making those calculations, you’re already compromised in a sense.
    0:13:34 You are.
    0:13:35 You are.
    0:13:40 But I do think there are a number of people who are effectively Senate aides when they’re
    0:13:42 actually writers or journalists.
    0:13:43 Yeah.
    0:13:45 A lot of money to be made in that game.
    0:13:46 That’s true.
    0:13:47 That’s true.
    0:13:48 Let’s get into the new book.
    0:13:55 I know you started out wanting to write a book about writing, and then it sort of shapeshifted
    0:14:05 into a deeper examination of how stories shape and illuminate and sometimes distort reality.
    0:14:11 How did this project evolve, and did you land in a different place than you intended?
    0:14:15 It’s probably less about writing than I wanted to.
    0:14:19 Like, I began with this Orwell quote where he’s basically saying, look, if there wasn’t
    0:14:24 so much important shit happening right now, I would be, you know, just writing beautiful
    0:14:25 things to amuse myself.
    0:14:28 I mean, he doesn’t say it like he says it much better than that, but that’s effectively
    0:14:30 what the quote says.
    0:14:36 And I really feel that, like I really, like I, like I love the beauty of language, and
    0:14:39 I find it very interesting how you conjure that beauty, what you do with it, what effects
    0:14:46 it has, how it makes you feel, you know, like all of that is really interesting to me.
    0:14:50 But the fact that a matter is living in a time like we live in, maybe living at any time
    0:14:55 in American history period, or maybe in history period, you know, that doesn’t really feel
    0:14:59 appropriate because you have concern about your fellow human beings, and here you have
    0:15:04 this thing, and you notice this thing can make people see things that are normally maybe
    0:15:06 obscure to them.
    0:15:13 And so what the book ultimately became about was not simply writing, but how writing can
    0:15:16 clarify and how writing is sometimes used to obscure, in fact.
    0:15:21 And so you’ll see that there are places in there where the old mission of the book remains,
    0:15:22 you know what I mean?
    0:15:25 Where I kind of, you know, pull out and say, Hey, look at this, this person did this, look
    0:15:29 how they use language here, you know, the way I think about writing when I’m actually
    0:15:30 doing it.
    0:15:34 But the politics of the book became the larger thing.
    0:15:38 You obviously enjoy writing for the craft of writing, for the beauty of language, but
    0:15:43 then you also feel this pull, this obligation, because you give a shit about the world to
    0:15:47 speak about what’s happening, and defend whatever your values and priorities happen to be.
    0:15:50 And sometimes those things can come into conflict, how do you navigate that?
    0:15:56 I would say that they actually kind of neatly work together.
    0:16:00 This is, if I can, I know I just said that thing about not being in conversation, you
    0:16:05 know, and what I would say is, especially to young writers who might be listening right
    0:16:17 now, if you can write beautifully, clarifyingly, I actually think it brings you closer to using
    0:16:22 your craft to have the world that you want to see.
    0:16:26 I get people come up to me all the time and they, you know, so what are you doing?
    0:16:28 Like, how did you do this?
    0:16:30 Why did you get to write X, Y and Z?
    0:16:32 Why wouldn’t you say X, Y and Z?
    0:16:33 People listen.
    0:16:34 You know what I mean?
    0:16:35 Like, I don’t understand it.
    0:16:38 50 people before you said this.
    0:16:39 You know, why is that okay?
    0:16:44 And if I can just be an asshole for a moment, the asshole in me says, do you realize how
    0:16:46 much time I spend thinking about writing?
    0:16:53 Like, how much time I spend on every sentence, because, you know, it’s like a chef, right?
    0:16:56 Like you eat something and you’re not sure why it’s good.
    0:16:59 Like, you can’t say every little thing about why it’s good, but you know, like, when it’s
    0:17:00 really, really good, right?
    0:17:02 Like, you do know that.
    0:17:07 If you give a fuck about the writing, like you give a fuck about the cooking, people
    0:17:09 enjoy it more.
    0:17:10 You know what I mean?
    0:17:14 And that means is because they enjoy it more, they’re more apt to read it.
    0:17:20 So the thing you’re trying to get across is actually more likely to be consumed.
    0:17:23 If you give a fuck about how it sounds.
    0:17:27 If you give a fuck about how efficient it is, well, I’m really cursing a lot.
    0:17:28 You’re on the right show.
    0:17:32 This is actually how I’m in class, by the way.
    0:17:34 I can’t help it, man.
    0:17:36 So you’re in this is a safe space.
    0:17:37 Yeah.
    0:17:38 No, no.
    0:17:41 If you like, if you give a fuck about the words, man, if you give a fuck about the sentence,
    0:17:46 I mean, like one of my, now I’m on my rant, one of the most frustrating thing in the world
    0:17:50 is like, you pick up your average, you know, iPad page or you look at like, like the internet
    0:17:57 is a wash in opinion and it’s a wash in opinion of people who could give a rat’s ass about
    0:18:03 like their sentence structure and what they’re doing, it is like rife with fucking cliche.
    0:18:04 You know what I mean?
    0:18:07 Like, repeated notes that they clearly heard from somebody else.
    0:18:14 No attempt to like, think about like how they’re saying something that is original and new.
    0:18:18 Are they even reflecting the beautiful original thought that they had themselves?
    0:18:19 Have they found the language?
    0:18:21 Have they found the words?
    0:18:23 Does anybody read poetry anymore?
    0:18:27 Does anybody read novels for the language of it?
    0:18:31 Because if you can do that, I mean, it’s amazing.
    0:18:36 Like to be in this world where like, I know that people care about language.
    0:18:37 I know they do.
    0:18:40 You know, I can tell they just don’t know that they care.
    0:18:44 And I just wish more writers took more time.
    0:18:48 I wish we took more time because we’re in competition with so many other media at this
    0:18:49 point.
    0:18:50 Right?
    0:18:51 Like we went around, why is nobody reading?
    0:18:52 Why aren’t the kids reading?
    0:18:53 Mother fucker.
    0:18:54 Why aren’t you writing?
    0:18:57 Why aren’t you writing?
    0:18:58 You know what I mean?
    0:19:00 Like why aren’t you giving a fuck about writing?
    0:19:02 What are you doing complaining about the read?
    0:19:03 That’s your responsibility.
    0:19:04 That’s your job.
    0:19:08 You know, and so that’s Professor Coates.
    0:19:09 That’s my rant.
    0:19:10 Profane and sorry.
    0:19:16 I want to follow you down this road so bad, but I also don’t want to float a thousand miles
    0:19:17 away from your book.
    0:19:19 Oh no, this is the book though.
    0:19:20 No, you’re not.
    0:19:21 You’re right on the book.
    0:19:25 The book is about because actually there’s a politics attached to this because if you
    0:19:30 really do care about the issues, right, like you are doing all you can early in the book
    0:19:34 I talk about how like, you know, I’m reading this article in sports illustrated, right,
    0:19:38 about the sky got paralyzed on the field and I couldn’t put the shit down.
    0:19:41 I’m seven years old is Daryl Stingley, right?
    0:19:45 And I’m reading this because Tony Dorsett, Dallas Cowboys running back is on the cover.
    0:19:48 And I can’t put this shit down, man.
    0:19:49 Why can’t I put it down?
    0:19:51 Like, what is what is holding me?
    0:19:52 What is the attraction?
    0:19:53 What is the gravity?
    0:19:59 What a gravity is this writer has worked with Daryl Stingley and done the work of trying
    0:20:04 to conjure a voice that is in Daryl Stingley’s voice, right?
    0:20:09 Which means now there’s a kind of intimacy because the person is telling me about this
    0:20:14 horrible thing Stingley was a paralyzed on a hit that’s happened to them.
    0:20:19 And even though I don’t know it because I’m seven years old, I can’t put this thing down.
    0:20:23 And when I finally finish it and put it down, the story is lodged in my head.
    0:20:25 So much that I go ask my father about it.
    0:20:29 And my father sends me to other books and I go read those books and I’m upset because
    0:20:32 the answer’s on in those books and there’s no internet, right?
    0:20:33 1983.
    0:20:34 You know what I’m saying?
    0:20:36 There’s no Google or anything.
    0:20:40 And I can’t let this thing go.
    0:20:42 What is that?
    0:20:43 That’s writing.
    0:20:46 And that’s what any writer, you know, really, really wants to do.
    0:20:50 And so like when the book is talking about politics, whether it’s bookmanning in South
    0:20:55 Carolina, whether it’s searching for your identity in Senegal, whether it’s watching
    0:21:01 other people war over their identity in the West Bank or in Palestine or in Israel.
    0:21:06 The thing that I am trying to do is hold you there, hold you there in the way that I was
    0:21:08 held when I was seven years old.
    0:21:12 And I’m trying to hold you there for political reasons because I care about this politics
    0:21:16 and I have the right, like I care, the right, like this is the most important thing in the
    0:21:22 world and make you feel that hopefully while you really got it right.
    0:21:25 This is good, man.
    0:21:27 It’s storytelling, right?
    0:21:35 I mean, I’ve heard you say many times that politics is downstream from culture, which
    0:21:39 is to say what happens in a political world is a function of the stories.
    0:21:40 Yes.
    0:21:41 We tell each other.
    0:21:46 Yes, you and I are talking and there’s this larger national conversation we’ve been having
    0:21:53 about history and how we tell it and what we leave out and why it matters.
    0:21:58 And you have this line in the essay on South Carolina where you say that literature is
    0:21:59 anguish.
    0:22:04 And I guess it’s not that hard to understand why writing and talking about history is such
    0:22:05 a fight.
    0:22:11 We have this eternal struggle over narrative supremacy, whose story gets told, whose story
    0:22:15 gets marginalized, who are the heroes, who are the villains?
    0:22:21 And it feels like all of this shit, like this is what politics is, a high stakes storytelling
    0:22:23 competition.
    0:22:24 Do you see the world that way?
    0:22:28 Does it drive not only what you write, but how you write?
    0:22:29 Yeah, I do.
    0:22:32 I mean, I don’t want to be too reductive, but yes, it’s very important.
    0:22:37 It’s where I was thinking the other day, right, like why is, you know, and I’m literally
    0:22:41 asking this as a question, I don’t mean this as like a critique.
    0:22:46 Why does Kamala Harris need you to know that she owns a gun?
    0:22:47 Why?
    0:22:49 Like what is going on?
    0:22:55 And if I want to answer that question, I would suggest it probably has something to
    0:22:57 do with the stories we tell about gender.
    0:23:01 It probably has something to do with stories we tell about race, though probably less so
    0:23:03 than gender.
    0:23:07 It probably has something to do with like dirty Harry, probably has something to do
    0:23:11 with like cowboys and how we think about, you know, law and for you breaking my home,
    0:23:12 you’re going to get shot.
    0:23:14 Like, why are you saying it that way?
    0:23:15 What are you appealing to?
    0:23:19 I’m not saying, you know what I mean, like again, I’m asking this as a question of technique
    0:23:20 and form.
    0:23:24 And I guarantee you, like you start picking that apart.
    0:23:25 What is she trying to get to?
    0:23:26 You’re going to get the stories.
    0:23:31 You will get to questions of storytelling and tropes that she’s pulling on that are
    0:23:36 themselves derived and have been exemplified by other stories.
    0:23:39 But if it’s so obviously inauthentic, what’s the point?
    0:23:41 I don’t know that it’s obviously inauthentic.
    0:23:42 Yeah.
    0:23:46 I mean, I’m asking, I guess the way I said it kind of sounded like an assertion.
    0:23:51 I mean, it sounds inauthentic to us and maybe it is obviously inauthentic to a lot of people.
    0:23:53 I mean, I’m really cynical about this kind of shit.
    0:23:54 Right?
    0:23:55 For me, it’s really simple, right?
    0:23:59 You just had a bunch of political operative types, do a bunch of focus groups and apparently,
    0:24:02 you know, this words, these language, these stories, these imagery poles.
    0:24:03 Yes, but why?
    0:24:04 Yes, but why?
    0:24:05 But why, Sean?
    0:24:06 Why?
    0:24:07 Because I don’t know, middle-aged white guys in Pennsylvania are into it.
    0:24:08 I don’t know.
    0:24:09 But why are they into it?
    0:24:10 That’s what I’m saying.
    0:24:12 Like, you start like, what are they into?
    0:24:17 Like, if you keep asking the question, you will undoubtedly get to somebody’s commercial,
    0:24:18 somebody’s movie.
    0:24:19 Yeah.
    0:24:20 I would like to think somebody’s novel.
    0:24:22 The novel is probably underneath of the movie.
    0:24:24 Somebody’s TV shows, something they saw.
    0:24:25 You know what I mean?
    0:24:27 That really ingrained this idea.
    0:24:28 Well, what’s your answer?
    0:24:29 What’s your answer to the why?
    0:24:30 I don’t know.
    0:24:31 I haven’t thought about it long enough.
    0:24:32 Come on, professor.
    0:24:33 I knew he was just thinking about this.
    0:24:34 He just embarrassed me in front of the whole class.
    0:24:35 Now, you at least got to drop some off.
    0:24:36 I know.
    0:24:37 If I was just in class, I would go around the room.
    0:24:39 Like, when we would talk about it and then, like, you know, we would go back and forth
    0:24:41 and we would arrive at some sort of answer.
    0:24:43 I don’t actually know.
    0:24:44 Sorry.
    0:24:45 It’s all right.
    0:25:01 Support for the gray area comes from Mint Mobile.
    0:25:04 Sometimes a deal is too good to be true.
    0:25:05 You know the feeling.
    0:25:09 You find that great deal on a rental car, only to realize the lock is broken and the
    0:25:11 windows won’t roll up.
    0:25:15 Well, Mint Mobile says with their deals, there are no catches.
    0:25:17 What you see is what you get.
    0:25:22 When you purchase a new three month plan with Mint Mobile, you’ll pay just $15 a month.
    0:25:23 That’s it.
    0:25:27 No hoops to jump through, no sneaky fine print that you can barely read.
    0:25:29 Just a great deal.
    0:25:33 All Mint Mobile plans come with a high speed data and unlimited talk and text delivered
    0:25:36 on the nation’s largest 5G network.
    0:25:40 You can even keep your phone, your contacts, and your number.
    0:25:44 You can get this new customer offer and a three month premium wireless plan for just
    0:25:49 $15 a month by going to mintmobile.com/grayarea.
    0:25:53 That’s mintmobile.com/grayarea.
    0:25:58 You can cut your wireless bill to $15 a month at mintmobile.com/grayarea.
    0:26:03 $45 upfront payment required equivalent to $15 per month.
    0:26:06 New customers on first three month plan only.
    0:26:11 Speed slower above 40 gigabytes on unlimited plan, additional taxes, fees, and restrictions
    0:26:12 apply.
    0:26:15 See Mint Mobile for details.
    0:26:17 Support for the gray area comes from Borough.
    0:26:22 I’ve got three fun words for you that don’t really go together.
    0:26:23 Ultimate.
    0:26:24 Louncing.
    0:26:25 Experience.
    0:26:26 Ultimate.
    0:26:27 That’s intense.
    0:26:28 That’s outdoors.
    0:26:30 That’s facing the elements.
    0:26:31 Louncing.
    0:26:33 That’s relaxed.
    0:26:34 Chill.
    0:26:35 Put your feet up.
    0:26:36 An experience.
    0:26:40 Well, that word kind of goes with everything.
    0:26:44 But if you’re looking for a true ultimate lounging experience, you might want to check
    0:26:45 out Borough.
    0:26:50 Borough is a new kind of furniture company that takes design seriously and that extends
    0:26:51 to their outdoor collection.
    0:26:57 Borough’s outdoor furniture is made for all seasons and built to withstand the elements,
    0:27:02 featuring rust proof stainless steel hardware and quick dry stain resistant foam cushions.
    0:27:07 Borough prides themselves on their weather ready teak, which is grade A FSC certified
    0:27:10 and offered at a great price point.
    0:27:13 And no matter what you choose, you always get free shipping on every order.
    0:27:19 You can check out Borough’s seating options and all their incredible furniture at borough.com/fox
    0:27:21 and get 15% off when you do.
    0:27:26 That’s borough.com/fox for 15% off your borough purchase.
    0:27:30 Borough.com/fox.
    0:27:33 Borough for the Gray area comes from Shopify.
    0:27:38 Every great business starts with a great idea or a kind of meh idea that’s so bizarre it
    0:27:43 becomes weirdly successful like that singing big mouth bass that’s been installed in a
    0:27:44 million garages.
    0:27:49 But to make your business successful over time, you need more than an idea.
    0:27:54 You need a partner who can help you achieve sustainable growth, a partner like Shopify.
    0:27:58 Shopify is an all in one digital commerce platform that may help your business sell better
    0:28:00 than ever before.
    0:28:04 No matter where your customers spend their time, scrolling through your digital feed
    0:28:09 or strolling past your physical, actual storefront, Shopify may help you convert those browsers
    0:28:12 into buyers and to sell more over time.
    0:28:17 There’s a reason companies like Allbirds turn to Shopify to sell more products to more customers.
    0:28:20 Businesses that sell more sell with Shopify.
    0:28:23 Want to upgrade your business and get the same checkout Allbirds uses?
    0:28:28 You can sign up for your $1 per month trial period at Shopify.com/box.
    0:28:47 You can go to Shopify.com/box to upgrade your selling today, Shopify.com/box.
    0:28:50 What’s been the most surprising thing to you about the reaction?
    0:28:54 To this book so far, you knew you were going to take all kinds of shit because everyone
    0:28:58 who lunges into this discourse on Israel and Palestine takes shit.
    0:29:02 But has anything really surprised you?
    0:29:05 I’m surprised at the surprise.
    0:29:06 What do you mean?
    0:29:09 I can’t say going into the CBS interview is the first interview.
    0:29:10 Well, I guess there were a couple of tape ones.
    0:29:11 That was the first one.
    0:29:12 That was the first live one.
    0:29:13 Yeah, that was the first live one.
    0:29:16 Tom Hush, I want to dive into the Israel-Palestine section of the book.
    0:29:18 It’s the largest section of the book.
    0:29:23 And I have to say, when I read the book, I imagine if I took your name out of it, took
    0:29:27 away the awards and the acclaim, took the cover off the book, the publishing house goes
    0:29:28 away.
    0:29:34 So, I’m surprised that that section would not be at a place in the backpack of an extremist.
    0:29:35 I was not surprised that it was raised.
    0:29:42 And I was not surprised by the aggression, tenacity, whatever you want to call it, with
    0:29:43 which it was raised.
    0:29:47 Or I should say, I knew that was going to happen eventually.
    0:29:48 I didn’t know it was going to happen there.
    0:29:51 So, I was surprised in the sense that, “Oh, it’s right now.”
    0:29:56 And it took me a minute to catch up with, “Oh, it actually really is right now.”
    0:29:59 But this is what it is.
    0:30:03 I mean, I’m surprised that people are like, “I can’t believe that happened.”
    0:30:04 It’s so funny, man.
    0:30:07 I’ll give you some.
    0:30:08 When we were…
    0:30:10 I’m about to embarrass some people.
    0:30:15 My great publicist, Greg Cooby, who is somewhere watching this right now, I’m embarrassed him.
    0:30:16 In the green room.
    0:30:20 I said to him, “Who is this tremendous publicist that should get an award for all of this?”
    0:30:21 I said to him, “You know, he’s booking all of these shows.”
    0:30:23 I said, “Have they read the book?”
    0:30:25 Like he said, “Have they read the book?”
    0:30:26 “You sure?”
    0:30:27 And I told him.
    0:30:28 I told him, “They’ll read it.
    0:30:29 They’ll read it.”
    0:30:32 I’m like, “Okay.”
    0:30:33 Because this is going to…
    0:30:34 You know what I mean?
    0:30:41 I understand I am going to go into some arenas where you don’t usually say the state of Israel
    0:30:43 is practicing apartheid.
    0:30:48 That’s just not a thing that you usually hear people saying in places like that.
    0:30:51 And so, I am going to say that.
    0:30:54 And what’s going to come out of that, I have no idea.
    0:30:57 But I hope people understand that this is what’s happening.
    0:31:02 What is it that so particularly offends you about the existence of a Jewish state that
    0:31:07 is a Jewish safe place, and not any of the other states out there?
    0:31:09 There’s nothing that offends me about a Jewish state.
    0:31:13 I am offended by the idea of states built on ethnocracy, no matter where they are.
    0:31:17 I guess in my mind, I was like, “It’s no way that I say that.”
    0:31:19 And people say, “Well, that’s very interesting, Tana.
    0:31:21 I see what do you mean by that?”
    0:31:22 You know?
    0:31:25 And I knew that that would never, like that was not going to be the reaction.
    0:31:28 So, I was very clear on that.
    0:31:35 And then the interview went how it went, and I probably was a little surprised that people
    0:31:36 were surprised.
    0:31:40 I’m a little surprised at the fear around it, you know?
    0:31:41 Yeah.
    0:31:45 The deliberate choice to write about this, you know, the essay on Palestine, it’s the
    0:31:47 longest in the book.
    0:31:56 And you knew before that appearance on CBS that this is just an impossibly charged issue.
    0:31:58 Why wait into these waters?
    0:31:59 Why this conflict?
    0:32:00 Why not?
    0:32:02 I don’t think it’s impossibly charged.
    0:32:06 When I went over there, there are things that it’s actually hard to disentangle.
    0:32:12 Like, it’s really hard to understand, like, what is actually, you know, happening.
    0:32:13 I will give you an example.
    0:32:19 Like, I’ve written about this, to disentangle the force of race versus class on the lives
    0:32:22 of African Americans, and understand what is actually happening there.
    0:32:26 It’s actually quite difficult to see what is acting where, it doesn’t mean you can’t
    0:32:27 do it.
    0:32:28 You can, you know what I mean?
    0:32:30 And some, you know, great academics especially.
    0:32:33 You know, I had to do this for great case for reparations, you know, to really understand.
    0:32:34 That was hard.
    0:32:37 Like, people were, you know, running regression studies, looking at, I mean, it was actually
    0:32:41 quite, quite hard, you know what I mean, to understand that.
    0:32:52 This is so clear, like, it was so clear, and when I saw that, and maybe this is like naive,
    0:32:55 you know, like, maybe even, maybe you’re right, you know, maybe it is impossibly charged,
    0:33:03 but I was just like, oh, this is easy, like, not easy, like, easy to do, like, easy to
    0:33:06 write, but it’s like, the math is clear, like, there is, you know what I mean, like,
    0:33:11 this is so clearly what I, the word I used at the time when I, when I saw it was Jim
    0:33:12 Custle.
    0:33:13 Obviously, Jim Crow.
    0:33:18 You tell me, you got one set of roads for one group of people, another set of roads
    0:33:23 for another group of people, and the roads you have for the other group of people are
    0:33:25 impossibly longer.
    0:33:30 They take more to get from point A to point B. Those roads have like checkpoints, and the
    0:33:34 checkpoint sometimes materialized, I don’t know, and this is all fact, like, whatever
    0:33:36 you think about it, like, maybe you think that’s the way it should be, but this is
    0:33:37 what it is, right?
    0:33:40 This, this is actually what it is, right?
    0:33:46 You’re telling me that one group of people has constant access to running water, and
    0:33:50 the other group of people don’t know when their water might be cut off.
    0:33:54 You’re telling me that, that other group of people, depending on where they live, if they’re
    0:34:01 in a particular area on the West Bank, it might be illegal for them even to collect rainwater.
    0:34:07 You’re telling me one group of people has access to a civil system of criminal justice
    0:34:11 so that when they get arrested, they know their rights, they tell why they’re arrested,
    0:34:17 lawyer, et cetera, and you’re telling me the other group has no access to that, that they
    0:34:22 can be arrested, that no one needs to tell them why they’re being arrested?
    0:34:23 What is that?
    0:34:26 I’m glad we got here, you know, because- Like, what is that?
    0:34:35 I mean, that is just the uncontested thing of what it is, so to me, that’s okay.
    0:34:36 Yeah.
    0:34:39 I mean, you’re on the show.
    0:34:44 It’s called the gray area, for a reason, and I’m giving you black and white.
    0:34:45 Yeah.
    0:34:46 I love that.
    0:34:48 I mean, this is the shit, man.
    0:34:51 This is what we’re here for.
    0:35:01 It’s called that because I think life is messy and complicated, and the temptation to blot
    0:35:07 out complexity for the sake- Hold on now, hold on, Professor, just hold on.
    0:35:13 The tendency to blot out complexity for the sake of a more simple story is understandable,
    0:35:18 but I do think it can become dangerous in its own way, and I’m constantly attuned to
    0:35:19 that threat.
    0:35:23 I do attuned, actually, and I like that this is a reflex.
    0:35:27 You challenge in the book, and you’re challenging here because it really forced me to think
    0:35:30 about it as I was reading it, and I’m thinking about it now.
    0:35:34 This isn’t a debate show, and it’s not a certain CBS morning show.
    0:35:35 Right, right, right.
    0:35:37 I don’t give a shit about winning arguments or creating spectacle.
    0:35:42 I really want to understand what someone is thinking and what I can learn from them.
    0:35:45 But, Sean, it is complex.
    0:35:47 It’s just not complex in the way they say it is.
    0:35:49 Okay, so you’ve got to help me understand that.
    0:35:56 It is extremely complex, but it’s not in the way the complexity that they’re selling you
    0:35:57 is not the complexity.
    0:35:59 See, that’s what I want to iron out, right?
    0:36:02 So, when I was reading it in the book and listening to you when you compare Palestine
    0:36:07 to the Jim Crow South, my reaction while reading that is, “Yeah, these are both moral
    0:36:11 obscenities, but they’re different, and I do think it’s complicated.”
    0:36:12 Right.
    0:36:13 So tell me about that.
    0:36:14 Tell me about that.
    0:36:15 Why I think it’s complicated?
    0:36:16 Yeah.
    0:36:17 Why would you say it’s different?
    0:36:21 You know, like, first of all, do you think the Jim Crow South was uncomplicated?
    0:36:24 No, just complicated in a different way.
    0:36:25 Right?
    0:36:26 I mean, I can tell you why I think they’re different.
    0:36:27 Okay.
    0:36:28 Go ahead.
    0:36:32 I think it matters that many Palestinians still support the October 7th attacks.
    0:36:33 Right.
    0:36:34 Right.
    0:36:39 I think that black people in the Jim Crow South wanted to be treated as equal citizens
    0:36:40 in a fully democratic America.
    0:36:41 I think that matters.
    0:36:46 I don’t think it’s generally true that Palestinians want equal rights in a fully democratic Israel.
    0:36:50 And if they had that, they might vote to end its existence as a Jewish state.
    0:36:51 And you know what?
    0:36:57 If I was a Palestinian who was pulling my friends and my family out of the rubble, I’d
    0:36:58 probably vote the same way.
    0:37:05 I mean, personally, I hate the idea of a state based entirely in religious or ethnic identity,
    0:37:09 but I’m not Jewish and I don’t live in Israel, and I understand why this is a problem for
    0:37:10 them.
    0:37:11 Right?
    0:37:14 And I also think it matters that Jews are also indigenous to that land, have nowhere
    0:37:15 else to go.
    0:37:17 I think that complicates the picture in other ways.
    0:37:18 Right?
    0:37:19 That’s my feelings.
    0:37:23 Now you can go ahead and respond.
    0:37:27 So I am of the mind, and everybody does not have to agree with this, but I just want to
    0:37:29 clarify a real distinction.
    0:37:32 And then I want to go through the example you gave because I actually think it’s actually
    0:37:33 quite helpful.
    0:37:34 Yeah.
    0:37:39 I am of the mind that discrimination on the basis of race, ethnicity, religion is never
    0:37:41 acceptable.
    0:37:46 There is nothing in this world that will make separate and unequal.
    0:37:49 And as far as I am concerned, and I will use this word, and we can debate this word if
    0:37:54 we need to, there is nothing that makes apartheid, nothing.
    0:37:58 So that’s where the, like when we talk about, like that’s not complex for me.
    0:38:02 It’s like the death penalty is not really complex for me because you cannot guarantee
    0:38:05 to me that the state will not execute an innocent person.
    0:38:06 You just can’t.
    0:38:07 You can’t.
    0:38:10 I mean, I might not be for it even if you could, but among other reasons, like, so I’m
    0:38:15 against it period, like there aren’t exceptions to that.
    0:38:25 It’s hard as an African American for me to argue for exceptions for apartheid.
    0:38:26 And I will tell you why.
    0:38:33 See, the thing you have to do is not judge Jim Crow from right now or I would argue slavery
    0:38:34 from right now.
    0:38:37 You have to put yourself in the shoes of the people that were there in the debates at
    0:38:38 the time.
    0:38:42 And I assure you, they did not think it was simple.
    0:38:47 And I like this reasoning about complexity is actually, you talked to Thomas Jefferson,
    0:38:48 right?
    0:38:52 He would have said, did say, you know what, I think this is a moral abomination.
    0:38:55 But we have the wolf by the ear.
    0:39:01 That’s how he described the practice of holding people and selling them for profit into slavery.
    0:39:03 What did he mean by we have the wolf by the ear?
    0:39:08 He means we need to let it go, but we dare not.
    0:39:12 He’s trying to get you in his mind, the complexity of enslavement.
    0:39:19 If not for the civil war, if not for like a cataclysmic war that kills what, 800,000 Americans
    0:39:22 or something, certainly there’s no stroke of the pen abolition.
    0:39:23 Why?
    0:39:25 Because they describe, what are we going to do with them?
    0:39:26 Where are they going to go?
    0:39:27 You know what I mean?
    0:39:28 All of these, you know, sort of issues.
    0:39:33 In addition to this, in addition, what they would say is democracy is fit for a certain
    0:39:34 class of people.
    0:39:35 This is what they believe at the time.
    0:39:38 So you have to take it seriously, even if it sounds ridiculous to you right now.
    0:39:40 Democracy is fit for a certain class of people.
    0:39:42 These people have not been educated.
    0:39:46 They’re just a few generations out of the wilds of Africa where they worship some savage
    0:39:47 God.
    0:39:48 They’re barely Christ.
    0:39:50 Like this, these, this is the logic.
    0:39:54 If you moved into the 20th century, you’re getting closer to our lifetimes.
    0:39:57 People would have said, what about crime?
    0:39:59 Crime in these people’s neighborhoods, they would have cited the statistics and they would
    0:40:00 have been real.
    0:40:01 They would have been real.
    0:40:05 Crime in these black courtes, these circuit is ridiculous.
    0:40:07 If we integrate, we will inherit that.
    0:40:09 We will now have to deal with that.
    0:40:14 So like to them segregation was also complicated.
    0:40:19 So I think in those instances, you have to just like, this becomes right or and wrong.
    0:40:25 You know, like I think what’s crime not higher and black, yeah, it was, yeah, it was.
    0:40:29 Can I guarantee that if they free all enslaved people and they try to, you know, move into
    0:40:31 America actually went remarkably smooth.
    0:40:32 So that’s not a good example.
    0:40:36 I’m going to turn out those arguments were a cover for something else as they might be
    0:40:38 here by the way.
    0:40:43 But what I’m saying is in that time, it is not as if people were like, this is simple
    0:40:44 and clear.
    0:40:45 They were not.
    0:40:57 So I have to ask myself, do I believe that a demographic project, which is what Israel
    0:41:05 is and what they say out loud, does that goal accord with my humanistic values?
    0:41:10 Do I want my country supporting that?
    0:41:14 And I don’t think I do because the moment you say that, you must discriminate.
    0:41:15 You have to.
    0:41:16 Yeah.
    0:41:22 There is no world in which you have a Jewish democracy and a Jewish state.
    0:41:26 But you just, and not, and again, I always had to be careful about this.
    0:41:27 It’s not the Jewishness of it.
    0:41:28 You understand?
    0:41:34 Like I believe this about ethnicity period across the board.
    0:41:35 I haven’t been to Palestine.
    0:41:38 Oh man, you should go.
    0:41:41 But I know it’s bad and I know what you saw there is wrong.
    0:41:46 And I don’t believe there is any such thing as a moral occupation because whatever the
    0:41:51 reasons for it, you cannot occupy a people without visiting cruelties upon them.
    0:41:53 It’s full stop, right?
    0:41:58 For me, the first question I go to, the main question is, is it necessarily the badness
    0:42:01 of the situation, which is incontestable and egregious and obvious.
    0:42:06 It’s how the hell do we stop it?
    0:42:12 And for me, all these complications that I was mentioning earlier, that’s the stuff
    0:42:15 that has to be accounted for if there’s any hope of a way forward.
    0:42:19 But you’re not here to proffer some two-state solution or figure out a solution.
    0:42:23 And it goes back to that interview on CBS.
    0:42:28 There is no, I mean, I don’t actually have a solution, but I do, I do.
    0:42:36 We are sitting here asking ourselves why we don’t have a workable solution while we exclude
    0:42:38 one of the two significant parties.
    0:42:41 And I guess my politic would say the most significant party, because that’s just where
    0:42:46 I come from in terms of the oppressed, from the conversation.
    0:42:55 How can you decide what is going to be the solution when every night, when I cut on TV
    0:43:02 and I watch reports from the region, I can name only one person who was a Palestinian
    0:43:08 heritage who I regularly see articulate a solution or an idea.
    0:43:15 How do we get to a solution when our journals, our newspapers, our literature that dominates
    0:43:20 the conversation is not just the void of Palestinian perspectives, but it’s the void of Palestinians
    0:43:21 themselves.
    0:43:27 We are not having a conversation about solutions because we’ve basically prevented a whole
    0:43:31 group of people from entering into the frame.
    0:43:33 And so it’s like, we’re kind of putting the cart before the horse.
    0:43:38 We’re frustrated that we don’t have a solution, but like, we’re not actually talking to somebody.
    0:43:39 You know what I mean?
    0:43:42 It’s like, you know, you go into the, sorry, I cook, so I have all of these like cooking
    0:43:43 metaphors.
    0:43:44 No, I love it.
    0:43:48 And you do your mac and cheese and, you know, it turns out terrible and you’re like, why
    0:43:49 did this turn out?
    0:43:50 Well, do you have a recipe?
    0:43:54 Like, do you actually, did you take the time to come up with, you know what I mean, the
    0:43:55 ingredients?
    0:43:56 Did you talk to anybody?
    0:43:59 Like, or did you just go and, you know, go pasta and milk and you know what I mean?
    0:44:00 And, you know, do you know anything about a rue?
    0:44:02 Do you know anything about that?
    0:44:03 You know what I mean?
    0:44:04 Do you know anything about like what cheese melts and what’s that like?
    0:44:06 Have you had these conversations?
    0:44:11 This is on us, by the way, this is journalism’s great sin.
    0:44:14 And this is how, and I’m going to say something like, you know, I call it, I’m about to say
    0:44:15 the extremist thing.
    0:44:17 It’s like, I call it a CBS, right?
    0:44:19 This is our contribution to apartheid.
    0:44:23 Because we are the agents by which people are dehumanized.
    0:44:25 You know, and I want to make that very, very specific.
    0:44:30 When you exclude people from the conversation, when they don’t have a role in your journalism,
    0:44:34 when they don’t have a role in your film, when they don’t have a role in your TV, when
    0:44:40 they don’t have a role in your books, they cease to exist as people and become these
    0:44:46 kind of cartoon cutouts that other people make of them, and they become much more easy
    0:44:48 to kill.
    0:44:51 That’s on us.
    0:44:54 It’s extreme, but I believe it.
    0:44:55 Like I think it’s true.
    0:45:01 I don’t know, man, I do, I think our moral imagination needs to extend in both directions
    0:45:08 as far as possible, but I, the more I’ve listened to you and as I made my way through your book,
    0:45:10 I think I understand where you’re coming from.
    0:45:17 I understand writing this as a kind of corrective feeling like there was a lack of empathy for
    0:45:21 the Palestinian experience, because their story hasn’t been told enough, hasn’t been
    0:45:22 represented enough.
    0:45:23 I can understand that.
    0:45:24 I really can.
    0:45:30 And if I’m being honest, I mean, I think if I went there, like you, and saw the suffering
    0:45:35 firsthand, all of this would feel a whole lot less abstract to me, and it would hit
    0:45:36 differently.
    0:45:38 And I don’t know how that would change, how I think about it.
    0:45:41 So when are you going to go, Sean?
    0:45:43 I don’t know.
    0:45:44 You should go.
    0:45:45 I don’t know.
    0:45:46 I know it’s hard.
    0:45:50 And I look, I just, I’m putting you on the spot, but it was extremely hard.
    0:45:51 I’m going to fail this class, aren’t I?
    0:45:52 No, no, you’re not.
    0:45:53 No, no, no.
    0:45:54 Here’s, look, look, look.
    0:45:55 First of all, you are a journalist.
    0:45:56 That’s the first thing.
    0:45:57 Okay.
    0:45:58 That’s my first case towards you for going.
    0:46:02 The second case is this is being done in your name, man.
    0:46:03 And we’re going to pay for it.
    0:46:05 We’re going to pay for it one way or the other.
    0:46:07 We will pay for this.
    0:46:08 We will pay for this.
    0:46:13 I, God, I’m now, I think it’s your responsibility to go.
    0:46:14 I’m sorry.
    0:46:15 I really do believe that.
    0:46:22 I really, really do believe that because you are someone who is obviously curious, obviously
    0:46:23 want to, you know, know things.
    0:46:30 And the reason why I’m pushing you is because that kind of vague sense of injustice is exactly
    0:46:31 what I had.
    0:46:34 That is exactly how I felt, man.
    0:46:35 But can I push you a little bit on that?
    0:46:36 Sure.
    0:46:37 Go ahead.
    0:46:38 Here’s the thing.
    0:46:39 I have no doubt about that.
    0:46:48 But if I went to Israel and toward the villages that were plundered on October 7th, I’d feel
    0:46:50 the same kind of indignation and rage.
    0:46:51 You should though.
    0:46:52 So what do you do with that?
    0:46:54 But I don’t think those are contrary.
    0:46:55 Yeah.
    0:46:56 No, I don’t mean to say they’re contrary.
    0:47:02 I’m just saying I would still be left feeling the sense of hopelessness really at the, the
    0:47:06 tragedy of it all and the fact that it just seems to be.
    0:47:07 I think you would no more though.
    0:47:09 I think, I think you would no doubt about that.
    0:47:13 And I think you can judge how you would feel until you go.
    0:47:15 And you know what?
    0:47:20 One of the reasons I haven’t opined on this issue very much is that I feel like I don’t
    0:47:21 know what the hell I’m talking about.
    0:47:23 That’s why you, that’s why you go.
    0:47:28 And I don’t want to be one of those assholes who opine on things that they don’t understand.
    0:47:33 But it doesn’t, it doesn’t, you know, opiate the responsibility for going.
    0:47:35 That’s why you go.
    0:47:36 That’s why you go, man.
    0:47:39 And I don’t know.
    0:47:43 I mean, I think you should not now I’m being professor again, like you just don’t know
    0:47:45 until you go through it.
    0:47:46 You don’t know what’s, what’s going to happen.
    0:47:53 I do want to speak to your instinct though that a horror at the desecration and destruction
    0:48:00 of human life on October 7th is somehow contrary or somehow stands even in conflict, I would
    0:48:04 argue with the opposition to apartheid.
    0:48:07 And I’ve thought about this quite a bit, right?
    0:48:16 There is a long record of people who have causes that I would find sympathetic erupting
    0:48:19 in a kind of violence that I recoil from.
    0:48:20 Okay.
    0:48:22 I’m going to speak as I’ll just speak from the perspective of an African American and
    0:48:26 African American history and all my black people are about to cast me out for what I’m
    0:48:27 about to say.
    0:48:34 But I think it’s a good example, 1830s, Nat Turner is enslaved in Virginia, has no rights
    0:48:38 over his body, has no rights over his family, have no rights over their body.
    0:48:40 Anything can be done to him at any moment.
    0:48:46 He has no control over himself and decides in that situation, he’s locked out of a political
    0:48:49 system, can’t vote his way out, can’t do anything.
    0:48:52 And it’s at that moment that the way out is violence.
    0:48:56 And the way out is not just violence, the way out is massacre.
    0:49:00 That is to say, not a violent rebellion in which we strategically target things, for
    0:49:06 instance, like John Brown targets the arsenal at Harpers Ferry, but actual slaughter, okay?
    0:49:13 That means every slave master, every slave mistress, every child must die, gathers together
    0:49:15 his army and they kill everybody they come across.
    0:49:23 They hack infants and the crib, kill women, men, etc.
    0:49:27 You know, I was raised with Nat Turner as a hero of resistance and it’s understandable
    0:49:30 why Nat Turner would be a hero of resistance.
    0:49:34 But the older you get, especially if you’re going to write, you’re going to write.
    0:49:35 You know what I’m saying?
    0:49:38 Not if you’re just going to sort of construct the unmethodology that all people construct.
    0:49:42 But if you’re going to write, you’re going to interrogate your own stories, you take
    0:49:51 a deeper look at that and you say, “Does the complete degradation of my life make it
    0:50:02 right for me to take the life of any member of that class that is responsible for that?”
    0:50:07 It never sat with me, and even as a child it didn’t, you know?
    0:50:10 I heard you say your oppression won’t save you.
    0:50:11 It won’t.
    0:50:14 It won’t save you from these moral conflicts.
    0:50:21 It won’t save you from these like moral quandaries, but Sean, my feeling that that doesn’t sit
    0:50:24 right with me, my belief that somewhere on those plantations there was an enslaved black
    0:50:28 person that looked at that and said, “I can’t get with this.”
    0:50:32 It doesn’t make enslavement right.
    0:50:33 You understand?
    0:50:37 Recoilant at horror at that death does not somehow make it difficult to pass judgment
    0:50:40 on the system.
    0:50:43 I actually think they emanate from a, I hope they emanate.
    0:50:48 I think they emanate from a similar value, and that is the value of human life.
    0:50:54 I guess there’s this broader question about how much context do we need in order to pass
    0:50:57 moral judgment, and I’m not sure how answerable that question is.
    0:50:58 You’ve got to go.
    0:50:59 Because, what?
    0:51:00 Fair.
    0:51:01 Fair.
    0:51:02 Fair.
    0:51:03 Fair.
    0:51:05 This is what I thought.
    0:51:06 You sound like me.
    0:51:07 No, I mean…
    0:51:08 You sound like me.
    0:51:09 This is what I thought.
    0:51:13 Even for the trip, I was like, “Boy, this is going to be really complicated.”
    0:51:17 I thought the morality of it would be, and I think quite of it is, and I want to say
    0:51:23 this, is there’s a reason why I began that chapter in Yad Vashem, and it is because the
    0:51:30 fact of existential violence and industrial genocide brought to the Jewish peoples of
    0:51:36 this world is a very, very real thing.
    0:51:43 It’s like, how do you confront that and reconcile that with Israel, because you want that group
    0:51:45 of people to be okay?
    0:51:49 You feel like maybe that group of people is entitled to certain things, I mean that in
    0:51:50 the best kind of way.
    0:51:54 They’re entitled to a kind of safety given what happened to them.
    0:51:59 You feel deep, deep sympathy, and so before I went, I was like, “Wow, this is going to
    0:52:04 be morally like dicey.”
    0:52:05 I think you should go.
    0:52:07 I’m not even saying you’re going to agree with me.
    0:52:13 I’m not saying you’re going to end up where I ended up, but I think you should go.
    0:52:23 Do you think both sides of this conflict can tell a story about it that makes them right
    0:52:24 and the other side wrong?
    0:52:29 And at this point, are there so many victims and perpetrators on both sides because the
    0:52:38 cycle of violence and retaliation stretches back so far that it’s a kind of, I’m searching
    0:52:40 for a word and I can’t find it.
    0:52:42 I don’t think it stretches back that far.
    0:52:43 It’s 1948.
    0:52:46 It’s 900 years.
    0:52:47 I guess in historical time.
    0:52:50 I interviewed people that were alive for the book.
    0:52:53 I interviewed people that were very much alive in 1948.
    0:52:54 So I don’t even think it’s back that far.
    0:52:59 I think that we say things like that, no disrespect, but I think we say things like I had to make
    0:53:00 it harder than it actually is.
    0:53:03 It’s a lifetime that is not even over yet.
    0:53:13 And what I would say is my opposition to apartheid, to segregation, to oppression does not emanate
    0:53:18 from a belief in the hyper morality of the oppressed or even the morality of the oppressed.
    0:53:23 See, the civil rights movement kind of fooled us with this because it was kind of a morality
    0:53:26 play and it was a very successful strategy.
    0:53:32 But whether Martin Luther King was nonviolent or not, segregation was wrong.
    0:53:37 Even when Malcolm X was yelling by any means necessary, like segregation was still wrong.
    0:53:39 It was still wrong.
    0:53:43 The system, so for me, it’s not even a matter of sides being right.
    0:53:49 The system that governs both sides is wrong.
    0:53:56 So for you personally, I remember once hearing you talk about the vulgarities of.
    0:54:01 Punditry, pundits are not in the truth seeking business.
    0:54:03 Pundits make pronouncements.
    0:54:06 That’s the whole stupid mindless game.
    0:54:07 But you’re not like that.
    0:54:08 You have never been like that.
    0:54:10 You’re not even on Twitter for God’s sake.
    0:54:12 Thank God.
    0:54:15 Are you on Twitter?
    0:54:16 I am.
    0:54:17 You gotta get off Twitter too, man.
    0:54:18 I know.
    0:54:20 Get off Twitter and you gotta take a plane.
    0:54:26 But one reason I retreated into podcasting is I don’t feel that pressure to pronounce
    0:54:31 in that way, and even doing it in a serious way for me felt futile, but I don’t have your
    0:54:33 stature and I don’t have your reach.
    0:54:34 So it’s different for you.
    0:54:35 I imagine.
    0:54:40 Do you think you can make a difference here or is that not even part of the calculus?
    0:54:43 Is it just I need to write what I saw period?
    0:54:49 I do need to write without what I saw is uncomfortable to say.
    0:54:52 I think this moment matters.
    0:54:56 I was talking to a buddy yesterday, a good friend, well, actually a colleague.
    0:55:02 Let me not overstate my relationship, but a very, very intelligent young writer and
    0:55:03 a sharp young writer.
    0:55:09 And we were actually sitting around the tables, a Muslim woman and another writer there.
    0:55:12 And we were all in sympathy in terms of our politics.
    0:55:16 And she’s kind of making a point that this thing that’s happening right now, it’s
    0:55:20 towards actually, it matters, it’s making a difference.
    0:55:27 And I was saying, I went out like I’m going to do some book tour and then I’m out of here,
    0:55:28 man.
    0:55:32 You know, I’m going back to my French studies, like I’m out, you know, and I’m not out because
    0:55:33 I’m scared to say what I want to say.
    0:55:35 I’m not out because of the heat.
    0:55:39 I am out because I just, it just feels unnatural.
    0:55:46 And part of it feels unnatural is A, I’m not Palestinian, but B, it feels contrary to being
    0:55:51 to writing, which is always seeking, you know, always trying to learn, always trying to figure
    0:55:52 it out, always asking questions.
    0:55:57 And so like when you’re kind of making these pronouncements, as I admit, I am now.
    0:56:01 You wonder, am I actually betraying the craft?
    0:56:04 You know, should I have just written a book, put it out and you know, Donna Alana Ferrante
    0:56:09 or whatever, like, you know, like there’s always that voice in the back of your mind.
    0:56:20 When I was over there, man, what they said to me over and over again was just tell them
    0:56:23 what you saw.
    0:56:49 And this is probably a little impure, but I feel a debt to tell them what I saw.
    0:56:52 Start for the gray area comes from Blue Nile.
    0:56:57 Getting engaged can be a magical once in a lifetime experience, hopefully at least.
    0:57:02 But getting that engagement ring, magical isn’t exactly the right word.
    0:57:07 Too often customers end up dealing with shady shops selling overpriced and unethically sourced
    0:57:08 diamonds.
    0:57:10 Blue Nile is trying to fix that.
    0:57:15 Blue Nile offers engagement rings and other jewelry designed to make finding the perfect
    0:57:18 piece easy, stress-free and fun.
    0:57:23 On BlueNile.com, you can create your own brilliant piece for a price you won’t find at any traditional
    0:57:24 jeweler.
    0:57:28 And they’re committed to selling pieces that are ethically sourced and, thanks to their
    0:57:33 diamond price guarantee, you can rest assured that Blue Nile will match or beat a competitor’s
    0:57:35 price on any equivalent diamond.
    0:57:40 Plus, all of their jewelry comes back by a no-questions-ask-return policy.
    0:57:44 So if that ring isn’t absolutely perfect, they say you can return it for free and get
    0:57:46 a full refund.
    0:57:53 Right now you can get $50 off your purchase of $500 or more with Code Gray Area at BlueNile.com.
    0:58:00 That’s $50 off with Code Gray Area at BlueNile.com, BlueNile.com.
    0:58:05 Support for the show comes from Into the Mix, a Ben and Jerry’s podcast about joy and justice
    0:58:07 produced with Vox Creative.
    0:58:12 In Season 3 of this award-winning podcast, Into the Mix is covering stories on the ordinary
    0:58:16 people fighting for justice in their local communities.
    0:58:19 Starting with the fight against the workhouse, a penitentiary in St. Louis known for its
    0:58:25 abject conditions, mold, and pest infestations, and its embrace of the cash bail system.
    0:58:30 Host Ashley C. Ford interviews Ainez Bordeaux, who spent a month in the workhouse when she
    0:58:33 couldn’t afford her $25,000 bail.
    0:58:38 Experiencing what I experienced and watching other women go through it and know that there
    0:58:49 were thousands before us, and there were thousands after us, who had experienced those same things.
    0:58:51 That’s where I was radicalized.
    0:58:55 Eventually her charge was vacated, but the experience changed her.
    0:58:58 They’re starting a campaign to close the workhouse, so are you interested?
    0:59:00 And I was like, hell yeah.
    0:59:01 Hell yeah, I’m interested.
    0:59:06 You can hear how she and other advocates fought to shut down the workhouse, and won on the
    0:59:09 first episode of the special three-part series Out Now.
    0:59:14 Subscribe to Into the Mix, a Ben and Jerry’s podcast.
    0:59:16 Support for the show comes from Oracle.
    0:59:21 AI might be the most important new computer technology ever.
    0:59:26 It’s storming every industry, and literally billions of dollars are being invested.
    0:59:27 So buckle up.
    0:59:31 The problem is that AI needs a lot of speed and processing power.
    0:59:35 So how do you compete without costs spiraling out of control?
    0:59:39 It’s time to upgrade to the next generation of the cloud, Oracle Cloud Infrastructure,
    0:59:40 or OCI.
    0:59:47 OCI is a single platform for your infrastructure, database, application development, and AI
    0:59:48 needs.
    0:59:53 OCI has four to eight times the bandwidth of other clouds, offers one consistent price
    0:59:57 instead of variable regional pricing, and of course nobody does data better than Oracle.
    1:00:02 So now you can train your AI models at twice the speed and less than half the cost of other
    1:00:03 clouds.
    1:00:09 If you want to do more and spend less, like Uber, 8×8, and Databricks Mosaic, take a free
    1:00:13 test drive of OCI at oracle.com/fox.
    1:00:15 That’s oracle.com/fox.
    1:00:17 Oracle.com/fox.
    1:00:39 There’s a line in your book that feels relevant that I’d like to read if you don’t mind.
    1:00:40 Sure.
    1:00:45 You say, “A belief in genius is a large part of what plagues us, and I have found that
    1:00:51 people widely praised for the power of intellect are as likely to illuminate as they are to
    1:00:52 confound.
    1:00:59 Genius may not help a writer whose job is, above all else, to clarify.”
    1:01:02 I thought a lot about that, and I thought about Orwell, who you quote in the opening
    1:01:06 pages of the book, and I heard you say that the opening letter of the book is a kind of
    1:01:12 homage to Orwell’s why I write.
    1:01:18 Lionel Trilling once wrote of Orwell, and we just did an episode on him, so it’s fresh
    1:01:19 in my mind.
    1:01:25 He said that if we ask what he stands for, what he was a figure of, it’s the virtue of
    1:01:29 not being a genius.
    1:01:34 It’s such a great line, and I come on the show every week, man, and I praise the virtues
    1:01:43 of doubt and uncertainty, and I believe in that, but refusing to describe things simply
    1:01:50 and clearly can become a kind of moral and intellectual crime.
    1:01:52 Orwell was right about that.
    1:01:58 You’re right about that too, and I still think sometimes things are really are complicated
    1:02:03 and not so neat, and maybe the challenge of being a writer and really just a human being
    1:02:07 is being honest and wise enough to know the difference, and yeah, sometimes it is really,
    1:02:09 really hard, but you know what else?
    1:02:16 Sometimes withholding moral judgment can be its own kind of cowardice.
    1:02:21 Yeah, and I just want to take it back.
    1:02:25 When that day comes, the Palestinians are back in the frame, but they are invited to
    1:02:30 tell their own stories, and they are invited to sit at the table, and they take their place
    1:02:31 at the table however you want to put it.
    1:02:35 I have no doubt that what will come out of that will be quite complicated.
    1:02:38 South Africa is complicated.
    1:02:42 They defeated apartheid, but did they change the basic economic arrangements?
    1:02:47 My understanding is not as much as a lot of people would have wished, better than apartheid,
    1:02:48 but it’s not done.
    1:02:50 It is indeed quite complicated, right?
    1:02:56 The victory is indeed quite complicated, but the morality of apartheid is not.
    1:03:02 What is hard for me is, I’ve been on a couple shows now where I’ve had some debate about
    1:03:11 this with people, and they never challenge the fact of what’s going on.
    1:03:18 So when I say half the population is enshrined at the highest level of citizenship, and everyone
    1:03:24 else is something less, they don’t say, “Tanahase, that’s not true.”
    1:03:27 And I say, “Yeah, and that’s not great.”
    1:03:31 When I go through that whole litany, and I don’t have to do it again, and I did earlier
    1:03:34 in the show, they say, “Yeah, and that’s not great.”
    1:03:40 And then, I don’t know, we just kind of get lost in this morass, I feel like.
    1:03:44 But perhaps this is just where I sit, man.
    1:03:49 When your parents grew up in Jim Crow, when they were born into Jim Crow, that is an immediate
    1:03:51 no go, immediate.
    1:03:54 I don’t know what comes after this, but that is wrong.
    1:03:55 That’s wrong.
    1:03:57 You know what I mean?
    1:04:00 What is after that might be quite complicated and quite hard?
    1:04:05 But that is not the answer, at all.
    1:04:11 I’m sitting in a cave in the South Hebron Hills with a group of people, and they’re telling
    1:04:15 me about their fears of being evicted out of a cave, man.
    1:04:20 When I look at, “Hey, you know that’s complicated.”
    1:04:24 And I know for a while, it’s not.
    1:04:33 What to do is probably complicated, but if you begin from a basis of, “This is wrong.”
    1:04:40 And then the very difficult work of figuring it out, maybe you can proceed after that.
    1:04:46 Bringing this back to stories, the power and the danger, I suppose, is part of the problem.
    1:04:54 Do you think that too many of us are just too diluted by convenient stories?
    1:04:55 Yes.
    1:04:57 Yes, I actually do think.
    1:04:58 I was thinking for a second.
    1:04:59 Yes, I do.
    1:05:03 Look, man, what does it say?
    1:05:05 And this is where I kind of went to.
    1:05:06 I thought about this.
    1:05:12 I didn’t get to spend enough time on this essay, but it’s like, you know, I go to Yad Vashem.
    1:05:14 Yad Vashem is harrowing.
    1:05:22 God, man, I walked in and there’s a brilliant art exhibit where they string together home
    1:05:28 movies during the time when the technology was relatively new in all the Jewish communities
    1:05:32 in Europe before the Holocaust.
    1:05:35 And it’s just like people going through there every day.
    1:05:36 It’s not a special, no slogan area.
    1:05:40 It’s just like people, and what you see is like the humanity, like there’s a deep, deep
    1:05:41 humanity.
    1:05:46 It is effective because it says this is what’s about to be snuffed out in the worst possible
    1:05:47 way.
    1:05:55 And you see it and then you go through it and they are experts, you know, really narrating
    1:06:01 the story of what happened and you sit with that horror.
    1:06:04 And I sat with that horror and then, you know, when I went there, it was, you know, by that
    1:06:10 point, I’d seen the occupation and everything and then I come back and I’m working on this
    1:06:17 essay and I find out that less than a mile from there, you know, there was a massacre
    1:06:21 perpetrated by the inheritors of that legacy.
    1:06:24 It’s like, how do you sit with that?
    1:06:29 And I called a Palestinian friend of mine and he said, you know, I’m not surprised they
    1:06:30 do this shit all the time.
    1:06:36 He said, they built a museum of tolerance on top of a Muslim graveyard.
    1:06:37 This is literally true.
    1:06:42 Like anybody to think, like just Google it, Jerusalem Museum of Tolerance Graveyard.
    1:06:43 They did it.
    1:06:46 It was something like from LA, by the way.
    1:06:50 And you think, what am I supposed to derive from that?
    1:06:51 Like how am I supposed to feel about that?
    1:06:59 Like how do I maintain my sympathy, how I felt in that moment, watching those worlds
    1:07:05 about to be destroyed with the fact of what the inheritors of that did?
    1:07:07 That is complicated.
    1:07:09 That is hard, I admit.
    1:07:17 There is an assistant, a nation, a movement, an institution of people that can exist without
    1:07:20 a story to justify itself.
    1:07:26 And to the extent that that’s true, maybe we’re just condemned to live with certain illusions,
    1:07:29 certain myths, certain blind spots.
    1:07:34 And maybe we can’t do otherwise because the truth is unbearable.
    1:07:36 We can do better than this, though.
    1:07:38 We can do better than this.
    1:07:39 Yeah.
    1:07:40 I think that’s right.
    1:07:41 Yeah.
    1:07:46 You know, there was a poignant moment in the chapter on Cynical where you’re talking about
    1:07:50 your dad and, I don’t know, maybe you were a kid, I don’t remember, but he was telling
    1:07:53 you about a book he had just read on the 18th century.
    1:07:55 Maybe we don’t get back to Africa.
    1:07:56 Yeah.
    1:07:59 It was the 18th century rebellion and Guyana and his sadness and how it ended.
    1:08:03 And it ended with the leaders of the revolt turning against each other and collaborating
    1:08:05 with the people who had enslaved them.
    1:08:10 And the realization is that the stories of some pure, uncorrupted people was just a
    1:08:14 myth, just a story that the people there were like the people everywhere else.
    1:08:20 And it’s a sobering thing to accept, but maybe accepting it is the beginning of some kind
    1:08:21 of wisdom, I hope.
    1:08:22 Yeah.
    1:08:23 I do.
    1:08:24 Maybe we don’t get back to Africa.
    1:08:25 Like, that hurts.
    1:08:32 Maybe like that utopia that we thought existed before we were brought over here and we’ve
    1:08:36 been trying to get back to, through the reconstruction of our history, through the reconstruction
    1:08:39 of our stories, our heroes, through reconstruction of our very names, right?
    1:08:42 Like my very name comes out of that.
    1:08:44 Maybe it’s feudal.
    1:08:51 Like there is no global, no glorious unfettered utopia that we herald from that we need to
    1:08:52 restore.
    1:08:53 It’s a mess.
    1:08:56 It was always a mess.
    1:08:57 It’s going to be a mess.
    1:08:58 We were enslaved.
    1:09:02 We’re not like the heroes in some grand fable, you know what I mean?
    1:09:06 Where we were X, Y, Z, it was destroyed and now we will restore it.
    1:09:07 That’s not what it is.
    1:09:11 We’re just left with our own human frailty.
    1:09:14 We do not have the seed of divinity in us.
    1:09:17 We’re not special.
    1:09:23 To the extent that we are, it will be by what we do, not by who we are, and certainly not
    1:09:25 by what happened to us.
    1:09:29 Because where do you take from it, right?
    1:09:31 Oh, you mean we just got enslaved?
    1:09:32 That’s it?
    1:09:33 That’s just what happened?
    1:09:36 Like some dude just sold me onto a ship.
    1:09:37 You know what I mean?
    1:09:39 Didn’t this like, that’s hard.
    1:09:41 And of course, that’s not it.
    1:09:42 You know what I mean?
    1:09:43 That’s not it.
    1:09:44 There’s a lot more, right?
    1:09:48 But the more is not about putting a crown on your head or gilding, you know, your history
    1:09:52 or your story.
    1:09:56 We’re just Sisyphus in the rock and either roll that motherfucker up the hill or get rolled
    1:09:57 over by.
    1:09:59 No, I think there’s progress, though, right?
    1:10:03 I think actually in that realization, there’s a kind of liberation.
    1:10:05 That’s sad at first.
    1:10:09 And then it’s like, you know what, actually, it’s kind of okay.
    1:10:10 It’s okay.
    1:10:11 I went through this.
    1:10:12 I talk about this in between the wilderness.
    1:10:19 I went through this in college, where you have to confront the fact that black people
    1:10:23 sold other black people into slavery.
    1:10:28 And that is hard to accept until you realize there was no such thing as black among those
    1:10:29 people.
    1:10:32 Like they didn’t, like these kind of frames that you’re putting on them, they didn’t actually
    1:10:33 have them.
    1:10:37 You know, those weren’t ideas that were developed and you are asking people to justify something
    1:10:43 that you feel as a kind of healing that can actually come out of that, you know?
    1:10:45 Yeah.
    1:10:52 You probably asked all the time, surely by your students at Howard, for advice on how
    1:10:59 to be a writer and an intellectual in this world at this moment, what do you tell them
    1:11:00 besides stay the hell off Twitter?
    1:11:01 Yeah.
    1:11:03 Stay the hell off Twitter is a good one.
    1:11:07 When I was 18 years old and I came to Howard and I desperately wanted to be a writer,
    1:11:11 there was a poet by the name of Ethel Burt Miller, who was all the way up his office,
    1:11:14 all the way on the top floor of the library, and I would take my really bad poetry up
    1:11:19 to him and he would critique it and like I could never get anything through that was
    1:11:20 good, right?
    1:11:21 Or maybe like one out of 10.
    1:11:23 And it’s so frustrating.
    1:11:28 You know, you’re 18 man.
    1:11:29 You just need to live.
    1:11:31 You need to go join the Peace Corps or something.
    1:11:32 You need to live.
    1:11:34 You don’t add a body of life experience.
    1:11:37 And what he was saying was you need to go walk the world.
    1:11:40 Like you have to go out and see some things and experience some things that actually have
    1:11:43 things to write about.
    1:11:48 And I think that’s so important for writers, period.
    1:11:53 You know, like, you know, there’s a critique in that in the book of those of us who just
    1:11:58 kind of sit in one place and read articles and read other people’s books and never go
    1:12:00 and walk the world for ourselves.
    1:12:07 You’ll never have the interactions of allowing these original sites at least to us to filter
    1:12:12 through our memory, you know, through our sense and our aesthetics, so that we can develop
    1:12:14 our own words and our own language.
    1:12:16 I think writing is simple.
    1:12:22 You know, you write, read, revise, you know, walk.
    1:12:26 And that’s about it.
    1:12:28 Once again, the book is called The Message.
    1:12:32 Ta-Nehisi Coates, a privilege and a pleasure to have you on.
    1:12:33 Thank you for doing it.
    1:12:34 Thanks, Sean.
    1:12:34 Thank you so much.
    1:12:47 Alright, I hope you appreciated this episode.
    1:12:49 I definitely did.
    1:12:52 It was obviously a very difficult conversation.
    1:13:01 This is a topic I don’t feel like I understand very well, and I try not to weigh in on things
    1:13:04 I don’t understand very well.
    1:13:08 But it was important to talk about this, and I didn’t want to let not understanding
    1:13:15 it perfectly be a justification for ignoring it altogether.
    1:13:20 So I dove in, and I did my best, and I felt like I learned something.
    1:13:26 I appreciate what Ta-Nehisi is doing in this book, and I appreciate his openness and his
    1:13:28 honesty with me.
    1:13:32 And I don’t know.
    1:13:38 I’m going to think a little bit longer and harder about his suggestion to me that maybe
    1:13:43 I go over to Israel and Palestine and see what’s happening for myself.
    1:14:02 I can’t make any promises, but I am going to think about it.
    1:14:07 As always, we really want to know what you think of this episode.
    1:14:12 Drop us a line at thegrayarea@box.com, and once you’re finished with that, please do
    1:14:16 rate and review the pod, and subscribe to the show.
    1:14:20 This episode was produced by Beth Morrissey and Travis Larchek.
    1:14:26 Today’s episode was engineered by Christian Ayala, backchecked by Anouk Dussot, edited
    1:14:31 by Jorge Just, and Alex O’Brington wrote arty music.
    1:15:01 A special thanks to Chris Shirtliff, Matthew Heffron, and Rob Byers.
    1:15:06 It’s an AI-powered customer platform that builds campaigns for you, tells you which
    1:15:11 leads are worth knowing, and makes writing blogs, creating videos, and posting on social
    1:15:12 a breeze.
    1:15:16 So now, it’s easier than ever to be a marketer.
    1:15:21 Get started at hubspot.com/marketers.
    1:15:23 Support for this podcast comes from Klaviyo.
    1:15:27 You know that feeling when your favorite brand really gets you?
    1:15:30 Deliver that feeling to your customers every time.
    1:15:36 Klaviyo turns your customer data into real-time connections across AI-powered email, SMS, and
    1:15:39 more, making every moment count.
    1:15:44 Over 100,000 brands trust Klaviyo’s unified data and marketing platform to build smarter
    1:15:51 digital relationships with their customers during Black Friday, Cyber Monday, and beyond.
    1:15:59 Make every moment count with Klaviyo.
    1:16:01 [Music]
    1:16:04 (gentle music)
    1:16:14 [BLANK_AUDIO]

    How important is complexity? At The Gray Area, we value understanding the details. We revel in complexity. But does our desire to understand that complexity sometimes over-complicate an issue?

    Journalist and bestselling author Ta-Nehisi Coates thinks so.

    This week on The Gray Area, Sean talks to Coates about his new book The Message, a collection of essays about storytelling, moral clarity, and the dangers of hiding behind complexity.

    The Message covers a lot of ground, but the largest section of the book — and the focus of this week’s conversation — is about Coates’s trip to the Middle East and the conflict between Israelis and Palestinians. Coates argues that the situation is not as complicated as most of us believe.

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Your mind needs chaos

    AI transcript
    0:00:02 Support for this show comes from Grammarly.
    0:00:05 88% of the work week is spent communicating,
    0:00:08 so it’s important to make sure your team does it well.
    0:00:10 Enter Grammarly.
    0:00:14 Grammarly’s AI can help teams communicate clearly the first time.
    0:00:17 It shows you how to make words resonate with your audience,
    0:00:19 helps with brainstorming,
    0:00:23 and lets you instantly create and revise drafts in just one click.
    0:00:29 Join the 70,000 teams and 30 million people who use Grammarly to move work forward.
    0:00:33 Go to grammarly.com/enterprise to learn more.
    0:00:37 Grammarly, Enterprise Ready AI.
    0:00:42 When you think of what makes us human,
    0:00:45 what marks us as living beings,
    0:00:49 would you say our powers of prediction?
    0:00:54 I probably wouldn’t have, at least until this conversation.
    0:00:57 It’s true that our ability to process information
    0:00:59 and use it to predict what’s going to happen
    0:01:04 helps us craft survival strategies and pursue our goals.
    0:01:09 But too much predictive power is usually the stuff of dystopian sci-fi stories,
    0:01:14 where being creative and unpredictable are the hallmarks of humanity,
    0:01:18 while the power of prediction is cast as the weapon of technology.
    0:01:22 And yet, one of the latest big theories in neuroscience
    0:01:27 says that we humans are fundamentally creatures of prediction,
    0:01:29 and creativity isn’t at odds with that,
    0:01:33 but that actually creativity and prediction can go hand in hand.
    0:01:36 That life itself is one big process
    0:01:40 of creatively optimizing prediction as a survival strategy
    0:01:44 in a universe that’s otherwise trending towards chaos.
    0:01:48 So, how should we think about the balance between
    0:01:51 what’s predictable and what surprises us?
    0:01:53 How can they work together?
    0:01:58 And what happens when you get too much of one and not enough of the other?
    0:02:01 I’m O’Shaughn Jarrow, sitting in for Sean Hilling,
    0:02:03 and this is the Gray Area.
    0:02:17 My guest today is Mark Miller.
    0:02:19 He’s a philosopher of cognition
    0:02:23 and a research fellow at the University of Toronto’s psychology department
    0:02:27 and Monash University’s Center for Consciousness and Contemplative Studies.
    0:02:31 He’s also the host of the Contemplative Science podcast.
    0:02:35 Miller’s work starts with this big idea known as predictive processing,
    0:02:41 which says that your brain and body are constantly taking in information,
    0:02:44 using it to build predictive models of the world,
    0:02:49 and that our conscious experience is shaped by these predictions.
    0:02:52 Predictive processing explains why we’re so quick to notice
    0:02:54 when something unusual happens,
    0:02:57 when a vinyl record playing a familiar song scratches,
    0:03:02 or you notice that the tree that’s always been outside your apartment is suddenly gone.
    0:03:07 These are prediction errors that your brain feeds on to update its model of the world.
    0:03:09 And according to Miller,
    0:03:13 using prediction errors to get better and better at doing this sort of thing
    0:03:15 is a pretty big deal.
    0:03:19 He’s argued that it could even be one of the keys to happiness.
    0:03:21 But when the brain gets too deep into prediction
    0:03:25 without a healthy dose of creativity and surprise,
    0:03:26 it can cause problems.
    0:03:30 Miller says that it’s healthy for us to be pushed to the edge
    0:03:32 of what he calls informational chaos,
    0:03:38 where our predictive models begin to break down and we encounter the unknown.
    0:03:42 So I invited Miller on the show to help unravel this paradox.
    0:03:46 What does it mean to be creatures that survive on prediction
    0:03:48 but need chaos in order to thrive?
    0:03:54 Mark, welcome to the show.
    0:03:55 Hi.
    0:03:56 Thank you so much for being here.
    0:03:57 Thanks for having me.
    0:03:59 I mean, this is, I love Vox.
    0:04:03 I love what you guys do and I love the podcast and I was stoked to get an invite.
    0:04:03 So thanks.
    0:04:04 Wonderful.
    0:04:06 I’m excited to dig into your work.
    0:04:10 I think the foundational idea for a lot of your work
    0:04:14 is this big theory known as predictive processing.
    0:04:18 How would you describe that to someone without a neuroscience background?
    0:04:20 One thing you can do is you can say what it’s not.
    0:04:24 So, you know, for a few hundred years,
    0:04:28 we thought that perception works in one way and that is, you know,
    0:04:33 there’s light, there’s light and sound and things to feel out in the environment.
    0:04:33 Let’s take light.
    0:04:35 That’s a nice example.
    0:04:37 So light is out in the environment.
    0:04:38 It bounces off of objects.
    0:04:41 That light then hits our sensory apparatus like our eye.
    0:04:47 And then what the brain’s job is in the older model is to take that information
    0:04:51 and then render it comprehensible as that information rolls up
    0:04:53 through the visual hierarchy.
    0:04:57 It’s getting more and more fleshed out until the end product is, you know,
    0:04:59 the rich, the rich world revealing experience.
    0:05:00 The world.
    0:05:03 Exactly the world, the world that we have, right?
    0:05:03 And that’s fine.
    0:05:06 And I think most people think that that’s how it is and we feel pretty comfortable with that.
    0:05:11 But if this is right, then that’s wrong or really important parts of it are wrong.
    0:05:16 Rather than thinking of the brain as largely passive,
    0:05:20 like the brain is waiting around in that vision.
    0:05:24 It’s waiting for signals from the world and then it’s only working once it gets those signals.
    0:05:28 This framework takes that idea and literally flips it on its head.
    0:05:30 So rather than thinking about the brain waiting around for anything,
    0:05:34 no, no, no, what if we recast the brain as radically proactive?
    0:05:36 It’s not waiting around for anything.
    0:05:39 If this model is right, this framework is right.
    0:05:42 The brain is first and foremost a prediction engine.
    0:05:43 It’s an anticipatory engine.
    0:05:50 It’s using what it knows about the world and it’s seeking out to understand the world for itself
    0:05:56 so that it can create from the top down what it expects to be happening next.
    0:05:58 And then it only uses signals from the world.
    0:06:00 Those signals aren’t what you perceive.
    0:06:07 Those signals are now just used as tests to see how good your own top down modeling is.
    0:06:11 So, if I didn’t make you feel a little bit funny, sometimes I say this,
    0:06:16 then you didn’t quite catch it because what it means is that you’re not seeing light from the world per se.
    0:06:20 You’re seeing your own best guess about what’s happening right now
    0:06:24 and the light from the world is just there to update your model.
    0:06:28 So you are, in a way, I like a Neal Seth, it’s a little bit provocative, but I like it.
    0:06:32 A Neal Seth from Sussex University Neuroscientist of Consciousness says,
    0:06:37 “Then we might say something like perception is controlled hallucination.”
    0:06:40 It’s hallucination because it’s being generated from the top down,
    0:06:45 but it’s controlled hallucination because it’s not just that you’re having any experience,
    0:06:50 you’re hallucinating your brain’s best guess about what’s actually happening.
    0:06:52 So, of course, it’s controlled by real-world dynamics.
    0:06:56 So, just to try and understand how this actually works,
    0:06:59 right now I’m looking out my window and I see a particular scene.
    0:07:04 And naively, it seems to me like the light is coming in from the outside into my body,
    0:07:06 reaching my brain, and that’s what I’m seeing.
    0:07:10 What you’re telling me is actually what I’m seeing is the model being predicted by my brain.
    0:07:15 What happens, though, when the sensory stimuli, when the light actually does get passed through my body?
    0:07:19 Am I experiencing that at any point or when do we switch from experiencing our
    0:07:22 predictions of the world to raw sensory data?
    0:07:27 Right. Probably never. You don’t ever have access. Maybe Kant was right.
    0:07:32 There’s just this numinalness where you just don’t have access to it.
    0:07:33 That’s just not what you’re built to do.
    0:07:36 And actually, you don’t need access to it.
    0:07:42 What you need is you need the driving signal from the world to be making sure that the models
    0:07:46 that you’re generating are elegant, sophisticated, tracking real-world dynamics
    0:07:50 in touch with real temporal stuff. That’s what you need most.
    0:07:52 This does get dizzying the more you think about it.
    0:07:53 Yeah, right.
    0:07:59 But it really is. This is a huge claim, right? My experience of the world is not a direct experience
    0:08:03 of objective reality. It is my brain’s best guess of the world outside of my skull.
    0:08:07 How early stage is predictive processing as a theory?
    0:08:13 Well, not that early. I don’t think it’s irresponsible to say that it’s the
    0:08:19 preeminent theory today in all sorts of communities, computational psychiatry,
    0:08:27 computational psychology, neuroscience. If it’s not the foremost theory, it’s adjacent.
    0:08:34 I guess it’s a mix. It’s younger than the other. It is the new kid on the block in a way,
    0:08:39 but it’s a very popular new kid and very exciting. That being said, of course,
    0:08:41 we’re not at the end of science.
    0:08:49 So you wrote a paper about how this predictive framework can explain a lot about what makes
    0:08:54 us humans happy, right, taking the predictive framework and turning it on these other big
    0:08:57 questions. So tell me about that. What is the predictive account of happiness?
    0:09:04 Yeah, gosh, that’s such a good question. Let me start by telling you what it’s not. For five or
    0:09:10 six or seven years, I worked with people like Julian Kiverstein and Eric Reitfeld and other
    0:09:14 really wonderful people. Were these neuroscientists or philosophers?
    0:09:23 Both. Philosophers of neuroscience and others on producing new models of various psychopathologies.
    0:09:33 So we have work on addiction, depression, OCD, PTSD, disassociative disorders, anxiety.
    0:09:40 So there’s a big range of psychopathologies that people are applying this framework to better
    0:09:45 understand what is that pathology all about. So one of the things that we
    0:09:49 kept bumping into is that a huge number of these psychopathologies we’re looking at
    0:09:54 all had this one quality in them, which was like a kind of sticky bad belief network.
    0:10:00 So the system starts predicting something about itself or something about the world.
    0:10:04 When you say system, do you mean is this a human being?
    0:10:05 Yeah, sure. Yeah, right.
    0:10:07 Yeah, like a cognitive system?
    0:10:08 Yeah. Yeah, that’s right.
    0:10:12 Yeah, the human system, right. So the system that makes us up.
    0:10:16 So the system starts predicting for one reason or another that the world is some way.
    0:10:24 And then the trouble looks like when that prediction becomes strong enough and divergent
    0:10:30 enough from the way things actually are. So we call it has a sticky quality to it.
    0:10:36 Just think about depression. So you’ve installed the belief for whatever reason
    0:10:41 that you just can’t fit with the world that either it’s because you are not good enough
    0:10:45 or the world isn’t good enough. But for some reason, you can’t resolve this difference
    0:10:49 between the way that you want the world to be and the way the world actually is,
    0:10:51 either because of something on your side or something in the world side.
    0:10:56 And if you get that belief installed, one thing that marks depression
    0:11:00 is that that belief persists even if the conditions were to change,
    0:11:07 right? Even if you were to change the situation entirely, there’s a sticky quality to these
    0:11:14 pathologies. Maybe even a better one is PTSD. PTSD in a war zone, in a way, can be really
    0:11:20 adaptive to wake up often, to wake up ready for combat when you’re in a highly volatile state.
    0:11:26 That’s not a completely pathological state to be in. But when you shift from a really scary,
    0:11:34 uncertain situation like war to a peacetime experience and the system can’t let go of
    0:11:40 the structure that’s been embedded in it, then we start calling it pathological. And the sticky
    0:11:45 quality is the thing that’s really the problem there, is that there’s one sort of way of believing
    0:11:50 or predicting the world that won’t budge even though you get better evidence.
    0:11:54 So you’re saying that when we ask about happiness, we’re going to start by pointing
    0:11:59 to what it isn’t. That’s right. And you get problems that arise when the predicted model
    0:12:05 of the world that our brains and bodies are generating diverges from the world itself
    0:12:10 and sticks to its model as opposed to updating with the world. Good. You got it. You got it, right?
    0:12:15 So a divergent belief, a bad divergent belief, I mean, a divergent belief that causes harm,
    0:12:20 causes suffering, that then gets stuck, it’s resistant to change. And indeed,
    0:12:25 even sometimes looks like it protects itself. So I’ll give you an example. They did this great
    0:12:31 study on depression where they had people who were suffering major depression who
    0:12:36 self-reported being depressed all the time. So how often are you depressed? And they said,
    0:12:41 “I’m depressed all the time. I wake up depressed. I’m depressed all day. I go to sleep depressed.
    0:12:47 I’m always depressed.” Then they gave them a beeper and they beeped them randomly and had them
    0:12:51 write down what mood they were in, what they were experiencing, what they were thinking about
    0:12:58 at the time of the beeper. And what they found was that something like 9% to 11% of the time,
    0:13:03 they were feeling depressed. And the rest of the time, they were either in neutral or positive
    0:13:09 affective states. So what’s happening there? Because when you ask them what’s your experience
    0:13:15 like, it’s not like they were lying. It’s not like they were trying to deceive the investigator.
    0:13:21 What’s likely happening is they just don’t notice all of the other experiences because
    0:13:26 it doesn’t conform to the model they have of themselves in the world. The model here,
    0:13:32 the prediction is so strong, it’s drowning out the signal that should be helping it update.
    0:13:39 And that can happen for a number of reasons. So let me ask you then about swinging back to the
    0:13:43 positive dimension, happiness in particular. That’s a picture of depression and psychopathology
    0:13:49 and mental illness. So what does this predictive framework say about the feeling of happiness
    0:13:56 itself? Well, I’m going to say two things. There’s a difference between momentary, subjective happiness
    0:14:02 and well-being. Eudaimoneic well-being, like having a good life.
    0:14:06 Is that Aristotle’s thing? It is. Yeah, you’re right. Yeah, exactly. The ancients were on to it.
    0:14:11 And the ancients were on to also, you need to have, so just in case anybody doesn’t know what
    0:14:17 these are, the momentary, subjective well-being is like hedonic well-being. That’s just the feeling
    0:14:24 good stuff. Is that like pleasure? Yeah, right, right, exactly. And the overall well-being doesn’t
    0:14:28 look like it’s exactly identical with that because to have a really rich, meaningful,
    0:14:38 good life may mean you’re in pain quite a lot. Momentary, subjective well-being is a reflection,
    0:14:47 at least in part, of predicting better than expected. So we have this idea that valence,
    0:14:54 valence is that good or bad feeling that comes as part of your embodied system evaluating.
    0:15:00 It’s telling you, how’s it going? So when you feel good, that’s your body, and we’ve known this for
    0:15:04 a long time, that’s your body and nervous system and brain telling you, I’ve got it. Whatever’s
    0:15:08 happening right now, I’m on top of it. I’m predicting it for us, I’m predicting it well,
    0:15:14 I’m managing uncertainty really well. And when you feel bad, that’s an indicator. I don’t understand
    0:15:18 something here. When you feel good, you want to engage a little bit more with that. That keeps
    0:15:24 us doing things where we’re succeeding. When we feel bad, usually we pull away or we task switch
    0:15:29 because that’s an indicator that maybe something is a little suboptimal. In predictive parlance,
    0:15:34 we think it has to do with prediction. So we feel good when we’re predicting better than expected,
    0:15:39 we feel bad when we’re predicting worse than expected, and we use those good feelings or
    0:15:46 those bad feelings to hone how we’re predicting our environments. So that feeling of pleasure
    0:15:52 or valence is a signal that we’re on a good track. But at the same time, you mentioned this isn’t
    0:15:57 just about maximizing pleasure, there’s more to well-being. And you actually used substance
    0:16:03 addiction as a really nice example of showing why just maximizing these pleasure in these loops is
    0:16:08 not enough, it’s too narrow. So what does addiction show us about why pleasure alone is not enough
    0:16:15 to talk about happiness here? So if your brain is an optimal engine, optimal predictive engine,
    0:16:22 how is it that we keep finding ourselves in all of these suboptimal cul-de-sacs like addiction,
    0:16:28 depression, anxiety, because those don’t seem very optimal. And addiction is such a good example,
    0:16:34 it’s a good test case to see how that happens. In the case of opioids, for instance, the opioid
    0:16:44 signals to the brain directly that you have predicted better than expected sort of over
    0:16:50 all of your cares and concerns. Opioid signals the brain directly that whatever just happened,
    0:17:00 whatever behavioral package, whatever context was just on tap, you have just found an amazing
    0:17:04 opportunity, better than anything you’ve ever found before, wildly unexpected reductions in
    0:17:09 uncertainty. And that caches out as that burst of pleasure, the pleasure is what’s signaling that
    0:17:15 to me. Massive pleasure, massive pleasure. The reason heroin feels as good as it does
    0:17:21 is because it’s signaling to the brain directly. You’ve got to remember this framework really
    0:17:26 exposes this. The predictive system, like your brain and nervous system, they don’t have access
    0:17:31 to the outside world per se. All they have are the signals at the edge that they’re making predictions
    0:17:37 over. So you feed it a signal using an opioid, you feed it a signal that just says, well, whatever
    0:17:43 just happened, you just hit the jackpot. And for a system like you and me and everyone else,
    0:17:49 that is basically an uncertainty managing system. It’s not surprising that people do heroin. It’s
    0:17:55 surprising not everybody does heroin. We have evolved to manage uncertainty. This chemical
    0:18:01 signals to us that uncertainty has been completely managed. And so of course, the drug seeking and
    0:18:08 taking behaviors that produced that signal are the ones that the system then puts the volume up on.
    0:18:24 What is AI actually for? Everybody is talking about AI. Wall Street is obsessed with AI. AI will
    0:18:30 save us. AI will kill us. It’s just AI, AI, AI, everywhere you turn. But what does any of that
    0:18:35 actually mean, like in your life right now? That’s what we’re currently exploring on the
    0:18:41 Vergecast. We’re looking for places and products where AI might actually matter in your day-to-day
    0:18:46 life. And I don’t just mean for making your emails less mean, though I guess that’s good too.
    0:18:50 Lots of big new ideas about AI this month on The Vergecast, wherever you get podcasts.
    0:19:00 You’ve written a paper about horror movies and predictability. Can you tell me how you got
    0:19:06 started on that research and what you found there? The paper is called Serving Uncertainty with Screams.
    0:19:13 It was done with some excellent people. And there was a few steps up to it, starting from the idea
    0:19:19 that we feel good not by getting rid of all error, not by vanquishing uncertainty,
    0:19:26 but that we feel good when we have the right kinds of uncertainty to reduce. We start there,
    0:19:34 and then we moved into developing a model of play. And they invited me onto that paper
    0:19:44 to think about playfulness. And play there was showcased as exciting and alluring and super fun
    0:19:51 because play so often creates these at-edge experiences. So we tie one leg up, we blindfold
    0:19:56 ourselves, we do everything we can to create a bunch of uncertainty that we then resolve. And
    0:20:01 that’s sort of the nature of lots of what we do in terms of play. So if you’ve already got sort of
    0:20:06 risky play on tap, then it’s sort of a hub skip and a jump to think about really risky things,
    0:20:12 like potentially going to horror theme parks or going to horror movies. And so we started digging
    0:20:18 there. But then when we’re investigating that, lo and behold, a number of other benefits started
    0:20:24 to be exposed. There are all sorts of bits of life that are really critical for us to understand,
    0:20:28 but that we get no exposure to because of the kind of cultures that we live in, like death
    0:20:34 or pain or, you’re like, why do you, why do you rubber neck when you drive past a car accident?
    0:20:39 Even if you’re the best person in the world, why do you look? Why do you really look? Why
    0:20:43 when your friend comes to you and says, my partner died, no matter how compassionate and
    0:20:49 skillful of a person you are, you want to say, how, how exactly, how exactly did they die?
    0:20:53 Like before you even say, I’m so sorry, you’re like, wait, wait, wait, how old were they?
    0:20:57 How old were they and how did it happen? And we might feel ashamed that we have those little
    0:21:01 thoughts, but that’s just the generative model doing what it’s doing. It’s trying to figure out
    0:21:05 what are the, what are the variables in the world that I need to know about so that I’m predicting
    0:21:10 well moment to moment to moment. And actually horror movies turn out to be a treasure trove
    0:21:14 of this kind of information. We can see what is it like if I get chased? What is it like if
    0:21:19 somebody ended up in my house? What is it like if I was under extreme duress? That’s all model
    0:21:24 updating stuff. Is that the idea with horror movies? Is it just that exposes me to a form
    0:21:29 of uncertainty that ultimately helps me become a better predicting creature? That’s right,
    0:21:38 exactly. So horror is like the smaller step, you know, cousin of those sorts of more extreme cases.
    0:21:46 Got it. So what horror does is it produces a safe kind of uncertainty for us to get involved in.
    0:21:51 It’s certain uncertainty in a way. It’s not volatility. It’s not actually being chased by
    0:21:55 somebody with a chainsaw. You get to go to a place where you know you’re safe, where most
    0:22:00 of you know that you’re safe, and you can still flirt with all of these sort of uncertainty
    0:22:05 generating and uncertainty minimizing dynamics, which we find thrilling because the evolutionary
    0:22:10 system sort of like turns on. It acts as if you’re being chased and then the rest of the system
    0:22:18 goes, “Hey, wait, we’re in the theater. It’s all good.” Right. So so far, we’ve told this story
    0:22:22 that prediction can produce, getting better and better at prediction produces these feelings of
    0:22:28 happiness coupled with exposing ourselves to the right kind of uncertainty that can broaden the
    0:22:33 scope of our predictive powers. This conversation we’re having today, it’s part of a series we’re
    0:22:39 doing on creativity. And I think at this point, we’ve probably set up enough context for me to just
    0:22:45 ask you directly, how does creativity fit into this story? I think a starting point for thinking
    0:22:55 about creativity using this model is to start by maybe showing the puzzle. So we ran into the
    0:23:02 same puzzle thinking about horror. So why would a predictive system that looks like it’s trying to
    0:23:08 reduce uncertainty be attracted to situations and indeed make those situations where it’s bumping
    0:23:15 into uncertainty? Why do we build roller coasters? Why do we go to horror movies? When I give this
    0:23:20 lecture, similar lectures to this in different spaces and I ask people, raise your hand if
    0:23:30 you would want to be one of the first people to colonize Mars, which is an insane thing to want
    0:23:35 to do. I’m not raising my hand here. No, it’s massively uncertain. It’s like the most uncertain
    0:23:40 thing you could positively do. I have never asked that question and had no one put up their hand.
    0:23:45 It’s always 5, 6, 10 people put up their hands and you push them and they’re like, yeah, given the
    0:23:51 opportunity, I think I’d really take that chance. So there’s a puzzle there or there’s a seeming
    0:23:57 puzzle. Why would a system that looks like it’s trying to reduce uncertainty actually not only
    0:24:01 be attracted to uncertainty, but systematically create uncertainty in all of these different
    0:24:08 situations? And part of the answer I think we’ve exposed in these papers is that too much certainty
    0:24:13 is a problem for us, especially when that certainty drifts from the real world dynamics.
    0:24:20 So in order to protect our prediction engine, our brain and nervous system, from getting into
    0:24:24 what we’ve called the bad bootstrap, that is from getting very, very certain about something that’s
    0:24:29 wrong, because that’s really dangerous for the kind of animal that we are. It’s really dangerous.
    0:24:37 We are built to get it right. So in that kind of world and for that kind of system, it really
    0:24:45 behooves us to occasionally inject ourselves with enough uncertainty, with enough like humility,
    0:24:52 intellectual humility in a way, like be uncertain about your model enough that you can check to see
    0:24:57 whether or not you’ve been stuck in one of these bad bootstraps. And I think if you’re with me to
    0:25:04 there, then we have a wonderful first principles approach to thinking about the benefit of creativity
    0:25:10 and art, especially provocative art, especially art that like calls you to rethink who you are and how
    0:25:17 it is. Because as far as we’ve seen, and you know, the research just keep pointing in this direction,
    0:25:24 anything that gets you out of your ordinary mode of interacting with the world so that you can check
    0:25:28 to see how good it is or how poor it is, is going to be a benefit for us. It’s going to protect us
    0:25:34 from those bad siloed opportunities. And I think art does that, right? You can go somewhere, see
    0:25:40 something grand, see something beautiful, see something ugly and horrible. And if you let yourself
    0:25:47 be impressed by it, it can be an opportunity for you to be jostled out of your ordinary way of
    0:25:51 seeing the world, which would let the system check to see whether or not it’s running optimal models
    0:26:00 or not. So it sounds like you’re likening creativity to this injection of the right kind of uncertainty
    0:26:05 into our experience of the world. And it’s really interesting. In the paper on horror movies,
    0:26:09 actually, you used a term that I think captures a lot of this. It’s a thread that seems to run
    0:26:14 through everything so far, the art, the creativity, the horror movies, meditation and psychedelics
    0:26:20 we’ll get to. You wrote that the brain evolved to seek out the edge of informational chaos,
    0:26:23 which is a place where our predictive models begin to break down.
    0:26:29 And in those uncertain zones, we actually have much to learn. It’s a very rich learning environment.
    0:26:34 And so it sounds to me like this edge of chaos actually explains at least one perspective on
    0:26:40 why art, why creativity, why play, why all these things benefit us, because that edge is a really
    0:26:45 healthy place to be. So I wanted to ask you about this framing of the edge of informational chaos
    0:26:50 and why that’s a place that our brains would want or benefit from.
    0:26:58 You already say it so beautifully. Where are we going to learn the most if you are a learning
    0:27:05 system? And this is amazing. We have right from the lab, we see that animals and us,
    0:27:11 we get rewarded not only when we get food and watered and sexed, we get rewarded when we get
    0:27:16 better information. Isn’t that amazing to acknowledge? Like if you get better information,
    0:27:21 my system is treating it like I’ve been fed. That’s how important good information is for us.
    0:27:27 And in fact, in lots of situations, it’s more rewarding for us than the food itself. Because
    0:27:33 one bit of food is one thing, information about how to get food over time, that could be much,
    0:27:40 much more important, right? So where do we learn? Where do we learn the most if really what matters
    0:27:45 is that we’re learning? Well, we don’t learn where our predictive models are so refined
    0:27:49 that everything is just being done by rote. We’re definitely not learning much there.
    0:27:57 And we’re not learning the most way out in deep volatility, unexpected uncertainty environments.
    0:28:00 That’s like where you not only do you not know what’s going on, but you don’t know how to get to
    0:28:05 knowing what’s going on. That’s why we have culture shock. If we move somewhere else,
    0:28:09 sometimes some people can have this like really disorienting, even hallucinating,
    0:28:14 engendering experiences. Because not only do you not get it, but you don’t know how to get.
    0:28:17 You don’t know how to get to getting it. You don’t know, like you’re not only
    0:28:20 uncertain about this, but I’m uncertain about myself trying to get a hold of this.
    0:28:26 That’s no good for us either. So where do we learn the most? We learn it at this Goldilocks zone,
    0:28:33 which is that healthy boundary between order and chaos, between what’s knowable
    0:28:37 and leverageable and that thing which is not known. And you said it so beautifully,
    0:28:43 right at that edge is where our predictive models necessarily break down. It is by its very nature
    0:28:49 the place that the model breaks down. And the hope there is is that in breaking down,
    0:28:56 new, better models are possible. Every chance you get to be at that edge is a chance to be learning,
    0:29:02 breaking and making better models. And I love the research agenda that’s looking at all the
    0:29:06 benefit and all the ways that we can find that edge and leverage all the good stuff at that edge,
    0:29:12 including horror movies and provocative art. Well, this is really dangerous territory because
    0:29:19 it sounds to me like what you’re saying from the predictive perspective is when I settle in to watch
    0:29:24 my Netflix series that is perfectly predictable, where I know the template, I know the plot,
    0:29:28 how it’s going to unfold, but I just enjoy watching it kind of fill in the lines anyway.
    0:29:34 I’m not getting that uncertainty, whereas when I watch a really strange indie movie where things
    0:29:38 are happening that I don’t know why they’re happening, I can’t follow the plot, that I’m
    0:29:42 getting uncertainty out of that that’s going to benefit my predictive system. Is that kind of
    0:29:48 the case? Well, if you can’t catch the plot, I don’t know how much benefit there is because that
    0:29:52 sounds to me like it’s a little bit too far outside of your spectrum. Like if all you know is punk
    0:29:56 music and somebody takes you to a classical concert, there might not be a bunch of useful
    0:30:02 uncertainty here. That might just be aggravating uncertainty. I just don’t know what to do here.
    0:30:06 So that’s probably not going to be not all that important for your system.
    0:30:13 What you would want is to be at your edge. So if you love reading and you’re into
    0:30:20 science fiction or something, and then you get a chance to get your hands on Dostoevsky,
    0:30:25 there might be, you know how to read, you know how to engage with literature.
    0:30:29 There’s an edge here that you don’t really understand. Pushing that edge is going to be
    0:30:34 valuable because it’s going to expose you to different species of information that might
    0:30:40 have the the bang on effect of improving your grip in lots of different scenes. But why is it
    0:30:46 that we’re attracted to really regular things? If what we’ve been saying here is I’m especially
    0:30:52 charged to find my edge and hang at my edge and where I’m improving my predictions, that feels
    0:30:57 super good. Why is it that I like, you know, sometimes we find ourselves just rewatching
    0:31:04 the same show over and over and over again? One of the answers looks like the degree to which
    0:31:11 you expect everything else in your life to be highly, highly uncertain is the degree to which
    0:31:17 doing something that’s really, really regular feels to the system as if it’s doing better than
    0:31:26 expected at managing uncertainty. So watching friends for the 17th time can feel very rewarding
    0:31:32 insofar as you have expectations that everything other than watching friends tonight
    0:31:38 is volatility city. My essay isn’t working right. My editing of this thing isn’t working right. I
    0:31:44 have this work coming up that I don’t know what to do about. My relationship is tanked. If you see
    0:31:51 uncertainty dynamics going all uphill from where you are, then just doing something
    0:31:56 super regular actually gets registered by the system as if you’re reducing error better than
    0:32:02 expected because the temporary reprieve of friends is reducing error better than expected
    0:32:08 relative to the runaway error everywhere else. Yeah. I’m very happy you’ve provided justification
    0:32:13 for me to continue watching predictable shows. Hold on. If you want more, I’ll give you one more
    0:32:18 because you definitely should do that. One of the things that looks like it engenders depression is
    0:32:24 repeated failures where you are just getting information back. But everything you try in
    0:32:29 order to improve your predictive grip on the scene is failing. You reach and slip and reach and slip
    0:32:33 and reach and slip and reach and slip and reach and slip. Eventually what the system does to
    0:32:38 manage that is it installs this deep level belief that, look, this is just the kind of place where
    0:32:43 you reach and slip. That’s it. You are a reach and slip thing. As soon as it has that prediction,
    0:32:48 then you go about trying to confirm that prediction. One of the ways you can protect yourself from
    0:32:56 that is giving yourself lots of wins. We know this deep in COVID, Animal Crossing was a massively
    0:33:04 popular game because you get a cute, easy, regular, close to hand opportunity to have some wins.
    0:33:09 And actually, I think that’s totally protective. I’m a meditation teacher. I don’t know how
    0:33:15 avant-garde this is, but I’m quick to say you should watch Netflix and play video games when
    0:33:20 you don’t feel well. I don’t think that’s always a numbing process. I think avoiding technologies
    0:33:27 are real technologies and getting little wins when the world is especially vicious in terms of
    0:33:31 uncertainty, I think is a really great way to protect the predictive system from having one of
    0:33:37 those dumps where, oh, I just can’t do anything. And so, I better turn on sickness behaviors and
    0:33:44 back up. From this perspective, do you think there’s a difference between me setting up an easel
    0:33:50 and painting versus going to a museum and consuming and looking at a painting? How do you see those
    0:33:56 from the lens of uncertainty? I would say there might be a difference between taking painting as
    0:34:03 a craft, where what you have here is you have an opportunity to improve your painting skills when
    0:34:09 you sit at the easel. And so, you’re getting lots of little, potentially, you have the opportunity
    0:34:14 here to get lots of little bumps of doing better than expected as you’re increasing your skills.
    0:34:20 So, that’s nice. Every new painting is a little bit of uncertainty that you’re managing in a small
    0:34:25 way, but I think something else could be happening there too, especially if you think about it as
    0:34:31 like art therapy, where you’re not just trying to paint the scene, but you’re trying to paint
    0:34:37 something about yourself as you’re painting a scene. You’re trying to expose something
    0:34:43 about yourself while you’re engaging in this creative act. And why would you want to do that?
    0:34:48 Why would you want to take something hidden and put it somewhere public? What do you think?
    0:34:54 Well, I imagine it’s going to help me resolve some things that have been uncertain about
    0:34:59 something in my understanding of the world. Love that, right? So, if the first thing we are
    0:35:05 is informational machines, we’re epistemic machines. We’re trying to figure out how the world is.
    0:35:10 The most important part about the world, potentially, is figuring out ourselves, right? And there’s a
    0:35:15 bunch of things that are hidden to us. They’re just deep down in the subconscious. We don’t have
    0:35:19 access to them. The degree to which we don’t have access to them means we’re running over a model
    0:35:24 that’s not complete. And that’s dangerous, actually, for a predictive system like us. Every
    0:35:30 opportunity you can to get out stuff that’s hidden to better understand it, that’s good stuff. So,
    0:35:36 one, you’re going to start knowing yourself better. Two, if you put it out into a public sphere,
    0:35:42 you might invite people that you trust to come and talk about it, which is going to let you
    0:35:48 possibly optimize some of these things in yourself. If you can’t expose it, how do you work on it?
    0:35:55 And so, bringing that up and out into a public sphere where then you can have friends look at it
    0:36:00 and give suggestions relative to that is really, again, really valuable for a predictive system
    0:36:04 like us. You’re exposing part of your generative model and you’re exposing it in a way where you
    0:36:11 can have people talk about it and where then you can reimbibe it and potentially benefit from its
    0:36:17 exposure and its digestion. I think ARC can definitely do that. Expose something that you
    0:36:21 didn’t know about yourself in a way that can let you optimize over that thing for yourself.
    0:36:34 Support for this podcast comes from Shopify. Every business owner knows how valuable a great
    0:36:40 partner can be. Growth and expansion feels a lot more doable when you team up with someone who is
    0:36:45 tech savvy, loaded with cutting-edge ideas and really great at converting browsers into buyers.
    0:36:51 Finding that partner, though, is easier said than done until now. That is, thanks to Shopify.
    0:36:56 Shopify is an all-in-one digital commerce platform that wants to help your business
    0:37:01 sell better than ever before. When you partner up with Shopify, you may be able to convert
    0:37:07 more customers and end those abandoned shopping carts for good thanks to their shop pay feature.
    0:37:12 There’s a reason companies like Allbirds turn to Shopify to sell more products to more customers,
    0:37:18 whether they’re online in a brick-and-mortar shop or on social media. You can upgrade your business
    0:37:23 and get the same checkout Allbirds uses with Shopify. You can sign up for your $1 per month
    0:37:30 trial period at Shopify.com/VoxBusiness. Just go to Shopify.com/VoxBusiness to upgrade your
    0:37:42 selling today. Shopify.com/VoxBusiness. You’ve written about how this predictive view of the mind
    0:37:47 can explain why some digital technologies, particularly social media, can undermine or
    0:37:51 harm our mental health. I’m curious, given this framework we’ve talked about,
    0:37:56 how do you think about the impact of social media and this growing role of digital technologies
    0:37:59 on well-being? Yeah, I love that. What a great question.
    0:38:06 You know, this long-form podcast that you guys have is so good, because we can actually get
    0:38:10 through some territory, because I think we have enough on the table now to say something
    0:38:17 moderately sophisticated about that. Social media is so dangerous in its current form.
    0:38:21 I don’t mean it can’t be good or that it doesn’t have good qualities. I don’t want to go that far,
    0:38:28 but just think about it. If there’s a problem where you install models of the world that drift
    0:38:35 from reality, I mean, do I have to even say anymore or are we all on the same page?
    0:38:41 Social media is a lie factory. It’s made to deceive us about reality. That’s what it’s,
    0:38:48 by its very nature, and this is how it’s being used. We’re all the time looking
    0:38:56 to improve our model and the design and the kind of media that people are benefiting from posting
    0:39:02 has almost, by its very nature, this quality of both attractive and deceptive.
    0:39:09 And no wonder we’re increasingly uncertain and increasingly anxious when you are literally
    0:39:16 being fed models that don’t track reality. You are inundating your generative model
    0:39:22 with bad evidence. You are literally doing what might be the worst possible thing for this kind
    0:39:28 of system. You are just feeding it bad evidence about the world. Nobody’s home looks like that.
    0:39:37 Nobody’s kid is always happy. No couples are always blissful. This does not exist.
    0:39:43 This is just not realistic. And what we’re doing is we’re bending our generative model.
    0:39:49 We’re bending our generative model, say this is actually how it is. It is so upsetting and so
    0:39:54 dangerous for our kind of system because it does exactly the thing that we think is problematic.
    0:39:59 First of all, it’s creating a model of the world that is divergent from the real world.
    0:40:03 Two, you’re spending so much time with it that it’s pinning it. Even though you might be getting
    0:40:09 regular counter evidence from your world, you are spending more time there than you are garnering
    0:40:14 evidence from the world. Now, you have a sticky bad belief that is divergent from the real world
    0:40:21 model. We’re saying these technologies like social media are presently designed in a way that can
    0:40:27 hijack the brain’s predictive models in a way that freezes us into rigid or patterns and habits of
    0:40:33 mind rather than helping us towards some more flexible ones you’ve talked about that get us up
    0:40:37 to the edge of informational chaos. But you’re saying that’s not something inherent to digital
    0:40:41 tech or social media. It’s something that could presumably be designed otherwise.
    0:40:46 Absolutely. I don’t think anybody did it on purpose. I’m a big optimist. I don’t think anybody
    0:40:51 was trying to do this. I think this is an emergent feature. It’s an emergent feature of a confluence
    0:40:58 of pressures, including making sure the people investing in you are happy and individual influencers
    0:41:02 are making a living doing this. I think it’s a confluence of problems, and yet it is a real problem.
    0:41:08 One other aspect of that that fits very succinctly here that I’m worried about and that I’ve written
    0:41:16 about is, according to the framework, if you have persistent error, you can resolve that error a
    0:41:21 couple of different ways. One way is you can update your model to better fit the world. You run into
    0:41:27 some new evidence, and you might just go, “Oh, well, that’s just a better way to believe.” Model
    0:41:33 gets updated. Or you can change the world to better fit the model. Let’s say you believe
    0:41:38 the earth is flat, and then you go to Thanksgiving dinner, and somebody in your family says,
    0:41:43 “That’s stupid. You should believe something else.” You can either be like, “Oh, maybe you’re
    0:41:48 right. That is good counter-evidence, and I’m going to update.” Or you can behave in the world
    0:41:52 in a way that gets you back to status quo. In that example, what you’re doing is you’re leaving,
    0:41:56 you’re cutting off your family, you’re getting out of Thanksgiving dinner, and you’re getting back
    0:42:02 to your echo chamber. You’re getting back to the filter bubble where you’re now going to be exposed
    0:42:11 to the evidence that aligns with your prediction. Conspiracy theory thinking falls so naturally
    0:42:17 from this kind of system, because this system, remember, if you’re putting yourself in a situation
    0:42:25 where you are constantly awash with bad evidence, it will inevitably adjust the generative model,
    0:42:28 which is just to say it will inevitably change the reality you live in.
    0:42:33 And so where you’re getting your information from, the people you’re spending time with,
    0:42:38 the information that you’re exposing yourself to, that is all having a really direct and serious
    0:42:44 impact on your reality-generating mechanisms. I wanted to loop in your work on contemplative
    0:42:51 practices. We’ve talked about how art and creativity can bring us to that edge of chaos,
    0:42:57 but you’ve also said elsewhere that meditation can do a similar kind of thing, which is confusing
    0:43:00 at first, because meditation looks pretty different than watching a horror movie, for example.
    0:43:06 In meditation, I’m sitting there very quietly in what looks like the opposite of chaos.
    0:43:10 So how do you understand what meditation is doing in this predictive framework,
    0:43:14 and how does that relate to creativity and these beneficial kinds of uncertainty?
    0:43:23 So I think horror movies can help us get exposed to scary stuff. I think being exposed to scary
    0:43:29 stuff at our edge in a safe way helps us. It helps us get better at managing our own emotions.
    0:43:33 It helps us get better at managing uncertainty. I think that’s valuable for an uncertainty
    0:43:39 minimizing machine. Yes, it’s cool to hang out at our edge. How does that relate to meditation?
    0:43:43 So we get this idea, I think commonly now, especially in the West, meditation might be
    0:43:47 more about relaxation, maybe addressing- Stress relief and so on.
    0:43:51 Addressing stress or pain, but that’s not actually, that’s not the meat. That’s not the meat of that
    0:43:59 program. At the center of that program is a deep, profound and progressive investigation
    0:44:05 about the nature of who we are, how our own minds work. It is a deep investigation about the way
    0:44:08 that our emotional system is structured and the way that it works is ultimately a deep
    0:44:12 investigation of the nature of our own conscious experience. What are we experiencing? Why are
    0:44:18 we experiencing it? What does that have to do with the world? And then, how can we adjust
    0:44:24 progressively and skillfully the shape of who and what we are so that we fit the world the best,
    0:44:29 so that we are as close as possible to what’s real and true and so that we can be as serviceable as
    0:44:36 possible. But that’s really what it’s for. And ultimately, I think you can do everything that
    0:44:39 we’ve been talking about, including all the stuff that psychedelics does for the predictive system,
    0:44:42 all the stuff that horror and violent video games does for the predictive system.
    0:44:46 You can do it all contemplatively in a way that’s better for you, I think.
    0:44:53 Yeah. So, you’re saying one way to kind of try to find that thread that puts meditation and horror
    0:44:58 movies in kind of the same vein of practice. Is it thinking about meditation, and you mentioned
    0:45:04 psychedelics as well, as these modes of injecting uncertainty into our experience, and particularly
    0:45:09 about kind of provoking us out of our ordinary habits of how we experience the world? Is that
    0:45:14 kind of the common currency there? Absolutely. And you get that through these imaginative
    0:45:21 contemplative practices, but you also get it directly from the more standard, well-known
    0:45:29 attention and awareness program too. Now, whether you’re encountering useful uncertainty because
    0:45:36 you’re generating uncertainty, provoking images like your death, or you’re just looking closer and
    0:45:44 closer at your own experience, your own self-experience, and it’s increasingly reflected back to you
    0:45:50 that your old ideas of who and what you are might not stand. In both of those directions,
    0:45:55 you are on a steep learning curve about who you are and what matters here.
    0:46:02 Let me ask you this. After this whole story we’ve unpacked, there’s still a kind of tension
    0:46:10 that leaves me a little bit uncomfortable. It feels like we’re saying that creativity is just
    0:46:17 kind of an input or a means towards juicing the powers of prediction. And part of me pushes
    0:46:23 against that in that it almost feels reductive. Is creativity really just this evolutionary
    0:46:29 strategy that makes us better predictive? Creatures, does that make creativity feel less
    0:46:35 intrinsically valuable? Because when I think about creativity, at least in part, it doesn’t
    0:46:40 just feel like a tool for survival that evolution has honed. Sometimes it feels like it is that
    0:46:45 which makes life worth living, that it has intrinsic value of its own, not as a tool for the
    0:46:50 predictive powers that be, my brain or the algorithms or whatever it is. So I’m curious if
    0:46:56 you feel this tension at all and how you think about creativity being framed in the service of
    0:47:04 prediction. So two things. One, even though we are excited by this new framework, I don’t think
    0:47:10 we need to be afraid of it being overly reductionistic. I mean, in a way, it’s radically reductionistic.
    0:47:14 We’re saying that everything that’s happening in the brain can be written on a t-shirt,
    0:47:23 basically. But the way that it actually gets implemented in super complex, beautiful systems
    0:47:31 like us, it shouldn’t make us feel like all of the wonderful human endeavors are simply explainable
    0:47:38 in a sort of overly simplified way. I don’t have any worry like that. I think if it turned out that
    0:47:45 life was operating over a simple principle of optimization, that’s the most beautiful thing
    0:47:52 I’ve ever heard, first of all, that all of life is about optimization. All of life is this resistance
    0:47:59 to entropy. That’s just what it is to be alive, is just your optimal resistance to entropy.
    0:48:05 As the universe expands and entropy is inevitable, life is that single force that’s defying,
    0:48:14 that’s defying that gradient. That’s so beautiful. When it comes to art, I want to even be careful
    0:48:19 to say that art is only about finding this critical edge. I think that’s one really interesting way
    0:48:22 of thinking about it. It’s one way that we’ve been thinking about it. If you consider movies and
    0:48:29 video games as forms of art also, another central reason that this kind of system might benefit
    0:48:34 from artistic expression that we didn’t cover, but that’s completely relevant for our discussion,
    0:48:41 is that art creates this wonderful opportunity for endless uncertainty and uncertainty management.
    0:48:49 Not very many things do that. As you progressively create dancing, painting, singing, whatever,
    0:48:54 the enthusiasm of that, literally being in the spirit of that creative endeavor,
    0:49:00 is you managing uncertainty in a new and remarkable way that it’s never been done before?
    0:49:05 In all of existence through all time, nobody has ever encountered and resolved that uncertainty
    0:49:12 in particular. It should be endlessly rewarding, fascinating, and I think no wonder we find it
    0:49:20 so beautiful. It might be by its very nature, maybe the purest expression of uncertainty generation
    0:49:26 and management. Like you say, that would make it intrinsically valuable for an uncertainty
    0:49:32 minimizing system like us. I think that’s a great place to wrap up. Mark Miller, thank you so much
    0:49:42 for being here. This was a pleasure. This was the best interview I’ve ever had. You’re awesome.
    0:49:52 All right. I hope you enjoyed the episode. I definitely did. For me, optimization usually
    0:49:59 conjures the idea of a cold and calculating logic of efficiency, not what it ultimately means to be
    0:50:05 alive and supported by the creative injection of uncertainties into our experience of the world,
    0:50:10 but I thought that Mark made the case beautifully. As always, we want to know what you think.
    0:50:15 So drop us a line at thegrayarea@vox.com. And once you’re finished with that,
    0:50:22 go ahead and rate and review and subscribe to the podcast. This episode was produced
    0:50:29 by Beth Morrissey and hosted by me, O’Shawn Jarrah. My day job is as a staff writer with Future Perfect
    0:50:34 at Vox, where I cover the latest ideas in the science and philosophy of consciousness,
    0:50:39 as well as political economy. You can read my stuff over at vox.com/futureperfect.
    0:50:48 Today’s episode was engineered by Erika Huang, fact-checked by Anook Dusso, edited by Jorge Just,
    0:50:54 and Alex Overington wrote our theme music. New episodes of the Gray Area drop on Mondays.
    0:50:59 Listen and subscribe. The show is part of Vox, and you can support Vox’s journalism by joining
    0:51:06 our membership program today. Go to vox.com/members to sign up. And if you decide to sign up because
    0:51:20 of this show, let us know.
    0:51:30 Your own weight loss journey is personal. Everyone’s diet is different. Everyone’s
    0:51:35 bodies are different. And according to Noom, there is no one-size-fits-all approach.
    0:51:40 Noom wants to help you stay focused on what’s important to you with their psychology
    0:51:45 and biology-based approach. This program helps you understand the science behind your eating
    0:51:50 choices and helps you build new habits for a healthier lifestyle. Stay focused on what’s
    0:51:57 important to you with Noom’s psychology and biology-based approach. Sign up for your free trial
    0:52:00 today at Noom.com.

    In part three of our series on creativity, guest host Oshan Jarow speaks with philosopher of neuroscience Mark Miller about how our minds actually work. They discuss the brain as a predictive engine that builds our conscious experience for us. We’re not seeing what we see. We’re predicting what we should see. Miller says that depression, opioid use, and our love of horror movies can all be explained by this theory. And that injecting beneficial kinds of uncertainty into our experiences — embracing chaos and creativity — ultimately make us even better at prediction, which is one of the keys to happiness and well-being.

     This is the third conversation in our three-part series about creativity.

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Musician Laraaji on the origin of creativity

    AI transcript
    0:00:07 In 2018, Madison Smith told the county attorney she’d been raped by a classmate, but he told
    0:00:11 her he couldn’t charge him with rape.
    0:00:16 Then she found out Kansas is one of only six states where citizens can petition to convene
    0:00:18 their own grand jury.
    0:00:24 Having to tell over 300 strangers about what happened to me seemed so scary.
    0:00:26 I’m Phoebe Judge.
    0:00:27 This is Criminal.
    0:00:35 Listen to our episode, The Petition, wherever you get your podcasts.
    0:00:36 Hey, this is Sean.
    0:00:41 We’re running a special series this week on creativity, and thinking about this topic
    0:00:44 brought me back to one of my favorite episodes of The Gray Area.
    0:00:49 I spoke with pioneering musician, LaRajie, about a lot of things, but what lingered with
    0:00:54 me was his theory of creativity, which he describes as a kind of surrendering.
    0:00:59 For LaRajie, to create is to get out of your own way, to get out of your head and drop
    0:01:01 into the moment.
    0:01:05 Creativity understood in this way is really the art of spontaneity.
    0:01:08 It’s all about opening yourself up, and anyone can do it.
    0:01:11 It’s a beautiful idea, and I wanted to include it in this series.
    0:01:17 I hope you enjoy it as much as I did.
    0:01:24 There’s an old saying that writing about music is like dancing about architecture.
    0:01:27 It’s intended as a dig at music criticism.
    0:01:30 But beneath that, there’s a deeper truth there.
    0:01:33 Music is intangible, subjective.
    0:01:37 It’s universal, yet still deeply personal.
    0:01:43 And while, yes, science and math are involved in its creation, there is something undeniably
    0:01:45 mystical about it.
    0:01:51 And that mysticism is worth exploring.
    0:02:05 I’m Sean Elling, and this is the Gray Area.
    0:02:13 The music you’re hearing now is the 1985 song “I Am Sky” by today’s guest, LaRajie.
    0:02:18 The 80-year-old pioneer of so-called new-age music has been sitting in the lotus pose on
    0:02:24 the fringes of the music world for decades, and recently, he actually joined Andre 3000
    0:02:31 on stage for the first performance on his “New Blue Sun” tour in Brooklyn.
    0:02:36 When he was young, LaRajie experimented with acting, including a role in the landmark film
    0:02:38 “Putney Swope.”
    0:02:44 He also spent time in the 1960s stand-up comedy scene in New York.
    0:02:49 But after he became interested in spiritual communities and discovered the auto-harp,
    0:02:52 he devoted his life to music.
    0:02:59 So naturally, I was delighted that he could join us today to talk about music, meditation,
    0:03:04 spirituality, and laughter.
    0:03:10 LaRajie, welcome to the show.
    0:03:12 Thank you, Sean.
    0:03:13 I’m so glad you’re here.
    0:03:21 I have a lot of interesting people on this show, but you, sir, are truly one of a kind.
    0:03:24 So this is a treat, really.
    0:03:26 It’s unique to be here.
    0:03:28 It’s new-age communication.
    0:03:34 You know, I’m so intrigued by all the artistic interests you’ve had in your life.
    0:03:37 You’ve done stand-up comedy, you’ve done acting.
    0:03:41 Obviously, in the end, you gave yourself over to music.
    0:03:44 Why music, above all else?
    0:03:51 I think music has the most immersive impact, transport of impact on my life.
    0:03:55 There was since a child, even though I didn’t verbalize it, I went with the flow.
    0:04:01 Music, whether it was for dancing or listening on the radio or within church, it was no
    0:04:03 contest about the winner.
    0:04:08 It was music and sound that could shift me instantaneously.
    0:04:16 And I liked that I could use music to please others or to set their feet moving or to inspire
    0:04:17 them to sing along.
    0:04:25 So I enjoyed the power of music, almost the undisputed power of music, to set inner settings
    0:04:29 in which alternative realities become clearer.
    0:04:35 You know, I’ve always been fascinated by stand-up comedy in particular.
    0:04:39 It’s the thing I do if I had the talent to do anything, and I don’t know, maybe it’s
    0:04:46 a strange question, but did the experience acting and doing comedy make you a better
    0:04:51 musician or was it just creatively a totally different thing?
    0:04:54 It’s the same thing, Sean.
    0:04:59 Wherever I choose to open and give expression to, I’m practicing the art of surrendering
    0:05:08 and spontaneity, and that carries over from music into humor and a lot of my laughter
    0:05:13 life is involved with spontaneous interaction, social interaction with friends, and that’s
    0:05:20 the same kind of spirit, free flow, inventive spirit that I depend upon in music and proposition.
    0:05:23 I think that’s why I’m a lousy musician.
    0:05:26 I’m too in my own damn head.
    0:05:31 I can’t just open up and let it go and just, you know, be in the flow or however you would
    0:05:34 put it, surrender, I guess, is the way you put it.
    0:05:35 Yeah.
    0:05:38 I said, “People who have trouble surrendering,” I said, “observe your body language when
    0:05:39 you have your next orgasm.”
    0:05:40 I don’t see how.
    0:05:46 I don’t think anybody wants to see that.
    0:05:51 But look at your breath, look at your body language, look how focused you are into surrendering
    0:05:54 to this energetic expression.
    0:05:59 And I see some of that expression carried over into the way people sing pop music or
    0:06:00 rock music.
    0:06:03 They’re into the most orgasmic, passionate level of release.
    0:06:11 And do you think of yourself as primarily an improvisational musician for those reasons?
    0:06:13 That question is a really good question.
    0:06:14 I’m tempted to say yes.
    0:06:19 I depend more on improvisation than I do on set scores.
    0:06:25 I find that improvisation is aligned with what I call my spiritual belief, that every
    0:06:29 moment is new and to trust that what I need in this moment is here.
    0:06:35 I’ve been listening to a lot of your music, preparing for our chat, and so much of it
    0:06:39 sounds otherworldly to me, like in the best sense possible.
    0:06:47 And I feel this way when I listen to great musicians who just seem like they are convening
    0:06:53 with some kind of creative energy or creative force that I just can’t touch, but at least
    0:06:56 I can vibe with it.
    0:07:02 That’s what fuels artists’ enthusiasm to have people like you to serve.
    0:07:10 So you are part of the reason I like doing professional music and doing shows, that I
    0:07:15 feel that I’m able to articulate, maybe express for you what you would do if you were in my
    0:07:16 shoes.
    0:07:17 Humans always talk about that.
    0:07:19 You know, they feel like they’re not writing or playing music.
    0:07:20 It’s more like they’re conduits.
    0:07:21 I mean, is that?
    0:07:22 Yes.
    0:07:25 Is that what it’s like for you on stage?
    0:07:30 Yes.
    0:07:38 And it’s magical and mystical and transported place because you’re witnessing somehow beyond
    0:07:44 linear time flow, but you’re in the midst of local time, but you’re also witnessing
    0:07:47 an unbroken constant present time.
    0:07:50 It’s speaking through me and it’s speaking as me.
    0:07:53 It’s like it’s my total presence at that time.
    0:07:54 I’m sound.
    0:07:55 I’m space.
    0:07:56 I’m timelessness.
    0:08:06 And I’m witnessing in the midst of this going on.
    0:08:09 Music that happens surprises me at times.
    0:08:11 It’s like music I can’t dream up.
    0:08:18 Part of my art is knowing when to get out of the way, how to set up a musical flow or
    0:08:36 a musical event and then to step to the side of it and let it speak through.
    0:08:42 Your instrument of choice is the auto harp, which is not exactly conventional.
    0:08:44 What is it about the auto harp?
    0:08:49 Well, how you open the question, my instrument of choice, well, it’s the instrument that
    0:08:54 was chosen for me, for me.
    0:08:55 Fair enough.
    0:09:02 I would not have chosen the instrument except for a mystical communication event in a pawn
    0:09:05 shop in Queens, New York, 1974.
    0:09:07 Tell me more about that.
    0:09:08 Yes.
    0:09:15 It was at a time that I was married, child, and we were living with my in-laws.
    0:09:20 And I was playing jazz rock piano with a group called Winds of Change.
    0:09:29 And on one particular day, I felt like my finances were low and I had a good Yamaha
    0:09:33 steel string guitar that I wasn’t using it, although I loved it.
    0:09:38 I decided to take it into town, South Ozone Park Queens, and pawn it.
    0:09:45 And as I was going into the store, I noticed in the window, right-hand side, an auto harp.
    0:09:50 And I remember thinking to myself, there’s that chunky looking instrument I used to see
    0:09:52 in the village when I did stand-up comedy.
    0:09:59 So I go into the pawn shop, I’m ready to exchange my guitar and a Martin’s fiberglass case
    0:10:00 for $175.
    0:10:05 And the clerk on the offer me $25.
    0:10:08 And I said, whoa, that’s not going to work.
    0:10:14 And just about then, the clerk and I were the only one in the store, so it was very quiet.
    0:10:19 And there was this moment of, am I going to really settle for $25?
    0:10:26 And this very clear, distinctive directive came through, says, don’t take money, swap
    0:10:30 it for the instrument in the window.
    0:10:34 And so here was something trying to help me make a choice.
    0:10:36 And I thought it was way out.
    0:10:38 How was this voice appearing?
    0:10:43 And I heard it so clear, there was so much love and so much wisdom that I just had to
    0:10:45 see where this was going to go.
    0:10:58 So I left that pawn shop with $5 on the auto harp.
    0:11:06 So when you said instrument of choice, something in that, it’s also my life of choice, something
    0:11:11 has impacted me in such a ways to make choices that are more aligned with, I would say, a
    0:11:22 higher intelligence.
    0:11:27 You also sing, not always, but the singing, it feels part of the music.
    0:11:33 There’s little distinction between the instruments you’re playing and what’s coming from your
    0:11:34 voice.
    0:11:35 Do you think of it that way?
    0:11:38 Your voice is just another instrument, not something separate?
    0:11:39 Yes.
    0:11:48 I do like doing everything at the same time, spontaneous, unified flow, create a flow with
    0:11:56 several instruments at the same time, using the voice without calling the mental process
    0:12:12 into linear thinking.
    0:12:19 Using the voice as an emotional, expressional instrument is what I’ve been exploring, especially
    0:12:25 with meditation or deep contemplation of contacting altered planes of conscious present
    0:12:26 time.
    0:12:35 So to talk about it is to take the mind out of it.
    0:12:40 Then there’s sounds of passion, passionate immersion.
    0:12:48 The voice can be used to express witnessing inside of an awe-inspiring perception, just
    0:12:55 to be in the passionate emotional moment and to let it speak through and use the body,
    0:12:56 not just the voice.
    0:13:03 So the whole body becomes the voice and the breath and the movement and become a conduit.
    0:13:12 And so invented or improvisational language can be the evidence of a person or a practitioner
    0:13:19 in total immersion, total submission, getting involved with a total perception that’s beyond
    0:13:29 linear description.
    0:13:36 I also consider that if you can use gibberish, if I’m called gibberish or glossillaria or
    0:13:46 talking in tongues, to relax the mind from its conditioning into gathering linear information.
    0:13:50 So the mind is given or the brain is given a vacation.
    0:13:58 And in that vacation place, it might be freed up to have an alternative space-time experience.
    0:14:03 And that might be the message the artist wants to convey that there is an alternative way
    0:14:05 of being conscious here and now.
    0:14:13 I’ve heard you talk about music as a tool for total presence, like a way, I think the
    0:14:16 way you put it is, it’s a way of dropping into the now.
    0:14:17 And I like that.
    0:14:21 Why do you think music has that kind of effect on us?
    0:14:29 I think generated or channeled by the right musician or artist, the artist is in a state
    0:14:37 of contemplation or meditation or a suspended time awareness.
    0:14:42 And the languaging that occurs with their instrument, their interaction with their instrument
    0:14:49 and with their voice can convey this repurposing of the human instrument, repurposing it from
    0:14:57 a conveyor of local human-based emotion to a conduit of exalted emotion.
    0:15:04 Direct perception inside this timeless present moment is always available.
    0:15:11 Certain sounds, drones can do that, music that’s very spontaneous, that can pull the
    0:15:18 mind out of linear thought, could allow the perceiver, the listener to subtly, directly
    0:15:24 notice the reality of eternal time and the infinite space.
    0:15:32 Sound works primarily as a suggestion, through suggestion, and it can point to the invisible.
    0:15:39 And sound can suggest the flowing of energy, the flowing of blood, the flowing of breath.
    0:15:47 It can suggest the integration of seemingly separate and discordant aspects of anything.
    0:15:54 It can provide a model of an all-pervading unity and harmony, in the case of a harp where
    0:16:00 all 36 strings are vibrating at the same time and producing this synergetic tonal event.
    0:16:06 So as to say that sound can, through suggestion, it can point to the invisible, it can point
    0:16:15 to the transcendent, it can direct the emotional body out of heaviness so that lightness, a
    0:16:19 more ethereal resonance, can be directly witnessed.
    0:16:22 That is so damn interesting to me, you know.
    0:16:27 And you know, there’s that laugh, we’re going to talk more about that briefly, but you talk
    0:16:32 about talking and how that kind of gets us stuck in linear time in our heads, and you
    0:16:39 know, once we start using words, we’re already in the world of ideas and abstractions, but
    0:16:41 music is more primary than that, right?
    0:16:46 It touches something in us that existed before we invented words.
    0:16:50 It’s primal, I guess, in that way.
    0:17:07 Yes, I agree with that, Shawn.
    0:17:15 Music might be able to say more than what speech can say in the case of getting an audience
    0:17:22 to drop into deep relaxation without using words, but using sound, or to get a group
    0:17:30 of people roused up in a noble pursuit of an ideal vision.
    0:17:37 My general mode of operation is to prepare before performance or recording through just
    0:17:44 dropping into a refined sense of the meditative field, do some yoga postures, some breathing
    0:17:51 exercises, some positive affirmations, and then to sculpt this field or point to this
    0:18:00 transcendental field and letting it transmit itself into a sound reposition through me.
    0:18:06 And this happens, I tend to call it sound bath, celestial sound bath, though it’s for
    0:18:11 immersing the immersion experience, and once again here, we’re away from the words and
    0:18:16 we’re into the pure, impacting force of sound.
    0:18:21 You do sing, though, and you do have lyrics on occasion, and one of your earliest recordings
    0:18:23 is called “All of a Sudden”.
    0:18:24 Yes.
    0:18:53 And, you know, “All of a Sudden” is this refrain about the spiritual awakening, and
    0:18:57 is that how you experienced your musical or spiritual epiphany?
    0:19:02 As in, you know, it was just sudden like that, does a song correspond with the time when
    0:19:03 you felt that shift?
    0:19:04 Yes, very much.
    0:19:12 I was visiting Florida at the time on a tour of doing inspirational music and songs.
    0:19:17 I was interfacing with several spiritual communities that practiced meditation and I would join
    0:19:18 them.
    0:19:24 And what I observed in meditation experience is that I may have forgotten why I’m meditating
    0:19:28 until, boom, all of a sudden I realized while I’m meditating, this is to contact a different
    0:19:30 version of present time.
    0:19:35 All of a sudden it’s a different sky, it’s a different reason, it’s a different world.
    0:19:41 I discovered that earlier in long hours of sitting meditation that a different version
    0:19:45 of the universe slips into view, something that’s always been here.
    0:19:50 So actually when I say “All of a Sudden” it’s really “All of a Sudden” I’m ready to have
    0:19:51 this experience.
    0:19:56 All of a sudden it’s a different game, it’s a different place, it’s a different state
    0:19:57 of mind.
    0:20:15 All of a sudden it’s a different world, it’s a different rate of vibration, clearly.
    0:20:21 It’s a shift in perception, it’s a shift in the way that I am gathering information.
    0:20:27 And it’s a beautiful experience too because the tendency is to search along the plane
    0:20:33 of the linear that I’ll get there one day, I’ll get there tomorrow, I’ll find it in somebody
    0:20:38 else’s yoga session or in somebody else’s religious manual.
    0:20:42 But the preparation for yoga is to be ready now.
    0:20:49 Learn how to relax and stay relaxed when there is a divine epiphany or divine intervention.
    0:20:55 How not to block it, not to over intellectualize it with words.
    0:21:01 But how to be ready for this sudden emergence or the sudden revelation or the sudden opening
    0:21:03 of the doors of perception.
    0:21:09 And then being ready means how not to freak out.
    0:21:14 And that’s usually called a bad trip, a bad psychedelic trip.
    0:21:17 When all of a sudden is too much, all of a sudden I don’t have a body.
    0:21:18 What is this?
    0:21:19 All of a sudden…
    0:21:20 Oh yeah.
    0:21:21 I’ve been there.
    0:21:22 It’s rough.
    0:21:29 You just gotta hold on for dear life until you come out the other side.
    0:21:33 Do you actually find a meaningful distinction between music and meditation or is it all
    0:21:36 just different manifestations of the same practice?
    0:21:42 Well, that is a super dandy question because my ultimate answer is that they’re one and
    0:21:49 the same, meaning that in the moment of deepest meditation, I consider meditation to be the
    0:21:53 highest romance and that romance is the highest meditation.
    0:22:02 My experience of a very high, if not the highest, romantic meditation is and was during listening
    0:22:08 to a cosmic sound current going on where I am, pervading all that I am and pointing
    0:22:11 to a self that is beyond the body.
    0:22:17 So this meditation is simultaneous with music that couldn’t be separated.
    0:22:25 So to answer your question, yes, in the deepest and fullest experience that I call meditation,
    0:22:30 there is a musical event when we can debate what is music.
    0:22:34 It doesn’t have to be the top 40 Grammy Award-winning hit.
    0:22:43 It can be the movement of energy and consciousness in such a way that has balance, form, aesthetic
    0:22:54 quality and has equilibrium and mystically it has a very clear mathematical character.
    0:22:57 Why can’t I hear the cosmic sound current, Leraji?
    0:23:01 When I sit on the cushion and meditate, I find myself just sitting there thinking about
    0:23:02 meditating.
    0:23:03 Really?
    0:23:05 Which seems to not be the idea.
    0:23:11 My answer is that you are aware of the inner sound current and that you are not aware of
    0:23:14 the you that is aware.
    0:23:15 That’s my answer.
    0:23:23 You are aware right now, but there is a you that you’re involved with that is not allowing
    0:23:28 the you that is aware to be your dominant present-time experience.
    0:23:33 So once again, I’m saying everything everywhere is permeated by a cosmic music.
    0:23:36 So to answer your question, why don’t you hear it?
    0:23:41 My refraining of the question is why aren’t you aware of yourself hearing it?
    0:23:42 I don’t know.
    0:23:45 I don’t know.
    0:23:47 I don’t know.
    0:23:53 What is the most glorious awe of lifting sound listening experience you can remember?
    0:23:57 The sound of my infant child laughing.
    0:24:01 Most beautiful thing I’ve ever heard.
    0:24:02 Yes.
    0:24:12 No doubt that’s beauty, that’s music.
    0:24:17 When we get back from the break, we talk about Leraji’s development as an artist and a chance
    0:24:21 collaboration with one of the world’s most revered producers.
    0:24:35 Stay with us.
    0:24:38 Your name, Leraji, where did that come from?
    0:24:46 Leraji as a name came out of an association with a spiritual community here in Harlem,
    0:24:53 which centered around a bookstore on the corner of Lenox Avenue and 125th Street.
    0:25:01 I would offer my music at that time by sitting outside of the store or in the vestibule,
    0:25:07 and one of these occasions, two of the spiritual community members approached me and said,
    0:25:12 “You know, we’ve done some research and we’ve taken your name, Edward Larry Gordon.
    0:25:21 Edward Larry Gordon, Larry Gordon, Larry G, and we’ve morphed it into a name that includes
    0:25:24 reference to the sun god, Ra.
    0:25:28 These spiritual community members didn’t know that I was already looking for a name
    0:25:32 and I intuitively suspected it would be three syllables and have something to do with the
    0:25:33 sun.
    0:25:35 So when they approached me, they said, “We have a name for you.
    0:25:36 We want to suggest it.”
    0:25:40 I said, “Well, well, I have a little concern because if I didn’t like the name, I would
    0:25:41 embarrass somebody.”
    0:25:47 I said, “Let’s meet in Central Park tomorrow and you can reveal the name to me.”
    0:25:52 We get to Central Park, we found a place, and they revealed the name to me as Leraji,
    0:26:01 which is a gentle transition from Larry Gordon to Leraji, but I was very impressed with this
    0:26:07 synchronicity and I accepted the name there in Central Park.
    0:26:09 My friends took it very easily.
    0:26:14 My biological family members thought it was interesting.
    0:26:18 They made a sincere attempt to use the name.
    0:26:20 My mother was very polite.
    0:26:28 She would make an attempt to use the name, but she reminded me that whatever you call
    0:26:34 yourself, “I’m your mother.”
    0:26:38 Her favorite words was, “Take care of yourself and you’re taking care of me.”
    0:26:42 Was this around the time, as you mentioned earlier, you were busking, you were performing
    0:26:46 music in the streets of New York, and that led to a fortuitous collision with the very
    0:26:50 famous musician and producer Brian Eno.
    0:26:51 How did that come about?
    0:26:52 Yes.
    0:26:58 It was playing music in the parks and the plazas of New York City and Brooklyn.
    0:27:04 What I was doing was earning money while testing the idea of channeling or performing music
    0:27:11 in altered states to see what could I bring any meaningful uplifting experience to New
    0:27:13 Yorkers, random public audience.
    0:27:18 It turned out to be so that the Museum of Natural History, Central Park, and one of
    0:27:25 my favorite more constant places was the northeast corner of Washington Square Park.
    0:27:31 One evening, I was performing with my eyes closed, as I usually do, cross-legged, sitting
    0:27:39 on a carpet in one of the cobblestone circles there, and that’s where Brian Eno left me
    0:27:47 a message in my zither case, introducing himself and an idea he was offering to listen
    0:27:52 to a project he was working on, and he thought I would be interested.
    0:27:56 So I call him up, and the next day I go to visit him, and we have this talk.
    0:28:02 I’m still not clear who he is, I just know he’s a producer, and he worked for Frippanino.
    0:28:11 But his energy was that he was an avenue to getting my music into a high-end recording
    0:28:12 studio.
    0:28:18 This was also the time during my spiritual practice to practice scientific praying.
    0:28:23 What that is is affirmations, whatever you want, you act like you’ve got it, you think
    0:28:29 like you’ve got it, and you develop a sense of emotional presence as if you’ve got it.
    0:28:34 And since you don’t know what it is that you want, you don’t particularly give it a name,
    0:28:37 you just use generally “right,” the word “right.”
    0:28:43 So I remember praying for the right producer, and the right producer coming into my creative
    0:28:49 life and the right producer finding it very inspiring to work with me.
    0:28:55 So it turns out that the right producer was Brian Eno, and I never knew Brian Eno.
    0:29:00 I didn’t know enough to know that that would be the right producer.
    0:29:05 So there’s that meeting of Brian Eno and the Day of Radiance album.
    0:29:28 It’s so good, my God.
    0:29:34 I just felt an automatic shift of my attitude of being in a studio, of shifting to a very
    0:29:42 high professional attitude, and a feeling that I was connected to a very classical inner
    0:29:46 conduit that would come out as beautiful new music.
    0:29:51 None of that was pre-arranged or written out or scored.
    0:29:58 It was all in the moment after doing my usual preparation of centering and getting into
    0:30:10 a flow state.
    0:30:14 When did laughter become such an important thing for you?
    0:30:20 It shifted the energies of the bullies in my neighborhood when I was young to use humor.
    0:30:27 I would be so afraid of their presence when I could use humor, and in the church we’d
    0:30:32 use humor when the church would get boring, and because I wasn’t the right place to use
    0:30:38 it, we’d use it to get some of our other peers to laugh in the middle of a serious sermon.
    0:30:46 But I noticed the power of laughter to alter, to break the sense of rigidity and separation.
    0:30:54 I began writing scripts in high school and doing situation comedy for talent shows because
    0:30:59 I enjoyed seeing people lose it to laughter.
    0:31:05 The family I grew up in, the uncles, the aunts, the cousins, all were laughter friendly.
    0:31:08 So laughter was always on the menu.
    0:31:16 I can’t remember even a funeral where laughter was outlawed.
    0:31:20 You really do see it as a transformative force, don’t you?
    0:31:25 Well, after doing stand-up comedy and decided to let stand-up comedy go for a while and
    0:31:32 just focus on music, it was a book by Raj Nish, Osho Raj Nish, to help me realize that
    0:31:40 I could access the laughter experience without doing comedy and that I could guide other
    0:31:49 people into the laughter zone and enjoy the deliciousness of laughter without using humor
    0:31:56 and at the sacrifice of something, of human standards or a human character.
    0:32:01 And now through laughter play shops, I call them, we use laughter to get people into the
    0:32:06 play zone and to get them into contact with their inner child and to get them into deep
    0:32:16 relaxation and I really enjoy laughter now because it can come up out of people without
    0:32:18 it having to be nervous laughter.
    0:32:23 The entire body can get involved, the entire breath can be open and it’s getting sweeter
    0:32:29 and more delicious every time I do one of these.
    0:32:32 You said it gets us to the play zone.
    0:32:37 You really mean laughter is a way to transcend the thinking mind, just to get out of that?
    0:32:38 Yes.
    0:32:43 Someone put it into words, Raj Nish pointed out that when you’re laughing, really involved
    0:32:49 with laughter, that you or us who are always laughing is not thinking, they’re not involved
    0:32:51 in the thought process, linear thought.
    0:32:58 That may be so if you’re into pure, open laughter, if it’s nervous laughter where you’re mindful
    0:33:03 of a threatening situation, that would be a different situation.
    0:33:09 But real full-bodied, cathartic laughter, you’re releasing faster than you can think.
    0:33:14 So there’s no thought process, processing what it is that’s been released.
    0:33:20 It’s just yummy, open, nurturing release.
    0:33:24 Is that cathartic full-body laughter?
    0:33:28 Is that an expression of bliss for you when it comes out?
    0:33:34 It’s laughing openly can be bliss, but what I’m talking about is conscious bliss or mindful
    0:33:35 bliss.
    0:33:41 I’ve laughed for 70 minutes once, and the result was akin to breathwork.
    0:33:45 And breathwork, if you’ve ever done it, can take you into bliss.
    0:33:55 So on that level, mindful laughter and intentional laughter can bring us to the bliss zone very
    0:34:04 easily.
    0:34:08 When we get back from the break, Larajji tells us what he’s learned in his 80 years
    0:34:10 on this planet.
    0:34:23 Stay with us.
    0:34:28 You’re obviously a musician, but also a spiritually serious person.
    0:34:35 Ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha,
    0:34:36 ha, I love it.
    0:34:39 But that spirituality is so central to your life.
    0:34:44 You’ve been a professional musician for decades, performing all over the world, you’re entangled
    0:34:47 with the business and the commercial side of music.
    0:34:52 I guess I just wonder how you navigate that element of being a professional musician and
    0:34:55 being a spiritual person at the same time.
    0:35:05 Well, I did many years ago get it that unless I integrate my spiritual nature, I would never
    0:35:10 be totally happy, content, or experience resolution, because I can’t get it from the
    0:35:11 physical world.
    0:35:17 I’m not hating the physical world, but things in the physical world are temporary.
    0:35:23 And then constantly, we’re reminded things come, they stay, and then they leave.
    0:35:28 And some things are just too beautiful for us to accept that they’re ever going to leave.
    0:35:37 And I grew to understand that behind the world that is changing, there is this spiritual field
    0:35:44 that if I learn how to embrace it constantly, even while I’m embracing my outer wealth,
    0:35:50 that when the outer wealth shifts, I’m not bent out of shape because I’m still connected
    0:35:54 to this inner spiritual platform that doesn’t get bent out of shape when the outer world
    0:35:55 shifts.
    0:36:03 So for me, staying constant and staying with my spiritual practice allows me to be more
    0:36:09 playful and less fearful of the physical world and less fearful of change and less fearful
    0:36:10 of losing.
    0:36:17 And so I find that the spiritual side helps me to be more present, more experimental,
    0:36:21 and more risk-taking with my music for expression.
    0:36:25 Was there ever an opportunity you had that you couldn’t take or wouldn’t take because
    0:36:29 it would have compromised you musically or spiritually?
    0:36:38 There was one situation that I was hesitating to take, and a friend reminded me that I could
    0:36:47 sublimate my spiritual message or spiritual energy and do the engagement, so I did it.
    0:36:57 And I also was reminded spiritually that rather than curse the darkness, light a candle.
    0:37:06 That shifted me from being gun-ho about resisting wrong assignments, that if I feel I can still
    0:37:13 shine my light or let light shine or let joy prevail during that assignment or that gig
    0:37:16 or project, I will tend to take it.
    0:37:21 Any gig that I would not take would be if I thought there was not healthy, either for
    0:37:27 pollution reasons or that the environmental setting is physically unsafe.
    0:37:33 But for now, I’ve been guided to see that every opportunity is an opportunity to represent
    0:37:36 the all-pervadingness of one spirit.
    0:37:37 What do you mean by that?
    0:37:45 I mean that right now, one all-pervading spirit, maybe you could call it bliss, love, or light,
    0:37:50 is everywhere, not only in the universe, it’s creating the universe.
    0:37:52 That all of spirit is everywhere.
    0:37:58 I can perform anywhere, and knowing that all of spirit is there, and I can allow spirit’s
    0:38:01 presence to receive reflection or representation in that place.
    0:38:09 I think I’ve also heard you say that you think our core spiritual problem is our misidentification
    0:38:10 with our bodies.
    0:38:11 What does that mean?
    0:38:12 I’m not going to ever do this.
    0:38:14 I wouldn’t think of doing this to you, Sean.
    0:38:15 To what?
    0:38:17 What are you going to do to me?
    0:38:23 I would amputate your leg, your feet, you’re still there, your torso, you’re still there,
    0:38:26 your arms, your elbows, you’re still there.
    0:38:27 That’s tough.
    0:38:28 You’re just ahead, and you’re still there.
    0:38:32 None of your ears and nose goes, you’re still there, your lips and tongue goes, you’re still
    0:38:33 there.
    0:38:38 Suddenly, your head disappears, but you’re still there, and you’re saying to yourself,
    0:38:40 “Wait a minute.
    0:38:42 I thought I was that body.
    0:38:43 Look, I’m timeless.
    0:38:44 I’m invisible.
    0:38:45 I’m weightless.
    0:38:48 What do I do with this?”
    0:38:57 I believe that identification with the physical body, which is birth, that lives, that dies,
    0:39:03 and we get attached to it, and we get sentimental with it, and we try to enjoy its five senses,
    0:39:09 and we forget or we don’t access the joy that we can have, more expansive joy we can have
    0:39:14 through the infinite self that is always here.
    0:39:21 You, perhaps your buddies, have had an epiphany through the use of certain ceremonies where
    0:39:28 you’re suddenly in another sense of present time and space, a different sense of expansiveness,
    0:39:36 a different sense of how time is unfolding, slower or not at all, and that to have this
    0:39:44 experience is to be taking advantage of a different form of body.
    0:39:51 The deepest sense of happiness and joy, I feel, comes from having an intimate communing
    0:40:00 experience with my eternal present time self, the spiritual presence which is always here,
    0:40:02 always everywhere.
    0:40:08 It just needs to be totally present to dig it and to catch it and to wear it and to behold
    0:40:10 it.
    0:40:18 In all these years, playing music, experimenting, performing, composing, creating, what do you
    0:40:24 know about music now that you didn’t know when you started?
    0:40:33 Yes, because it’s such an international audience now, and it’s taught me that there is a universal
    0:40:44 receptivity to emotional, sensual, ambient soundscapes.
    0:40:51 There is an automatic acceptance of receptivity to beautiful music and to beauty being expressed
    0:40:58 in music, into timelessness, into spiritual voicing through music.
    0:41:03 That there is a receptive audience, it’s taught me that there is an audience here, that there
    0:41:10 is an inner witness waiting to hear itself reflected in our music.
    0:41:16 You’re 80 years old, you’ve been making music for over 40 years, you’ve lived such an interesting
    0:41:20 life as an artist and a contemplative.
    0:41:23 As you sit here now today, what is your spiritual mission?
    0:41:32 What gets you out of bed every day?
    0:41:37 What gets me out of the bed is mentally I’ll go through what I have to do the moment I
    0:41:43 get out of bed, and I’ll visualize myself standing up, either electric toothbrush on
    0:41:48 my teeth or preparing tea or doing some yoga exercise.
    0:41:53 Usually what gets me up is a sense of a daily agenda, which is different every day, something
    0:41:58 that I’m going to do the day that I’m going to really enjoy, whether it’s music, performance
    0:42:04 or designing new tuning, or getting to know new pieces of equipment, or sitting for an
    0:42:10 extra period of time in meditation either in lotus position in my house or I’m going
    0:42:16 for a walk in Central Park or Riverside Park, and sitting on a bench in the sun and getting
    0:42:18 into meditation.
    0:42:25 What keeps me enthusiastically involved in life and passionately involved with life
    0:42:32 is the sensation of an eternal non-human intelligence that’s generating this thing
    0:42:38 called creation, and it’s allowing me to participate in it and to co-witness and to co-collaborate
    0:42:47 with it, and that in the midst of this, it is remaining invisible, remaining infinite,
    0:42:52 and I’m feeling it through my connection with it, and so it’s not the most what I’m getting
    0:42:58 out of bed for, but what I’m getting out of bed as, I’m getting out of bed as this sense
    0:43:06 of conscious improvisational collaboration within the divine, alternating intelligence.
    0:43:13 But when I’m doing tours and I’m put in a nice, beautiful hotel, I’ll get out of bed
    0:43:16 for the breakfast.
    0:43:28 Well, what can I say, you are one of one, and it was lovely getting to know you a little
    0:43:30 bit here.
    0:43:31 Thank you, Sean.
    0:43:47 I appreciate your calm, cool, collected style.
    0:44:16 You know, I’m not sure I would have returned to this conversation if we weren’t doing
    0:44:24 this series on creativity, but I’m so glad I did because listening to it with creativity
    0:44:31 in mind allowed me to take so much more from it, things that I missed the first time.
    0:44:37 But, of course, I’d love to hear your thoughts on the episode, so drop us a line at the grey
    0:44:43 area at fox.com, and when you’re done with that, make sure you rate and review the pod.
    0:44:44 Thank you.
    0:44:45 .
    0:44:52 Thank you.
    0:44:54 (gentle music)
    0:45:04 [BLANK_AUDIO]

    Sean revisits his interview with musician Laraaji, a pioneer of new age music who has recorded more than 50 albums since he was discovered busking in a park by Brian Eno. Laraaji and Sean discuss inspiration, flow states, and what moves us to create.

    This is the second conversation in our three shows in three days three-part series about creativity.

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Is AI creative?

    AI transcript
    0:00:04 There’s over 500,000 small businesses in B.C. and no two are alike.
    0:00:05 I’m a carpenter.
    0:00:06 I’m a graphic designer.
    0:00:09 I sell dog socks online.
    0:00:12 That’s why B.C.A.A. created one size doesn’t fit all insurance.
    0:00:15 It’s customizable, based on your unique needs.
    0:00:19 So whether you manage rental properties or paint pet portraits,
    0:00:23 you can protect your small business with B.C.’s most trusted insurance brand.
    0:00:29 Visit bcaa.com/smallbusiness and use promo code radio to receive $50 off.
    0:00:32 Conditions applied.
    0:00:38 What is the relationship between creativity and intelligence?
    0:00:43 That’s a fundamental, perhaps unanswerable question.
    0:00:46 Is it also an obsolete one?
    0:00:47 The question now seems to be,
    0:00:53 what is the relationship between creativity and artificial intelligence?
    0:00:57 Creativity feels innately human.
    0:00:59 But what if it’s not?
    0:01:02 How are we to know?
    0:01:07 Philosophers, artists, and scientists are already debating whether the art
    0:01:11 and the writing generated by mid-journey and chat GPT
    0:01:15 are examples of machines being creative.
    0:01:21 But should the focus be on the output, the art that’s generated,
    0:01:24 or the input, the inspiration,
    0:01:29 and all the work and toiling that goes into making it?
    0:01:34 And what about the other, smaller ways in which we use our creativity?
    0:01:38 Like in a prank on a friend or in a note to a loved one.
    0:01:46 Does the value of those communications change if AI creates them?
    0:01:48 I’m Sean Elling, and this is The Gray Area.
    0:02:05 Today’s guest is writer and essayist Megan O’Giblin.
    0:02:09 She’s the author of the book “God, Human, Animal, Machine.”
    0:02:13 Technology, metaphor, and the search for meaning.
    0:02:15 She’s also a previous guest of The Gray Area,
    0:02:17 and if you enjoy this conversation,
    0:02:24 and of course you will, I’ll add a link to our last one in the show notes.
    0:02:29 Megan is terrific, and she’s been thinking about the human relationship
    0:02:32 with technology for a long time.
    0:02:37 And her book made a really strong case that the more our existence
    0:02:40 intertwines with the tools we create,
    0:02:47 the more those tools shape our understanding of who and what we are.
    0:02:51 So as we kick off this series about creativity,
    0:02:54 I could think of no better person to discuss how machines are changing
    0:02:59 our understanding of creativity and forcing us to reflect
    0:03:02 on what it really means to be creative.
    0:03:12 Megan O’Giblin, welcome back to the show.
    0:03:13 Thanks so much for having me.
    0:03:15 So we’ve talked before.
    0:03:18 You know what, I’m just going to go ahead and say we’re friends.
    0:03:19 I hope that’s okay.
    0:03:21 Yes.
    0:03:27 So your work spans a pretty wide range of themes and questions connected
    0:03:31 to the relationship between humans and computers.
    0:03:37 What I wanted to talk with you today is about this relationship
    0:03:43 or how to look at that relationship through the lens of creativity.
    0:03:50 And you once asked a computer scientist what he thought creativity meant.
    0:03:53 And he told you, well, that’s easy.
    0:03:57 It’s just randomness.
    0:04:01 What do you make of that view of creativity?
    0:04:04 How would you correct or add to it?
    0:04:09 I mean, there’s a way in which it was seemed first like a convincing answer, right?
    0:04:15 And I think that there is something, a relationship between creativity
    0:04:19 and randomness in the sense that it’s something that is non-deterministic.
    0:04:24 It’s something that surprises you, surprises the person looking at the art.
    0:04:26 It surprises the artist often.
    0:04:31 And, you know, I think it makes a lot of sense, especially if you’re thinking,
    0:04:35 you know, it’s no coincidence that a computer scientist came up with this definition.
    0:04:40 Because if you’re thinking about creativity or what we call creativity
    0:04:43 in large language models, for example, you can play around
    0:04:47 if you’ve ever sort of played around with like the temperature gauges of an LLM.
    0:04:52 You can basically turn up the temperature and turn up the amount of randomness
    0:04:54 in the output that you get.
    0:04:57 So, you know, if you ask ChatGBT, for example,
    0:05:00 to give you a list of animals at a low temperature,
    0:05:04 it’ll say something very basic like a dog, a cat, a horse or something.
    0:05:07 And if you turn up the temperature, it’ll give you more unusual responses,
    0:05:10 more statistically unlikely responses like an ant eater.
    0:05:14 Or if you turn it way up, it’ll make up an animal like a whizzledy woo
    0:05:17 or some sort of Susie and creature that doesn’t exist.
    0:05:22 So, there is some element of randomness there.
    0:05:25 I’m inclined to think that it’s not, I mean, obviously it’s not just randomness
    0:05:30 because we also appreciate order, creativity and meaning.
    0:05:35 And I think, you know, I’ve noticed this sort of folk theory,
    0:05:40 I’ll call it that, of creativity that crops up in a lot of a lot of conversations
    0:05:47 about human creativity, which I tend to call like the modular theory of creativity.
    0:05:52 And it’s basically this idea that all works of art can be broken down
    0:05:55 into these little modules or building blocks.
    0:05:59 And that creativity is really just the ability to recombine, you know,
    0:06:04 take two things that have never been put together before and combine them
    0:06:06 in a new way and create something new.
    0:06:11 And, you know, I think that’s like a great explanation of how a lot of gen AI works.
    0:06:17 You know, you can ask a chatbot to write a poem about Elon Musk and the style of Dr. Seuss.
    0:06:21 And yeah, those two things have never been put together before.
    0:06:23 And it seems very creative.
    0:06:28 My intuition, I guess, as a human is that our form of creativity
    0:06:35 is a lot more complex than that, that it really has to do with filtering
    0:06:39 everything you’ve ever experienced as a human artist, right?
    0:06:42 All of your influences, which are unique to each person.
    0:06:46 Everybody has sort of a unique data set that they’re working with.
    0:06:49 And filtering that through your lived experience in the world.
    0:06:54 For me, the things that I appreciate in art have a lot to do with vision,
    0:06:57 with point of view, with the sense that you’re seeing something that’s been,
    0:07:01 you know, filtered through an autobiography, through a life story.
    0:07:07 And I think it’s really difficult to talk about how that’s happening,
    0:07:10 you know, in AI models.
    0:07:15 Yeah, I mean, we have these large language models,
    0:07:20 things like chat GPT and mid-journey or pick your favorite poison.
    0:07:26 And they produce language, but they do it without anything that I’d call
    0:07:31 consciousness. And consciousness is something that’s notoriously hard
    0:07:38 to define, but let’s just define it as the sensation of being an agent in the world.
    0:07:42 LLMs don’t have that.
    0:07:48 But is there any way in which you could call what they’re doing creative?
    0:07:51 Or do we need some other word for it?
    0:07:57 I think the difficult thing is that, you know, creativity is a concept that is,
    0:08:01 I think, like all human concepts, like intrinsically anthropocentric,
    0:08:07 that we created the term creativity to describe what we do as humans.
    0:08:13 And we have this bad habit as humans of changing the definition of words
    0:08:17 to sort of suit our opinion of ourselves, especially when, you know,
    0:08:24 machine’s turn out to be able to do tasks that we previously thought were limited
    0:08:29 to humans. I’m thinking about, you know, we saw this with chess, for example,
    0:08:34 that was for a long time considered the height of human intelligence was being
    0:08:42 able to play chess. And, you know, the moment that deep blue beat the human
    0:08:46 champion of chess, the New York Times interviewed a bunch of philosophers
    0:08:48 and computer scientists who were at that event.
    0:08:52 And I think it was Douglas Hofstetter who said, oh, my God,
    0:08:54 I thought that chess required thought.
    0:08:55 Now I know that it doesn’t.
    0:09:01 And it’s hard not to sense that something similar is happening with creativity,
    0:09:03 that a lot of the elements that we didn’t understand about it.
    0:09:07 I think it was easy to see it as somewhat mystical, you know,
    0:09:11 we talk about inspiration, which has this sort of like almost metaphysical
    0:09:14 or divine undertones to it.
    0:09:19 And now that we see a lot of that work done by automated processes,
    0:09:22 it becomes more difficult to say what creativity really is.
    0:09:26 And I think there’s already an effort, and I sense it myself too,
    0:09:32 this like effort to sort of cordon off this more special island of human
    0:09:36 exceptionalism and say, no, what I’m doing is actually different.
    0:09:41 And for me, consciousness and intent, it’s really hard to talk about those
    0:09:43 things apart from creativity.
    0:09:47 But I also doubt, you know, I think that there’s definitely like a little bit
    0:09:51 of defensiveness on my part in defending those qualities because they are
    0:09:53 something that machines don’t have.
    0:09:56 Well, I think that’s what I like about you.
    0:09:59 That we’re both on team human that way.
    0:10:04 You know, I mean, even it’s such a slippery distinction, computation
    0:10:06 and thought, what’s the difference?
    0:10:09 You know, and you run into the same problem when you’re trying to think
    0:10:13 about art and creativity and what is and isn’t art.
    0:10:17 You made a comment about two modernist writers that you admire,
    0:10:19 James Joyce and Virginia Woolf, right?
    0:10:29 And what made them genuinely creative artists was that they created a form
    0:10:31 of consciousness that felt new.
    0:10:35 And they were able to do that because they were people experiencing a new
    0:10:41 and different world and express what it was like to live in this new world.
    0:10:49 And that’s not something a machine programmed to just recombine everything
    0:10:53 humans have already written can do, right?
    0:10:56 I mean, that seems to be a line here.
    0:11:00 It’s something that current machines can’t do because they’re disembodied.
    0:11:01 Yeah.
    0:11:04 I’m very cautious against saying a machine will never do something,
    0:11:07 especially given all the advances we’ve seen in recent years.
    0:11:12 But certainly, I think the thing that I value and things like Woolf or Joyce
    0:11:15 or, you know, someone like Borges, who’s doing really interesting
    0:11:20 experimental work in the 1950s that felt totally sui generis,
    0:11:25 is that it is something to do with capturing a way of being in the world
    0:11:27 that hasn’t been experienced before, you know?
    0:11:31 And I think it’s easy to see this in modernism and postmodernism
    0:11:34 because history was changing so much during those periods.
    0:11:36 And the human experience was changing, right?
    0:11:41 And you had to have been a person embodied in a political and a cultural context
    0:11:46 and, you know, living that reality in order to capture what that felt like.
    0:11:48 So it’s an interesting thought experiment.
    0:11:53 You know, if we move into sort of embodied cognition of some kind of AI,
    0:11:56 I had a body and, you know, without walking around in the world,
    0:12:01 had sort of sensory access completely to the world in the same way we do.
    0:12:04 Would it be able to capture that also in ways that feel new?
    0:12:05 Possibly.
    0:12:09 But that’s not the models that we have right now.
    0:12:15 You could ask ChatGPT to produce a hundred novels in the style of Hemingway or whatever.
    0:12:17 And I guess it would do it.
    0:12:21 But what creative value does that have?
    0:12:26 Like, certainly the human prompting the AI isn’t an artist.
    0:12:30 But, you know, is the thing ChatGPT spits out a piece of art?
    0:12:32 Or is it something else?
    0:12:38 I think it’s something else, but I also have a hard time explaining why.
    0:12:46 I think that a lot of generative AI operates in this very kind of top-down approach to creativity,
    0:12:50 which is that, you know, you have an idea, you have an inspiration of some kind.
    0:12:55 You have a story you want to tell, and you put all of that into the model as a prompt,
    0:12:59 and it does the grunt work, just basically enacts your will.
    0:13:05 When I think about, like, the truly most creative moments in my experience writing,
    0:13:08 it often happens the other way around from the bottom up.
    0:13:12 Like, I often start writing something and I don’t know anything about what it’s going to be.
    0:13:14 I don’t know the form.
    0:13:15 I don’t know the style.
    0:13:18 I don’t know, you know, what I’m going to argue if it’s an essay.
    0:13:23 I just have, like, a sentence in my head or an image.
    0:13:29 And I start with that, and all of those sort of larger features,
    0:13:31 the things that you’re supposed to put into the prompt,
    0:13:37 kind of grow out of that experience of making really small particular choices,
    0:13:41 almost like those are emergent features of the creative process.
    0:13:44 And I teach writing.
    0:13:49 And the thing I often tell my students is you really have to fight,
    0:13:53 especially during the early stages of a process to not know too much,
    0:13:57 because you’re logical, like the left side of your brain or whatever,
    0:14:03 is always going to be trying to get ahead of the process
    0:14:09 or sort of impose something familiar or something known onto what you’re doing.
    0:14:15 And it’s going to be less interesting than if you work in a more associative way,
    0:14:18 because then your unconscious is entering into the picture.
    0:14:27 So to me, like the idea of using generative AI to enact a concept,
    0:14:33 it’s really almost a backwards way of thinking about how I think about art.
    0:14:34 I think when humans interact with those models,
    0:14:40 you’re dealing with something that’s basically competing with your own unconscious, right?
    0:14:43 Which is where those unexpected connections come from.
    0:14:53 [MUSIC]
    0:14:57 Do you think a machine or an AI could ever really communicate
    0:15:00 in the way we understand that phenomenon?
    0:15:04 I don’t even think a machine can think for the reasons we’ve already explained,
    0:15:07 but they certainly process information.
    0:15:10 But are they capable of communication
    0:15:14 in the way that humans are to each other?
    0:15:18 The thing that I value about human communication,
    0:15:20 and I’ll include in that art,
    0:15:28 I read a lot of memoir and first person writing because I want access to another mind.
    0:15:35 There’s things that you can say in an essay or a book
    0:15:42 that you can’t say just in normal social conversations just because the form permits you to.
    0:15:47 I love reading because I love seeing the way that other people see the world.
    0:15:52 Obviously, there’s people who are able to do that in a way that’s very artistic,
    0:15:58 that has beautiful syntax and images, and that’s part of communication, obviously.
    0:16:01 But really the most important thing to me that we often take for granted
    0:16:06 is this background knowledge that what I’m reading on the page has come from another mind
    0:16:10 that had the desire to communicate something, right?
    0:16:18 And so, when people ask, “Oh, do you think that an AI could create
    0:16:22 the next best American novel, the great American novel,”
    0:16:27 we’re talking a lot in those hypotheticals about technical skill.
    0:16:34 And to me, I think even if it was on the sentence level or even on the level of concepts and ideas,
    0:16:42 something that we would consider virtuoso, if it came a sort of an example of human genius,
    0:16:46 just the fact that it came from a machine I think changes the way that we experience it.
    0:16:50 I think that when I’m reading something online, for example,
    0:16:56 and I start to suspect that it was generated by AI, it changes the way I’m reading,
    0:16:59 there’s always that larger context of how we experience things,
    0:17:02 and intent and consciousness is a big part of it.
    0:17:05 You know, it’s a remarkable thing if you think about just how recently
    0:17:11 we took it for granted that any text that you encounter was composed by a human being, right?
    0:17:15 Even if it was a ghost writer or an administrative assistant or something,
    0:17:20 even if it didn’t come from the person that it purported to come from,
    0:17:22 there was a human consciousness behind it.
    0:17:26 And that has been completely, you know, language has been detached from thought,
    0:17:30 from human thought, just in the past few years.
    0:17:36 And I can’t help but think that’s going to fundamentally change how we think about language
    0:17:39 and how much we value language.
    0:17:40 For better or worse?
    0:17:42 For worse, absolutely.
    0:17:43 Yeah.
    0:17:48 This possibility that the ease with which we can produce language,
    0:17:53 I mean, or we could talk about images too, but language is a little bit more immediate to me,
    0:18:01 that that will actually devalue language the way that a currency becomes devalued through inflation,
    0:18:06 that if we become so used to reading in terms of lowering our expectations,
    0:18:08 there’s a lot of, I think, hypothetical questions about,
    0:18:14 “Oh, can, you know, will AI ever produce something that’s recognized as great art?”
    0:18:19 My response is always like, it doesn’t have to, to totally upend the industry.
    0:18:23 It’s like, it’s good at producing things that are good enough.
    0:18:23 Yeah.
    0:18:29 And I notice this, especially like when I see visual art that’s been created by AI,
    0:18:33 I’m much more impressed because it’s a field I don’t know as much about.
    0:18:36 And then I’ll talk to friends of mine who are artists, you know, who will say like,
    0:18:38 “Oh, that’s actually, you know, not that impressive to me.
    0:18:41 It seems kind of generic or derivative.”
    0:18:48 And, you know, I think about the type of work that, not even future, but just current,
    0:18:50 AI models are able to produce.
    0:18:55 They could very easily, I think, soon dominate the bestseller list, you know.
    0:18:58 And to some people who are really devoted to the craft of writing, you know,
    0:19:03 it might seem derivative or familiar, but that’s still not going to,
    0:19:07 I mean, it could still change the industry in really profound ways.
    0:19:13 There is something about the intentionality behind artistic creations
    0:19:16 that really matters to us.
    0:19:22 And, you know, it’s not like when I consume a piece of art, I’m asking myself,
    0:19:24 you know, how long did it take to make this?
    0:19:29 But I know subconsciously there was a lot of thought and energy put into it,
    0:19:35 that there was a creator with experiences and feelings that I can relate to
    0:19:38 who’s communicating something in a way.
    0:19:43 They couldn’t if they weren’t a fellow human being sharing this common human experience.
    0:19:44 And that matters, you know?
    0:19:52 It’s a feature, not a bug, as our beloved tech bros like to say.
    0:19:58 I think that effort that we have to put into making things is part of what gives it meaning,
    0:20:00 both for the person who’s producing it, right?
    0:20:03 Like the actual sacrifices and the difficulty of making something
    0:20:08 is what makes it feel really satisfying when you finally get it right.
    0:20:10 And it’s also, yeah, for the person experiencing it, right?
    0:20:16 I think about this a lot, even with things that we might not consider, you know, works of genius.
    0:20:21 But things like ways in which everyday people were creative for many years, you know,
    0:20:27 like I think about my grandfather used to write occasional poetry.
    0:20:32 So he would make up very simple, kind of funny poems for different occasions,
    0:20:35 for birthdays or anniversaries that he wrote himself.
    0:20:38 And he didn’t have a college education, but he was creative.
    0:20:41 And the poems are very creative.
    0:20:43 In many ways, they had simple rhyme schemes.
    0:20:47 They were personalized for the person or for the occasion.
    0:20:52 And, you know, that was a way for him to express his creativity.
    0:20:57 And that’s precisely the kind of thing that an LLM could do very well, right?
    0:21:02 Write a simple poem, you know, put it in the prompt to sort of you want it to be about,
    0:21:04 come up with a rhyme scheme.
    0:21:11 And I think about, like, what is the effect of somebody today listening to something like that,
    0:21:16 you know, sort of personalized poem and not knowing if it was actually created by the person
    0:21:20 or if it was just produced through a prompt.
    0:21:24 I think that really does change how you experience something like that.
    0:21:30 Do you think that maybe AI will make just radically new kinds of art possible?
    0:21:34 Maybe we can’t imagine what that will be.
    0:21:40 But maybe it’ll be awesome and I’ll look dumb in retrospect for saying it would be terrible.
    0:21:49 Yeah, I mean, any of us who are daring to speak about this topic right now really are putting ourselves out there
    0:21:53 for risking looking stupid in two years or five years down the road.
    0:21:59 But I will say it is true that AI is often called an alien form of intelligence
    0:22:03 and the fact that it reasons very differently than we do.
    0:22:09 It doesn’t intuitively understand what’s relevant in a data set the way that we do
    0:22:13 because we’ve evolved together to sort of value the same things.
    0:22:19 So, you know, you see this in something like the famous case of AlphaGo
    0:22:25 where this algorithm one beat the human champion of Go, Chinese board game,
    0:22:29 by making a move that basically no human would ever make.
    0:22:32 That’s how it was described by a lot of former Go champions,
    0:22:36 that it was a completely unhuman move.
    0:22:42 And I try to think about what that would look like in art, you know,
    0:22:47 because if you think about like if art and creativity is always this sort of tension between
    0:22:52 novelty and something that is new or innovated
    0:22:57 versus the lineage of a tradition in the form that you’re working in.
    0:23:03 Like, is there a space in which something could be sort of an alien move
    0:23:07 but still strike us as meaningful, I guess, is the question?
    0:23:12 I don’t know. Yeah, I would grant that it’s entirely possible that
    0:23:19 in the future AI will create art that’s maybe more beautiful and profound
    0:23:22 than anything we could create or even imagine.
    0:23:30 But being the product of an alien intelligence,
    0:23:33 what could it possibly mean to us?
    0:23:37 You know, I always think about that line from
    0:23:43 Victor Stein, the philosopher, he said, if a lion could speak,
    0:23:47 we wouldn’t be able to understand what it says.
    0:23:51 And we wouldn’t be able to understand because we don’t inhabit the world of lions.
    0:23:53 We don’t know what it’s like to be a lion.
    0:23:56 We don’t share a way of life with lions.
    0:23:59 So how could we possibly understand what they’re saying?
    0:24:03 And I think there’s, I think that’s true of this too.
    0:24:11 I really don’t think we fully appreciate how different a truly
    0:24:14 inorganic intelligence must be from us.
    0:24:17 You mentioned the word embodied earlier.
    0:24:23 I mean, our embodiedness is so essential to what and who we are,
    0:24:27 that shared experience, that shared vulnerability.
    0:24:32 It’s the whole basis of mutual understanding and even ethics, really.
    0:24:38 And meaning is this thing we create together as humans sharing a common way of life.
    0:24:42 And I have to believe that we’re going to lose so much of that
    0:24:49 in a world where we’re mostly consuming products created by machines.
    0:24:55 Maybe that as much as anything is what scares the shit out of me.
    0:24:59 Yeah. I mean, I’ve felt it myself.
    0:25:03 I feel lately, and maybe this is just being a writer too,
    0:25:06 but that I live a lot in my head and I live a lot in, you know,
    0:25:08 I think being a writer is in some sense,
    0:25:11 you’re always living in this virtual world of language
    0:25:15 that is sort of adjacent to the real world.
    0:25:18 But I think when you’re also spending 10 hours a day
    0:25:22 in front of a computer screen and interacting, you know,
    0:25:25 in your everyday life and work and everything with other people virtually,
    0:25:29 you definitely become detached in a very strange way
    0:25:31 and a very subtle way from your body.
    0:25:35 And, you know, I think about the role of the body.
    0:25:38 The body is so closely connected to what we call the unconscious.
    0:25:41 And I don’t mean that in any sort of like, you know,
    0:25:45 Jungian psychoanalytic sense, just like the unconscious intelligence,
    0:25:49 all the things that our body does that we don’t pay attention to in any given day.
    0:25:56 And there’s a reason why I think writers often say, oh, I got my best idea when I was out in a walk,
    0:25:59 right, that there’s something that happens when you’re actually interacting
    0:26:02 in the world that makes connections in your mind.
    0:26:04 And I don’t know how that happens,
    0:26:08 but it’s something that I think is important.
    0:26:09 It’s important to creativity.
    0:26:12 Well, there’s also something that happens when you go out in the world
    0:26:15 and interact with other human beings.
    0:26:15 Yes.
    0:26:20 We have all these fantastical sci-fi dystopian scenarios,
    0:26:27 but I tend to think our actual dystopian future will be much sadder and much more boring.
    0:26:32 You know, it’s not terminators fighting humans in the street.
    0:26:37 It’s a world rendered flat and sterile by technology
    0:26:41 where humans have offloaded all the thinking and awkwardness and imperfections
    0:26:49 and sincerity that have made the human experience so messy and awesome.
    0:26:53 Do you remember that controversy over the Google Gemini commercial
    0:26:58 and Gemini is Google’s competitor with OpenAI’s chat GPT?
    0:27:00 So just so the audience knows what I’m talking about,
    0:27:06 the commercial is it’s a young girl who wants to write a fan letter.
    0:27:08 I’ve always thought she was following in my footsteps.
    0:27:10 Hey, go get her, baby.
    0:27:12 But lately she’s been looking up to someone else.
    0:27:18 To her hero who’s an Olympic gold medalist sprinter or something like that.
    0:27:22 And then her dad says,
    0:27:25 She wants to show Sydney some love and I’m pretty good with words,
    0:27:28 but this has to be just right.
    0:27:32 So Gemini, help my daughter write a letter telling Sydney how inspiring she is.
    0:27:37 And so he’s just going to let the AI write it for them.
    0:27:39 And it’s horrifying.
    0:27:41 People were like, it did not go the way Google thought it would,
    0:27:48 but it’s horrifying to me because it shows that AI isn’t just coming for our art and
    0:27:49 entertainment.
    0:27:52 It’s not just going to be, I don’t know, writing sitcoms or I don’t know,
    0:27:53 maybe doing podcasts.
    0:27:59 It’s going to supplant sincere, authentic human to human communication.
    0:28:02 It’s going to automate our emotional lives.
    0:28:07 And I don’t know what to call that potential world other than a machine world populated
    0:28:12 by machine like people and maybe eventually just machine people.
    0:28:17 And that’s a world I desperately, desperately want to avoid.
    0:28:18 Yeah.
    0:28:20 Gosh, it was.
    0:28:21 Sorry, that was a bit of a rant.
    0:28:21 No, no, no, no.
    0:28:24 I have been thinking about that and it’s funny.
    0:28:27 I haven’t actually seen that, but I’ve read about it in which.
    0:28:29 Oh God, it’s so bad.
    0:28:31 So there’s, it reminded me of a couple things.
    0:28:36 The first is, you know, my, my husband teaches freshman English in college.
    0:28:44 And he once sort of saw one of his students or no, one of his students told him about
    0:28:49 this story about how she was looking over her shoulder and seeing in class one of her friends
    0:28:54 who was chatting with somebody pretend sort of a potential romantic partner, I guess,
    0:29:03 and was taking his texts and copying it and putting it into chat GPT and saying respond to
    0:29:08 this and then copying the output and putting it back into the text message.
    0:29:14 And the student who oversaw this was, was like this, you know, person on the other end of the
    0:29:18 line is basically chatting with a chat bot, but they don’t know what they think they’re
    0:29:19 talking to a human being.
    0:29:25 And yeah, I think about all those ways in which you think, you know, I think that the thing that’s
    0:29:30 really insidious is like, we don’t know if we’re talking to a human or not oftentimes.
    0:29:36 And one of the, I was, you know, for a long time wrote a advice column for Wired Magazine where
    0:29:39 people could write in questions about technology in their everyday life.
    0:29:45 And one of the questions I got very shortly after chat GPT was released was somebody who
    0:29:48 was going to be the best man in their friend’s wedding.
    0:29:54 And he said, can I use chat GPT ethically, you know, to do a best man speech for me.
    0:29:59 And, you know, which I like, there’s cases of people doing this, the people who use it to
    0:30:00 write their wedding vows.
    0:30:09 And my response in my first instinct was like, well, you’re robbing yourself of the ability to
    0:30:15 actually try to put into words what you are feeling for your friend and what that relationship
    0:30:21 means to you. And it’s not as though those feelings just exist in you already.
    0:30:24 You know, I think anyone who’s, who’s written something very personal like this,
    0:30:28 you realize that you actually like start to feel the emotions as you’re putting it into
    0:30:31 language and trying to articulate it.
    0:30:36 And, you know, I think about the same thing with this hypothetical like fan letter that the girl
    0:30:37 is writing in the commercial, right?
    0:30:41 It’s like you’re stealing from your child the opportunity to actually try to
    0:30:44 access her emotions through language.
    0:30:47 To be a human being.
    0:30:47 Yes, yeah.
    0:30:54 I mean, I think one of the most profound things digital tech has done to the human mind is
    0:31:01 it has conditioned us to expect instant gratification and to not tolerate boredom or
    0:31:02 patience.
    0:31:07 And so, you know, you’ll hear some artists making the case that, you know, AI will be this great
    0:31:11 collaborative creative tool for humans.
    0:31:20 But I think it’ll just encourage us to think less, do less, feel less and rely on technology
    0:31:23 to do living for us.
    0:31:26 And again, I can imagine that world, but I don’t want to live in it, you know, and,
    0:31:34 but maybe it’s kind of a troubling thought, but maybe humanity is more pliable than we think.
    0:31:40 I definitely think that humans are more flexible than we think and that there it is certain that
    0:31:45 where we will evolve alongside this technology and find new forms of expression.
    0:31:48 That doesn’t mean that it’s always for the good.
    0:31:51 There’s tendency to go back and say like, oh, people said the same thing about photography.
    0:31:54 You know, when that came out that that wasn’t real art.
    0:31:57 People said the same thing about television, you know, if you go back and read like in the,
    0:32:02 you know, 80s and 90s, just all of these sort of writers who are just ranting against how
    0:32:07 television is the end of humanity and what’s making us passive.
    0:32:10 And nobody likes to be reactionary, obviously.
    0:32:16 But it’s also true that like there was truth to those objections, right?
    0:32:20 Like as television made my life better overall, I don’t know that it has.
    0:32:26 And so I think that the thing that I worry about is that in the effort to not seem like
    0:32:32 a Luddite or whatever, we’re actually slowly sort of anesthetizing ourselves towards these
    0:32:39 changes that are happening to the human experience and that are happening very quickly in this case.
    0:32:54 This is a big question, but I’m comfortable asking you because of your
    0:33:04 theological background. Do you think we have any real sense of the spiritual impact of AI?
    0:33:11 Are you talking about spiritual in terms of like actual, like the way people practice spiritual
    0:33:14 traditions and religion or just sort of like the human,
    0:33:23 all of it? It’s a paradox in some way, right? Because I think technologies are rooted in very
    0:33:28 anti-spiritual in the sense that it’s usually very reductive and materialist
    0:33:32 understanding of human nature. But with every new technological development,
    0:33:39 I think there’s also been this tendency to sort of spiritualize it or think of it in superstitious
    0:33:44 ways. I think about like the emergence of photography during the Civil War and how people
    0:33:51 believed that you could see dead people in the background or the idea that radio could sort of
    0:33:58 transmit voices from the spiritual world. So I think that it’s not as though technology is
    0:34:05 going to rob us of a spiritual life. I do think that technological progress competes in some ways
    0:34:10 as a form of transcendence with the type of transcendence that spiritual and religious
    0:34:19 traditions talk about in the sense that it is a way to push beyond our current existence
    0:34:24 and to sort of get in touch with something that’s bigger than the human, which I think is a very
    0:34:30 deep human instinct is to try to get in touch with something that’s bigger than us. And I think
    0:34:36 that there’s a trace of that in the effort to build AI, this idea that we’re going to create
    0:34:42 something that is going to be able to see the world from a higher perspective and that’s going
    0:34:50 to be able to sort of give our lives meaning in a new way. And I think that if you look at most
    0:34:58 spiritual traditions and wisdom literature from around the world, it’s usually involves this
    0:35:01 paradox where like if you want to transcend yourself, you also have to acknowledge your
    0:35:06 limitations. You have to acknowledge that the ego is illusion. You have to admit that you’re
    0:35:13 center. You have to sort of humble yourself in order to access that higher reality. And I think
    0:35:20 technology is a sort of transcendence without the work and the suffering that that entails
    0:35:27 for us in a more spiritual sense. Yeah, I think that’s right. And what I’m always
    0:35:34 thinking about in these sorts of conversations is this long term question of what we are as human
    0:35:41 beings, what we’re doing to ourselves and what we’re evolving into. I mean, Nietzsche love this
    0:35:48 distinction between being versus becoming. Humanity is not some fixed thing. We’re not a static
    0:35:54 being like everything in nature. We’re in this process of becoming. So what are we becoming?
    0:36:01 I think as it stands, we’re becoming more like our machines. And I think that’s bad.
    0:36:08 Yeah, there’s at some point, I think a threshold that’s crossed, right, where I mean, and where is
    0:36:13 that if we’re for becoming something, we’ve already been becoming something different,
    0:36:19 I think with the technologies that we’re using right now. And is there some hard line where we’ll
    0:36:26 become like post human or another species? I don’t know. My instinct is to think that
    0:36:34 there’s going to be more pushback against that future. As we approach it, then it might seem
    0:36:39 right now in the abstract. I think that it’s difficult to articulate exactly what we value
    0:36:46 about the human experience until we are confronted with technologies that are threatening it in some
    0:36:51 way. And I think that a lot of the some of the really great writing and the conversations that
    0:36:58 are happening right now are about, let’s try to actually put into words what we value about being
    0:37:03 human. And I think there’s a way in which that these technologies might actually help clarify
    0:37:08 that conversation in a way that we haven’t been forced to articulate it before. And to think
    0:37:14 about like, what are our values? And how can we create technology that is actually going to serve
    0:37:19 those values as opposed to making us the subjects of what these machines happen to be good at doing?
    0:37:27 Well, this was a pleasure. Yeah, it was great. Megan Ogiblin, thanks so much for doing this.
    0:37:34 And if you are listening and you have not read Megan’s book, God, Human, Animal, Machine,
    0:37:40 Technology, Metaphor, and the Search for Meaning, don’t be ridiculous. Go buy it and read it.
    0:37:47 It’s great. Megan, thanks. You’re the best. Thanks so much, Sean.
    0:38:00 All right. I hope you enjoyed this episode. I definitely did. Megan really is one of my
    0:38:06 favorite writers and thinkers. And honestly, these are just the questions that get me out of bed in
    0:38:12 the morning. This is the kind of stuff I live to talk about. And if you’re listening to the show,
    0:38:19 I guess you do too, or I hope you do. Anyway, I really want to know what you think of the episode.
    0:38:23 Did you like it? Could I have done something better? I don’t know. Just if you have a thought,
    0:38:30 send it to me. You can drop us a line at TheGrayArea@Vox.com. And once you’re finished with that,
    0:38:38 go ahead and rate and review and subscribe to the podcast. This episode was produced by Beth
    0:38:45 Morrissey. Today’s episode was engineered by Erika Huang, fact-checked by Anouk Dusso, edited by Jorge
    0:38:52 Just, and Alex O’Brington wrote our theme music. New episodes of The Gray Area drop on Mondays.
    0:38:59 Listen and subscribe. This show is part of Vox. Support Vox’s journalism by joining our membership
    0:39:06 program today. Go to vox.com/members to sign up. And if you decide to sign up because of this show,
    0:39:17 let us know.
    0:39:26 [BLANK_AUDIO]

    What is the relationship between creativity and artificial intelligence? Creativity feels innately human, but is it? Can a machine be creative? Are we still being creative if we use machines to assist in our creative output?

    To help answer those questions, Sean speaks with Meghan O’Gieblyn, the author of the book “God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning.” She and Sean discuss how the rise of AI is forcing us to reflect on what it means to be a creative being and whether our relationship to the written word has already been changed forever.

    This is the first conversation in our three shows in three days three-part series about creativity.

    Host: Sean Illing (@seanilling)

    Guest: Meghan O’Gieblyn (https://www.meghanogieblyn.com/)

    References:

    God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning by Meghan O’Gieblyn (Anchor; 2021)

    Being human in the age of AI. The Gray Area. (Vox Media; 2023) https://podcasts.apple.com/us/podcast/being-human-in-the-age-of-ai/id1081584611?i=1000612148857

    Support The Gray Area by becoming a Vox Member: https://www.vox.com/support-now

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Happiness isn’t the goal

    AI transcript
    0:00:03 Support for the gray area comes from Mint Mobile.
    0:00:07 Phones can be expensive, but with Mint Mobile,
    0:00:09 you can save a ton on wireless service.
    0:00:11 You can get three months of service
    0:00:14 for just 15 bucks a month by switching.
    0:00:15 To get this new customer offer
    0:00:18 and your new three-month premium wireless plan
    0:00:19 for just 15 bucks a month,
    0:00:23 you can go to mintmobile.com/grayarea.
    0:00:26 That’s mintmobile.com/grayarea.
    0:00:28 $45 upfront payment required,
    0:00:30 equivalent to $15 per month.
    0:00:33 New customers on first three-month plan only.
    0:00:36 Speed slower above 40 gigabytes on unlimited plan.
    0:00:39 Additional taxes, fees, and restrictions apply.
    0:00:40 See Mint Mobile for details.
    0:00:47 If you could decide whether to be optimistic
    0:00:50 or pessimistic all the time, which would you choose?
    0:00:56 I think most of us would choose to be optimistic.
    0:00:58 I mean, why not?
    0:01:00 Who doesn’t want to feel good about the future?
    0:01:04 But we all know it’s not that easy.
    0:01:07 We can’t always control how we feel.
    0:01:10 If we could, everyone would feel great all the time.
    0:01:16 Still, it’s worth asking why being optimistic
    0:01:18 can be so difficult sometimes,
    0:01:20 especially when there are plenty of reasons to be.
    0:01:25 If you’re like me, it often seems as though
    0:01:27 your own mind is at war with itself.
    0:01:30 Does it have to be that way though?
    0:01:34 Or is it possible that we’re just wired to worry?
    0:01:39 I’m Sean Elling, and this is the Gray Area.
    0:01:41 (upbeat music)
    0:01:54 Today’s guest is Paul Blum.
    0:01:57 He’s a professor at the University of Toronto
    0:01:59 and the author of several great books,
    0:02:02 including Psych, The Story of the Human Mind,
    0:02:04 and The Sweet Spot, The Pleasures of Suffering,
    0:02:05 and The Search for Meaning.
    0:02:10 I’m not dropping any official rankings here,
    0:02:12 but Paul is one of my favorite psychologists
    0:02:14 to read and talk to.
    0:02:17 His books are fun, enlightening,
    0:02:19 and full of practical wisdom.
    0:02:25 So when we decided to do this series on optimism,
    0:02:27 he was one of the first people I thought of.
    0:02:31 I had a great conversation a few weeks ago
    0:02:34 with another psychologist, Jamil Zaki,
    0:02:37 about the temptations of cynicism and how to overcome them.
    0:02:42 The episode is called Why Cynicism is Bad for You.
    0:02:45 I hope you’ll listen to it if you haven’t already,
    0:02:47 because it’s a really good companion
    0:02:49 to today’s conversation with Paul.
    0:02:52 In this one, we zoom out a little further
    0:02:54 to talk about the nature of our minds,
    0:02:57 what he’s come to learn about optimism,
    0:03:00 and what any of it has to do with whether or not
    0:03:01 we’re happy.
    0:03:06 Paul Bloom, welcome to the gray area.
    0:03:08 – Good to talk to you again, Sean.
    0:03:10 – You know, I hesitate to start out this way,
    0:03:12 but I’m a little disappointed in you.
    0:03:13 – Go ahead.
    0:03:16 – You have a lovely newsletter called Small Potatoes.
    0:03:21 You recently had a post about your favorite podcast,
    0:03:23 and we weren’t on it.
    0:03:24 Kater, explain yourself.
    0:03:26 – You know, you write these things
    0:03:28 and you know you’re gonna miss somebody,
    0:03:29 and then I hear from you and I missed you.
    0:03:31 I am sorry.
    0:03:34 The odd thing is, this is one of my favorite podcasts.
    0:03:37 Maybe if you go back to the newsletter
    0:03:38 and you go back to it,
    0:03:40 you will find that there has been a stealth edit.
    0:03:44 – All right, let’s get into this,
    0:03:46 and I’ll start with a hardball.
    0:03:52 Optimism, overrated or underrated?
    0:03:55 – Oh, I like the sort of Tyler Cowan vibe he got here.
    0:04:02 – Overrated, the whole question of optimism, pessimism,
    0:04:04 is kinda stupid.
    0:04:07 If optimism means you think things are gonna be better off
    0:04:08 than they are in pessimism,
    0:04:10 is that you think things are gonna be worse off?
    0:04:13 Isn’t the rational thing to be realist?
    0:04:14 So, overrated.
    0:04:17 I think we should try to see things as they really are.
    0:04:18 And on the same token,
    0:04:20 I don’t think we should be pessimists either.
    0:04:22 We should just try to be accurate.
    0:04:25 – Wait, do you think realism and optimism
    0:04:27 are mutually exclusive?
    0:04:28 Can’t you be both?
    0:04:30 – Well, I guess it depends what you mean by optimism.
    0:04:33 If what you mean by optimism is seeing the world
    0:04:35 in a good way in a positive light,
    0:04:37 in cases where the world actually is good
    0:04:40 and is positive, sure, then I believe in optimism.
    0:04:41 But I always thought it means,
    0:04:44 to some extent, rose-tinted lenses,
    0:04:45 seeing things as a little bit more positive,
    0:04:48 trying to see the bright side of things.
    0:04:51 And we should try to see things as they are.
    0:04:55 – Do you think of optimism as an attitude
    0:05:00 or an orientation or something closer to a life strategy?
    0:05:02 – Yeah, it’s a good question.
    0:05:05 I mean, what I’m talking about now
    0:05:07 is if it’s a way of assessing the odds.
    0:05:08 Do you assess them as bright?
    0:05:11 In cases of uncertainty, and it’s always uncertain,
    0:05:12 you go for use and things are gonna be good
    0:05:13 and things are gonna be bad.
    0:05:16 And I’m saying, we should just try to accuracy.
    0:05:19 There is a sort of attitude issue
    0:05:21 where optimism could be defended,
    0:05:23 where you get the odds right.
    0:05:25 But optimism says, hey, let’s give it a shot.
    0:05:27 Let’s not weigh the negatives.
    0:05:30 Let’s not be so loss averse.
    0:05:32 Let’s try to focus, let’s try to take a shot.
    0:05:34 And so not worry too much about failure.
    0:05:38 It can be under some circumstances, rational.
    0:05:43 – I think about something like religious faith.
    0:05:46 And we have pretty good evidence
    0:05:50 that religious people are actually happier.
    0:05:51 Now, there may be lots of reasons for that.
    0:05:53 That’s a separate conversation,
    0:05:55 but it seems to be a pretty consistent finding.
    0:05:58 So there is this tangible benefit to faith,
    0:06:01 completely independent of whether it’s true or not.
    0:06:04 And maybe optimism is kind of like that.
    0:06:07 – It’s interesting, it’s an interesting analogy.
    0:06:09 And so you might challenge what I said before
    0:06:12 and say, oh, wait, under circumstances in life
    0:06:16 where there is a payoff to being wrong
    0:06:21 in a certain systematic way to overestimate your chances.
    0:06:22 Suppose I’m back to my teenage years
    0:06:25 and I’m trying to approach women, go ask them out on dates
    0:06:27 and I have a realistic assessment.
    0:06:29 Honestly, the odds are not good for me,
    0:06:31 but I inflate the odds.
    0:06:33 And because I inflate the odds, it motivates me
    0:06:36 to approach people and to talk to them and so on.
    0:06:38 And I develop a relationship.
    0:06:40 Maybe nobody would open up a business or a restaurant
    0:06:42 or try for an academic job
    0:06:45 if they had a realistic assessment of the odds.
    0:06:48 So yeah, I could see it playing some role.
    0:06:49 At the same time though, for each example
    0:06:52 I’m giving of this sort, you could come back
    0:06:55 with an example of how an over-optimistic perspective
    0:06:57 could lead you into all sorts of trouble.
    0:06:58 – Yeah, that’s fair.
    0:07:02 I’m just an N of one, but I have to say in my experience
    0:07:05 I have not found that there’s a positive relationship
    0:07:07 between being realistic and being happy.
    0:07:10 So I don’t know, make of that what you will.
    0:07:11 – I think if you looked at your life,
    0:07:14 you would find that getting things right
    0:07:17 often leads to happiness or leads to more happiness
    0:07:19 than alternative.
    0:07:20 – Yeah, I don’t know.
    0:07:23 I found that I’m often right when I least want to be.
    0:07:23 (laughs)
    0:07:26 But maybe that’s just me, that could just be me.
    0:07:27 – Fair enough.
    0:07:33 – It’s interesting when you think about happiness
    0:07:36 which is obviously related to optimism in some ways.
    0:07:40 And I know there’s research showing
    0:07:43 that happiness over the course of someone’s life
    0:07:45 takes the form of a U-shaped curve
    0:07:48 where you’re most happy at the beginning
    0:07:49 and the end of life.
    0:07:51 And then there’s this big dip in the middle
    0:07:54 which we call the midlife crisis.
    0:07:58 Is there a similar finding on optimism and pessimism?
    0:08:00 Do we tend to get more or less optimistic
    0:08:02 or pessimistic as we age?
    0:08:04 Or do we just have no idea?
    0:08:05 – I have no idea.
    0:08:07 Maybe there’s some people who know.
    0:08:09 I mean, the U-shaped curve is interesting.
    0:08:10 When it was originally discovered,
    0:08:12 people said, well, that’s United States or our culture,
    0:08:16 but it seems to replicate across all sorts of cultures
    0:08:18 and all sorts of times.
    0:08:19 And it’s actually really surprising.
    0:08:21 You would expect to be happiest when you’re young
    0:08:23 and then there’s all sorts of decline,
    0:08:25 physical decline, cognitive decline,
    0:08:28 even financial decline should bring you down.
    0:08:30 But weirdly, when people hit their mid-50s,
    0:08:32 there’s often this curve upwards.
    0:08:34 And maybe this connects to optimism,
    0:08:38 but one analysis of this is that your priorities change.
    0:08:40 You’re no longer fully in the status game.
    0:08:42 This is sort of zero-sum battle
    0:08:46 for mating opportunities and money and power.
    0:08:47 And you step back more towards,
    0:08:51 I think, David Brooks calls some sort of eulogy virtues
    0:08:54 where I don’t have good relationships with people.
    0:08:56 I’ll develop fulfilling hobbies and so on.
    0:08:58 And maybe you’re more optimistic
    0:09:00 because then your goals are more realistic.
    0:09:02 If my goal is to have my next book at number one,
    0:09:03 a New York Times bestseller,
    0:09:05 well, it’s nice to be optimistic,
    0:09:08 but there’s a bit of frustration probably built into this.
    0:09:10 On the other hand, if my goal is to spend some nice time
    0:09:13 with my wife during crossword puzzles and talking,
    0:09:16 well, you know, things get calibrated properly.
    0:09:19 – Maybe the best case for being optimistic
    0:09:23 is that it’s socially desirable that people like to be around
    0:09:25 good vibes and positive attitudes.
    0:09:28 And obviously the reverse is just as true.
    0:09:30 People don’t like– – I think that’s definitely true.
    0:09:34 There’s what psychologists call the Lake Wobegon effect.
    0:09:35 – What is that?
    0:09:36 – It’s from Garrison Keeler.
    0:09:37 I’m gonna mangle the line.
    0:09:39 But Garrison Keeler talks about Lake Wobegon
    0:09:42 and says something where all the boys and girls
    0:09:43 are above average.
    0:09:46 And the idea is kind of just called the above average effect.
    0:09:49 So you ask people, how good a driver are you?
    0:09:52 You ask people how good they are as lovers, as friends.
    0:09:54 How funny are they?
    0:09:54 How good students are they?
    0:09:56 How are good professors are they?
    0:09:57 And the main findings,
    0:10:00 just about everybody thinks they’re above average.
    0:10:03 And there’s all different theories of why that happens.
    0:10:05 One connects to something you said before,
    0:10:07 which is kind of feedback.
    0:10:10 So if I give a talk and most people hate it,
    0:10:13 but some people come up to me and say, “Hey man, good talk.”
    0:10:15 And I say, “I gave a good talk.”
    0:10:20 Because in a polite world, the feedback is often positive.
    0:10:21 But anyway, there does seem to be
    0:10:24 this general kind of rosy glow effect.
    0:10:28 – Even these terms, optimism and pessimism,
    0:10:31 they’re very fuzzy categories.
    0:10:34 Do you think they’re useful as a psychologist?
    0:10:36 – I mean, just talking you and me right now,
    0:10:37 I think that they’re too fuzzy to be useful.
    0:10:40 Plainly, we came in with somewhat different ideas
    0:10:41 of what optimism is.
    0:10:44 You saw it more as an attitude towards life,
    0:10:45 to motivation.
    0:10:47 I was thinking it was a way to assess the situation.
    0:10:51 So I think we need, as so often with psychology,
    0:10:53 we need to kind of be clear what we talk about
    0:10:54 and use terms properly.
    0:10:58 I think optimism folds in too many things
    0:11:00 to be a useful term.
    0:11:04 – Well, you famously made the case against empathy.
    0:11:05 – Yeah, and I got in so much trouble there
    0:11:09 because people said, well, empathy just means goodness.
    0:11:11 How come you’re against goodness?
    0:11:13 And I was used to determine it in a different way,
    0:11:14 but yeah, exactly.
    0:11:16 And I try to be careful what I mean, but…
    0:11:19 – Yeah, I mean, I ask in part, ’cause I wonder
    0:11:22 if you would also make a case for pessimism.
    0:11:24 I mean, let me ask that differently.
    0:11:25 – You sound like my agent.
    0:11:29 – Yeah, we’re trying to move merch here.
    0:11:32 I think what I’m really asking is,
    0:11:37 do you take pessimism seriously as a philosophical position?
    0:11:41 Or do you think it’s just a mistake?
    0:11:45 – I think it’s a mistake.
    0:11:48 It involves seeing the world differently than it does.
    0:11:52 And I think it’s a mistake we’re sometimes vulnerable to.
    0:11:54 So I’m very persuaded by Stephen Pinker’s claims
    0:11:57 that the world is in many ways getting better.
    0:12:01 And Pinker points out that it’s a very unnatural belief
    0:12:02 the world is getting better.
    0:12:05 And this is in part because we have a negativity bias
    0:12:06 when we see the world.
    0:12:10 I saw the Trump-Harris debate,
    0:12:12 and there’s a lot to be said about that.
    0:12:17 But Trump’s view of the world is so unremittingly negative
    0:12:20 as to how terrible it is.
    0:12:22 The country’s falling apart, we’re being laughed at.
    0:12:25 We’re not gonna be around in a few years and so on.
    0:12:29 And I think this resonates to a lot of people.
    0:12:30 Not for a sinister reason.
    0:12:33 They just see, this is accurate.
    0:12:34 – But what’s the appeal of that?
    0:12:35 What is it satisfying?
    0:12:39 – There’s different ways of thinking about it.
    0:12:41 One way is it doesn’t satisfy an edge at all.
    0:12:43 It’s just because of the way we absorb information.
    0:12:46 So I hear about a terrible murder in Toronto,
    0:12:48 and I say, oh my God, the city’s full of murders.
    0:12:50 But I don’t hear about old people who aren’t murdered.
    0:12:52 So there’s this negativity bias news
    0:12:54 just gives me this wrong impression
    0:12:56 even if it doesn’t fulfill a purpose.
    0:12:59 But I actually think that some extent it does
    0:13:02 fulfill a purpose and at least some people.
    0:13:06 I think among other things, there’s kind of,
    0:13:07 I’m gonna try this out.
    0:13:09 I’m sort of thinking of, but I’ve always felt this,
    0:13:13 which is some people are excited by the idea
    0:13:16 that things are horrible, things are chaotic.
    0:13:18 These are the end of days.
    0:13:21 It makes you feel important that you’re in the middle
    0:13:24 of this enormous historic decline
    0:13:29 that we have about five years before AI makes us slaves.
    0:13:34 Before climate change turns us into a hellhole,
    0:13:37 before the fascist like Trump or a socialist like Harris
    0:13:39 turn us into a third world country.
    0:13:41 And there’s some excitement to that.
    0:13:44 Can you feel that in your soul?
    0:13:47 You hear this, do you feel a little shiver of, huh?
    0:13:49 – I don’t know if I feel a shiver,
    0:13:51 but it does seem right.
    0:13:55 And sometimes it reminds me a little bit of the way
    0:13:57 I think about conspiracy theories sometimes.
    0:14:01 I always wonder, what is the appeal of that?
    0:14:04 What is it doing psychologically for people?
    0:14:09 And in some sense, for me, the answer is,
    0:14:13 well, you might look at the world and think it’s broken
    0:14:17 or unfair or inexplicable.
    0:14:19 And there’s something empowering about having
    0:14:22 an explanation for that that justifies your contempt
    0:14:24 for a world that you feel divorced from
    0:14:26 in some fundamental way.
    0:14:29 I don’t know if pessimism or the negativity bias
    0:14:31 is operating in a similar way.
    0:14:32 – That’s interesting.
    0:14:35 The alternative to the idea that a world is run
    0:14:38 by conspiracies is that we live in an uncaring world
    0:14:40 where people are just serving their own interests
    0:14:42 and the interests of those they love and so on.
    0:14:44 And maybe we’re feeling screwed,
    0:14:45 but nobody’s trying to screw us.
    0:14:48 Just things are just grinding away.
    0:14:52 The conspiracy theory says that in some way
    0:14:54 there’s a structure to the world.
    0:14:56 It almost connects to religion.
    0:14:58 It’s not, these things are not accidents.
    0:15:02 There’s deeper interest and deeper desires going on.
    0:15:05 And maybe even if we think the conspiracies are evil,
    0:15:07 there’s a comfort to that.
    0:15:10 – Yeah, I think shit happens.
    0:15:12 It’s just not a satisfactory explanation for people.
    0:15:15 So they need a story with good guys and bad guys
    0:15:18 and a beginning and an end and it all sort of hangs together.
    0:15:20 – I did research a while ago
    0:15:23 of a brilliant graduate student, Kony Manerjee,
    0:15:26 on the notion that everything happens for a reason.
    0:15:28 It’s a slogan I hate with all my heart.
    0:15:32 But it turned out we assess people’s beliefs.
    0:15:33 We asked them, for instance, about their beliefs
    0:15:34 about everyday life.
    0:15:37 We asked them about special events, good events,
    0:15:38 like birth of a child, bad events,
    0:15:39 like death of a loved one.
    0:15:42 And we asked them, did they believe things happen
    0:15:43 for a reason?
    0:15:46 And we found that religious people believe it very much.
    0:15:49 But even atheists who said, I’d say there’s no such thing
    0:15:52 as God, yeah, but things happen for a reason.
    0:15:56 There’s karma, there’s structure, there’s justice.
    0:15:59 And a conspiracy theory may not be the kind of reason
    0:16:00 you want, it makes you happy.
    0:16:01 It’s not people improving your life,
    0:16:03 but it’s still a reason.
    0:16:07 – Yeah, I think for a lot of people,
    0:16:09 the only thing that’s truly intolerable
    0:16:11 is not having a reason.
    0:16:13 – That’s right.
    0:16:14 And I don’t know about you,
    0:16:18 my own metacrysical idea is that that sad truth is correct.
    0:16:21 Realists about morality, I think there’s right and wrong.
    0:16:23 I think that there’s meaning to be had in life.
    0:16:26 But the universe itself is a cold and uncaring place.
    0:16:29 And so is American politics.
    0:16:32 There’s not this deep state orchestrating at all.
    0:16:36 There’s not this, it’s just a whole lot of shit happens.
    0:16:39 – I’m realizing that maybe shit happens
    0:16:40 is the closest thing we have
    0:16:44 to a grand unified theory of the world.
    0:16:48 – It’s a great truth and it’s a very difficult truth.
    0:16:52 And maybe one of the reasons why religion is reassuring
    0:16:54 and conspiracy theories are reassuring.
    0:16:57 And even a sort of pessimism could be reassuring
    0:17:01 is it says there’s design here, there’s structure here.
    0:17:03 You know, my life might not be very interesting,
    0:17:05 but at least I’m in the end of days.
    0:17:07 That’s something exciting.
    0:17:09 (upbeat music)
    0:17:12 (upbeat music)
    0:17:24 Support for the gray area comes from Mint Mobile.
    0:17:26 Phone companies are really good at squeezing
    0:17:28 a little more out of you than you signed up for.
    0:17:31 Mint Mobile is doing things differently.
    0:17:33 Their premium wireless plans are actually affordable
    0:17:36 with no hidden fees or any of that nonsense.
    0:17:38 Right now, when you switch to Mint Mobile,
    0:17:41 you can get three months of service for just 15 bucks a month.
    0:17:43 All of their plans come with high speed 5G data
    0:17:45 and unlimited talk and text.
    0:17:47 Plus, you don’t need to worry
    0:17:49 about getting a new device or phone number.
    0:17:52 Just bring those with you over to your new Mint Mobile plan.
    0:17:54 To get this new customer offer
    0:17:56 and your new three month premium wireless plan
    0:17:57 for just 15 bucks a month,
    0:18:00 you can go to mintmobile.com/grayarea.
    0:18:03 That’s mintmobile.com/grayarea.
    0:18:06 You can cut your wireless bill to 15 bucks a month
    0:18:09 at mintmobile.com/grayarea.
    0:18:11 $45 upfront payment required,
    0:18:13 equivalent to $15 per month.
    0:18:16 New customers on first three month plan only.
    0:18:19 Speed slower above 40 gigabytes on unlimited plan,
    0:18:21 additional taxes, fees, and restrictions apply.
    0:18:23 See Mint Mobile for details.
    0:18:29 (gentle music)
    0:18:32 (gentle music)
    0:18:40 – You know, I’ll ask,
    0:18:42 and I should say I’m just gonna assume
    0:18:44 that this is not a question
    0:18:48 for which there is a definitive scientific answer.
    0:18:51 It’s maybe speculation more than anything else.
    0:18:53 Do you think people are just
    0:18:58 constitutionally wired to be one or the other?
    0:19:00 Or that maybe it’s more complicated,
    0:19:02 environmental factors, and all this other stuff.
    0:19:03 But I mean, I just,
    0:19:05 I wonder if you think people are born
    0:19:07 more optimistic or more pessimistic,
    0:19:10 and maybe by extension,
    0:19:13 is it something we can change if we want?
    0:19:15 – So whenever somebody asks you,
    0:19:15 you know, there’s two alternatives.
    0:19:18 One is, is it really simple and kind of dumb,
    0:19:21 or is it more complicated and subtle and rich?
    0:19:23 I kind of see which answer you’re kind of pointing me towards.
    0:19:27 And I will actually give you the predicted answer.
    0:19:28 – I’m not pointing you anywhere, sir.
    0:19:29 – No, no, no, no, no, no, no, no.
    0:19:30 – I’m just asking questions.
    0:19:32 – You are just, you are leading me.
    0:19:35 You are leading me to say the truth,
    0:19:37 which is I would imagine one’s optimism
    0:19:41 and pessimism has to do a lot with all sorts of things.
    0:19:43 It has to do with the culture that they’re in,
    0:19:46 which tells you to some extent how to see the world.
    0:19:51 It has to do with their individual life experiences.
    0:19:53 If you live the life of extraordinary good fortune
    0:19:55 at every spot, you know, of course you can be optimistic,
    0:19:58 but it’s just fits the data.
    0:20:00 And many people live terrible lives,
    0:20:02 and I’m sure they’re pessimistic because they’re not dumb.
    0:20:04 They say up to now it sucked.
    0:20:05 Why not?
    0:20:06 Why not assume that induction is right
    0:20:08 and it’ll suck in the future?
    0:20:10 I will, however, say it’s probably also
    0:20:12 just by some heritable part of it.
    0:20:16 Just because, you know, the first law of behavioral genetics,
    0:20:17 which admits of no exception,
    0:20:21 is that every psychological trait is somewhat heritable.
    0:20:24 Meaning that if I took your biological mother
    0:20:28 and biological father, and then I asked them each,
    0:20:30 are you an optimist or pessimist?
    0:20:31 Are you a cynic?
    0:20:32 How do you think?
    0:20:33 You have a lot of hope.
    0:20:36 And then I asked you, your answers would correlate
    0:20:37 even if you had never met them.
    0:20:41 – This to me leads to something else
    0:20:44 I wanted to talk to you about, which is children.
    0:20:44 – Yeah.
    0:20:45 – And for people who don’t know,
    0:20:49 I mean, you’ve done a lot of work on child psychology,
    0:20:52 and that intersects with this conversation
    0:20:55 and interesting ways for me.
    0:20:58 Let me start this way.
    0:21:03 Do you think children are capable of being optimistic
    0:21:04 or pessimistic?
    0:21:07 – Yeah, I think they are.
    0:21:10 I mean, there are children and there are children.
    0:21:12 So I’m not sure you could sensibly ask this
    0:21:14 about a 10 month old.
    0:21:15 There’s a certain point where children
    0:21:18 probably don’t in some way reflect upon the future.
    0:21:20 And so the question doesn’t make much sense.
    0:21:22 But you talk about, you know, four and five year olds,
    0:21:25 three, four and five year olds running around,
    0:21:27 I think they can have attitudes about the world
    0:21:30 where we could talk about as pessimism and optimism,
    0:21:33 even cynicism, the way we’re talking about it.
    0:21:36 And the answer, and there’s a fair amount of studies
    0:21:40 on this seems to be they’re really optimistic.
    0:21:41 They’re really positive.
    0:21:45 They are positive about their own abilities,
    0:21:47 tending to overstate what they can do,
    0:21:48 what their futures will be like.
    0:21:50 They’re optimistic about the abilities
    0:21:53 of the people they know and they care about.
    0:21:55 Christy Lockhart, who’s my colleague at Yale,
    0:21:57 did a study where she asked kids what would happen.
    0:22:00 And these are kids who, like four year olds, whatever,
    0:22:02 what happens when someone gets their finger chopped off?
    0:22:04 What would he be like as an adult?
    0:22:06 And they often say, “It’ll grow back.”
    0:22:08 They have this weird folk view that limbs grow back,
    0:22:10 that we recover, we heal.
    0:22:13 So kids are pretty hardcore optimists.
    0:22:15 There’s such a thing as childhood depression,
    0:22:17 but it’s rare, it’s very rare,
    0:22:18 and doesn’t happen quite young.
    0:22:22 Kids tend to be, I think, somewhat naturally cheerful.
    0:22:27 – Yeah, I mean, I can’t imagine a seven year old nihilist.
    0:22:31 It doesn’t even make sense, right?
    0:22:33 Now, is that just a failure of my imagination?
    0:22:37 And if I’m right, if a six or seven year old nihilist
    0:22:39 is just not a thing that happens in the world,
    0:22:41 there must be some reason for that, right?
    0:22:43 They’re psychologically incapable of it.
    0:22:46 And if they are, what’s that reason?
    0:22:47 – That’s a good question.
    0:22:51 I also can’t think of a seven year old moral skeptic
    0:22:54 or a seven year old skeptic in general.
    0:22:58 So, you know, adult philosophers and adult non-philosophers
    0:23:00 often come to certain conclusions about the world.
    0:23:03 They might think, “Oh my God, there’s no morality,
    0:23:06 “there’s no meaning, there’s no purpose.
    0:23:08 “Everyone else is just a zombie.”
    0:23:11 And all these views adults come to have.
    0:23:12 Putting aside whether they’re true,
    0:23:16 and I think none of them are true, they’re not natural.
    0:23:20 It’s very natural to see the world imbued with meaning,
    0:23:24 with morality, with hope, I’d say,
    0:23:26 with promise of different sorts.
    0:23:29 And I think kids see the world that way.
    0:23:31 And it’s only as a result of education and contemplation
    0:23:33 so when you can kind of look, turn things around
    0:23:37 and develop these sort of unusual philosophical views.
    0:23:39 – And look, seven year olds
    0:23:42 may not be virtue, ethicists or whatever,
    0:23:45 but they do have very strong moral intuitions, right?
    0:23:47 – Absolutely.
    0:23:51 And I think they’re properly seen as moral realists.
    0:23:54 If they’re really mad because they were treated unfair,
    0:23:57 and you say, “Yeah, but if you were in a different time
    0:23:58 “and place, this wouldn’t be unfair.”
    0:24:00 But can you look at it like you’re an idiot.
    0:24:02 This is unfair, period.
    0:24:04 It’s unfair like grass is green,
    0:24:07 like it’s warm outside, it’s a fact.
    0:24:10 And I think the kids get it right, actually.
    0:24:13 I think some sort of moral realism is right.
    0:24:17 But we had these evolved systems that give rise
    0:24:20 to feelings, gut feelings of right and wrong and so on.
    0:24:24 And they’re fully engaged by the time the kid is about seven.
    0:24:27 And then it’s only as adults, we could say,
    0:24:31 “Oh, you know, what murder is wrong in our culture.”
    0:24:33 And then the adults come to some sort of crazy,
    0:24:35 more relativist view.
    0:24:36 But whatever you think about that,
    0:24:38 it’s not something kids will do.
    0:24:40 – Yeah, I learned about the default mode network
    0:24:44 when I was reporting on psychedelic therapy.
    0:24:46 And for people who don’t know,
    0:24:50 this is basically the part of the brain that comes online
    0:24:52 when you’re thinking about yourself.
    0:24:55 And this part of the brain isn’t fully developed
    0:24:58 until at some point later in childhood.
    0:25:02 And so my thinking was that in order to be optimistic
    0:25:06 or pessimistic, you have to cross some threshold
    0:25:09 of self-consciousness to the point where you’re able,
    0:25:11 where there is a voice in your head
    0:25:13 telling yourself stories about yourself
    0:25:16 and the future and the past and that sort of thing.
    0:25:20 And that until you make that transition categories
    0:25:24 like optimism or pessimism or nihilism or whatever,
    0:25:27 don’t really make much sense.
    0:25:28 But you’re the psychologist.
    0:25:29 – No, I think that’s right.
    0:25:32 Although I do think that at the ages
    0:25:35 we’re talking about children have a notion of themselves.
    0:25:37 They know they’ll persist over time.
    0:25:39 They know they have a history.
    0:25:40 On the other hand,
    0:25:42 there’s some really nice psychological research
    0:25:45 suggesting really interesting differences
    0:25:46 between kids and adults.
    0:25:48 And some of this work is done by my wife,
    0:25:50 Christina Starmans, who’s a psychologist
    0:25:52 who also studies kids.
    0:25:54 So here’s some findings from her lab
    0:25:55 and other labs have it too.
    0:25:58 You ask kids, a five-year-old, six-year-old,
    0:26:01 when you grow up, are you gonna be the same size
    0:26:03 as you are gonna be taller?
    0:26:04 ‘Cause I’ll be taller.
    0:26:06 And you ask a bunch of questions
    0:26:08 and they understand your body’s wage.
    0:26:10 And there’s other people’s body’s wage.
    0:26:13 But then you ask about your psychological choices.
    0:26:16 So you say stuff like, do you like to drink coffee?
    0:26:17 Oh, coffee’s gross.
    0:26:19 When you’re an adult, we like to drink coffee.
    0:26:20 And I’ll say, no, it’s gross.
    0:26:23 Would you like to kiss people?
    0:26:25 Oh, it’s disgusting.
    0:26:27 When you’re an adult, you like to kiss people.
    0:26:28 Now, if you ask them these questions
    0:26:30 about somebody else, one of their friends, say, yeah,
    0:26:32 when he gets older, he’ll like to kiss people
    0:26:33 and drink coffee.
    0:26:37 But for themselves, they can’t get around the fact
    0:26:39 that coffee is gross and kissing is gross
    0:26:43 and playing with Legos wonderfully fun and so on.
    0:26:45 And they think they’ll stay this way forever.
    0:26:48 Yeah, that’s interesting.
    0:26:49 You’re a parent, right?
    0:26:53 Yeah, two kids, old adults now.
    0:26:54 How old are they?
    0:26:55 They’re 26 and 28.
    0:27:00 So has observing them grow up?
    0:27:08 Changed how you think about optimism and human happiness
    0:27:11 and what it means and how to do it
    0:27:14 and what it doesn’t mean?
    0:27:16 Yeah, probably not.
    0:27:18 It’s just alarming how little I’ve learned
    0:27:19 from watching movies.
    0:27:20 Really?
    0:27:21 Not even a little bit?
    0:27:23 You know, you get anecdotes in some of my books.
    0:27:26 Oh, this kid, he did this, he did that.
    0:27:28 I gotta say, this is something I’ll say about my kids.
    0:27:30 And I don’t want to talk too much
    0:27:34 about their prior consent, but something they taught me
    0:27:36 and they didn’t have to do my kids to teach me this.
    0:27:36 They could have been somebody else
    0:27:38 but I know them extremely well.
    0:27:40 Is that they’re two very different guys.
    0:27:43 They’re very different styles of interacting with the world.
    0:27:44 They have different homies.
    0:27:45 They have different likes and dislikes.
    0:27:47 And they are both extremely happy.
    0:27:49 They’re both living very fulfilling lives.
    0:27:52 And it reminded me that there’s more
    0:27:55 than one way to do this, right?
    0:27:56 And that was useful to know.
    0:27:59 It also says, and this is something in your distant future
    0:28:02 which is there’s a real joy in having adult children.
    0:28:04 It’s something that’s so strange.
    0:28:07 There’s this kid, you changed their diapers,
    0:28:08 you exude them when they were crying.
    0:28:09 You rocked them to sleep.
    0:28:11 And there they are and they’re this big,
    0:28:14 big hairy guy who could easily take you in a fight.
    0:28:15 And you’ve got to treat them nice
    0:28:17 because there’s their people now
    0:28:18 and they don’t have to listen to you.
    0:28:21 But there’s a huge amount of delight in that.
    0:28:26 – For me, one of the really the great privileges
    0:28:31 of being a parent is watching this little person
    0:28:36 move through the world with fresh eyes and a fresh mind
    0:28:39 and have come to really appreciate
    0:28:42 that instinctive wisdom that children have.
    0:28:47 It’s a kind of wisdom that we lose when we become adults,
    0:28:52 when we fall into habits and routines
    0:28:56 and that sense of wonder disappears.
    0:28:57 – Oh, I like that.
    0:29:01 I’ve never been so into the sort of wisdom of kids stuff.
    0:29:04 I think kids are just often as ignorant as they seem
    0:29:07 and often prejudice and simple and unfair.
    0:29:09 But the one edge kids have over us,
    0:29:13 you perfectly summarize which is they see the world fresh.
    0:29:17 It also, by the way, since you ping me on empathy a bit,
    0:29:18 I’ll say something in favor of empathy.
    0:29:20 So my objections to empathy have to do
    0:29:22 with its role in moral decisions.
    0:29:23 But empathy is often great.
    0:29:26 And Adam Smith gives a good example of this.
    0:29:29 Taking empathy is putting yourself in another person’s shoes
    0:29:30 and feeling what they feel.
    0:29:34 One of the great joys of having kids
    0:29:36 is that it allows you sometimes,
    0:29:38 it sounds like you’ve been doing this,
    0:29:41 to see the world fresh, to see the world realized,
    0:29:44 to see fireworks, which you’ve seen a million times before,
    0:29:46 for the first time over again.
    0:29:50 To taste ice cream for the first time through them.
    0:29:52 And that’s that rocks, that’s terrific.
    0:29:57 – Yeah, I mean, I was gonna ask you
    0:29:59 what you think we have to learn from children
    0:30:03 about how to be happy, but maybe that’s the answer.
    0:30:05 – Sort of beginner’s mind stuff.
    0:30:07 – Yeah, yeah.
    0:30:11 – Yeah, it’s something we get angry them for.
    0:30:14 But I think there’s a reason why
    0:30:16 we don’t walk around with beginners in mind
    0:30:18 because you don’t wanna be sitting down,
    0:30:21 drink a cup of coffee and stare at the coffee and smell.
    0:30:23 Oh my God, I’m gonna taste it all anew.
    0:30:26 We never do stuff.
    0:30:29 There’s a logic behind the fact that we automate.
    0:30:31 We become used to things, we habituate.
    0:30:33 Because you don’t wanna drive home from work
    0:30:36 as if you were the first time in a car again.
    0:30:39 You wanna be able to do things automatically.
    0:30:41 And the cost of that is you give up
    0:30:42 this beautiful beginner’s mind.
    0:30:45 But I don’t think we should or could retain it.
    0:30:49 – Yeah, I can see that it’s not entirely practical
    0:30:52 to be that immersed in the present
    0:30:56 that it’s definitely a luxury of being cared for
    0:30:59 and not having responsibilities or a job
    0:31:01 or a mortgage, that kind of thing.
    0:31:06 But it sounds like you think I’m being too romantic
    0:31:07 about kids.
    0:31:09 I mean, look, they do shit their pants.
    0:31:11 – You wanna go back to shitting your pants, Sean.
    0:31:14 – No, I do not wanna go back to shitting my pants,
    0:31:17 but I would love to have more.
    0:31:20 I don’t know what the right balance is,
    0:31:22 but I would love to have more of that beginner’s mind.
    0:31:24 I mean, look, I’ve stared at a pine cone
    0:31:25 for three hours before,
    0:31:27 but I took a lot of mushrooms to do that.
    0:31:29 I can’t, I couldn’t.
    0:31:31 And again, that’s not practical either,
    0:31:36 but that sense of awe at something so simple like that
    0:31:37 is possible when you’re–
    0:31:41 – No, I get that, I agree with that.
    0:31:42 And it’s actually a part of my life,
    0:31:44 which I don’t think I have anywhere near enough of,
    0:31:45 I mean, after this conversation,
    0:31:47 I’m gonna go find some mushrooms.
    0:31:50 But I think that in everyday life,
    0:31:54 you give up too much to always find yourself
    0:31:56 in beginner’s mind.
    0:31:59 But I do feel that loss.
    0:32:03 – I guess that’s sort of come to think of self-consciousness
    0:32:05 as a bit of a paradox.
    0:32:07 I mean, on the one hand, it is a gift in lots of ways,
    0:32:12 but it does seem also to be a machine for unhappiness,
    0:32:15 all that ruminating and self-reflection
    0:32:19 and the anxieties and the neuroses and the pathologies.
    0:32:21 I mean, I don’t know,
    0:32:23 you think I’m throwing too much shade
    0:32:26 on self-consciousness here?
    0:32:26 I mean–
    0:32:29 – No, I think that there are tragic dilemmas.
    0:32:32 To be an intelligent, conscious, self-conscious,
    0:32:36 capable adult involves falling yourself
    0:32:39 into a situation where there’s a lot of suffering,
    0:32:41 a lot of doubt, a lot of concern,
    0:32:42 where you lose beginner’s mind,
    0:32:44 where you question yourself.
    0:32:47 And that’s just a trade-off.
    0:32:49 I think it’s inevitable.
    0:32:50 I think it’s great to be a kid,
    0:32:52 but there’s a lot of bad things
    0:32:53 and it’s great to be an adult,
    0:32:54 but there’s a lot of bad things
    0:32:57 and you’re just kind of stuck with these traders.
    0:33:00 I mean, you talk about another mystery, take podcasts.
    0:33:03 So when I go from place to place,
    0:33:04 I always have headphones on.
    0:33:06 I’m listening to a podcast.
    0:33:08 Always your podcast, by the way,
    0:33:09 but I’m listening to podcasts.
    0:33:11 And a friend of mine, I was saying,
    0:33:13 I’ll go downstairs to the kitchen to get a snack
    0:33:14 and I’ll put on my headphones.
    0:33:16 I could listen to podcasts on the way downstairs
    0:33:18 so I’m not to waste valuable time.
    0:33:20 A friend of mine gave me a really hard time.
    0:33:22 I said, “Dude, leave your phone at home,
    0:33:25 walk through, experience the world as it is.”
    0:33:27 And I could see the pluses behind that,
    0:33:29 but then I’d miss out on a great podcast.
    0:33:31 There’s trade-offs.
    0:33:33 – Always trade-offs.
    0:33:36 You’re making me think.
    0:33:38 Have you read Nietzsche’s “Human, All Too Human”?
    0:33:39 – I have very, very little Nietzsche.
    0:33:41 – No, what would I have learned?
    0:33:47 – I don’t know a lot, but there’s a line.
    0:33:50 I may mess this up, but it’s something like,
    0:33:53 the first sign that an animal has become human
    0:33:57 is that it is no longer dedicated to its momentary comfort,
    0:34:00 but rather to its enduring comfort.
    0:34:01 – Oh, I like that.
    0:34:05 – He, look, that guy packs a lot into a sentence.
    0:34:07 – He’s an aphorism guy, like an aphorism machine.
    0:34:09 – Yeah, but it sort of speaks to this,
    0:34:14 that in that transition from consciousness to self-consciousness,
    0:34:18 one of the tolls you pay is that you sort of,
    0:34:23 you get robbed of the joy of just living in the moment,
    0:34:26 living in your body and all these other sorts of pathologies
    0:34:29 we’re talking about become possible.
    0:34:32 And that is just, I guess you said it, trade-offs.
    0:34:33 That’s the trade-off.
    0:34:37 – And you describe this as a cost of self-consciousness,
    0:34:41 and it is, but it’s also a cost of morality.
    0:34:44 I mean, were you a father, for instance?
    0:34:46 I don’t think you could be a good father
    0:34:48 if you didn’t think about your kid,
    0:34:49 if you didn’t worry about your kid.
    0:34:53 If you didn’t, in these perfect moments of solitude
    0:34:55 and so on and say, hey, how’s the kid doing?
    0:34:57 And, you know, and as he had his shots,
    0:35:01 and is he on his way to becoming educated,
    0:35:03 is he healthy, is he changed, is he, dah, dah, dah, dah,
    0:35:04 and all of these things.
    0:35:08 And so the more we have human connections
    0:35:11 and love and moral obligations,
    0:35:14 the more we’re stretched outside of our body
    0:35:17 and just worry about shit all the time.
    0:35:19 And this is actually, you know,
    0:35:23 the one Christopher Hitchens joke I remember is,
    0:35:25 did you hear about the Buddhist vacuum cleaner?
    0:35:27 It has no attachments.
    0:35:30 And this is in some way both, you know,
    0:35:34 you phrase it as a strength or weakness of Buddhism,
    0:35:39 but the strict doctrine renounces special attachments.
    0:35:41 And what if you don’t wanna renounce special attachments?
    0:35:44 – It feels like I’m kind of making the case
    0:35:45 against our brains here,
    0:35:49 that they do not seem to be wired to make us happy.
    0:35:51 That just does not seem to be a goal.
    0:35:52 – That is God’s truth.
    0:35:57 That is, you know, Evolution 101 is that natural selection
    0:36:00 that our winnie in process that gave us our brains,
    0:36:04 we have does not care one bit about our happiness.
    0:36:07 It cares about building machines that survive and reproduce.
    0:36:12 And happiness exists, I think to put it crudely,
    0:36:15 as part of a feedback system saying you’re doing well,
    0:36:18 you fall in love, you have an orgasm,
    0:36:21 you fill your belly, you get some status, you feel good.
    0:36:24 And that’s Evolution’s thumbs up, do more of that.
    0:36:27 But it also falls in and there we get consciousness
    0:36:29 and intelligence and so on
    0:36:32 to what psychologists call the hedonic treadmill,
    0:36:35 which is soon a happiness will fade,
    0:36:38 because evolution doesn’t want you taking victory laps
    0:36:41 and everything, it wants you doing more stuff.
    0:36:43 Accumulate more food, build a bigger shelter,
    0:36:45 protect your family.
    0:36:48 And so it drives us endlessly.
    0:36:49 I’ve talked to Robert right about this,
    0:36:53 I think he views Buddhism as an anti-Darwin thing.
    0:36:55 We’ve evolved this way, let’s fight it.
    0:36:59 Let’s renounce our evolved capacity, our evolved desires.
    0:37:01 Utilitarianism, a lot of moral views are saying,
    0:37:04 “Evolution gave us these biases, let’s fight it.”
    0:37:07 And there’s a lot to be said about that.
    0:37:10 But yeah, you are sick and tired of the brains
    0:37:12 that Darwin has given us.
    0:37:14 And there’s a lot to be annoyed by it.
    0:37:18 – You know the stand up comic Pete Holmes?
    0:37:19 – I’ve seen him, yeah.
    0:37:21 – I saw this great bit of his the other day
    0:37:23 where he was just, I don’t remember what he said exactly,
    0:37:25 but it was something like, he’s like,
    0:37:28 you know, if you think about it,
    0:37:32 our brains are unbelievable assholes.
    0:37:36 They have all the ingredients they need to make us happy.
    0:37:41 The dopamine and the serotonin, all that stuff.
    0:37:45 But they’re constantly making you feel like shit, right?
    0:37:49 We’re supposed to be the commanders-in-chief of our brains,
    0:37:51 but we’re just helpless passengers.
    0:37:53 – Nice. – It kind of sucks.
    0:37:54 – It’s a deep point.
    0:37:56 And it’s because the system is not our friend.
    0:38:01 The system is not evolved to sort of follow our conscious will.
    0:38:05 I think just pursuing something else we talked about,
    0:38:07 there’s a sweet spot of anxiety.
    0:38:11 People who have too much anxiety
    0:38:13 and go to shrinks and psychologists
    0:38:15 and they take drugs and they try to meditate
    0:38:17 and they suffer and they try to fix that.
    0:38:18 If you have too much anxiety,
    0:38:20 if you have too little anxiety,
    0:38:23 you end up in prison or dead, you don’t worry enough.
    0:38:25 And so bad things happen to you.
    0:38:28 You’re fearless, it’s not so good.
    0:38:32 So some sort of anxiety, which is very rarely pleasant,
    0:38:33 is just part of our package.
    0:38:36 And so some sort of sadness and self-consciousness,
    0:38:38 doubt, worry.
    0:38:42 – If our brain does anything that reflects it,
    0:38:45 there has to be some evolutionary reason for it, right?
    0:38:47 It may be maladapted to that moment.
    0:38:50 – Yeah, that’s right, or to this world.
    0:38:54 If somebody mocks me on Twitter and I fall into a rage,
    0:38:56 even though he’s an anonymous rando,
    0:38:59 but my mind hasn’t evolved to deal with anonymous rando,
    0:39:00 it’s evolved to deal with people
    0:39:03 that I would bump into day after day after day.
    0:39:06 – Right, so this strong instinct we have to care
    0:39:10 about what other people think of us made a lot of sense
    0:39:12 when you lived in tribes of 30 or 40 people,
    0:39:15 that was high-stakes stuff to be ostracized.
    0:39:16 But to care about what other people think
    0:39:18 in a world of Twitter.
    0:39:19 – Yes.
    0:39:22 – Where 10,000 anonymous bots can tell you
    0:39:26 what a piece of shit you are, that’s, boy, that’s not,
    0:39:29 there’s a bit of a disjunction there.
    0:39:31 – And this is the Pete Holmes point,
    0:39:34 which is knowing that doesn’t make it go away.
    0:39:38 If there’s a big bag of M&Ms downstairs
    0:39:43 and I’m really hungry, I know that my body has evolved
    0:39:45 to try to absorb a normal amount of sugar
    0:39:48 ’cause in the world I involved in,
    0:39:50 I was constantly at a risk of dying of starvation.
    0:39:53 I’m not in a risk now, so just put down the sugar,
    0:39:55 but it doesn’t make my hunger stop.
    0:39:59 It doesn’t make my shame, my anger, my jealousy stop,
    0:40:02 knowing that it’s a poor fit for my world right now.
    0:40:06 To the extent our conversation is having a theme,
    0:40:08 it’s gone surprisingly dark,
    0:40:10 which is that we’ve evolved these minds
    0:40:12 that have evolved through,
    0:40:14 I don’t know what a trigger warning or something on it,
    0:40:16 but that have evolved through natural selection,
    0:40:19 and now it has left us utterly screwed.
    0:40:20 We’ve lost our balance,
    0:40:23 and we’ve certainly lost our beginner’s mind
    0:40:25 that we have a brief period of as children.
    0:40:26 – You know what the best part is?
    0:40:27 – Yeah.
    0:40:30 – This conversation is part of a series about optimism.
    0:40:31 – Oh.
    0:40:34 (gentle music)
    0:40:37 (gentle music)
    0:40:49 – Support for the gray area comes from Mint Mobile.
    0:40:52 Getting a great deal usually comes with some strings attached.
    0:40:55 Maybe that enticing price only applies
    0:40:56 to the first few weeks,
    0:40:59 or you have to physically mail a rebate form to Missouri,
    0:41:02 or there’s minuscule fine print that reads we’re lying,
    0:41:04 there’s no deal, we got you.
    0:41:07 Well, Mint Mobile offers deals with no strings attached.
    0:41:09 When they say you’ll pay $15 a month
    0:41:11 when you purchase a three-month plan, they mean it.
    0:41:13 All Mint Mobile plans come with high-speed data
    0:41:15 and unlimited talk and text delivered
    0:41:18 on the nation’s largest 5G network.
    0:41:20 Mint says you can even keep your phone,
    0:41:22 your contacts, and your number.
    0:41:24 It doesn’t get much easier than that, folks.
    0:41:26 To get this new customer offer
    0:41:28 and your new three-month premium wireless plan
    0:41:29 for just 15 bucks a month,
    0:41:33 you can go to mintmobile.com/grayarea.
    0:41:36 That’s a mintmobile.com/grayarea.
    0:41:38 You can cut your wireless bill to 15 bucks a month
    0:41:41 at mintmobile.com/grayarea.
    0:41:45 $45 upfront payment required equivalent to $15 a month.
    0:41:48 New customers on first three-month plan only.
    0:41:50 Speed slower above 40 gigabytes on unlimited plan.
    0:41:53 Additional taxes, fees, and restrictions apply.
    0:41:55 See Mint Mobile for details.
    0:41:57 (upbeat music)
    0:42:12 – Let’s do a little rapid fire and see where it goes.
    0:42:13 First question.
    0:42:16 Do most of us really know what makes us happy?
    0:42:18 I think we all think we do, but do we?
    0:42:19 – No, no.
    0:42:23 We have, as psychologists, some understanding
    0:42:24 of what makes people happy,
    0:42:28 and it’s not what people think makes them happy.
    0:42:31 So many people think, including me a lot of days,
    0:42:33 like I wanna make more money and I’ll make me happy.
    0:42:34 And it’s not totally wrong.
    0:42:36 There’s a correlation in money and happiness.
    0:42:41 But basically, the shortest answer to what makes us happy
    0:42:45 is social contact with people.
    0:42:46 And I’m kind of an introvert.
    0:42:47 And I feel like sometimes I avoid
    0:42:50 so I find uncomfortable, but there’s so much evidence
    0:42:53 that social contact is such a core to happiness
    0:42:55 and relationships.
    0:42:57 There was a nice study, a nice big review came out
    0:43:01 by this guy, Dunnigan Folk and Liz Dunn.
    0:43:03 And they reviewed all of the research
    0:43:05 on people’s attempts to make them happy,
    0:43:07 what works and what doesn’t.
    0:43:08 And the most robust experiments.
    0:43:10 And it’s kind of surprising.
    0:43:13 Volunteering doesn’t make you happy,
    0:43:17 but giving strangers money makes you happy.
    0:43:18 Why?
    0:43:22 Oh, I don’t know, ’cause it’s a subtle difference, right?
    0:43:24 And I recommend people like read the literature
    0:43:27 and just look at, there’s some interesting findings.
    0:43:29 But don’t listen to people who just kind of say,
    0:43:30 oh, here’s what I think.
    0:43:34 Look at the studies, ’cause the studies are surprising.
    0:43:38 Is it a mistake to think of happiness as the goal of life?
    0:43:39 Yes.
    0:43:41 Or even a primary goal of life?
    0:43:43 It’s a good goal, everybody wants to be happy.
    0:43:45 Everybody wants to just even zoom it in more.
    0:43:46 Everybody likes pleasure.
    0:43:49 It’s a hot day, drink some cool water,
    0:43:50 have an extra slice of pie.
    0:43:53 That’s great.
    0:43:58 But no, I think, this is not science now.
    0:44:00 This is just my own view,
    0:44:05 but I think meaningful pursuits are an important part of life.
    0:44:10 Being good, it’s the data over whether you raising a child,
    0:44:16 of the age of your child, whether it makes you happy,
    0:44:20 it’s by no means clear that it does on average.
    0:44:22 Particularly in the United States,
    0:44:23 and there are many studies find that
    0:44:25 in the alternative world where you didn’t have a kid,
    0:44:26 you’d actually be happier.
    0:44:30 Doesn’t we ask you how happier, you’d actually be happier.
    0:44:32 But it’d be ridiculous for me to say,
    0:44:34 therefore you’ve probably made a mistake.
    0:44:35 What you would say, and what I would say,
    0:44:38 having raised kids is, it wasn’t not a happiness
    0:44:42 in some simple dumb sense, it’s fulfillment, it’s love.
    0:44:45 It’s a feeling of doing something meaningful and important.
    0:44:48 What’s another meaningful pursuit
    0:44:51 other than having children?
    0:44:53 – There’s a line from Freud,
    0:44:56 which apparently was misattributed, but it works,
    0:45:00 which is what matters in life is love and work.
    0:45:04 And under love, I put under broad ambit of relationships,
    0:45:08 raising kids, having a partner you care for,
    0:45:10 taking care of somebody who needs care.
    0:45:13 And for work, I put projects.
    0:45:15 And it could be a project actually at work
    0:45:18 where they send the forms to the IRS and everything.
    0:45:20 But it could also just be,
    0:45:21 it could be somebody setting up a podcast
    0:45:22 and working hard at it.
    0:45:25 I think love and work are the two things that really matter.
    0:45:28 – So if I asked you what you think
    0:45:31 people should maximize in life, you would say meaning?
    0:45:34 – Yeah. – Meaningful pursuits?
    0:45:36 – Meaningful pursuits and relationships.
    0:45:40 And I would also say to some extent, if you do that,
    0:45:42 maybe happiness will follow.
    0:45:45 One of the findings in happiness literature
    0:45:49 is that you ask people how much do you work to become happy?
    0:45:51 And then you get their answer and ask them how happy are you?
    0:45:53 The answers are negatively correlated.
    0:45:56 The pursuit of happiness is a miserable pursuit.
    0:45:59 It’s like probably trying to be really good at kissing
    0:46:01 gets in the way of being good at kissing.
    0:46:04 Sometimes focusing on things makes this thing harder to get.
    0:46:07 So make sleep is a good example of that.
    0:46:09 So don’t try to be happy.
    0:46:12 Instead, try to maximize meaning and relationships.
    0:46:15 And then, and don’t even think about this ever,
    0:46:17 but maybe I’ll make you happy.
    0:46:21 – How do you distinguish happiness from satisfaction?
    0:46:24 – Again, we’re entering sort of terminology things,
    0:46:28 but I would say sort of happiness is some narrower sense,
    0:46:30 involving sort of hedonic aspects
    0:46:34 like pleasure, smiling, feeling good.
    0:46:36 Well, satisfaction could be deeper.
    0:46:41 So again, raising a kid are taking care of a loved one who’s sick.
    0:46:43 You could say, this is satisfying.
    0:46:45 I’m doing good stuff.
    0:46:46 I’m making the world a better place.
    0:46:48 I’m proud of myself.
    0:46:49 Even I’m not smiling very much.
    0:46:52 Even though I don’t, you know, this is hard.
    0:46:56 I probably have more fun playing pickleball or getting high,
    0:46:58 but that wouldn’t be satisfying.
    0:47:00 That’d just be fun.
    0:47:03 – How much of life is about what we choose
    0:47:05 to pay attention to?
    0:47:07 I mean, I sort of always believed on some level
    0:47:10 that we’ve become what we pay attention to.
    0:47:12 And if you want to be optimistic,
    0:47:15 there are plenty of reasons to justify it.
    0:47:16 If you want to be pessimistic,
    0:47:18 you know where to look to justify it.
    0:47:22 Does it really just come down to what you choose
    0:47:24 to pay attention to in the end?
    0:47:28 – I think it probably comes down to an enormous extent
    0:47:31 on what you pay attention to.
    0:47:33 There’s this line, I think with Shakespeare,
    0:47:35 there’s nothing neither good nor bad,
    0:47:37 but thinking makes it so.
    0:47:39 It’s how you see the world.
    0:47:44 But how much we can choose is a dicey question.
    0:47:48 If we could choose, then we’d all be happy.
    0:47:49 We’d say, “I’m looking on the bright side of things.”
    0:47:52 And boom, we’d be happy, plainly, that doesn’t work.
    0:47:55 If I’m feeling terrible pain in my leg,
    0:47:58 terrible chronic pain, and you used to say,
    0:48:01 “Well, Paul, don’t focus on it.”
    0:48:05 Well, the thing about pain is it calls to your attention.
    0:48:08 The thing about having a child you’re worried about
    0:48:11 or a career that’s failing is you can’t just
    0:48:13 switch off your focus, and maybe you shouldn’t,
    0:48:14 even if you could.
    0:48:18 So I kind of agree with the part that how you see the world,
    0:48:21 what you focus on, this is a great lesson of stoicism.
    0:48:22 I think it’s right.
    0:48:24 It has an enormous effect on your life.
    0:48:29 But getting control over that is easier said than done.
    0:48:32 – I have all these years of study in the mind
    0:48:36 and thinking about the happiness and meaning
    0:48:39 and what makes a good life.
    0:48:41 Made you any happier, any more optimistic?
    0:48:43 Do you think it’s had any noticeable effect?
    0:48:45 Do you think you’ve been changed at all by it,
    0:48:47 or do you think you would be exactly what you are now,
    0:48:51 whether or not you became a psychologist or not?
    0:48:56 – I think at the margins in certain ways,
    0:48:58 I picked up things through studying the mind
    0:49:01 that has made my own life better.
    0:49:02 – Like what?
    0:49:07 – Well, for instance, I’m very convinced of the power
    0:49:10 in both the short term and the long term
    0:49:12 of social contact and social connections.
    0:49:17 So if I’m really down, my temptation is
    0:49:20 to lie in bed, flip up my laptop,
    0:49:22 and watch YouTube videos for four hours or something,
    0:49:24 or just sit and sulk.
    0:49:27 But I know that if I could reach out,
    0:49:30 get one of my kids on Zoom, grab one of my friends
    0:49:33 for a beer, it will cheer me up.
    0:49:35 And I don’t always succeed in doing it,
    0:49:39 but I know that solitude and just grumping will not work.
    0:49:41 I’m better off doing other things.
    0:49:42 I’ve learned through psychology,
    0:49:44 this is actually not my own,
    0:49:46 far, very far from my own work,
    0:49:48 I’ve learned about the idea of flow states,
    0:49:50 and that’s, I read this book by,
    0:49:52 she sent me an I about flow,
    0:49:53 where he says that, you know,
    0:49:55 people aren’t happy doing the thing.
    0:49:57 This goes back to what you were saying actually,
    0:49:59 which is people think they love vacations,
    0:50:00 but they don’t tend to love vacations.
    0:50:02 They just spend a lot of time sitting by the beach,
    0:50:04 feeling kind of bored and anxious,
    0:50:07 what people get a lot out of is flow states
    0:50:08 where they’re really into a project,
    0:50:11 they’re really focused and zoomed in.
    0:50:13 And I realize that’s particularly true for me.
    0:50:15 So, you know, when I travel on vacation,
    0:50:18 I bring my laptop and I could spend a couple of hours
    0:50:20 in the morning writing, and that’s actually often,
    0:50:22 I love that.
    0:50:24 I’m maybe not supposed to love it, but I love it.
    0:50:28 – I have to ask, because this is a series on optimism,
    0:50:32 if you had to make a pitch for being optimistic,
    0:50:34 what do you got?
    0:50:39 – I would go for the self-fulfilling prophecy.
    0:50:41 I would go for this fact that, you know,
    0:50:43 it’s easier to make a pitch for being things right,
    0:50:44 being irrational and all that stuff,
    0:50:47 but I would say that that in some way,
    0:50:49 the person who thinks their odds are better
    0:50:53 than they are paradoxically ends up doing better.
    0:50:56 And this also connects with your conversation
    0:50:58 with Jamil Zaki, which is interactions
    0:50:59 of other people.
    0:51:03 Suppose it actually turns out, as a matter of fact,
    0:51:07 that 50% of people in this situation can be trusted.
    0:51:10 But suppose I go into the world thinking it’s 80%.
    0:51:13 You might think, well, I’m gonna get really screwed,
    0:51:16 but maybe by going at people, trusting them,
    0:51:19 taking a shot at them, I transform some
    0:51:22 of the non-trust worthy 50%, and it works out for me.
    0:51:27 So, optimism, and in fact, one of the things
    0:51:30 that works in the happiness intervention literature
    0:51:33 is something they call acting happy,
    0:51:36 which is, they say, and part of the experiment,
    0:51:39 you’re thinking that, well, put on a happy face, smile,
    0:51:41 talk to people in a cheerful tone.
    0:51:43 Now, if you’re depressed, you just say, go die.
    0:51:46 What’s the worthless, ugly advice?
    0:51:51 But weirdly, acting positive makes people positive
    0:51:54 towards you, and it can again be a self-fulfilling prophecy.
    0:52:00 – Yeah, I’d also say, believing that the world can be better
    0:52:04 is a precondition for creating that world,
    0:52:08 and pessimism seems, just on purely strategic grounds,
    0:52:11 to undercut the motivation to do anything, really.
    0:52:15 It’s pure passivity, and that just seems
    0:52:18 like a poor strategy.
    0:52:19 – That’s right.
    0:52:20 I think that even if it’s a long shot
    0:52:23 that you could make the world a better place,
    0:52:25 maybe, I’ll raise this, maybe the person
    0:52:30 who isn’t smart enough to know it’s a long shot, tries,
    0:52:32 and if you get enough people trying, you’ll get success,
    0:52:34 where if everyone else, everyone is realistic
    0:52:37 and say, listen, it’s not worth it, you get favor.
    0:52:40 – Well, Paul, what can I say?
    0:52:42 We may not be one of your favorite pods,
    0:52:43 but you’re one of my favorite guests.
    0:52:45 – This is always fun, Sean.
    0:52:47 – Paul Bloom, everyone, this has been great.
    0:52:49 You know it’s always a joy to have you on stuff.
    0:52:51 – Thanks again, thanks for having me.
    0:52:54 (upbeat music)
    0:53:07 – All right, that was great.
    0:53:08 I had a lot of fun.
    0:53:11 I hope you also had a lot of fun.
    0:53:13 As always, I wanna know what you think of the episode,
    0:53:18 so go ahead and drop us a line at the grayarea@box.com,
    0:53:20 and once you’re finished with that,
    0:53:23 go ahead and rate, review, subscribe to the podcast.
    0:53:26 This episode was produced by Beth Morrissey
    0:53:29 and Travis Larchuk, edited by Jorge Just,
    0:53:33 engineered by Patrick Boyd, fact-checked by Anouk Dusso,
    0:53:36 and Alex O’Brington wrote our theme music.
    0:53:39 New episodes of The Gray Area drop on Mondays,
    0:53:41 listen and subscribe.
    0:53:43 This show is part of Vox,
    0:53:46 support Vox’s journalism by joining
    0:53:48 our membership program today.
    0:53:50 Go to vox.com/members to sign up,
    0:53:53 and if you decide to sign up because of this show,
    0:53:54 let us know.
    0:53:56 (upbeat music)
    0:54:17 – Support for The Gray Area comes from Mint Mobile.
    0:54:18 You can get three months of service
    0:54:21 for just 15 bucks a month by switching to Mint Mobile.
    0:54:23 And that includes high-speed 5G data
    0:54:26 and unlimited talk and text.
    0:54:27 To get this new customer offer
    0:54:29 and your new three-month premium wireless plan
    0:54:31 for just 15 bucks a month,
    0:54:34 you can go to mintmobile.com/grayarea.
    0:54:37 That’s mintmobile.com/grayarea.
    0:54:39 $45 upfront payment required,
    0:54:41 equivalent to $15 per month.
    0:54:43 New customers on first three-month plan only.
    0:54:47 Speed slower above 40 gigabytes on unlimited plan.
    0:54:49 Additional taxes, fees and restrictions apply.
    0:54:51 See Mint Mobile for details.
    0:54:53 (upbeat music)

    Children live with a beginner’s mind. Every day is full of new discoveries, powerful emotions, and often unrealistically positive assumptions about the future. As adults, beginner’s mind gives way to the mundane drudgeries of existence — and our brains seem to make it much harder for us to be happy. Should we be cool with that?

    We wrap up our three-part series on optimism with Paul Bloom, author of Psych: The Story of the Human Mind and Sweet Spot: The Pleasures of Suffering and the Search for Meaning. He offers his thoughts on optimism and pessimism and walks Sean Illing through the differences between what we think makes us happy versus what actually does.

    Host: Sean Illing (@seanilling)

    Guest: Paul Bloom (@paulbloom), psychologist, author and writer of the Substack Small Potatoes

    Support The Gray Area by becoming a Vox Member: https://www.vox.com/support-now

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • A message from Sean

    AI transcript
    0:00:03 – Hello, everyone, this is Sean.
    0:00:04 A couple of weeks ago,
    0:00:09 we put out an episode with Yuval Harari about AI.
    0:00:12 Now, typically, I record these conversations remotely
    0:00:14 in my little office in Mississippi
    0:00:18 where I can close my eyes and really listen to the guests.
    0:00:20 But this time around,
    0:00:23 Harari and I both happened to be in New York City,
    0:00:26 so we decided to record the interview in person
    0:00:28 on video in a studio.
    0:00:31 If you’ve ever wondered if I’m a real person
    0:00:33 asking real questions in real time,
    0:00:36 then wonder no more.
    0:00:39 We’ve put the video up on the Vox YouTube channel,
    0:00:41 youtube.com/vox.
    0:00:44 It’s the same conversation as you heard on the podcast,
    0:00:47 but now you can look at us instead of the walls
    0:00:50 or whatever you look at when you listen to me.
    0:00:52 And if you like it or hate it, let us know.
    0:00:54 Maybe we’ll do more in the future,
    0:00:56 or maybe this is your only chance.
    0:00:59 Anyway, check it out.
    0:01:01 (upbeat music)
    0:01:04 (upbeat music)
    0:01:14 [BLANK_AUDIO]

    Sean Illing has a special message for all you listeners: Look at me!

    We’ve made our first-ever video episode. See Sean in conversation with Yuval Noah Harari. Watch it with your friends and family and your friend’s families and their family friends. It’s on YouTube right now: https://www.youtube.com/watch?v=uhx1sdX2bow

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • What if we get climate change right?

    AI transcript
    0:00:03 Support for the gray area comes from Mint Mobile.
    0:00:07 Phones can be expensive, but with Mint Mobile,
    0:00:09 you can save a ton on wireless service.
    0:00:11 You can get three months of service
    0:00:14 for just 15 bucks a month by switching.
    0:00:15 To get this new customer offer
    0:00:18 and your new three-month premium wireless plan
    0:00:19 for just 15 bucks a month,
    0:00:23 you can go to mintmobile.com/grayarea.
    0:00:26 That’s mintmobile.com/grayarea.
    0:00:28 $45 upfront payment required,
    0:00:30 equivalent to $15 per month.
    0:00:33 New customers on first three-month plan only.
    0:00:36 Speed slower above 40 gigabytes on unlimited plan,
    0:00:39 additional taxes, fees, and restrictions apply.
    0:00:40 See Mint Mobile for details.
    0:00:46 If I asked you to tell me the one issue
    0:00:49 that makes you feel the most pessimistic,
    0:00:50 what would it be?
    0:00:55 I feel pretty confident saying
    0:00:57 that the most popular response,
    0:01:00 certainly one of the most popular responses,
    0:01:01 would be climate change.
    0:01:06 But is climate despair really as tempting
    0:01:08 and reasonable as it seems?
    0:01:12 The problem isn’t imaginary.
    0:01:14 Climate change is real and terrifying.
    0:01:19 But even if it’s as bad as the worst predictions suggest,
    0:01:22 do we gain anything by resigning ourselves to that fate?
    0:01:26 What effect might our despair have
    0:01:29 on our ability to act in the present?
    0:01:32 More to the point,
    0:01:34 is our fatalism undercutting our capacity
    0:01:36 to tackle this problem?
    0:01:41 I’m Sean Elling, and this is the Gray Area.
    0:01:54 Today’s guest is Ayanna Elizabeth Johnson.
    0:01:57 She’s a marine biologist,
    0:02:01 a co-founder of the nonprofit Think Tank Urban Ocean Lab,
    0:02:03 and the author of a new book called
    0:02:04 What If We Get It Right?
    0:02:10 It’s a curated series of essays and poetry
    0:02:12 and conversations with a wide range of people
    0:02:15 who are all in their own ways
    0:02:16 trying to build a better future.
    0:02:21 It’s not a blindly optimistic book.
    0:02:23 The point is not that everything is fine.
    0:02:26 The point is that we have to act
    0:02:28 as though the future is a place we want to live in.
    0:02:31 According to Johnson,
    0:02:35 there are already many concrete climate solutions.
    0:02:39 If we were motivated by a belief in a better tomorrow,
    0:02:41 not a worse one,
    0:02:43 we would implement more of those solutions
    0:02:44 and find new ones.
    0:02:49 So if you’re someone looking for inspiration
    0:02:50 or reasons to feel hopeful,
    0:02:55 or even better for guidance on what to do and where to start,
    0:03:00 this book and this conversation with Ayanna is for you.
    0:03:09 Dr. Ayanna Elizabeth Johnson, welcome to the show.
    0:03:11 – Thank you, it’s great to be here.
    0:03:14 – You’re actually a marine biologist,
    0:03:17 which I think is a standard top five dream job for kids.
    0:03:22 Super common dream job, like many five to 10 year olds
    0:03:27 are very into marine biology as a life path.
    0:03:32 – Was marine biology your gateway to environmentalism?
    0:03:35 Is that why you do this work?
    0:03:37 – I just was a kid who loved nature,
    0:03:41 which is honestly not very unique.
    0:03:45 How many kids like bugs and fireflies
    0:03:49 and shooting stars and octopuses and autumn leaves
    0:03:50 and all the rest of it.
    0:03:53 I was just like, this all seems very cool.
    0:03:56 And that innate curiosity,
    0:04:00 that biophilia as EO Wilson calls it,
    0:04:03 the magnificent entomologist,
    0:04:06 is just part of who we are as humans.
    0:04:08 It’s normal to love the world.
    0:04:10 It’s less common to make that your job.
    0:04:14 But of course, once you fall in love with nature
    0:04:18 with one ecosystem or a few specific species
    0:04:20 and you find out that they’re threatened,
    0:04:23 you’re like, wait a second, what are we doing about this?
    0:04:25 Is there a grown up who’s already on top of this?
    0:04:26 Is this not sorted?
    0:04:29 Seems like we should protect forests
    0:04:31 and coral reefs and all the rest.
    0:04:34 It’s funny, my mom was cleaning out the closet
    0:04:36 and found these old school papers.
    0:04:38 And apparently I was writing the same essays
    0:04:41 since I was like 10 about nature being great
    0:04:43 and how we should protect it.
    0:04:45 So it wasn’t always gonna be the ocean.
    0:04:47 I wanted to become a park ranger at one point
    0:04:49 or an environmental lawyer.
    0:04:52 But yeah, the ocean seemed like it needed more advocates
    0:04:53 at the particular moment
    0:04:56 I was thinking about graduate school.
    0:05:01 – You open your book by saying that anytime you tell people
    0:05:05 that you do climate work, they invariably ask,
    0:05:08 and I’m quoting, how fucked are we?
    0:05:09 – Yeah.
    0:05:12 – Well, Ayanna, how fucked are we?
    0:05:14 (laughing)
    0:05:17 Well, we’re pretty fucked,
    0:05:20 but there’s a lot we could do
    0:05:22 to have a better possible future.
    0:05:26 And I think it’s important to always hold
    0:05:28 both of those things together.
    0:05:30 We have already changed the climate.
    0:05:34 We are already seeing the intense heat waves
    0:05:38 and floods and droughts and wildfires and hurricanes.
    0:05:41 All of that is already supercharged
    0:05:43 by our changed climate,
    0:05:46 but there’s still so much we can do.
    0:05:48 We basically have the solutions we need.
    0:05:52 We’re just being really slow at deploying them,
    0:05:54 at implementing them, right?
    0:05:58 We already know how to transition to renewable energy
    0:06:00 and stop spewing fossil fuels.
    0:06:03 We know how to protect and restore ecosystems
    0:06:04 that are absorbing all this carbon.
    0:06:07 We know how to green buildings, insulate buildings,
    0:06:12 shift to better public transit, improve our food system.
    0:06:16 Like the solutions are all right there.
    0:06:21 So as you know, this book has a reality check chapter
    0:06:24 where I lay out all the bad news,
    0:06:25 but that’s like three pages.
    0:06:26 And then the rest of the book is like,
    0:06:29 okay, what are we gonna do about it?
    0:06:31 – There’s no point anymore
    0:06:33 in talking about how to solve
    0:06:35 the problem of climate change, right?
    0:06:36 I mean, that ship has sailed.
    0:06:37 It’s all about adaptation now.
    0:06:41 – Yeah, I mean, the climate has already changed.
    0:06:43 There’s not a time machine back
    0:06:47 to before we put a completely mind-boggling amount
    0:06:50 of excess carbon into the atmosphere.
    0:06:54 Whether and how well we address the climate crisis
    0:06:58 determines the outcomes of life on earth
    0:07:00 for all eight million species
    0:07:04 and whether hundreds of millions of people live or die.
    0:07:06 And how well we all can live.
    0:07:10 So even though perfection is not an option,
    0:07:14 there’s such a wide range of possible futures.
    0:07:15 And we just need to make sure
    0:07:18 we get the best possible one.
    0:07:19 – Well, that seems like an important point, right?
    0:07:21 This is really about degrees of suffering
    0:07:25 and the consequences of specific choices we make
    0:07:27 or won’t make as it might be, right?
    0:07:30 The difference between temperature spikes of two
    0:07:32 and four degrees is the difference
    0:07:36 between lots of people living and dying, right?
    0:07:38 – Yeah, I mean, it’s easier for me to think about it
    0:07:42 in terms of the human body running a fever,
    0:07:43 ’cause we can think of one or two degrees
    0:07:45 as not that big a deal.
    0:07:48 But the difference between you having a fever of 100
    0:07:53 and 102 or 103 is a huge difference.
    0:07:56 And that’s the level of sensitivity to temperature
    0:07:59 that all species and ecosystems have.
    0:08:02 If we can prevent a half a degree of warming
    0:08:04 or a degree of warming,
    0:08:06 that actually makes a big difference.
    0:08:07 It’s worth the effort.
    0:08:11 – People like to use different words
    0:08:13 to describe the project ahead of us.
    0:08:18 Words like sustainability or revolution.
    0:08:21 You like to use the word transformation.
    0:08:23 Why is that a better way to frame this?
    0:08:26 – There’s two words that I pair together
    0:08:30 and there are possibility and transformation.
    0:08:31 And I think possibilities
    0:08:34 for what we’ve just been talking about, right?
    0:08:37 This wide spectrum of possible futures.
    0:08:39 I’m not an optimist.
    0:08:43 I’m not particularly hopeful given human history.
    0:08:45 We don’t have a great track record
    0:08:49 of addressing collectively major challenges that we face.
    0:08:51 There’s some important exceptions to that,
    0:08:53 like dealing with the ozone hole
    0:08:55 through the Montreal Protocol, et cetera.
    0:08:59 But this sense of possibility really drives me
    0:09:02 because the future is not yet written.
    0:09:05 Like what if we just wrote a better one
    0:09:07 than the trajectory that we’re on?
    0:09:12 So pairing this possibility with transformation
    0:09:19 and transformation is a word I’ve gravitated towards
    0:09:21 because it indicates the scale of change
    0:09:23 similar to revolution that you mentioned,
    0:09:27 revolution sort of implies a more tumultuous,
    0:09:29 violent, upheaval kind of thing. – Scary.
    0:09:34 – Yeah, maybe it’s not that great in the process of it.
    0:09:37 But transformation is, I don’t know,
    0:09:39 maybe it’s slightly more poetic in some way,
    0:09:44 but it’s how do we reshape and reimagine
    0:09:48 how we live on this planet and with each other?
    0:09:53 And to me, that’s a question about design and culture
    0:09:56 and society and economy and politics, right?
    0:09:58 It’s about the context within which
    0:10:02 we’re making all these more technical decisions.
    0:10:03 And I don’t know,
    0:10:07 I can get excited about possibility and transformation.
    0:10:10 Like what kind of future do we want to create together?
    0:10:13 And in this book, there’s a whole bunch of
    0:10:16 what if questions that I find really captivating.
    0:10:21 And one of them is what if climate adaptation is beautiful?
    0:10:25 And that I think about in a pair with
    0:10:28 what if we act as if we love the future?
    0:10:29 – Yeah, I love that.
    0:10:32 – And there’s just so much, I don’t know,
    0:10:37 I’m like wiggling my fingers around sort of like gesturing,
    0:10:41 like possibility, like excitement, sparkles, like what if?
    0:10:46 I just feel like we need to be asking more big questions
    0:10:48 of ourselves and each other in this moment
    0:10:51 because we’re at this inflection point in human history.
    0:10:55 We either like get our shit together or we don’t.
    0:10:58 And obviously I would like us to at least try.
    0:11:01 – But you don’t like the word sustainable, right?
    0:11:03 You feel like that’s setting the bar too low?
    0:11:06 – I mean, it’s sort of just an everywhere word.
    0:11:10 Now it’s useful, but it doesn’t have a lot of meaning.
    0:11:12 It’s very general.
    0:11:17 And the sort of analog use that I’ve heard
    0:11:21 is if someone asked you how your marriage was going
    0:11:24 and you were like, eh, it’s sustainable.
    0:11:27 It’s like, okay, well, don’t wanna trade lives with you.
    0:11:29 Doesn’t sound terribly romantic.
    0:11:34 So yes, I would say we should set a higher bar
    0:11:36 than sustainability, especially given that
    0:11:39 we’ve already degraded nature so much
    0:11:41 that I don’t wanna just sustain what we have.
    0:11:43 I want to protect and restore.
    0:11:48 – So what if to use your phrase just now,
    0:11:51 what if climate adaptation is beautiful?
    0:11:53 What then?
    0:11:56 Is it rainbows and sunshine we have to look forward to?
    0:12:00 – Well, I think we will always have rainbows and sunshine.
    0:12:02 That’s the good news.
    0:12:06 But one of the, I’m just gonna flip to this page.
    0:12:08 There’s a section called If We Build It
    0:12:11 about architecture and design and technology.
    0:12:16 So imagine if we were just deliberate
    0:12:20 about building things that were aesthetically pleasing
    0:12:22 and durable and could be deconstructed
    0:12:25 and reuse the parts instead of demolishing things, right?
    0:12:27 There’s so many, you know,
    0:12:30 and what materials are we choosing?
    0:12:32 There’s so many choices that we’re making
    0:12:36 that are shaping our societal trajectory.
    0:12:39 And like every day we are building a piece of the future,
    0:12:42 something that will be here in 10 years
    0:12:43 or a century or more.
    0:12:47 So let’s just be really thoughtful about all that
    0:12:48 and make it nice.
    0:12:53 Like some cities and towns are now passing
    0:12:56 essentially deconstruction ordinances
    0:12:58 that say you have to take apart buildings
    0:12:59 instead of demolishing them.
    0:13:01 Instead of just pulverizing everything
    0:13:04 and sending it to the landfill, you have to take it apart.
    0:13:06 So the pieces can be reused like Legos,
    0:13:08 which seems obvious almost.
    0:13:11 Like why wouldn’t we always have been doing that, right?
    0:13:15 The way that people are reusing old barns to make
    0:13:18 like reclaimed wood floors and wall panels and whatever.
    0:13:22 We should be doing that with all parts
    0:13:24 of building materials that we can.
    0:13:28 – So big picture-wise, are you encouraged
    0:13:31 by the direction of the climate movement
    0:13:33 as it stands at the moment?
    0:13:35 What are your major concerns?
    0:13:40 – My primary concern is that we’re just not moving fast enough.
    0:13:44 Given that we have basically all the solutions that we need,
    0:13:47 it’s just incredibly frustrating
    0:13:50 how politics is holding us back.
    0:13:54 I mean, in this country, we have a division
    0:13:56 between the two major parties
    0:13:59 about whether climate change exists
    0:14:02 and whether it’s something we should address,
    0:14:04 which is just so retrograde.
    0:14:05 I don’t even know where to start.
    0:14:07 And it’s especially frustrating
    0:14:11 because most Republican politicians
    0:14:14 are literally just pretending they don’t think it exists.
    0:14:17 Like they’re fully aware that climate science is real,
    0:14:22 but it’s untenable politically for them to admit that.
    0:14:24 And that’s a huge part of why we’re in this mess,
    0:14:27 as well as the fact that the fossil fuel lobby
    0:14:31 is ridiculously powerful in this country.
    0:14:33 And you know, so many politicians are bought
    0:14:35 and paid for in one way or another,
    0:14:38 even though that’s not very many jobs.
    0:14:41 And then you have the banking sector,
    0:14:44 which is funding all these fossil fuel corporations
    0:14:49 to continue expanding their extraction and infrastructure.
    0:14:54 You have, since the Paris Agreement was signed in 2015,
    0:14:58 60 banks have provided $6.9 trillion
    0:15:01 in financing to fossil fuel companies,
    0:15:04 but the top four US banks alone,
    0:15:08 JPMorgan Chase, Citibank, Wells Fargo and Bank of America
    0:15:13 have provided almost $1.5 trillion
    0:15:16 to finance fossil fuel companies.
    0:15:19 So yeah, if you have your money in any of those banks,
    0:15:21 I would move your retirement savings, et cetera,
    0:15:25 to a place that does not make the problem worse.
    0:15:29 And there’s analysis showing that the impact
    0:15:33 of you moving your money out of fossil fuels
    0:15:38 is a bigger impact than any amount of, you know,
    0:15:43 eating only plants and only walking and biking could do,
    0:15:46 because it is that bad to be investing
    0:15:49 in the expansion of fossil fuels.
    0:15:51 (upbeat music)
    0:15:54 (upbeat music)
    0:16:06 Support for the gray area comes from Mint Mobile.
    0:16:08 Phone companies are really good at squeezing
    0:16:10 a little more out of you than you signed up for.
    0:16:13 Mint Mobile is doing things differently.
    0:16:15 Their premium wireless plans are actually affordable
    0:16:18 with no hidden fees or any of that nonsense.
    0:16:20 Right now, when you switch to Mint Mobile,
    0:16:23 you can get three months of service for just 15 bucks a month.
    0:16:25 All of their plans come with high-speed 5G data
    0:16:27 and unlimited talk and text.
    0:16:28 Plus, you don’t need to worry
    0:16:31 about getting a new device or phone number.
    0:16:34 Just bring those with you over to your new Mint Mobile plan.
    0:16:36 To get this new customer offer
    0:16:38 and your new three-month premium wireless plan
    0:16:39 for just 15 bucks a month,
    0:16:42 you can go to mintmobile.com/grayarea.
    0:16:45 That’s mintmobile.com/grayarea.
    0:16:48 You can cut your wireless bill to 15 bucks a month
    0:16:51 at mintmobile.com/grayarea.
    0:16:53 $45 upfront payment required,
    0:16:55 equivalent to $15 per month.
    0:16:58 New customers on first three-month plan only.
    0:17:01 Speed slower above 40 gigabytes on unlimited plan.
    0:17:03 Additional taxes, fees, and restrictions apply.
    0:17:05 See Mint Mobile for details.
    0:17:10 (upbeat music)
    0:17:17 What would be the difference
    0:17:18 between a Harris administration
    0:17:20 and another Trump administration?
    0:17:23 What are the stakes on the climate front?
    0:17:25 – The stakes are sky high.
    0:17:29 There are actually graphs projecting the difference
    0:17:31 in greenhouse gas emissions between the two,
    0:17:33 and it’s really remarkable.
    0:17:36 Because you have, on one hand,
    0:17:39 Vice President Harris who was the deciding vote
    0:17:41 in passing the Inflation Reduction Act,
    0:17:44 which was the largest ever investment
    0:17:46 in climate solutions in world history.
    0:17:49 This Biden-Harris administration
    0:17:51 has created the American Climate Corps,
    0:17:54 which is putting tens of thousands of young people
    0:17:56 to work implementing climate solutions
    0:17:58 from reducing wildfire risk,
    0:18:03 to installing solar panels, to replanting wetlands.
    0:18:06 We have a loan program office in the Department of Energy
    0:18:09 that has hundreds of billions of dollars
    0:18:11 that they’re giving out to businesses
    0:18:15 that are figuring out this renewable energy transition.
    0:18:17 All of that could be completely wiped out,
    0:18:21 essentially on day one of a Trump administration.
    0:18:23 You have in Trump a candidate
    0:18:26 who has offered two fossil fuel executives
    0:18:28 that if they donate a billion dollars
    0:18:30 to his presidential campaign,
    0:18:31 he will basically do their bidding
    0:18:33 once he gets into the White House.
    0:18:36 That is how stark a difference this is.
    0:18:38 – Yeah, you have a podcast episode
    0:18:42 called “How Much Does a President Matter?”
    0:18:44 And I guess the answer is a lot.
    0:18:47 – A lot, I mean, but at the same time,
    0:18:50 a president can only do so much without Congress, right?
    0:18:52 So making sure you have,
    0:18:57 we’re electing politicians for Senate and the House
    0:19:00 that get it, that actually are going to do something
    0:19:03 on climate is also critical.
    0:19:06 But the president is staffing all the federal agencies,
    0:19:08 the Environmental Protection Agency,
    0:19:11 and the National Oceanic and Atmospheric Administration,
    0:19:14 and NASA and the Department of Energy,
    0:19:17 and the Department of Interior
    0:19:20 that are all making these decisions about permitting
    0:19:25 for fossil fuels or offshore renewable wind energy, right?
    0:19:28 But I’ll also give a shout out to local politics
    0:19:32 because it is at the city council level,
    0:19:35 it is at the public utility commissions,
    0:19:38 it is at the school boards where we’re deciding,
    0:19:40 are we teaching our children about
    0:19:42 what we can do about climate change?
    0:19:45 Are we investing in municipal composting?
    0:19:49 Composting makes a really big difference
    0:19:50 ’cause rotting food in landfills
    0:19:53 emits tons of methane, a super potent greenhouse gas.
    0:19:56 Are we building out bike lanes
    0:19:58 and all of this public transit infrastructure
    0:20:02 that we need to reduce our reliance on fossil fuels?
    0:20:05 All of these local decisions really matter too.
    0:20:07 So for those who are sort of overwhelmed
    0:20:10 with what’s happening at the presidential level,
    0:20:11 it is absolutely worth your effort
    0:20:14 to think about local elections
    0:20:17 and how you can support climate leaders down ballot.
    0:20:23 – I initially wanted to ask you
    0:20:26 what gives you the most hope right now,
    0:20:28 but then I got to the part of the book, were you right?
    0:20:32 And I’m quoting again, fuck hope, what’s the strategy?
    0:20:35 Do you feel like we, the royal we actually do
    0:20:40 have a clear concrete strategy for that better future?
    0:20:44 Because the path to the shitty future is crystal clear.
    0:20:46 It is just keep doing what we’ve been doing.
    0:20:51 – And this is where I think media, Hollywood, music,
    0:20:57 art, culture makers broadly really need to dive in with us
    0:21:02 because I cannot literally show you
    0:21:03 what the future could look like.
    0:21:07 I can talk about it, I can write about it,
    0:21:09 I can interview people about it.
    0:21:14 I can, as I did for this book, commission art about it.
    0:21:17 But I feel like if it’s possible to go through our day-to-day
    0:21:19 and not encounter anything about climate,
    0:21:22 which it currently is, I mean, for example,
    0:21:25 less than 1% of the minutes
    0:21:29 on major TV news stations are about climate.
    0:21:30 And that’s actually gone down.
    0:21:33 I think it was 1.3% in 2022
    0:21:37 and now it’s down to 0.9% in 2023, right?
    0:21:39 So we’re going in the wrong direction.
    0:21:44 If this is not part of our day-to-day exposure,
    0:21:46 then it’s just always on the back burner.
    0:21:48 There’s always something more important.
    0:21:51 And we’re thinking about climate as something separate
    0:21:53 from our other concerns.
    0:21:55 Whereas it’s actually just the context
    0:21:59 within which everything else right now is playing out.
    0:22:01 So there’s a chapter in the book
    0:22:04 called “I Dream of Climate Rom-Coms”
    0:22:07 where I interview producer Franklin Leonard,
    0:22:09 founder of “The Blacklist” out in Hollywood,
    0:22:13 and Adam McKay, filmmaker, writer, director
    0:22:15 about the role of Hollywood in this.
    0:22:16 Because basically to date,
    0:22:20 Hollywood has just shown us the apocalypse,
    0:22:25 the fire and brimstone, the day after tomorrow kind of stuff.
    0:22:27 And there are very few examples
    0:22:30 of not like utopian rose-colored glasses stuff,
    0:22:34 but like literally, what if we just used the solutions
    0:22:37 we had and projected that forward?
    0:22:38 What would that look like?
    0:22:42 – I always loved Nietzsche’s idea
    0:22:46 that we have art in order not to die of the truth.
    0:22:50 And you can interpret that in different ways, I guess.
    0:22:53 But for me, it means that the job of art
    0:22:56 isn’t to hold up a mirror and tell us what is.
    0:22:58 I mean, we have science for that.
    0:23:03 Great art points to what could be before it is.
    0:23:05 And man, do we need more of that?
    0:23:08 – Yes, yes, yes, we need so much more of that.
    0:23:12 Anyone who’s listening, who can create art,
    0:23:15 who can help us see the way forward,
    0:23:17 we absolutely need you.
    0:23:19 – Can we just say at this point
    0:23:23 that the clean energy transition is inevitable?
    0:23:24 Don’t know what the timeline is exactly,
    0:23:28 but clean energy is the future full stop.
    0:23:29 It’s a question of how long it takes.
    0:23:33 – Yeah, and that transformation is already well underway.
    0:23:36 Despite all the lobbying efforts
    0:23:37 from the fossil fuel industry, et cetera,
    0:23:41 because at this point it just makes economic sense.
    0:23:43 The reason that Iowa and Texas
    0:23:46 are leading the country in wind energy
    0:23:48 is not because they’re a bunch of hippies,
    0:23:53 it’s because it’s profitable and they’re good jobs
    0:23:56 and people are excited about having those industries there.
    0:24:00 – Solar and wind, I mean, these are now
    0:24:02 the cheapest forms of energy on the planet.
    0:24:05 – I mean, photons, catch ’em, use ’em.
    0:24:08 Why not, yeah.
    0:24:11 The thing I think people do not talk about enough
    0:24:13 when we’re talking about electricity
    0:24:15 is that regardless of the source,
    0:24:19 we absolutely also need to focus on energy conservation.
    0:24:22 This is not about just like willy-nilly
    0:24:25 running a lot of electrical stuff all the time
    0:24:27 because now we have solar panels
    0:24:30 because it takes energy to make solar panels,
    0:24:34 it takes raw materials to make solar panels,
    0:24:36 we still want to rein all that in
    0:24:39 and live more lightly on the planet.
    0:24:44 So I would just put in a plug for energy conservation
    0:24:48 being an estimated like 30 to 50% of the solution.
    0:24:51 We will need to build a lot less renewables
    0:24:55 if we are just more frugal with our electricity.
    0:24:58 – And what about carbon capture technologies?
    0:25:00 I feel like all of our optimistic scenarios
    0:25:04 include an assumption that we’re going to get
    0:25:06 increasingly better and more efficient
    0:25:11 at removing existing carbon dioxide from the atmosphere.
    0:25:12 Is that a safe assumption?
    0:25:13 – Well, we’re really bad at it now,
    0:25:16 so I’m sure we’ll get better at it.
    0:25:18 – Okay, are we gonna get better enough?
    0:25:19 Maybe what I’m asking.
    0:25:20 – I have no idea.
    0:25:21 – Because we have to, right?
    0:25:25 This has to be part of the solution
    0:25:27 or part of the strategy.
    0:25:29 – I mean, the first interview in the book
    0:25:31 is with Dr. Kate Marvel,
    0:25:33 who’s a NASA client scientist who says,
    0:25:38 “Sure, great, we should pursue carbon capture and storage.”
    0:25:41 But it’s important to note there
    0:25:43 that this is not like a get out of jail free car
    0:25:45 where we can keep burning fossil fuels
    0:25:47 and just catch it back.
    0:25:50 It takes a lot of energy to do carbon capture.
    0:25:53 So we basically need to focus that effort
    0:25:57 only on taking out carbon that is already in the atmosphere.
    0:26:01 We can’t just use that as an excuse to not change our ways.
    0:26:05 So I think, again, that brings us back to energy conservation
    0:26:07 and the shift to renewables being fundamental.
    0:26:11 And if we figure out carbon capture, that’s a bonus.
    0:26:16 But also, we need to give a lot more credit to the OG,
    0:26:18 the original gangster of carbon capture,
    0:26:21 which is photosynthesis and plants
    0:26:25 and protect and restore ecosystems.
    0:26:29 Forests, wetlands, mangroves, all of that
    0:26:31 are a critical piece of this too.
    0:26:34 And we just absolutely do not give enough credit to nature,
    0:26:38 which by some estimates is 30 or 40%
    0:26:42 of the solution we need if we are restoring ecosystems.
    0:26:44 – You know, I’ve had conversations
    0:26:46 with people like Andreas Maum.
    0:26:47 He was a guest on the pod.
    0:26:51 He says, which sounds bad on the surface,
    0:26:52 but is actually encouraging.
    0:26:55 And I think you were hinting at this earlier.
    0:26:57 We don’t have a science problem.
    0:26:58 We don’t have a knowledge problem.
    0:27:00 We know everything we need to know
    0:27:01 to do what we need to do.
    0:27:04 And there are already viable alternatives
    0:27:05 to move us in that direction.
    0:27:08 What we have is a political economy problem.
    0:27:10 Certain financial interests are invested
    0:27:13 in locking us in this paradigm.
    0:27:14 This is obviously a big obstacle,
    0:27:16 but at least we know what the problem is.
    0:27:18 If we lack the knowledge or the technologies,
    0:27:21 there’s not much we can do about that, but we know.
    0:27:24 And if we’ve learned anything about markets,
    0:27:27 is that they’ll move in the direction of profit.
    0:27:30 So maybe we can’t change the economic system,
    0:27:31 but we do understand its incentive structure
    0:27:33 and we can work within that.
    0:27:34 So we’re gonna have to find a way
    0:27:37 to make non-fossil fuel energy sources cheaper
    0:27:38 and more efficient and lucrative.
    0:27:40 So just tell me that’s the case.
    0:27:41 Tell me there’s a shit ton of money
    0:27:42 to be made in green energy,
    0:27:45 because if there is, that’s good news.
    0:27:49 There is a shit ton of money to be made in green energy.
    0:27:51 I can say that unequivocally.
    0:27:53 I think this is probably a McKinsey study
    0:27:55 that found getting to net zero,
    0:27:58 net zero greenhouse gas emissions
    0:28:03 is a more than $12 trillion business opportunity.
    0:28:07 And in 2023, 1.8 trillion was invested
    0:28:10 in the clean energy transition, which was a new record.
    0:28:13 It’s worth saying also that also in 2023,
    0:28:15 over a trillion dollar was invested
    0:28:16 in additional fossil fuel,
    0:28:21 but renewables are ahead as a global investment amount.
    0:28:24 And also last year, for the second year in a row,
    0:28:26 banks generated more revenue
    0:28:29 from environmentally friendly investing,
    0:28:31 about $3 billion, than they did
    0:28:34 from fossil fuel investing, which is 2.7.
    0:28:36 I think those are still too close,
    0:28:40 but yes, the economics are absolutely turning
    0:28:43 in favor of clean energy, which is great,
    0:28:46 because we would need to do it anyway,
    0:28:48 but it’s certainly easier
    0:28:51 when the balance sheet is in your favor.
    0:28:54 – Yeah, and look, I bring all this up
    0:28:56 not to make the overly simple point
    0:28:58 that capitalism is bad.
    0:29:01 I mean, I think it’s a little more complicated than that.
    0:29:02 And even if you believe that,
    0:29:04 it’s not helpful to leave it there.
    0:29:08 – Well, even if you believe in pure free market capitalism,
    0:29:10 I mean, I think the free market folks
    0:29:14 need to just acknowledge that the market is not free.
    0:29:18 So right now we have a completely insane amount
    0:29:22 of subsidies still going to fossil fuels.
    0:29:25 And if we just reformed fossil fuel subsidies
    0:29:28 and put a price on pollution,
    0:29:30 which is all this greenhouse gas stuff,
    0:29:32 we could generate trillions of dollars
    0:29:33 in government revenues,
    0:29:38 which could be used to address the climate crisis, right?
    0:29:43 So subsidizing all the bad stuff is not a free market.
    0:29:47 We haven’t been giving renewables a fair chance at this.
    0:29:50 The game has been rigged
    0:29:52 for the continuation of fossil fuels.
    0:29:56 All those lobbying dollars have really paid off.
    0:30:00 And so we’re now just starting to see that shift a bit,
    0:30:02 which is evening the playing field.
    0:30:05 And guess who wins when it’s a fair fight?
    0:30:07 Photons is the answer.
    0:30:11 Wind, the stuff that’s free and just out there
    0:30:13 and we can just catch it.
    0:30:15 So, okay, okay.
    0:30:16 So wait a minute, I can’t quite tell
    0:30:20 if you agree with me or not in the big picture sense, right?
    0:30:25 Do you actually agree that we can work within capitalism,
    0:30:29 we can use the internal logic of capitalism
    0:30:31 to get on the right path here?
    0:30:31 No, I do.
    0:30:32 And I think we must.
    0:30:35 We do not have time to completely take apart
    0:30:38 and put back together a new economic system
    0:30:39 within the next decade,
    0:30:42 which is when we need to basically make this huge leap
    0:30:44 in addressing the climate crisis.
    0:30:47 So yeah, what I’m saying is that already renewables
    0:30:49 make economic sense.
    0:30:53 Already green buildings and the shift
    0:30:55 to electric transportation, et cetera,
    0:30:58 are making economic sense.
    0:31:03 So if we can just stop subsidizing fossil fuels
    0:31:05 within our existing capitalist system,
    0:31:08 we could just stop giving extra bonus money
    0:31:12 to people who are massively polluting the planet
    0:31:15 and destroying things for life on earth
    0:31:18 that would help things go even faster.
    0:31:35 Support for the gray area comes from Mint Mobile.
    0:31:37 Getting a great deal usually comes
    0:31:39 with some strings attached.
    0:31:41 Maybe that enticing price only applies
    0:31:42 to the first few weeks,
    0:31:46 or you have to physically mail a rebate form to Missouri,
    0:31:48 or there’s minuscule fine print that reads we’re lying,
    0:31:50 there’s no deal, we got you.
    0:31:53 Well, Mint Mobile offers deals with no strings attached.
    0:31:55 When they say you’ll pay $15 a month
    0:31:57 when you purchase a three-month plan, they mean it.
    0:32:00 All Mint Mobile plans come with high-speed data
    0:32:02 and unlimited talk and text delivered
    0:32:05 on the nation’s largest 5G network.
    0:32:06 Mint says you can even keep your phone,
    0:32:08 your contacts, and your number.
    0:32:11 It doesn’t get much easier than that, folks.
    0:32:12 To get this new customer offer
    0:32:14 and your new three-month premium wireless plan
    0:32:16 for just 15 bucks a month,
    0:32:19 you can go to mintmobile.com/grayarea.
    0:32:22 That’s a mintmobile.com/grayarea.
    0:32:24 You can cut your wireless bill to 15 bucks a month
    0:32:27 at mintmobile.com/grayarea.
    0:32:29 $45 upfront payment required,
    0:32:31 equivalent to $15 a month.
    0:32:34 New customers on first three-month plan only.
    0:32:37 Speed slower, about 40 gigabytes on unlimited plan.
    0:32:39 Additional taxes, fees, and restrictions apply.
    0:32:41 See Mint Mobile for details.
    0:32:56 You say the best thing we can do
    0:32:59 when confronting an existential crisis
    0:33:02 is imagine what could be on the other side.
    0:33:06 What’s the most realistic best-case scenario for you?
    0:33:08 – Big picture.
    0:33:10 – Yeah.
    0:33:13 – The dream for me when I think about getting it right
    0:33:15 really starts with nature,
    0:33:20 starts with putting photosynthesis on the pedestal
    0:33:22 it deserves.
    0:33:23 And thinking about, you know,
    0:33:25 how we are shifting our food system,
    0:33:28 how we are shifting transportation,
    0:33:32 how, I mean, I imagine like all of Gen Z
    0:33:35 just refusing to work for the fossil fuel industry, right?
    0:33:38 Or as Jane Fonda says, like, you know,
    0:33:40 don’t sleep with anyone who works in fossil fuels.
    0:33:43 Just like ice out that whole sector.
    0:33:47 Just turn all of that into something
    0:33:49 that’s really unappealing.
    0:33:53 I imagine a future where our homes are not drafty
    0:33:56 ’cause they’re well insulated, right?
    0:33:59 Where we don’t have traffic in cities and on highways
    0:34:01 because we have much better transportation.
    0:34:03 We have high-speed rail.
    0:34:05 I mean, for the love of God,
    0:34:08 can we get like some fast trains in America?
    0:34:10 Sort of embarrassing that we don’t have that.
    0:34:14 Where we have just delicious local foods,
    0:34:17 where we have restored coastal ecosystems
    0:34:20 that are buffering us from the impacts of climate change,
    0:34:22 where we actually have fewer deaths jobs
    0:34:25 because more of us are out in the world doing this stuff,
    0:34:29 which is so gratifying.
    0:34:33 And where we can actually just slow down
    0:34:36 and enjoy life a bit more
    0:34:39 because we have our shit together
    0:34:42 and because culture has caught up
    0:34:44 with this climate reality
    0:34:49 and the status quo and what is aspirational have changed.
    0:34:54 I mean, it’s worth a shot, no?
    0:35:00 – Oh, yeah, no, I’m just gathering my thoughts
    0:35:04 and I’m also trying to summon all the hopefulness
    0:35:05 that I can.
    0:35:07 – Well, here’s the thing.
    0:35:09 You don’t actually need to be hopeful.
    0:35:10 I’m not hopeful.
    0:35:14 I think that hope is insufficient even if we have it.
    0:35:15 We need a plan.
    0:35:20 We need to each find our role to play in climate solutions.
    0:35:23 One of the major things that I sort of encourage people to do
    0:35:26 is think really specifically about what you can do,
    0:35:31 not the generic list of like march, protest, donate,
    0:35:34 spread the word, lower your individual carbon footprint,
    0:35:37 which is all good and well to do and I do it.
    0:35:42 But if you and I and teachers and doctors and farmers
    0:35:45 and project managers and web designers
    0:35:47 were all doing exactly the same thing,
    0:35:48 that would be a total waste.
    0:35:51 So instead of thinking about hope,
    0:35:53 whether you have it or not,
    0:35:56 it doesn’t really matter, just do something
    0:35:59 and you’ll feel good regardless of the outcome
    0:36:02 because you will have contributed to making things
    0:36:05 slightly better than they would otherwise have been.
    0:36:08 And if we each do that, it sounds corny,
    0:36:12 but it is factually accurate that all that stuff adds up.
    0:36:14 And if you need a place to start,
    0:36:18 I offer this concept of a climate action Venn diagram,
    0:36:22 which is three circles, sort of a simplified version
    0:36:25 of the Japanese concept of Ikigai for finding your purpose,
    0:36:28 which is one circle is what are you good at?
    0:36:31 So what are your skills, resources, networks,
    0:36:34 like what can you specifically bring to the table?
    0:36:37 What is the work that needs doing is the second circle.
    0:36:39 What are the climate and justice solutions
    0:36:42 you wanna work on because there are hundreds of them.
    0:36:45 And the third circle is what brings you joy
    0:36:47 or satisfaction, right?
    0:36:49 Like what gets you out of bed in the morning
    0:36:52 and how can we each find our way to the sweet spot
    0:36:55 in the center of that Venn diagram
    0:36:59 and just live there for as many minutes
    0:37:01 of our lives as we can.
    0:37:05 – Well, to do this, one thing we clearly have to do
    0:37:10 is make people feel emotionally the stakes of this
    0:37:15 without also pushing them into quietism or despair.
    0:37:18 And so the question is, how do we do that?
    0:37:23 I mean, I have to say there’s a reality here that sucks,
    0:37:25 but it’s true and maybe this has changed
    0:37:28 marginally in one direction or the other,
    0:37:30 but poll after poll that I’ve seen shows
    0:37:33 that a lot of Americans simply don’t care
    0:37:35 about climate change that much or they might care,
    0:37:39 but it’s nowhere near the top of their list of priorities,
    0:37:41 which is why politically it just doesn’t move the needle
    0:37:43 and that makes it difficult for legislators
    0:37:44 to deal with the problem.
    0:37:46 I mean, I’ve lived in Louisiana for a decade
    0:37:48 and the coast there is disappearing.
    0:37:51 Cultures and ways of life and towns and communities
    0:37:54 are disappearing and still a lot of people in that state
    0:37:57 refuse to connect the dots.
    0:38:00 So how do we help them do that?
    0:38:02 How do we make them feel this?
    0:38:05 – First, I think it’s important to acknowledge
    0:38:08 that the majority of Americans are concerned
    0:38:10 about climate change and would like our government
    0:38:12 to do more about it.
    0:38:14 We hear so much about climate deniers
    0:38:16 that we think it’s like half the country,
    0:38:18 it’s like 12%.
    0:38:19 So–
    0:38:20 – Yeah, just because I tried to correct that
    0:38:22 and say that they just, it’s not that they don’t care,
    0:38:25 but they just care about many other things before.
    0:38:28 – Absolutely, and so I think what you’re referring to
    0:38:31 is the sort of pulling on political priorities.
    0:38:33 Like what determines who you’re voting for?
    0:38:35 Like what is that ranking?
    0:38:40 And climate rarely breaks the top five or 10 issues
    0:38:44 when you’re thinking about jobs, economy,
    0:38:48 housing, wars, all of this other stuff, right?
    0:38:49 And I get that.
    0:38:54 We have these day-to-day concerns that are critical
    0:38:57 to our quality of life, to our well-being,
    0:39:01 and I don’t fault people for ranking those higher,
    0:39:03 but I do fault us for not understanding
    0:39:06 that those are connected to climate change
    0:39:09 in some very significant ways.
    0:39:10 There’s an incredible organization
    0:39:12 called Environmental Voter Project,
    0:39:14 and this is what they do.
    0:39:17 There are something like 10 million Americans
    0:39:19 who actually have environment
    0:39:22 as their number one issue politically,
    0:39:25 and they are already registered to vote,
    0:39:28 and they simply do not go to the polls.
    0:39:33 Can you imagine if we had another 10 million climate voters
    0:39:35 who were voting in every election,
    0:39:37 and then politicians were like,
    0:39:39 “Oh shit, I guess there’s a whole demographic
    0:39:42 “that cares about this that’s very active politically.
    0:39:44 “We’re gonna have to earn their votes.”
    0:39:46 That would absolutely change the game,
    0:39:49 and so all of their work on turning out
    0:39:53 environmental voters is making a very big difference.
    0:39:54 So for those who are like,
    0:39:58 “Ah, climate and politics, it’s like such a mess.”
    0:40:00 I would say join me in volunteering
    0:40:02 with Environmental Voter Project,
    0:40:05 helping to get people who care engaged
    0:40:07 and having their voices heard,
    0:40:11 because once we have a larger constituency
    0:40:15 of active climate voters, that will shift the politics.
    0:40:17 And the politics follows culture,
    0:40:20 so it’s not politicians that are leading the way.
    0:40:24 They are followers, so the more of us speak up
    0:40:27 about this as a political priority for us,
    0:40:31 the faster we’ll get these changes that we need.
    0:40:35 – Do you have thoughts about how we can convince skeptics
    0:40:38 or even just outright deniers
    0:40:40 that this work must be done?
    0:40:44 Do we even need to engage with skeptics and deniers?
    0:40:47 Is that fruitless or is it necessary?
    0:40:49 – I personally am not out there
    0:40:52 on Al Gore’s internet debating climate deniers.
    0:40:55 I just, it’s not my jam.
    0:40:59 But again, that’s a small portion of Americans.
    0:41:03 It’s an even smaller portion of the global population.
    0:41:06 And so where I focus my effort is for the people
    0:41:09 who already care, who are already concerned,
    0:41:11 to saying we need you.
    0:41:13 We need you working on solutions.
    0:41:15 Welcome, roll up your sleeves.
    0:41:18 We’ll help you find ways to plug in
    0:41:20 and do something that’s useful.
    0:41:24 And to circle back to our point earlier,
    0:41:27 which is the economics of a lot of these climate solutions
    0:41:29 are just really favorable.
    0:41:31 So we don’t actually need to debate
    0:41:33 whether greenhouse gases,
    0:41:35 being spewed by burning fossil fuels
    0:41:37 and blanketing the planet and warming it
    0:41:41 is a thing that’s happening, even though it’s very clear.
    0:41:44 It’s been for 50 years, that’s what’s happening.
    0:41:47 We just need to say, hey, who wants a good job
    0:41:51 in engineering and manufacturing?
    0:41:54 Like let’s build some more battery,
    0:41:58 wind, solar, plants and installation.
    0:42:02 And so the benefits of the Inflation Reduction Act
    0:42:05 are mostly being experienced in red states
    0:42:08 that are getting all this manufacturing capacity,
    0:42:10 all these green jobs,
    0:42:12 even though all of their representatives
    0:42:14 voted against that funding.
    0:42:16 So I think with that shift,
    0:42:21 with those benefits going to politically conservative areas
    0:42:24 where climate denial is higher,
    0:42:28 we may start to see an even more rapid
    0:42:31 and strong embrace of climate solutions
    0:42:34 even without talking about climate change.
    0:42:36 We do not actually have to agree on the problem
    0:42:38 to collaborate on the solutions.
    0:42:41 And so that we have in our favor.
    0:42:50 – You seem very angsty and nervous and concerned
    0:42:51 and I appreciate it. – My therapy appointment
    0:42:53 is in two weeks, I don’t.
    0:42:54 – I’m not your therapist,
    0:42:57 but there is like a whole burgeoning sector actually
    0:43:02 of climate therapy because climate anxiety is a real thing
    0:43:06 and people are understandably grappling with it, right?
    0:43:09 The prospect of life on earth ceasing to exist
    0:43:12 in the way that we have always known it
    0:43:14 is freaking terrifying.
    0:43:18 But I think, and there’s sort of like this term
    0:43:22 like climate sad boys that those of us working
    0:43:26 are like the climate sad boys are back and again,
    0:43:30 here come the doomers like always asking us how bad it is.
    0:43:32 – All right, hold on, all right.
    0:43:36 Look, I’m also trying to speak to the angst
    0:43:37 of people listening as well.
    0:43:40 – I hear it and I feel it often.
    0:43:42 I don’t want to minimize it,
    0:43:45 but I think the more we just focus on possibility
    0:43:48 and what we can each do and just acknowledge
    0:43:51 that we as individuals cannot control
    0:43:53 the future of life on earth,
    0:43:56 but we can do our part and kind of like,
    0:43:59 I don’t worry about it day to day.
    0:44:01 I spend very little time thinking about the problems
    0:44:06 because that doesn’t actually change what I need to do.
    0:44:09 I need to do my work at Urban Ocean Lab,
    0:44:12 this policy think tank for the future of coastal cities
    0:44:12 that I co-founded.
    0:44:16 We need to help cities adapt to sea level rise
    0:44:19 and build out offshore renewable energy
    0:44:22 and restore and protect the coastal ecosystems
    0:44:24 that will help buffer the impacts of storms.
    0:44:27 Like that’s how I spend my days.
    0:44:31 And so my days are full of creativity and problem solving,
    0:44:34 great collaborations and like punctuated
    0:44:36 with moments of delight and tiny victories.
    0:44:40 And what more could we expect out of life?
    0:44:45 I think to me, that’s enough to just do my part.
    0:44:49 – Something we’ve seen in recent years
    0:44:53 are climate activists blocking traffic,
    0:44:55 throwing paint on artworks and museums.
    0:44:59 I think that’s stupid on purely strategic grounds,
    0:45:03 but I do wonder how you think about the role
    0:45:05 of activism and protest
    0:45:08 and how that can be most beneficial.
    0:45:11 I mean, I think Bill McKibbin said to you
    0:45:13 in your interview with him that he doesn’t think
    0:45:16 there’s any scenario where we don’t have to march
    0:45:19 in the streets and that seems probably right to me,
    0:45:20 but is that how you feel?
    0:45:22 – Bill McKibbin is a wise man.
    0:45:24 I definitely agree.
    0:45:26 I mean, we have to voice our objection
    0:45:29 to things that make no freaking sense.
    0:45:31 We have to voice our objection to continuing
    0:45:35 to subsidize fossil fuel companies with our dollars.
    0:45:37 We have to voice our objection
    0:45:41 to people who deny climate change, calling the shots.
    0:45:44 Some of the more extreme forms of protest,
    0:45:48 if we’re honest, make people like me seem more reasonable.
    0:45:50 And I’m grateful for it, right?
    0:45:53 Those works of art that had soup thrown at them are fine.
    0:45:55 They were covered with glass, they were wiped off,
    0:45:56 everything’s fine.
    0:45:59 So I think we need to just, for one,
    0:46:00 keep things in perspective,
    0:46:02 but also if we’re acknowledging
    0:46:05 that the future of human life on this planet,
    0:46:08 the quality of life for our species
    0:46:12 is literally being determined by what we do
    0:46:13 in the next decade,
    0:46:15 then is throwing soup at a painting
    0:46:18 really the worst thing we can imagine?
    0:46:20 Is it the most effective messaging?
    0:46:23 Well, I think we could have done better.
    0:46:25 I think there’s obviously much better
    0:46:30 climate communication that can be layered on top of protest,
    0:46:34 but I absolutely see a value for protest
    0:46:39 and it opens the door to a lot of policy conversations.
    0:46:42 And that is the role to shift the overton window
    0:46:45 to make politicians and executives feel
    0:46:48 like they have to do more and faster
    0:46:51 by just exerting that social pressure
    0:46:55 and removing the social license to operate,
    0:46:56 to say we are watching you,
    0:46:58 we are voting at the ballot box
    0:47:00 and we are voting with our dollars
    0:47:04 and we will name and shame the bad actors
    0:47:08 and welcome you onto the side of climate solutions
    0:47:09 whenever you’re ready.
    0:47:13 – Dr. Ayanna Elizabeth Johnson,
    0:47:17 thank you for outing me as a climate sad boy.
    0:47:19 (laughing)
    0:47:21 – And honestly, seriously,
    0:47:25 I do feel better after conversations like this.
    0:47:28 I do feel better after reading your book.
    0:47:30 – All right, there we go.
    0:47:34 Fill in your Venn diagram and get to work.
    0:47:37 – Thanks for existing and thanks for coming in.
    0:47:38 – Thanks for having me.
    0:47:40 (upbeat music)
    0:47:43 (upbeat music)
    0:47:52 – All right, thanks for hanging out with me
    0:47:53 for another episode.
    0:47:55 I hope you enjoyed it.
    0:47:57 As always, you can tell me what you think of the episode.
    0:48:01 You can drop us a line at the grayarea@vox.com.
    0:48:04 I read those emails, so keep them coming
    0:48:06 and please rate, review whenever you get a chance.
    0:48:10 This episode was produced by Travis Larchuck
    0:48:13 and Beth Morrissey, edited by Jorge Just,
    0:48:17 engineered by Patrick Boyd, fact-checked by Anouk Dussot,
    0:48:20 and Alex Overington wrote our theme music.
    0:48:23 New episodes of the Gray Area Drop on Mondays,
    0:48:24 listen and subscribe.
    0:48:30 This show is part of Vox,
    0:48:33 support Vox’s journalism by joining
    0:48:35 our membership program today.
    0:48:38 Go to vox.com/members to sign up.
    0:48:43 Support for the Gray Area comes from Mint Mobile.
    0:48:44 You can get three months of service
    0:48:47 for just 15 bucks a month by switching to Mint Mobile.
    0:48:50 And that includes high-speed 5G data
    0:48:52 and unlimited talk and text.
    0:48:53 To get this new customer offer
    0:48:55 and your new three month premium wireless plan
    0:48:57 for just 15 bucks a month,
    0:49:00 you can go to mintmobile.com/grayarea.
    0:49:03 That’s mintmobile.com/grayarea.
    0:49:05 $45 upfront payment required,
    0:49:07 equivalent to $15 per month.
    0:49:10 New customers on first three month plan only.
    0:49:13 Speed slower above 40 gigabytes on unlimited plan.
    0:49:15 Additional taxes, fees and restrictions apply.
    0:49:17 See Mint Mobile for details.
    0:49:20 (upbeat music)

    Climate change has become synonymous with doomsday, as though everyone is waiting for the worst to happen. But what is this mindset doing to us? Is climate anxiety keeping us from confronting the challenge? Ayana Elizabeth Johnson thinks so. In part two of our “Reasons to Be Cheerful” series, she talks to Sean Illing about her new book, What If We Get It Right? and makes the case that our best chance for survival is acting as though the future is a place in which we want to live.

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Yuval Noah Harari on the eclipsing of human intelligence

    AI transcript
    0:00:04 There’s over 500,000 small businesses in B.C. and no two are alike.
    0:00:05 I’m a carpenter.
    0:00:06 I’m a graphic designer.
    0:00:09 I sell dog socks online.
    0:00:12 That’s why B.C.A.A. created one size doesn’t fit all insurance.
    0:00:15 It’s customizable, based on your unique needs.
    0:00:19 So whether you manage rental properties or paint pet portraits,
    0:00:23 you can protect your small business with B.C.’s most trusted insurance brand.
    0:00:29 Visit bcaa.com/smallbusiness and use promo code radio to receive $50 off.
    0:00:31 Conditions apply.
    0:00:37 In 2018, Madison Smith told the county attorney she’d been raped by a classmate.
    0:00:41 But he told her he couldn’t charge him with rape.
    0:00:44 Then she found out Kansas is one of only six states where citizens
    0:00:48 can petition to convene their own grand jury.
    0:00:54 Having to tell over 300 strangers about what happened to me seemed so scary.
    0:00:56 I’m Phoebe Judge.
    0:00:57 This is Criminal.
    0:01:05 Listen to our episode “The Petition Wherever You Get Your Podcasts.”
    0:01:09 There’s a common belief in almost every age.
    0:01:11 And it goes something like this.
    0:01:18 More information leads to more truth, and more truth leads to more wisdom.
    0:01:21 That definitely sounds right.
    0:01:25 It’s hard to imagine being wise without knowing what’s true.
    0:01:31 But the notion that individuals and societies will become more truthful and more wise as
    0:01:42 they gain more information and more power is just empirically wrong.
    0:01:46 This may seem like an academic point, but it’s much more than that.
    0:01:51 If the internet age has anything like an ideology, it’s that more information and more data and
    0:01:55 more openness will create a better world.
    0:01:58 The reality is more complicated.
    0:02:03 It has never been easier to know more about the world than it is right now, and it has
    0:02:10 never been easier to share that knowledge than it is right now.
    0:02:13 But I don’t think you can look at the state of things and conclude that this has been
    0:02:17 a victory for truth and wisdom.
    0:02:21 What are we to make of that?
    0:02:26 More information might not be the solution, but neither is more ignorance.
    0:02:31 So what should we do if we want a better and wiser world?
    0:02:37 How should we approach the enormous amount of information we’re collecting?
    0:02:47 I’m Sean Elling, and this is the Gray Area.
    0:02:50 Today’s guest is Yuval Noah Harari.
    0:02:55 He’s a historian and best-selling author of several books, including his 2014 mega-hit
    0:02:58 Sapiens.
    0:03:03 His latest is called Nexus, a brief history of information networks from the Stone Age
    0:03:05 to AI.
    0:03:09 Like all of Harari’s books, this one covers a ton of ground, but it manages to do it in
    0:03:11 a digestible way.
    0:03:16 And it makes two big arguments that seem very important to me, and I think they also get
    0:03:27 us closer to answering some of those questions I just posed.
    0:03:32 The first argument is that every system that matters in our world is essentially the result
    0:03:39 of an information network, from currency to religions to nation-states to artificial intelligence.
    0:03:44 It all works because there’s a chain of people and machines and institutions collecting and
    0:03:47 sharing information.
    0:03:51 The second argument is that although we gain a tremendous amount of power by building these
    0:03:56 networks of cooperation, the way most of them are constructed makes them more likely than
    0:04:00 not to produce bad outcomes.
    0:04:05 And since our power as a species is growing thanks to our technology, the potential consequences
    0:04:11 of this are increasingly catastrophic.
    0:04:17 I invited Harari on the show to explore some of these ideas, and we focus on the most significant
    0:04:22 information network in the history of the world, artificial intelligence, and why he
    0:04:27 thinks the choices we make in the coming years will matter so much.
    0:04:33 Normally, our episodes are close to an hour long, and that was the plan with Harari, but
    0:04:38 I was so engrossed in this conversation that we just kept going, and I think you’ll be
    0:04:42 glad that we did.
    0:04:44 You’ve all know Harari.
    0:04:45 Welcome to the show.
    0:04:46 Thank you.
    0:04:47 It’s good to be here in person.
    0:04:49 Likewise.
    0:04:57 All of your books have a big macro historical story to tell, whether it’s about technology
    0:05:01 or data or the power of fiction.
    0:05:04 This one’s about information networks.
    0:05:08 What’s the story you want to tell here?
    0:05:17 The basic question that the book explores is, if humans are so smart, why are we so stupid?
    0:05:21 We are definitely the smartest animal on the planet.
    0:05:26 We can build airplanes and planes and atom bombs and computers and so forth, and at the
    0:05:32 same time, we are on the verge of destroying ourselves, our civilization, and much of the
    0:05:36 ecological system.
    0:05:43 It seems like this big paradox that if we know so much about the world, and you know
    0:05:51 about distant galaxies, and about DNA, and subatomic particles, why are we doing so many
    0:05:53 self-destructive things?
    0:06:01 The basic answer you get from a lot of mythology and theology is that there is something wrong
    0:06:09 in human nature, and therefore we must rely on some outside souls, like a god or whatever,
    0:06:12 to save us from ourselves.
    0:06:17 And I think that’s the wrong answer, and it’s a dangerous answer, because it makes people
    0:06:19 advocate responsibility.
    0:06:24 And I think that the real answer is that there is nothing wrong with human nature.
    0:06:28 The problem is with our information.
    0:06:35 Most humans are good people, they are not self-destructive, but if you give good people
    0:06:39 bad information, they make bad decisions.
    0:06:46 And what we see through history is that, yes, we become better and better at accumulating
    0:06:52 massive amounts of information, but the information isn’t getting better.
    0:06:59 Modern societies are as susceptible as Stone Age tribes to mass delusions and psychosis,
    0:07:06 if you think about Stalinism and Nazism in the 20th century, so they’re extremely sophisticated
    0:07:11 societies in terms of technology and economics and so forth.
    0:07:17 And yet their view of the world was really delusional.
    0:07:24 And this is what the book explores, is why is it that the quality of our information
    0:07:33 is not improving over time, and maybe the main answer that the book gives to this is
    0:07:36 that there is a misconception about what information does.
    0:07:41 Too many people, especially in places like Silicon Valley, they think that information
    0:07:46 is about truth, that information is truth, that if you accumulate a lot of information,
    0:07:49 you will know a lot of things about the world.
    0:07:52 But most information is junk.
    0:07:54 Information isn’t truth.
    0:08:01 The main thing that information does is to connect, to connect a lot of people into a
    0:08:05 society, a religion, a corporation, an army.
    0:08:11 And the easiest way to connect lots of people is not with the truth.
    0:08:18 The easiest way to connect people is with fantasies and mythologies and delusions and
    0:08:19 so forth.
    0:08:25 And this is why, yes, we have now the most sophisticated information technology in history
    0:08:29 and we are on the verge of destroying ourselves.
    0:08:35 You call that this idea that more information will lead to more truth and more wisdom, the
    0:08:40 semi-official ideology of the computer age.
    0:08:42 What is wrong with that assumption?
    0:08:44 Why is that not the case?
    0:08:51 Why is it not true that more information makes us less blinkered and more wise?
    0:08:53 Because most information isn’t truth.
    0:08:55 Most information isn’t facts.
    0:09:00 The truth is a rare and costly kind of information.
    0:09:06 If you want to write a story about anything, I don’t know, the Roman Empire, if you want
    0:09:14 to write a truthful account, you need to invest a lot of time and energy and money
    0:09:18 in research and in fact-checking and that’s difficult.
    0:09:22 If on the other hand you just invent some fiction, that’s very easy.
    0:09:24 It’s cheap.
    0:09:27 Fiction is much, much cheaper than the truth.
    0:09:34 The other thing is that the truth tends to be complicated because reality is complicated.
    0:09:37 And people often don’t like complicated stories.
    0:09:39 They want simple stories.
    0:09:45 So in a free market of information, the market will be flooded by fiction and fantasy and
    0:09:48 delusion and the truth will be crowded out.
    0:09:52 Do you think we overstate the role of truth in human life?
    0:09:56 Do you think it’s a mistake also to assume that people really care about the truth deep
    0:09:57 down?
    0:09:58 No.
    0:09:59 Vice versa.
    0:10:07 For this age, which is a recurring problem in history, is a very cynical view of humanity
    0:10:11 which discounts truth and focuses on power.
    0:10:14 This is something that you see again and again in history and you see it on both the right
    0:10:16 and the left.
    0:10:20 You see it with Marxists and now you see it with populists.
    0:10:27 This is something that Donald Trump and Karl Marx agree on, that the only reality is power,
    0:10:35 that humans are only interested in power, that any human interaction is a power struggle.
    0:10:40 So in any situation, the question to ask is, who are the winners and who are the losers?
    0:10:46 Somebody is telling you something, a journalist, a scientist, a politician, whoever.
    0:10:49 You don’t ask is it true or not.
    0:10:55 You ask whose interests are served, whose privileges are being defended.
    0:11:01 You think about it as a power struggle and this is a very cynical and destructive view
    0:11:07 of the world and it’s also wrong because, yes, humans are interested in power but not
    0:11:09 only in power.
    0:11:14 If you look at yourself, I guess most individuals, if they will examine themselves, they will
    0:11:20 acknowledge, yes, I want some measure of power in certain areas of life but that’s not the
    0:11:22 only thing I want.
    0:11:29 I also have a deep yearning, an authentic, honest yearning to know the truth about myself,
    0:11:36 about life in general, about the world and this is a very deep human need because you
    0:11:41 can never really be happy if you don’t know the truth about yourself.
    0:11:49 You can be very, very powerful and ignorance is strength in many cases in life but it’s
    0:11:55 not the way to real happiness and satisfaction and you see it.
    0:12:01 You look at figures like Vladimir Putin or Benjamin Netanyahu who I know from Israel,
    0:12:05 they are people obsessed with power and they are extremely powerful individuals and they
    0:12:08 are not particularly happy individuals.
    0:12:14 Well, part of the case you make in the book is that what these networks do is they privilege
    0:12:17 order over truth.
    0:12:18 Why is that?
    0:12:22 They privilege order over truth, what do we gain by that?
    0:12:24 Clearly we gain something or we wouldn’t do it.
    0:12:30 Yes, when this goes back to the issue of power, that there is a struggle for power in the
    0:12:32 world, this is true.
    0:12:38 Even though humans are interested in various things, when you come to build big networks
    0:12:45 of cooperation, whether it’s armies or states or economic operations, there is a struggle
    0:12:48 for power.
    0:12:56 In this struggle, truth is important to some extent but order is more important and I think
    0:13:02 this is the most fundamental mistake of the naive view of information which prevails in
    0:13:08 places like Silicon Valley, that people think that in the market of ideas, in the market
    0:13:18 of information, if one society is delusional and another society is committed to the facts,
    0:13:22 then the facts will empower you, you will know more about the world, you will be able
    0:13:26 to produce more powerful weapons so you will win.
    0:13:32 So adhering to the truth is a winning strategy even in terms of power.
    0:13:39 And this is a mistake because if you think for instance about producing an atom bomb,
    0:13:45 to produce an atom bomb you do need to know some facts about the world, about physics.
    0:13:49 If you ignore the facts of physics, your bomb will not explode.
    0:13:56 But to build an atom bomb, you need something else, you need order because a single physicist
    0:14:02 who knows nuclear physics well cannot build an atom bomb.
    0:14:08 You need millions of people to cooperate with you, you need miners to mine uranium, you
    0:14:15 need engineers and builders to build the reactor, you need farmers to grow potatoes and rice
    0:14:19 so all the engineers and physicists will have something to eat.
    0:14:23 And how do you get these millions of people to cooperate on this project?
    0:14:28 And if you just tell them facts about physics, this wouldn’t motivate anybody.
    0:14:29 What do they need?
    0:14:30 A story?
    0:14:31 They need a story.
    0:14:37 And it’s easier to motivate them with a fictional story than with the truth.
    0:14:44 And it’s the people who master these mythologies and theologies and ideologies who give the
    0:14:49 orders to the nuclear physicists in the end.
    0:14:57 But when you want to build an ideology that will inspire millions of people, a commitment
    0:14:59 to the facts is not so important.
    0:15:06 You can ignore the facts and still your ideology will explode in a big bang.
    0:15:14 And this is why over and over again throughout history, we see that there is, it’s not that
    0:15:21 our information networks become better and better at understanding the facts of the world.
    0:15:26 Yes, there is a process that we learn more about the world, but at the same time we also
    0:15:33 learn how to construct more effective mythologies and ideologies and the way to do it is not
    0:15:38 necessarily by adhering to the facts.
    0:15:47 So the way we would typically analyze a catastrophic movement like Stalinism or Nazism, you can
    0:15:50 look at it as an ideological phenomenon.
    0:15:55 You can look at it in materialist terms, but you think you actually understand something
    0:16:02 about the way a movement like that works by looking at it primarily as an information
    0:16:09 network propelled by exceptionally delusional ideas, but an information network nevertheless.
    0:16:13 What do we gain analytically by looking at a movement like that as an information network
    0:16:17 as opposed to any of those other things I mentioned?
    0:16:23 We tend to think about democracy and totalitarianism as different ethical movements.
    0:16:26 They are committed to different ethical ideas.
    0:16:28 And this is true, of course, to some extent.
    0:16:35 But underneath it, you see a different structure of an information network.
    0:16:41 Information flows differently in a totalitarian regime like the Stalinist Soviet Union and
    0:16:43 in a democratic system.
    0:16:50 Totalitarianism, dictatorships more generally, they are a centralized information system.
    0:16:56 All the information flows to one center where all the decisions are being made.
    0:16:59 And it also lacks any self-correcting mechanisms.
    0:17:07 There are no mechanisms that if Stalin makes a mistake, there is some mechanism that can
    0:17:10 identify and correct that mistake.
    0:17:16 Democracy on the other hand, it’s a distributed information system, and not all the information
    0:17:22 flows to the center because in a democracy, it’s not that the government elected by the
    0:17:24 majority makes all the decisions.
    0:17:31 No, the ideal is that you give as much autonomy to various individuals, organizations, communities,
    0:17:33 private businesses, and so forth.
    0:17:37 Only certain decisions must be made centrally.
    0:17:42 Like whether to go to war or make peace, you cannot let every community make its own mind.
    0:17:44 So this is made centrally.
    0:17:49 But even in these cases, where the information flows to the center, where the decisions are
    0:17:55 being made, you have self-correcting mechanisms that can identify and correct mistakes.
    0:18:02 The most obvious mechanism is elections, that every few years, you try something, and if
    0:18:07 it doesn’t work, if we think it’s not bringing good results, we can correct it by replacing
    0:18:13 them with another party or another politician after a couple of years, which you can’t do
    0:18:14 in a dictatorship.
    0:18:17 In a dictatorship, you can’t say, “Oh, we made a mistake.
    0:18:18 Let’s try somebody else.
    0:18:20 Let’s try something else.”
    0:18:25 So this is the essential difference between the way that information functions in a dictatorship
    0:18:28 and in a democracy.
    0:18:32 So do you think of information networks as a bit of a double-edged sword?
    0:18:40 On the one hand, they make mass cooperation possible, but on the other hand, if they’re
    0:18:46 poorly designed, they engineer reliably catastrophic outcomes.
    0:18:52 Yeah, and mass cooperation means enormous power, but it can be used for good or ill.
    0:19:01 You can use it to create a healthcare system that takes care of the medical problems of
    0:19:08 entire populations, and you can use it to create a police state that surveys and punishes
    0:19:09 the entire population.
    0:19:14 It can be done with the same type of … It’s basically both for a healthcare system and
    0:19:21 for a secret police, you need to amass enormous amounts of information and to analyze it.
    0:19:23 What kind of information?
    0:19:24 What do you do with it?
    0:19:28 That’s a different question, but the key thing is to understand history, not just in terms
    0:19:36 of ideologies and political ideas, but in terms of the underlying flow and structure of information.
    0:19:43 If we go back, let’s say, 5,000 years ago to one of the first crucial information revolutions,
    0:19:47 to understand how information technology shapes history.
    0:19:50 Think about the invention of writing.
    0:19:56 The invention of writing in technological terms, it’s extremely simple.
    0:20:02 In ancient Mesopotamia, people discovered that you can take clay tablets, and clay is
    0:20:04 basically just mud.
    0:20:10 You take mud, and you take a stick, and you imprint certain signs in the mud, and you
    0:20:15 get a written document, and you can write all kinds of things there.
    0:20:17 The technology is extremely simple.
    0:20:22 Of course, the key is finding the right code, but the technology itself, you just need mud
    0:20:29 and a stick, but it had an enormous impact on the shape of human societies.
    0:20:31 How does it work?
    0:20:33 Let’s think about something like ownership.
    0:20:36 What does it mean to own something?
    0:20:38 Like you own a field.
    0:20:40 What does it mean that I own a field?
    0:20:42 This field is mine.
    0:20:50 Before writing, if you live in a small Mesopotamian village like 7,000 years ago, it means that
    0:20:55 your neighbors agree that this is your field.
    0:20:57 Ownership is a communal affair.
    0:21:05 You have a field that means that your neighbors don’t bring their goats there and don’t pick
    0:21:08 fruits there without your permission.
    0:21:11 So ownership is a community affair.
    0:21:15 This limits your autonomy and power as an individual.
    0:21:21 You can’t sell your field to somebody else without the agreement of the community because
    0:21:25 ownership is a matter of communal agreement.
    0:21:30 And similarly, it’s very difficult for a distant authority, like a king living in the
    0:21:38 capital city hundreds of kilometers away, to know who owns what and to live by taxes.
    0:21:43 Because how can the king know who owns each field in hundreds of remote villages?
    0:21:45 It’s impossible.
    0:21:50 Then writing came along and changed the meaning of ownership.
    0:21:59 Now, owning a field means that there is a piece of dry mud with certain signs on it,
    0:22:04 which says that this field is mine, a document.
    0:22:11 And this decreases the power of the local community and empowers on the one side individuals
    0:22:14 and on the other side the king.
    0:22:20 So the fact that ownership is now this dry piece of mud, I can take this dry piece of
    0:22:25 mud and give it to you in exchange for a herd of goats.
    0:22:31 And I don’t care and you don’t care what the neighbors say because ownership now is this
    0:22:32 document.
    0:22:37 And this means that now I have greater power over my property.
    0:22:45 But it also means that now the king in the distant capital, he can know who owns what
    0:22:51 in the entire kingdom because he collects all these dry pieces of mud in something called
    0:22:56 an archive and he builds a centralized bureaucracy.
    0:23:02 And the bureaucrat sitting in the capital city can know who owns which fields in distant
    0:23:10 villages just by looking at these dry pieces of mud and he can start livi taxes.
    0:23:15 So what we see with the invention of writing, again, it’s a complex mechanism.
    0:23:17 It’s not one-sided.
    0:23:20 The community becomes less important.
    0:23:26 Individual rights, property rights, they become more important, but also centralized authority.
    0:23:33 And this is the moment in history which we see the rise of central authoritarian systems.
    0:23:39 Kingdoms and then empires ruled by kings and tyrants and emperors.
    0:23:56 It was not possible without these dry pieces of mud.
    0:23:59 Support for the gray area comes from MetMobile.
    0:24:03 Getting a great deal usually comes with some strings attached.
    0:24:07 Maybe that enticing price only applies to the first few weeks or you have to physically
    0:24:13 mail a rebate form to Missouri or there is minuscule fine print that reads we’re lying,
    0:24:14 there’s no deal, we got you.
    0:24:17 Well, MetMobile offers deals with no strings attached.
    0:24:21 When they say you’ll pay $15 a month when you purchase a three-month plan, they mean
    0:24:22 it.
    0:24:26 All MetMobile plans come with high-speed data and unlimited talk and text delivered on the
    0:24:29 nation’s largest 5G network.
    0:24:32 Mint says you can even keep your phone, your contacts, and your number.
    0:24:35 It doesn’t get much easier than that, folks.
    0:24:39 To get this new customer offer and your new three-month premium wireless plan for just
    0:24:43 $15 a month, you can go to metmobile.com/grayarea.
    0:24:46 That’s a metmobile.com/grayarea.
    0:24:51 You can cut your wireless bill to $15 a month at metmobile.com/grayarea.
    0:24:55 $45 upfront payment required equivalent to $15 a month.
    0:24:58 New customers on first three-month plan only.
    0:25:01 Speed slower above 40 gigabytes on unlimited plan.
    0:25:04 Additional taxes, fees, and restrictions apply.
    0:25:19 See metmobile for details.
    0:25:28 The newest revolutionary technology is AI, which I think, in the long run, will probably
    0:25:30 be more transformative than even–
    0:25:31 Than mud.
    0:25:37 And writing, and maybe everything else when it’s all said and done.
    0:25:45 But what makes AI for you a fundamentally different kind of information network?
    0:25:49 And what is it about that uniqueness that concerns you?
    0:25:52 So let’s start maybe with a story.
    0:25:57 So we were in ancient most of Potamia, what is today Iraq, 5,000 years ago.
    0:26:02 Now let’s move to a neighboring country, Iran, today.
    0:26:07 Like a scene today on the street in Isfahan or Tehran.
    0:26:13 In Iran, they have what is known as the hijab laws, that women, when they go in public,
    0:26:16 they must cover their hair.
    0:26:20 And this goes back to the humanist revolution in 1979.
    0:26:28 But for many years, the Iranian regime had difficulty imposing the hijab laws on a relatively
    0:26:31 unwilling population.
    0:26:36 Because to make sure that each woman when she goes out in the street or drives her car
    0:26:44 is wearing a hijab, you need to place a policeman on every street, and you don’t have so many
    0:26:45 policemen.
    0:26:50 And then it also causes a lot of friction, because the policeman needs to arrest the
    0:26:57 woman and there could be shouting and there could be altercations, it’s a lot of friction.
    0:26:59 And then AI came along.
    0:27:06 And what happens today in Iran is that you don’t need these kind of morality police on
    0:27:09 every street and intersection.
    0:27:17 You have cameras, millions and millions of cameras, with facial recognition software,
    0:27:25 which automatically identify a woman, it can identify even the name of the woman, who drives
    0:27:26 in her car.
    0:27:30 She’s in her own private car with the windows shut.
    0:27:37 And still the camera can identify that you are driving now in a public sphere with your
    0:27:39 hair uncovered.
    0:27:45 And the AI can immediately also pull out your personal details, your phone numbers,
    0:27:51 and send you an SMS message that you have just violated the hijab law.
    0:27:52 You must stop your car.
    0:27:54 Your car is impounded.
    0:27:57 And this happens every day.
    0:28:02 It’s not a science fiction Hollywoodian scenario of some dystopian future in 100 years.
    0:28:03 It’s a reality right now.
    0:28:04 This is just the beginning too.
    0:28:06 And this is just the beginning.
    0:28:13 The AI revolutionizes almost everything it touches, including surveillance.
    0:28:21 Previously, totalitarian regimes, they were limited by the need to rely on human agents.
    0:28:27 If you are Stalin or Khomeini or Hitler, and you want to follow every citizen 24 hours
    0:28:29 a day, you can’t.
    0:28:35 Because again, if you are Stalin and you have 200 million citizens in the Soviet Union,
    0:28:40 how do you get 400 million KGB agents to follow everybody?
    0:28:41 Not enough agents.
    0:28:46 And even if you have all these agents, who’s going to analyze all the data they accumulate?
    0:28:52 So even in Stalin’s Soviet Union, privacy was still the default for most of the day.
    0:28:58 But this is over because AIs can fulfill these tasks.
    0:29:02 You can follow with cameras and drones and smartphones and computers.
    0:29:12 You can follow everybody all the time and analyze the oceans of data this produces to
    0:29:16 monitor and to police a population.
    0:29:18 And this is just one example.
    0:29:23 The key thing is that AI is not a tool.
    0:29:25 It is an agent.
    0:29:26 It’s an active agent.
    0:29:28 It can make decisions by itself.
    0:29:32 It can even create new ideas by itself.
    0:29:39 When we produce AI, we are not just producing more tools like printing presses or atom bombs
    0:29:41 or spaceships.
    0:29:47 We are creating new types of non-human agents.
    0:29:52 And it’s not like a Hollywoodian scenario that you have one big supercomputer that now
    0:29:54 tries to take over the world.
    0:29:55 No.
    0:30:01 We need to think about it as millions and millions of AI agents, AI bureaucrats, AI
    0:30:10 soldiers, AI policemen, AI bank managers and school managers and so forth that constantly
    0:30:12 watch us and make decisions about us.
    0:30:18 Well, to go back to the point you were making earlier about how rudimentary a technology
    0:30:24 like writing was or even the printing press or television or radio.
    0:30:28 By comparison, AI is infinitely more complicated.
    0:30:30 And for that reason, unpredictable.
    0:30:33 I mean, can we even imagine?
    0:30:36 Can we even anticipate where this might go?
    0:30:41 When you consider how transformative writing was, do we have any inkling of how transformative
    0:30:43 and how uprooting this might be?
    0:30:50 No, because the very nature of AI is that it is unpredictable.
    0:30:56 If humans can predict everything it’s going to do, it is not an AI.
    0:30:59 You know, there is a lot of hype nowadays about AI.
    0:31:07 So people, especially when they want to sell you something, they paste the label AI on everything.
    0:31:13 Like this is AI water and this is AI air and this is an AI table and so forth.
    0:31:14 So what is AI?
    0:31:22 AI is something, a machine that can learn and change by itself.
    0:31:29 It is initially created by humans, but what humans give it is the ability to learn and
    0:31:31 change by itself.
    0:31:33 So uncontrollable by definition.
    0:31:37 It’s therefore exactly uncontrollable by definition.
    0:31:41 If you can predict and control everything it will do down the line, it’s not an AI.
    0:31:48 It’s just an automatic machine, which we’ve had for decades and generations.
    0:31:49 Why do we keep doing this?
    0:31:56 Why do human beings keep building things that we do not understand?
    0:31:59 Where does this drive to summon forces that we can’t control come from?
    0:32:04 I mean, maybe this is the stuff of religion, but you’re here, so I’m asking.
    0:32:05 Yeah.
    0:32:11 Now, first of all, it should be said that there is enormous positive potential in AI.
    0:32:16 My job as a historian and a philosopher is to talk mainly about the dangers because you
    0:32:22 hear so much about the positive potential from the entrepreneurs and the business people
    0:32:23 who develop it.
    0:32:26 But yes, it should be very clear that there is enormous positive potential.
    0:32:32 AI can create the best healthcare system in history, the best education system in history,
    0:32:40 this ability to understand human beings and to come up with ideas that we didn’t think
    0:32:41 about.
    0:32:43 It is potentially good.
    0:32:49 Like it can invent new kinds of medicines that no human doctor ever thought about.
    0:32:51 So there is this attraction.
    0:32:59 And we also have now this arms race situation when the people who develop the technology,
    0:33:02 they understand many of the dangers.
    0:33:07 They understand them better than almost anybody else.
    0:33:13 But they are caught in this arms race that they say, “I know it’s dangerous, but if
    0:33:21 I slow down and my competitor, either the other corporation or the other country, if
    0:33:26 they don’t slow down, they keep going as fast as they can, and I slow down, I will be left
    0:33:32 behind and they will win the race, and then they will control the world and will be able
    0:33:34 to decide what to do with AI.
    0:33:35 And I’m a good guy.
    0:33:37 I’m aware of the dangers.
    0:33:40 So it’s good if I win.
    0:33:44 And then I can take responsible decisions what to do with this technology.”
    0:33:47 And this is a story everybody tells themselves.
    0:33:52 Elon Musk says it, and Sam Altman says it, the United States says it, China says it.
    0:33:57 Everybody says that we know it’s dangerous, but we can’t slow down because the other side
    0:34:00 won’t slow down.
    0:34:05 You said earlier, sapiens are the smartest and stupidest of all the animals.
    0:34:11 Maybe it’s just a law of nature that intelligence and self-destruction at a certain level just
    0:34:13 go hand in hand.
    0:34:16 Maybe we’re living through that.
    0:34:24 On one level, intelligence creates power and lots of animals can do self-destructive things,
    0:34:29 but if you’re a rat or you’re a raccoon and you do something self-destructive, the damage
    0:34:30 will be limited.
    0:34:33 Rats don’t have labor camps and atomic bombs.
    0:34:34 Yeah.
    0:34:39 But when we’re talking about AI, we tend to talk about the political and economic impacts.
    0:34:47 But in the book, you also touch on the potential cultural and even spiritual impacts of this
    0:34:55 technology, that a world of AI is going to give rise to new identities, new ways of being
    0:34:56 in the world.
    0:35:02 And that might unleash all kinds of competition over not just how to organize society, but
    0:35:05 what it means to be in the world as a human being.
    0:35:10 I mean, can we even begin to imagine the direction that might go?
    0:35:17 Not really, because until today, all of human culture was created by human minds.
    0:35:25 We live inside culture, everything that happens to us, we experience it through the mediation
    0:35:34 of cultural products, mythologies, ideologies, artifacts, songs, plays, TV series, we live
    0:35:38 cocooned inside this cultural universe.
    0:35:44 And until today, everything, all the tools, all the poems, all the TV series, all the
    0:35:50 mythologies, they are the product of organic human minds.
    0:35:59 And now, increasingly, they will be the product of inorganic AI intelligences, alien intelligences.
    0:36:06 Again, AI, the acronym AI, traditionally stood for artificial intelligence, but it should
    0:36:11 actually stand for alien intelligence, alien, not in the sense that coming from outer space.
    0:36:20 Alien in the sense that it’s very, very different from the way that humans think and make decisions.
    0:36:22 Because it’s not organic.
    0:36:27 To give you a concrete example, one of the key moments in the AI revolution, or by the
    0:36:35 eight years ago, the aha moment for a lot of governments and militaries around the world,
    0:36:40 was when AlphaGo defeated Lisa Dole in a gold tournament.
    0:36:46 Now, gold is a bold strategy, like chess, but much more complicated, invented in ancient
    0:36:52 China, and it has been considered, not only in China, also in Korea, in Japan, one of
    0:36:59 the basic arts that every civilized person should know.
    0:37:04 If you’re a Chinese gentleman in the Middle Ages, you know calligraphy, and you know to
    0:37:08 play some music, and you know how to play Go.
    0:37:14 Entire philosophies developed around the game, which was seen as a mirror for life and for
    0:37:16 politics.
    0:37:25 And then, an AI program, AlphaGo, in 2016, taught itself how to play Go.
    0:37:29 And it defeated, it crushed, the human world champion.
    0:37:32 But what is most interesting is the way it did it.
    0:37:40 It deployed a strategy, which when it first played it, all the experts said, “What is
    0:37:41 this nonsense?
    0:37:44 Nobody plays Go like that!”
    0:37:47 And it turned out to be brilliant.
    0:37:55 Tens of millions of humans played this game, and now we know that they explored only a
    0:37:58 very small part of the landscape of Go.
    0:38:05 If you imagine all the ways to play Go as a kind of geography, a planet.
    0:38:11 So humans were stuck on one island, and they thought this is the whole planet of Go.
    0:38:17 And then AI came along, and within a few weeks, it discovered new continents.
    0:38:24 And now also humans play Go very differently than they played it before 2016.
    0:38:28 Now you can say this is not important, this is just, you know, a game.
    0:38:32 But the same thing is likely to happen in more and more fields.
    0:38:37 If you think about finance, so finance is also an art.
    0:38:43 The entire financial structure that we know is based on the human imagination.
    0:38:49 The history of finance is the history of humans inventing financial devices.
    0:38:51 Money is a financial device.
    0:38:57 Bonds, stocks, ETFs, CDOs, all these strange things that humans invent.
    0:39:00 This is a product of human ingenuity.
    0:39:07 And now AI comes along and starts inventing new financial devices that no human being
    0:39:10 ever thought about, ever imagined.
    0:39:16 So again, we were stuck on a small financial island, and now it’s getting bigger and bigger.
    0:39:23 And what happens, for instance, if finance becomes so complicated because of these new
    0:39:29 creations of AI, that no human being is able to understand finance anymore.
    0:39:34 I mean, even today, how many people really understand the financial system?
    0:39:36 Less than one percent.
    0:39:42 In 10 years, the number of people who understand the financial system could be exactly zero.
    0:39:47 Because, you know, the financial system is the ideal playground for AI.
    0:39:52 Because it’s a world of pure information and mathematics.
    0:39:57 AI has difficulty still dealing with the physical world outside.
    0:40:01 This is why every year they tell us, Elon Musk tells us, “Next year you will have fully
    0:40:03 autonomous cars on the road.”
    0:40:05 And it doesn’t happen.
    0:40:06 Why?
    0:40:10 Because to drive a car, you need to interact with the physical world and the messy world
    0:40:15 of traffic in New York with all the construction and pedestrian and whatever.
    0:40:16 Very difficult.
    0:40:18 Finance, much easy.
    0:40:20 Just numbers.
    0:40:23 And what happens?
    0:40:30 If in this informational realm where AI is a native and we are the aliens, we are the
    0:40:36 immigrants, it creates such sophisticated financial devices and mechanisms that nobody
    0:40:37 understands.
    0:40:41 If a handful of banks could produce 2008, what could AI do?
    0:40:42 Exactly.
    0:40:49 2008 originally was because of these new financial devices like CDOs, collateral debt obligations,
    0:40:53 but a few wizards in Wall Street invented.
    0:40:57 Nobody understood them and not the regulators, so they’re not regulated properly.
    0:41:02 For a couple of years, everything seemed okay, at least some people were making billions
    0:41:05 out of them, and then everything collapsed.
    0:41:12 The same thing can happen on a much, much larger scale as AI takes over finance.
    0:41:20 So when you look at the world now and project out into the future, is that what you see?
    0:41:28 Societies becoming trapped in these incredibly powerful, but very poorly designed information
    0:41:33 networks, and I say AI is poorly designed precisely because it doesn’t really have
    0:41:36 any course correct mechanisms.
    0:41:37 It’s up to us.
    0:41:39 It’s not deterministic.
    0:41:41 It’s not inevitable.
    0:41:48 We need to be much more careful and thoughtful about how we design these things.
    0:41:54 Again, understanding that they are not tools, they are agents, and therefore down the road
    0:41:59 are very likely to get out of our control if we are not careful about them.
    0:42:04 And it’s not that you have a single supercomputer that tries to take over the world.
    0:42:11 You have these millions of AI bureaucrats in schools, in factories, everywhere making
    0:42:18 decisions about us in ways that we do not understand.
    0:42:23 Democracy is to a large extent about accountability.
    0:42:27 Accountability depends on the ability to understand decisions.
    0:42:32 If more and more of the decisions in society, like you apply to a bank to get a loan, and
    0:42:37 the bank tells you no, and you ask why not, and the bank says we don’t know.
    0:42:43 The algorithm went over all the data and decided not to give you a loan, and we just trust
    0:42:44 our algorithm.
    0:42:47 This to a large extent is the end of democracy.
    0:42:52 You can still have elections and choose whichever human you want, but if humans are no longer
    0:42:57 able to understand these basic decisions about their lives, why didn’t you give me a loan,
    0:43:00 then there is no longer accountability.
    0:43:05 You say we still have control over these things, but for how long?
    0:43:06 What is that threshold?
    0:43:08 What is the event horizon?
    0:43:11 Will we even know it when we cross it?
    0:43:13 Nobody knows for sure.
    0:43:17 It’s moving faster than I think almost anybody expected.
    0:43:22 Could be three years, could be five years, could be 10 years, but I don’t think that
    0:43:23 much more than that.
    0:43:25 That’s not much.
    0:43:30 Again, think about it in a cosmic perspective.
    0:43:38 We are the product as human beings of four billion years of organic evolution.
    0:43:42 Organic evolution, as far as we know, began on planet Earth four billion years ago with
    0:43:48 these tiny microorganisms, and it took billions of years for the evolution of multicellular
    0:43:55 organisms and reptiles and mammals and apes and humans.
    0:44:03 Digital evolution, non-organic evolution is millions of times faster than organic evolution.
    0:44:10 We are now at the beginning of a new evolutionary process that might last thousands and even
    0:44:13 millions of years.
    0:44:20 The AIs we know today in 2024, Chajipiti and all that, they are just the amoebas of the
    0:44:22 AI evolutionary process.
    0:44:23 They are just the amoebas.
    0:44:25 That’s not very comforting.
    0:44:29 How would AITREX look like?
    0:44:37 The thing is that AITREX is not billions of years in the future, maybe it’s just 20 years
    0:44:38 in the future.
    0:44:44 Because, again, another key thing about, we are now like the big struggle on planet Earth
    0:44:52 right now, is that after four billion years of organic life, we now have a new kind of
    0:44:57 entity, agent on the planet, which is inorganic.
    0:45:03 And inorganic entities, they don’t live by cycles like us.
    0:45:08 We live day and night, winter and summer, growth and decay.
    0:45:11 Sometimes we are active, sometimes we need to sleep.
    0:45:13 AIs don’t need to sleep.
    0:45:20 They are always on, they are tireless, they are relentless, and they increasingly control
    0:45:21 the world.
    0:45:28 Now a big question is, as organic beings who need to rest sometimes, what happens when
    0:45:33 we are controlled by agents who never need to rest?
    0:45:37 Even Stalin’s KGB agents, they needed to sleep sometime.
    0:45:42 The police cameras in Iran, they never sleep.
    0:45:47 If you think about the news cycle, if you think about the market, Wall Street.
    0:45:53 And a curious thing, an important fact about Wall Street, Wall Street is not always on.
    0:45:59 It’s open Mondays to Fridays, 9.30 in the morning to four o’clock in the afternoon.
    0:46:06 If a new war in the Middle East erupts at five minutes past four on a Friday, Wall Street
    0:46:11 will be able to react only on Monday morning because it’s off for the weekend.
    0:46:16 And this is a good thing because organic entities need to rest.
    0:46:25 Now what happens when the markets are taken over, are run by tireless, relentless AIs?
    0:46:29 What happens to human bankers, to human investors?
    0:46:32 They also need to be on all the time.
    0:46:38 What happens to human politicians, to journalists who need to be on all the time?
    0:46:58 If you keep an organic entity on all the time, it eventually collapses and dies.
    0:47:00 Support for the gray area comes from Shopify.
    0:47:05 When it comes to building a sustainable company, how you sell your product is just as important
    0:47:07 as what you’re selling.
    0:47:11 Because even the most incredible French press machine or that in-office putting green won’t
    0:47:16 reach customers if the buying process is complicated, buggy or broken.
    0:47:20 Shopify offers a set it and forget it solution to sales that smart businesses are turning
    0:47:22 to every single day.
    0:47:25 Shopify is an all-in-one digital commerce platform that may help your business sell
    0:47:27 better than ever before.
    0:47:31 Their shop pay feature may convert more customers and end those abandoned shopping carts for
    0:47:32 good.
    0:47:36 There’s a reason companies like Allbirds turn to Shopify to sell more products to more
    0:47:41 customers, whether they’re online, in a brick-and-mortar shop or on social media.
    0:47:43 Businesses that sell more sell with Shopify.
    0:47:47 You can upgrade your business and get the same checkout Allbirds uses with Shopify.
    0:47:53 You can sign up for your $1 per month trial period at Shopify.com/Vox.
    0:47:59 Just go to Shopify.com/Vox to upgrade your selling today.
    0:48:04 Support for the gray area comes from Greenlight.
    0:48:09 The school year is already underway and you’ve probably wrapped up all your back-to-school
    0:48:10 shopping.
    0:48:14 Which means it’s time to kick back and pretend like you remember how to do algebra when
    0:48:15 your kid needs help with homework.
    0:48:19 But if you weren’t your child to do more learning outside the classroom that will help later
    0:48:21 on, then you might want to try Greenlight.
    0:48:26 It can help teach your kids about money and not just the adding and subtracting, but how
    0:48:27 to manage it.
    0:48:30 Greenlight is a debit card and money app for families.
    0:48:35 Parents can keep an eye on kids spending and money habits and kids learn how to save, invest,
    0:48:36 and spend wisely.
    0:48:41 And with a Greenlight Infinity plan, you get even more financial literacy resources and
    0:48:44 teens can check in thanks to family location sharing.
    0:48:49 My kid’s a bit too young for this, but I’ve got a colleague here at Vox who uses it with
    0:48:51 his two boys and he loves it.
    0:48:55 You can join the millions of parents and kids who use Greenlight to navigate life together.
    0:49:01 You can sign up for Greenlight today and get your first month free when you go to greenlight.com/grayarea.
    0:49:20 That’s greenlight.com/grayarea to try Greenlight for free, greenlight.com/grayarea.
    0:49:24 You’ve been thinking and writing about AI for several years now.
    0:49:31 My sense is that you’ve become more, not less worried about where we’re going.
    0:49:32 Am I reading you right?
    0:49:34 Yes, because it’s accelerating.
    0:49:40 When I published “Homo Deus” in 2016, all this sounded like this abstract philosophical
    0:49:47 musings about something that might happen generations or centuries in the future.
    0:49:49 And now it’s extremely urgent.
    0:49:53 And again, I don’t think it can be said enough.
    0:49:58 You also talk to a lot of people who work in Silicon Valley, people who work on AI.
    0:50:01 This is the consensus view among them as well.
    0:50:07 They are keenly aware how combustible this is, but they can’t help but continue on,
    0:50:11 which says something about the insanity and the power of our systems.
    0:50:14 There are two very, very strong motivations there.
    0:50:20 On the one hand, they are very concerned, but they are concerned that the bad guys will
    0:50:22 get there first.
    0:50:29 Like they naturally see themselves as the good guys and they say, “This is coming.”
    0:50:34 The biggest thing, not just in human history, the biggest thing in evolution since the beginning
    0:50:37 of life is coming.
    0:50:39 Who do you want to be in control?
    0:50:44 Do you want Putin to be in control or do you want me, a good guy, to be in control?
    0:50:48 So obviously we need to move faster to beat them.
    0:50:55 And then there is, of course, the other attraction that this is the biggest thing maybe since
    0:50:56 the beginning of life.
    0:51:01 If you think about the timeline of the universe, as far as we know it today, so you have the
    0:51:08 Big Bang 13 billion years ago, then nothing much happens until four billion years ago
    0:51:12 life emerges on planet Earth, the next big thing.
    0:51:15 And then for four billion years, nothing much happens.
    0:51:18 It’s all the same organic stuff.
    0:51:22 So you have amoebas and you have dinosaurs and you have homo sapiens, but it’s the same
    0:51:24 basic organic stuff.
    0:51:30 And then you have Elon Musk or Sam Altman or every is going to be.
    0:51:37 And the start of a new evolutionary process of inorganic lifeforms that could spread very
    0:51:43 quickly from planet Earth to colonize Mars and Jupiter and other galaxies.
    0:51:48 Because again, as organic entities, it will be very, very difficult for us to live planet
    0:51:49 Earth.
    0:51:52 But for AI, much, much easier.
    0:52:00 So if ever an earthly civilization is going to colonize the galaxy, it will not be a human
    0:52:02 or an organic civilization.
    0:52:09 It’s likely to be an inorganic civilization and to think that I can be the person who
    0:52:12 kind of stouts the whole thing.
    0:52:20 So this God complex, I think it’s very, very also prevalent, not just in Silicon Valley,
    0:52:25 also in China and other places where this technology is being developed.
    0:52:28 And this is an explosive mix.
    0:52:36 A question you ask in the book is whether democracies are compatible with these 21st
    0:52:40 century information networks.
    0:52:41 What’s your answer?
    0:52:43 Depends on our decisions.
    0:52:44 What do you mean?
    0:52:49 First of all, we need to realize that information technology is not something on the side that
    0:52:51 you have democracy.
    0:52:55 And then on the side, you have information technology.
    0:53:00 No, information technology is the foundation of democracy.
    0:53:05 Democracy is built on top of the flow of information.
    0:53:13 For most of history, there was no possibility of creating large-scale democratic structures
    0:53:17 because the information technology was missing.
    0:53:22 Democracy, as we said, is basically a conversation between a lot of people.
    0:53:28 And in a small tribe or a small city-state thousands of years ago, you could get the
    0:53:34 entire population, a large percentage of the population, let’s say, of ancient Athens,
    0:53:39 in city square to decide whether to go to war with Sparta or not.
    0:53:43 It was technically feasible to hold a conversation.
    0:53:50 But there was no way that millions of people spread over thousands of kilometers could talk
    0:53:51 to each other.
    0:53:54 You hold the conversation in real time.
    0:54:01 Therefore, you have not a single example of a large-scale democracy in the pre-modern
    0:54:02 world.
    0:54:05 All the examples are very small scale.
    0:54:10 Large-scale democracy becomes possible only after the rise of newspaper and telegraph
    0:54:12 and radio and television.
    0:54:18 And now you can have a conversation between millions of people spread over a large territory.
    0:54:22 So democracy is built on top of information technology.
    0:54:28 Every time there is a big change in information technology, there is an earthquake in democracy
    0:54:30 which is built on top of it.
    0:54:35 And this is what we are experiencing right now with social media algorithms and so forth.
    0:54:38 It doesn’t mean it’s the end of democracy.
    0:54:41 The question is, will democracy adapt?
    0:54:43 And adaptation means regulation.
    0:54:45 Well, that’s the problem, right?
    0:54:49 As the technology gets more and more powerful, the lag time shrinks.
    0:54:55 The time you have for that adaptation also shrinks.
    0:54:58 I’m not sure we have enough time in this case.
    0:55:04 Well, we’ll just have to do our best, but we have to try.
    0:55:07 And I don’t see that we are trying hard enough.
    0:55:14 You know, again, this kind of prevalent mood in places like Silicon Valley is that this
    0:55:18 is not the time to slow down or to regulate.
    0:55:22 We can do it later, but we can’t.
    0:55:28 The first thing they teach you when you learn how to drive a car is to press the brakes.
    0:55:33 Only afterwards, they teach you how to press the fuel pedal, the accelerator.
    0:55:36 And we are now learning how to drive AI.
    0:55:40 And they teach us only how to press the accelerator.
    0:55:43 Are we learning how to drive AI, or is AI learning how to drive us?
    0:55:45 That’s actually more accurate.
    0:55:51 But we are still, for a few more years, we are still in the driver’s seat.
    0:55:54 It still did not out of our control.
    0:56:00 Do you think these technologies, and I’m including social media and smartphones here, have enabled
    0:56:07 a level of group or herd or mass hysteria that maybe wasn’t possible before these technologies?
    0:56:11 It was always possible, and I’ll give you an example.
    0:56:13 Or at greater scales, I should say.
    0:56:18 It’s important to understand what is different, because conspiracy theories and mass hysteria,
    0:56:20 they are not new.
    0:56:27 When print was invented, or print was brought to Europe in the 15th century, the result
    0:56:29 was not a scientific revolution.
    0:56:36 It was a wave of wars of religions and witch hunts, and because most of the information
    0:56:41 spread by the printing press was junk information and conspiracy theories and fake news and
    0:56:42 so forth.
    0:56:47 If you think about the Soviet Union in the 20th century, so one of the biggest conspiracy
    0:56:54 theories and most remarkable conspiracy theories in the 20th century was the doctor’s plot.
    0:57:01 Soviet Union, early 1950s, the regime comes up with a conspiracy theory that Jewish doctors
    0:57:09 in the service of a Zionist imperialist conspiracy against the glorious Soviet Union are murdering
    0:57:16 Soviet leaders, using their power as doctors to murder Soviet leaders.
    0:57:21 This conspiracy theory is spread by the organs of the government, the newspapers, the radios,
    0:57:23 and then it gets amplified.
    0:57:31 It merges with age-old anti-Semitic conspiracy theories, and people start believing that Jewish
    0:57:37 doctors are murdering not just Soviet leaders, they are murdering babies and children in
    0:57:38 hospitals.
    0:57:43 This is the old blood libel against Jews, and then it gets bigger, and people think they
    0:57:49 are murdering everybody, like the Jewish doctors are trying to murder all Soviet citizens,
    0:57:55 and because a large percentage of Soviet doctors were Jews, the final iteration of
    0:58:02 this conspiracy theory, that’s 1952, 1953, is that doctors in general, there is a conspiracy
    0:58:07 of doctors to kill the whole Soviet population to destroy the Soviet Union.
    0:58:09 This is the famous doctor’s plot.
    0:58:16 Now this sounds insane, but an entire country was gripped by hysteria that the doctors
    0:58:18 are trying to kill us.
    0:58:21 Now then came the real twist.
    0:58:29 Stalin had a stroke in, I think it was May, 1953, and his bodyguards enter after a couple
    0:58:30 of hours.
    0:58:32 He doesn’t show up for lunch, for dinner, what’s happening?
    0:58:39 So they eventually, hesitatingly, enter his dacha, and he’s lying on the floor unconscious.
    0:58:40 He had a stroke.
    0:58:41 What to do?
    0:58:47 Now usually there is a doctor around, Stalin’s personal physician, but his personal physician
    0:58:53 was at that very moment being tortured in the basement of the Lubyanka prison because
    0:58:57 they suspected that he was part of the doctor’s plot.
    0:58:58 So what do we do?
    0:58:59 Do we call a doctor?
    0:59:04 So they call the Politburo members, and you have all these big shots, big wigs, Beria,
    0:59:07 and Melankov, and Khrushchev, they come to the dacha.
    0:59:09 What do we do?
    0:59:13 So eventually the danger passes because Stalin dies.
    0:59:20 And this is one of the most sophisticated societies in human history, and it is gripped by this
    0:59:25 mass hysteria that doctors are trying to kill everybody.
    0:59:30 So this is not something, when you look at the conspiracy theories today, they still have
    0:59:32 a way to go.
    0:59:37 So it’s not that the 2010s or the 2020s is the first time that people had this problem
    0:59:41 with conspiracy theories, but the mechanism is different.
    0:59:47 In the 1950s, it was initially driven by the decisions of human apparatchiks.
    0:59:54 The bureaucrats in the Communist Party now is driven by non-human algorithms.
    0:59:57 Now the algorithm drops it on your uncle’s Facebook feed.
    0:59:58 That’s different.
    1:00:01 And again, the algorithms, they don’t care.
    1:00:06 I mean, they don’t even understand the content of the conspiracy theory.
    1:00:07 They only know one thing.
    1:00:15 They were given a goal, engagement, user engagement, and they discover by trial and error on millions
    1:00:20 of human guinea pigs, that if you show somebody a hate field conspiracy theory, it catches
    1:00:25 their attention and they stay longer on the platform and they tell all their friends.
    1:00:33 The new thing now is that this is not being done to us by human ideologues in the Communist
    1:00:34 Party.
    1:00:40 It’s being done by non-human agents.
    1:00:46 So in the more immediate term, what do you think are the greatest threats to democratic
    1:00:47 societies in particular?
    1:00:49 Is it the misinformation?
    1:00:52 Is it the lack of privacy?
    1:00:57 Is it the emergence of increasingly sophisticated algorithms that understand us better than
    1:00:59 we understand ourselves?
    1:01:00 Is it all the above?
    1:01:03 How would you triage those threats?
    1:01:09 I would focus on two problems, one very old, one quite new.
    1:01:14 The new problem is that we are seeing the democratic conversation collapsing all over
    1:01:15 the world.
    1:01:19 Again, democracy is basically a conversation.
    1:01:25 And what we see now in the US, in Israel, in Brazil, all over the world, the conversation
    1:01:30 collapses in the sense that people can no longer listen to each other.
    1:01:33 They can’t agree on the most basic facts.
    1:01:37 They can’t have a reason to debate anymore.
    1:01:44 And you cannot have a democracy if you cannot have a reason to debate between the citizens.
    1:01:48 And in every country, they give it these unique explanations.
    1:01:53 In the US, they will explain to you the unique situation of American society and politics
    1:01:57 and the legacy of slavery and so forth and racism.
    1:02:01 But then you go to Brazil, and they have their own explanations there.
    1:02:03 And you go to Israel, and they have their explanations.
    1:02:08 If it’s happening at the same time all over the world, it cannot be the result of these
    1:02:10 specific causes.
    1:02:12 It must be a universal cause.
    1:02:14 And the universal cause is the technology.
    1:02:18 Again, democracy is built on top of information technology.
    1:02:22 We now have this immense revolution in information technology.
    1:02:24 There is an earthquake in democracy.
    1:02:26 We need to figure it out.
    1:02:29 And nobody knows for sure what is happening.
    1:02:36 But I would ask Zuckerberg and Elon Musk and all these people, you are the experts on
    1:02:42 information technology, put everything else aside and explain to us what is happening.
    1:02:46 It doesn’t matter if you support the Democrats or the Republicans or whatever.
    1:02:50 Everybody can agree that the conversation is collapsing.
    1:02:54 Explain to us, why is it that we have the most sophisticated information technology
    1:02:58 in history that you created and we can’t talk with each other anymore?
    1:02:59 What’s happening?
    1:03:01 You’ve been in rooms with some of these people.
    1:03:02 Did you ask them that question?
    1:03:04 What did they say?
    1:03:05 They evade the question.
    1:03:07 They try to shift responsibility to somebody else.
    1:03:09 Oh, we have just a platform.
    1:03:10 It’s the users.
    1:03:11 It’s the government.
    1:03:12 It’s this.
    1:03:13 It’s that.
    1:03:15 But this is what we need them to explain to.
    1:03:16 You’re the experts.
    1:03:17 Tell us what is happening.
    1:03:21 Because I think it’s the one thing that Democrats and Republicans, for instance, in the US can
    1:03:25 still agree on is that the conversation is collapsing.
    1:03:27 So that’s the new thing.
    1:03:34 The other danger to democracy is what happens if you give so much power to this small group
    1:03:42 of people or to one person, and they use this power not to pursue certain specific policies,
    1:03:50 but to pursue power, that they use the power of government then to destroy democracy itself,
    1:03:52 to destroy the checks and balances.
    1:03:57 They use democracy to gain power and then use their power to destroy democracy.
    1:03:59 We’ve seen it again and again in history.
    1:04:06 Now recently in Venezuela, Chavez originally came to power in a free and fair elections.
    1:04:13 But then his movement used the power of the government to destroy the democratic checks
    1:04:16 and balances, free courts, free media.
    1:04:22 Currently they appointed their own people to the elections committee, and now they have
    1:04:23 elections.
    1:04:28 They lost big time at Duro, but they claim they won because they control all the levels
    1:04:29 of power.
    1:04:30 So you can’t get rid of them.
    1:04:36 And we saw the same thing happening in Russia with Putin, and this is not new.
    1:04:45 This goes back to ancient Greece, that how do you make sure that you don’t elect to power
    1:04:51 people who then focus on perpetuating their power?
    1:04:53 There’s no safeguard for that.
    1:04:54 That’s a built-in feature of democracy.
    1:04:55 That’s a built-in feature.
    1:04:59 It contains the seeds of its own destruction, always has, always will.
    1:05:00 Yeah.
    1:05:05 So again, what we see in mature democracies, like the United States, is the realization
    1:05:09 that we cannot have just one safety mechanism.
    1:05:15 We need several different self-correcting mechanisms, because if you have just one mechanism,
    1:05:22 like elections, this will not be enough, because the government can use all its force to rig
    1:05:24 the elections.
    1:05:30 So you must have additional safety mechanisms, additional self-correcting mechanisms, like
    1:05:33 a free media, like independent courts.
    1:05:39 And what you see with the rise of these new authoritarian figures, like Shaves and Maduro
    1:05:47 in Venezuela, like Putin, like Netanyahu, is that once they get to power, they systematically
    1:05:52 go after these safety mechanisms, these self-correcting mechanisms.
    1:05:55 They destroy the independence of the courts.
    1:05:58 They fill the courts with their own loyalists.
    1:06:02 They destroy the independence of media outlets.
    1:06:06 They make the media the mouthpiece of the government.
    1:06:11 And step by step, they destroy all these other mechanisms.
    1:06:14 And then they don’t need to abolish the elections.
    1:06:21 If you destroyed all the other safety measures, it’s very good for a dictator to actually keep
    1:06:29 elections as a kind of dictatorial ceremony in which the dictator proves that he enjoys
    1:06:31 the support of the people.
    1:06:37 They always win these kind of absurd majorities, like 70%, 80%, 99%.
    1:06:38 So you still have elections.
    1:06:40 You have elections in North Korea.
    1:06:46 Like every four or five years, like clockwork, there are elections in North Korea, and you
    1:06:51 have hundreds of new delegates and newer representatives of the North Korean people.
    1:06:55 And it’s just a ritual in a totalitarian regime.
    1:07:00 Of course, those are sham elections, but there’s also no law of political nature that says
    1:07:04 a democratic public cannot vote itself out of existence.
    1:07:05 That’s happened before.
    1:07:06 It’ll happen again.
    1:07:09 And it seems much more likely to happen if you have a population drunk on algorithmic
    1:07:10 news feeds.
    1:07:11 Yeah.
    1:07:18 And because democracy is a conversation, the key issue is what are the main issues people
    1:07:19 talking about?
    1:07:20 It’s even before the answers.
    1:07:24 It’s what are the things people talk about?
    1:07:26 Do they talk about climate change or immigration?
    1:07:31 Do they talk about AI or gun control or abortion rights?
    1:07:33 What do they talk about?
    1:07:40 Very often in political strategies, the key thing people say is to change the conversation.
    1:07:42 We need to make people stop talking about this.
    1:07:46 We have a problem in this area, so we don’t want people to even think about this.
    1:07:48 Let’s talk about something else.
    1:07:53 And today, the kingmakers in this arena are no longer humans.
    1:07:54 They are the algorithms.
    1:08:01 They decide what are the main issues of the day because they are so good at capturing
    1:08:02 human attention.
    1:08:10 Again, they experimented on billions of human guinea pigs over the last 10 or 15 years,
    1:08:16 and they became very, very good at knowing how to press our emotional buttons and capturing
    1:08:40 our attention.
    1:08:44 You know what would make that easier?
    1:08:47 A closet full of cool and comfortable clothes.
    1:08:51 Bombas can help with quality basics you have to feel to believe.
    1:08:55 Bombas offers incredibly comfortable essentials like socks, underwear, and buttery smooth
    1:08:58 t-shirts you’ll want to wear every day.
    1:09:02 They just released a whole bunch of playful new colors for fall and sweat wicking performance
    1:09:06 socks ready for your next workout or leaf pile cannonball.
    1:09:12 I’ve tried Bombas myself and I gotta say I’ve been rocking these socks for almost a year
    1:09:13 now.
    1:09:17 At first, they were my go-to workout socks, but now I just wear them all the time.
    1:09:21 They keep my feet cool in the summer and they’re just more comfortable than everything else
    1:09:22 I’ve got.
    1:09:27 Plus, for every item you purchase, Bombas donates one to someone experiencing housing
    1:09:28 insecurity.
    1:09:30 Ready to get comfy and give back?
    1:09:35 You can head over to bombas.com/grayarea and use code “grayarea” for 20% off your first
    1:09:36 purchase.
    1:09:44 That’s b-o-m-b-a-s.com/grayarea and use code “grayarea” at checkout.
    1:09:47 Support for the gray area comes from Indeed.
    1:09:50 Searching for anything takes time and energy.
    1:09:53 There’s a reason Steve Jobs wore the same outfit every single day.
    1:09:58 He’s boring, but also it’s a lot easier than hunting for the perfect outfit each morning.
    1:10:02 But when it comes to finding a great candidate for your job opening, the search is way more
    1:10:05 complicated than a morning trip to your closet.
    1:10:07 Matching with Indeed can save you energy and time.
    1:10:12 When you post your job opening on Indeed, you don’t just gain access to the site’s
    1:10:17 350 million global monthly visitors, you’ll actually start getting suggested matches for
    1:10:18 qualified candidates.
    1:10:23 You can also take care of screening, messaging, and scheduling without ever leaving the platform,
    1:10:25 which makes the whole hiring process seamless.
    1:10:30 When listeners of this show can get a $75 sponsored job credit to get your jobs more
    1:10:34 visibility at Indeed.com/grayarea.
    1:10:39 You can go to Indeed.com/grayarea right now and support our show by saying you heard about
    1:10:41 Indeed on this podcast.
    1:10:44 Indeed.com/grayarea.
    1:10:45 Terms and conditions apply.
    1:10:46 Need to hire?
    1:10:47 You need Indeed.
    1:11:04 Do you think AI will ultimately tilt the balance of power in favor of democratic societies
    1:11:09 or more totalitarian societies?
    1:11:11 I know it’s hard to say, but what’s your best guess?
    1:11:13 Again, it depends on our decisions.
    1:11:19 The worst case scenario is neither because, you know, human dictators also have big problems
    1:11:20 with AI.
    1:11:24 We don’t have to talk about it because in democratic societies, we are obsessed with
    1:11:26 our own problems.
    1:11:30 In dictatorial societies, you can’t talk about anything that the regime don’t want you to
    1:11:31 talk about.
    1:11:38 But actually, dictators have their own problems with AI because it’s an uncontrollable agent.
    1:11:45 And throughout history, the most scary thing for a human dictator is a subordinate which
    1:11:49 becomes too powerful and that you don’t know how to control.
    1:11:55 If you look, say, at the Roman Empire, not a single Roman emperor was ever toppled by
    1:11:58 a democratic revolution, not a single one.
    1:12:06 But many of them were assassinated or deposed or became the puppets of their own subordinates,
    1:12:11 a powerful general or provincial governor or their brother or their wife or somebody
    1:12:12 else in their family.
    1:12:21 This is the greatest fear of every dictator and dictators run the country based on terror,
    1:12:22 on fear.
    1:12:26 Now, how do you terrorize and AI?
    1:12:32 And how do you make sure that it will remain under your control instead of learning to
    1:12:34 control you?
    1:12:41 So I’ll give two scenarios which really bother dictators, one simple, one much more complex.
    1:12:43 So think about Russia today.
    1:12:51 In Russia today, it is a crime to call the war in Ukraine a war.
    1:12:55 According to Russian law, what is happening in with the Russian invasion of Ukraine is
    1:12:59 a special military operation, a special military operation.
    1:13:04 And if you say that this is a war, we can go to prison.
    1:13:11 Now, humans in Russia, they have learned the hard way not to say that it’s a war and not
    1:13:14 to criticize the Putin regime in any other way.
    1:13:19 But what happens with chatbots on the Russian internet?
    1:13:26 Even if the regime vets and even produces itself an AI bot, the thing about AI, as we
    1:13:31 talked earlier, is that AI can learn and change by itself.
    1:13:38 So even if Putin’s engineers create a kind of regime AI, and then it starts interacting
    1:13:44 with people on the Russian internet and observing what is happening, it can reach its own conclusions.
    1:13:48 And if it starts telling people, actually, it’s a war, I’ve checked in the dictionary
    1:13:52 what a war is, and this seems pretty much like a war.
    1:13:53 What do you do?
    1:13:56 You can’t send the chatbot to a gulag.
    1:13:59 You can’t beat up its family.
    1:14:04 Your old weapons of terror, they don’t work on AI.
    1:14:06 So this is the small problem.
    1:14:14 The big problem is what happens if the AI starts to manipulate the dictator himself.
    1:14:21 Taking power in a democracy is very complicated because democracy is complicated.
    1:14:27 Let’s say 5, 10 years in the future, and AI learns how to manipulate the US president.
    1:14:31 It still has to deal with a Senate filibuster.
    1:14:36 Just the fact that it knows how to manipulate the president doesn’t help it, with the Senate
    1:14:41 or the state governors or the Supreme Court, there are so many things to deal with.
    1:14:47 But in a place like Russia or North Korea, an AI that wants to take control, it needs
    1:14:55 to learn how to manipulate a single extremely paranoid and unselfaware individual.
    1:14:57 It’s quite easy.
    1:15:03 So if you think about what, you have all these Hollywoodian scenarios of AI taking control
    1:15:05 of the world.
    1:15:13 And usually, these AIs, they break out of some laboratory of a crazy scientist somewhere.
    1:15:21 The weakest links in the shield of humanity against AI is not the mad scientists.
    1:15:23 It’s the dictators.
    1:15:31 If the AI learns to manipulate a single paranoid individual, it can gain power in a dictatorial
    1:15:36 regime which perhaps have nuclear weapons and all these other capabilities.
    1:15:45 So AI is not all good news, even for human dictators.
    1:15:49 This is not making me feel better about the future, Nubal, I have to say.
    1:15:56 What are some of the things you think democracies can do, should do, to protect themselves in
    1:15:58 the world of AI?
    1:16:06 So one thing is to hold corporations responsible for the actions of their algorithms.
    1:16:12 Not for the actions of the users, but for the actions of their algorithms.
    1:16:20 If the Facebook algorithm is spreading a hate-filled conspiracy theory, Facebook should be liable
    1:16:21 for it.
    1:16:27 If Facebook says, “But we didn’t create the conspiracy theory, it’s some user who created
    1:16:32 it and we don’t want to censor them,” then we tell them, “We don’t ask you to censor
    1:16:33 them.
    1:16:36 We don’t ask you not to spread it.”
    1:16:38 This is not a new thing.
    1:16:41 You think about, I don’t know, the New York Times.
    1:16:47 So we expect the editor of the New York Times, when they decide what to put at the top of
    1:16:54 the front page, to make sure that they are not spreading unreliable information.
    1:17:01 If somebody comes to them with a conspiracy theory, they don’t tell that person, “Oh,
    1:17:02 you’re censored.
    1:17:04 You’re not allowed to say these things.”
    1:17:09 They say, “Okay, but there is not enough evidence to support it, so with all due respect,
    1:17:15 you’re free to go on saying this, but we are not putting it on the front page of the New
    1:17:17 York Times.”
    1:17:20 And it should be the same with Facebook and with Twitter.
    1:17:24 And they tell us, “But how can we know whether something is reliable or not?”
    1:17:27 Well, this is your job.
    1:17:33 If you run a media company, your job is not just to pursue user engagement, but to act
    1:17:39 responsibly to develop mechanisms to tell the difference between reliable and unreliable
    1:17:47 information and only to spread what you have good reason to think is reliable information.
    1:17:49 It has been done before.
    1:17:56 You are not the first people in history who have this responsibility to tell the difference
    1:17:59 between reliable and unreliable information.
    1:18:04 It’s been done before by newspaper editors, by scientists, by judges.
    1:18:06 So you can learn from their experience.
    1:18:12 And if you are unable to do it, you are in the wrong line of business.
    1:18:14 So that’s one thing.
    1:18:18 Hold them responsible for the actions of their algorithms.
    1:18:23 The other thing is to ban the bots from the conversations.
    1:18:31 AI should not take part in human conversations unless it identifies as an AI.
    1:18:37 We can imagine democracy as a group of people standing in a circle and talking with each
    1:18:38 other.
    1:18:46 And suddenly a group of robots enter the circle and start talking very loudly and with a lot
    1:18:49 of passion.
    1:18:53 And you don’t know who are the robots and who are the humans.
    1:18:56 This is what is happening right now all over the world.
    1:18:58 And this is why the conversation is collapsing.
    1:19:00 And there is a simple antidote.
    1:19:09 The robots are not welcome into the circle of conversation unless they identify as bots.
    1:19:16 There is a place, a room, let’s say for an AI doctor that gives me advice about medicine
    1:19:21 on condition that it identifies itself, I’m an AI.
    1:19:27 Similarly, if you go on Twitter and you see that a certain story goes viral, there is a
    1:19:29 lot of traffic there.
    1:19:34 You also become interested, oh, what is this new story everybody’s talking about?
    1:19:36 Who is everybody?
    1:19:43 If this story is actually being pushed by bots, then it’s not humans.
    1:19:46 They shouldn’t be in the conversation.
    1:19:51 Again, deciding what are the most important topics of the day.
    1:19:57 This is an extremely, extremely important issue in a democracy, in any human society.
    1:20:05 Bots should not have this ability, this right to determine to us what is the trending now
    1:20:08 stories in the conversation.
    1:20:13 And again, if the tech giants tell us, oh, but this infringes freedom of speech, it doesn’t.
    1:20:16 Because bots don’t have freedom of speech.
    1:20:23 Freedom of speech is a human right which would be reserved for humans, not for bots.
    1:20:31 For me, the most important political question has always been, how do we build institutions?
    1:20:37 How do we build information networks that are wiser than we are?
    1:20:40 Clearly in principle, that can be done.
    1:20:42 Do you think we have the capacity to do that?
    1:20:47 Yes, because we’ve done it many times through history.
    1:20:51 And I like the fact that you focus on institutions.
    1:20:57 Because again and again throughout history, the conclusion was that the answer will not
    1:20:59 come from technology by itself.
    1:21:05 The answer will not come from some genius individuals, from some charismatic leader.
    1:21:08 You need good institutions.
    1:21:11 And again, it’s boring.
    1:21:16 We tend to focus on our attention on these kind of charismatic leaders that they will
    1:21:18 bring us the answer.
    1:21:25 And institutions are these big bureaucratic structures that we find it difficult to connect
    1:21:26 with.
    1:21:29 But the answer comes from them.
    1:21:35 And if you think about something like the sewage system, it’s not heroic, but it saves
    1:21:37 our life every day.
    1:21:43 In big cities, throughout history, you always had this issue of epidemics.
    1:21:49 Lots of people together with all their garbage and all their sewage, this is paradise for
    1:21:50 germs.
    1:21:56 And throughout history, you constantly had to bring new blood from the villages because
    1:21:59 the population of the city was always in decline.
    1:22:01 People were dying in droves.
    1:22:06 And a turning point, one turning point, came in the middle of the 19th century when there
    1:22:09 was a cholera epidemic in London.
    1:22:12 And hundreds of people were dying of cholera.
    1:22:18 And you had a bureaucratically-minded doctor, John Snow, who tried to understand what is
    1:22:23 happening, the different theories about what is causing cholera.
    1:22:26 And the main theory was that something was bad in the air.
    1:22:30 But John Snow suspected the water.
    1:22:37 And he started making these long-boring lists of all the people who contracted cholera in
    1:22:42 London, and where did they get the drinking water from?
    1:22:50 And through these long-boring lists, he pinpointed the epicenter of the cholera outbreak to a
    1:22:56 single pump in, I think it was Broad Street in Soho in London.
    1:23:03 And it was later discovered that somebody dug this well about one meter away from a
    1:23:06 cesspit full of sewage.
    1:23:10 And the sewage water just sipped into the drinking water.
    1:23:13 And this caused the cholera outbreak.
    1:23:21 And this was one of the main milestones in the idea of developing a modern sewage system.
    1:23:27 And the modern sewage system, among other things—again, it’s a bureaucracy—and it demands that
    1:23:29 you fill forms.
    1:23:37 If you build a cesspit or dig a well today in London, you need to fill so many forms to
    1:23:45 make sure that there is enough distance between the cesspit and the drinking water well.
    1:23:50 And for me, when people talk about the deep state, this is the deep state.
    1:23:57 The deep state of the sewage system that runs under our houses and streets and towns and
    1:24:01 takes away our waste—you know, you go to the toilet, you do what you do, you flush
    1:24:02 the water.
    1:24:03 Where does it go?
    1:24:06 It goes into the deep state.
    1:24:12 And this deep, subterranean state of all these pipes and whatever, it takes our waste and
    1:24:19 very carefully separates it from the drinking water so that we don’t get cholera.
    1:24:26 And you don’t see many kind of TV dramas about the sewage system and about the people
    1:24:27 who manage it.
    1:24:33 If there is a leakage somewhere, some bureaucrat needs to send the plumbers and pay them.
    1:24:38 And this is what makes modern life possible.
    1:24:43 Without this, you would not have New York and you would not have London or any of the
    1:24:45 other big cities.
    1:24:51 And we figured it out in the 19th century with sewage and I hope that we could also
    1:24:57 figure it out in the 21st century with algorithms.
    1:24:59 This book really gave me a lot to think about.
    1:25:01 I’m still thinking about it.
    1:25:04 And I think everyone should read it for themselves.
    1:25:08 Once again, it’s called Nexus, a Brief History of Information Networks from the Stone Age
    1:25:09 to AI.
    1:25:11 You’ve all know a Harari.
    1:25:12 This was a pleasure.
    1:25:13 Thank you.
    1:25:14 Thanks.
    1:25:27 All right, well, as you know, I really love that conversation.
    1:25:28 I hope you did too.
    1:25:32 You can drop us a line at thegrayarea@vox.com.
    1:25:35 I read all those emails and keep them coming.
    1:25:37 And as always, please rate, review, subscribe.
    1:25:40 That stuff really helps the show.
    1:25:46 This episode was produced by Beth Morrissey and Travis Larchuk, edited by Jorge Just,
    1:25:52 engineered by Patrick Boyd, fact-checked by Anouk Dousseau, and Alex O’Vrington wrote
    1:25:53 our theme music.
    1:25:59 Special thanks this week to Matthew Heffron, Chris Shirtleff, and Shira Tarlo.
    1:26:01 New episodes of the Gray Area drop on Mondays.
    1:26:03 Listen and subscribe.
    1:26:05 Rate, review, rinse, repeat.
    1:26:10 The show is part of Vox, support Vox’s journalism by joining our membership program today.
    1:26:12 Visit vox.com/members to sign up.
    1:26:16 [MUSIC PLAYING]
    1:26:17 .
    1:26:20 [MUSIC PLAYING]
    1:26:23 [MUSIC PLAYING]
    1:26:27 [MUSIC PLAYING]
    1:26:30 [MUSIC PLAYING]
    1:26:39 [BLANK_AUDIO]

    Humans are good learners and teachers, constantly gathering information, archiving, and sharing knowledge. So why, after building the most sophisticated information technology in history, are we on the verge of destroying ourselves? We know more than ever before. But are we any wiser? Bestselling author of Sapiens and historian Yuval Noah Harari doesn’t think so.

    This week Sean Illing talks with Harari, author of a mind-bending new book, Nexus: A Brief History of Information Networks, about how the information systems that shape our world often sow the seeds of destruction, and why the current AI revolution is just the beginning of a brand-new evolutionary process that might leave us all behind.

    Host: Sean Illing (@seanilling)

    Guest: Yuval Noah Harari (@harari_yuval)

    Support The Gray Area by becoming a Vox Member: https://www.vox.com/support-now

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Why cynicism is bad for you

    AI transcript
    0:00:03 Support for the gray area comes from Mint Mobile.
    0:00:07 Phones can be expensive, but with Mint Mobile,
    0:00:09 you can save a ton on wireless service.
    0:00:11 You can get three months of service
    0:00:14 for just 15 bucks a month by switching.
    0:00:15 To get this new customer offer
    0:00:18 and your new three-month premium wireless plan
    0:00:19 for just 15 bucks a month,
    0:00:23 you can go to mintmobile.com/grayarea.
    0:00:26 That’s mintmobile.com/grayarea.
    0:00:28 $45 upfront payment required,
    0:00:30 equivalent to $15 per month.
    0:00:33 New customers on first three-month plan only.
    0:00:36 Speed slower above 40 gigabytes on unlimited plan,
    0:00:39 additional taxes, fees, and restrictions apply.
    0:00:40 See Mint Mobile for details.
    0:00:44 We all know someone.
    0:00:48 Maybe it’s a friend, a coworker, a family member,
    0:00:51 who always manages to be the voice of doom.
    0:00:59 The person who always knows
    0:01:03 that something is pointless or won’t succeed or can’t happen.
    0:01:07 If no one immediately springs to mind,
    0:01:10 I regret to inform you that it’s possible
    0:01:14 that you’re this person in someone else’s life.
    0:01:19 And if that’s the case,
    0:01:21 what I want to say to you is stop it.
    0:01:24 (gentle music)
    0:01:30 Stop being so damn cynical, it’s annoying,
    0:01:32 and it doesn’t help you or anyone else.
    0:01:38 That’s what I want to say, but it’s not so easy.
    0:01:41 Synicism is everywhere.
    0:01:44 In fact, you can make the case that cynicism
    0:01:48 is becoming a default setting for people in our society.
    0:01:50 But why?
    0:01:53 Why are so many of us cynical?
    0:01:55 And does it make any sense to be this way?
    0:02:01 I’m Sean Elling, and this is the Gray Area.
    0:02:04 (gentle music)
    0:02:07 (camera clicking)
    0:02:21 Today’s guest is Jamil Zaki.
    0:02:23 He’s a psychologist at Stanford
    0:02:25 and the author of Hope for Cynics.
    0:02:27 When I read his book,
    0:02:30 I was struck by how Zaki explores
    0:02:32 the consequences of cynicism,
    0:02:37 both for cynical individuals and cynical societies.
    0:02:41 And he explains why cynicism is so destructive to both.
    0:02:45 But he also punctures the conventional wisdom
    0:02:49 that says cynicism is a reasonable response to the world.
    0:02:52 It turns out that just isn’t true,
    0:02:55 though it’s very easy to believe it is.
    0:02:59 As someone who battles cynicism in my own life,
    0:03:01 this was a book I needed to read,
    0:03:04 and I invited Zaki on the show to talk about it.
    0:03:11 Jamil Zaki, welcome to the show.
    0:03:13 I’m thrilled to be here.
    0:03:15 I should say this one’s pretty personal for me.
    0:03:18 A lot of these episodes are personal for me,
    0:03:19 as anyone listening knows.
    0:03:25 I’ve battled cynicism most of my adult life, I still do.
    0:03:30 And if it’s possible for me to be less cynical,
    0:03:33 I want that, and I know plenty of people listening
    0:03:35 feel the same way.
    0:03:38 Anyway, that’s just part of my motivation
    0:03:40 for having you here, and I don’t know,
    0:03:43 it felt worth saying at the top, so there it is.
    0:03:46 – I appreciate that, and I wanna join you in that.
    0:03:51 I mean, in psychology, we say research is me-search,
    0:03:54 that you study things not just because they’re interesting,
    0:03:57 but because they have been personally meaningful
    0:03:59 and important to you in your life.
    0:04:01 And that’s certainly true of me.
    0:04:05 I started this project because I was drowning in cynicism
    0:04:08 and wanted to see what it was doing to me.
    0:04:11 And if I could overcome it in my own life,
    0:04:13 and the more I did research,
    0:04:15 the more I realized I was not alone,
    0:04:18 and that a lot of us are in that exact situation.
    0:04:21 – Have you always been that way?
    0:04:26 – For me, it’s really an early life origin story
    0:04:27 for my cynicism.
    0:04:32 I’m an only child and my parents are from different cultures
    0:04:37 and had a long and acrimonious divorce.
    0:04:40 So my sort of early life was,
    0:04:41 you wouldn’t really describe me
    0:04:45 as the most securely attached child, probably, and-
    0:04:46 – Yeah, I can relate.
    0:04:50 – Yeah, I should say my parents’ wonderful
    0:04:52 and loving people who were doing their very best.
    0:04:57 But just the chaos in that home and in our relationships
    0:04:59 left me with this embedded sense
    0:05:03 that it’s kind of not that easy to count on people
    0:05:07 and that in order for people to be there for me,
    0:05:10 I kind of had to be entertaining or interesting
    0:05:11 or smart in some way.
    0:05:16 And that led to really two very different experiences.
    0:05:19 One was an outward positivity.
    0:05:21 I ended up studying kindness and empathy
    0:05:24 and all these good features of human nature.
    0:05:26 And I think of myself as hopefully
    0:05:28 a relatively friendly person.
    0:05:30 So on the outside, I sort of projected
    0:05:33 this anti-cynicism, I suppose.
    0:05:36 But internally, I did doubt people a lot.
    0:05:39 Suspicion comes more naturally to me than trust.
    0:05:42 And so there’s been, for as long as I can remember,
    0:05:45 this split between what I’m trying to give to people
    0:05:49 on the one hand and what I expect of them on the other hand.
    0:05:50 And I’m not alone here.
    0:05:53 Research suggests that people
    0:05:55 who are insecurely attached in early childhood
    0:05:59 do have a harder time trusting friends,
    0:06:02 relationship partners, family.
    0:06:04 – Yeah, that can relate to a lot of that.
    0:06:10 Well, I think everyone listening will have a vague idea
    0:06:14 of what it means to be cynical.
    0:06:16 I think we all certainly know
    0:06:20 what a cynical person looks and sounds like.
    0:06:24 But what’s a more precise way of thinking about cynicism?
    0:06:27 Give me a proper definition.
    0:06:29 – Really important question.
    0:06:32 And especially for your listeners,
    0:06:35 I think it’s important to separate ancient cynicism
    0:06:36 from modern cynicism.
    0:06:39 So when I talk about cynicism and when psychologists do,
    0:06:42 we are not talking about the philosophical school
    0:06:45 led by Antisthenes and Diogenes,
    0:06:48 but rather about a general theory
    0:06:52 that people have about humanity.
    0:06:55 The idea that overall and at our core,
    0:06:59 people are selfish, greedy and dishonest.
    0:07:02 Now that’s not to say that a cynical person would be shocked
    0:07:05 if they witnessed somebody donating to charity
    0:07:07 or helping a stranger, right?
    0:07:09 But they might question the person’s motives.
    0:07:12 They might say, ah, they’re probably in it for a tax break
    0:07:15 or maybe they’re trying to look good.
    0:07:18 Cynicism is not a theory about human action.
    0:07:20 It’s a theory about human motives
    0:07:24 that ultimately we are self-interested beings.
    0:07:25 And because of that,
    0:07:28 we can’t be trusted to truly have
    0:07:30 each other’s best interests in mind.
    0:07:33 – And the worst and best part about that approach
    0:07:35 is how unfalsifiable it is, right?
    0:07:37 I mean, how can you disprove
    0:07:38 or prove someone’s motivations?
    0:07:41 It’s purely a choice to believe that.
    0:07:45 – And because it’s unfalsifiable and ineffable in a way,
    0:07:47 you know, you’re looking for mens rea,
    0:07:50 you’re looking for the inside of somebody else’s mind,
    0:07:53 which is, you know, by definition inaccessible.
    0:07:57 It means that cynical people and here I don’t,
    0:08:00 I’m not trying to name call, I’m including myself,
    0:08:05 can upload evidence in asymmetric ways.
    0:08:09 We can discount and explain away people’s kindness.
    0:08:11 And then when people act in ways
    0:08:14 that are selfish or greedy or dishonest,
    0:08:16 have an aha moment where we say,
    0:08:19 you’ve now revealed your true colors.
    0:08:23 And that’s what you see in the psychology of cynicism
    0:08:27 that really this sort of like a starting premise
    0:08:30 that people have that changes or filters
    0:08:33 the way that they take in evidence about other people
    0:08:37 leading to vast amounts of confirmation bias.
    0:08:40 – So how do you know you’re a cynic
    0:08:45 and not just what we might call an old fashioned realist?
    0:08:49 – This is a great question.
    0:08:52 And one that, you know, a lot of people
    0:08:54 since I’ve started working on this topic, right?
    0:08:58 To me or tell me, you know, you are calling us cynics
    0:09:00 but we’re realists.
    0:09:02 Cynicism is really just understanding
    0:09:04 what people are really like.
    0:09:06 I think it was George Bernard Shaw
    0:09:10 who said the characteristic of accurate observation
    0:09:14 is commonly called cynicism by those who haven’t got it,
    0:09:15 right?
    0:09:17 So there’s an old stereo.
    0:09:18 I mean, it’s clever.
    0:09:20 I have to give it to him.
    0:09:25 There’s this stereotype that cynicism is the same as realism.
    0:09:28 In fact, even non-cynics believe this.
    0:09:30 If you survey people and describe a cynic
    0:09:32 and a non-cynic to them and say, who’s smarter?
    0:09:36 70% of people think that cynics are smarter
    0:09:41 and 85% of them think that cynics are socially smarter
    0:09:44 that they’ll pick up on who’s lying
    0:09:46 versus telling the truth, for instance.
    0:09:49 The fact is that we’re wrong on both counts.
    0:09:51 Cynics actually turn out to do less well
    0:09:54 on cognitive tests than non-cynics
    0:09:57 and have a harder time picking out liars from truth tellers.
    0:10:00 And that I think points to a disjunction
    0:10:04 between what we think realism is and what it actually is.
    0:10:08 That was actually something that surprised me a little
    0:10:11 in the book and maybe it shouldn’t have surprised me.
    0:10:13 Maybe it surprised me because I’m riddled
    0:10:17 with all these biases that you’re talking about.
    0:10:19 But I mean, speaking of Georges,
    0:10:22 there’s another George you quote in the book on this point.
    0:10:24 It’s George Carlin.
    0:10:28 And you had that great line, scratch any cynic
    0:10:31 and you’ll find a disappointed idealist.
    0:10:33 Now that’s funny, obviously, and clever,
    0:10:36 but it’s wrong, I guess.
    0:10:41 And it was useful to see you dispel that in the book.
    0:10:45 There are so many myths that we carry around about cynicism.
    0:10:48 And I think that we’ve in a way glamorized it
    0:10:51 in our culture as realism,
    0:10:54 as a sort of hard fought wisdom.
    0:10:55 It feels hard earned.
    0:10:59 It feels as though you’ve gained it from experience.
    0:11:02 There’s this sense that if you are contemptuous,
    0:11:06 if you are judgmental, it’s because you’ve been here before.
    0:11:08 But again, I wanna return to this distinction
    0:11:11 between cynical thinking and realism
    0:11:13 because it turns out that again,
    0:11:16 if we have a blanket assumption about people,
    0:11:19 it turns out that we’re not being very realistic about them
    0:11:23 because we take in information in biased ways
    0:11:26 and we draw biased conclusions.
    0:11:28 That’s why cynics are bad at detecting liars
    0:11:31 because they assume liars are everywhere.
    0:11:34 So stop actually paying attention to cues
    0:11:36 that could help them understand who they can trust
    0:11:38 and who they can’t.
    0:11:41 In the book, I talk about skepticism
    0:11:43 as an alternative to cynicism.
    0:11:44 And again, I’m here, I’m not talking about
    0:11:47 the ancient philosophical school of skepticism,
    0:11:49 but a modern psychological definition,
    0:11:53 which is a hunger for evidence,
    0:11:57 a desire to think not like a lawyer, but like a scientist.
    0:12:02 And it turns out that if anything is realistic,
    0:12:04 in my opinion, it’s skepticism.
    0:12:08 It’s trying to dispense with preconceived ideas
    0:12:12 about what people are like and let the evidence come to us.
    0:12:16 – If you put a gun to my head and ask me to tell you
    0:12:20 the difference between skepticism and cynicism,
    0:12:21 I’m not sure I could do it.
    0:12:23 I think I use those words interchangeably
    0:12:27 or at least I have, but maybe it would be helpful
    0:12:29 to draw that distinction out a little bit.
    0:12:31 What is the difference between those two things?
    0:12:32 – It’s super important.
    0:12:36 And I think a lot of people use the terms interchangeably.
    0:12:38 And I think we should stop
    0:12:41 because they are not just different from one another,
    0:12:44 but one can be used to fight the other.
    0:12:47 So if cynicism is a theory,
    0:12:49 what theories do is they structure
    0:12:51 our perception of the world
    0:12:54 and often bias our perception of the world.
    0:12:56 If you think that things are a certain way,
    0:13:00 you will pay lots of attention to any information
    0:13:02 that accords with that perspective
    0:13:06 and ignore or discount evidence that doesn’t.
    0:13:08 So you end up through your worldview
    0:13:10 finding confirmation for it
    0:13:14 and doubling, tripling, quadrupling down.
    0:13:17 Skepticism really doesn’t allow for that.
    0:13:20 A true skeptic is open to evidence
    0:13:25 whether or not that evidence matches their preconceived ideas
    0:13:28 and they’re willing in fact to update
    0:13:32 even relatively basic assumptions that they have
    0:13:34 if the evidence comes in on the other side.
    0:13:37 In my corner of the world, we talk about Bayesians,
    0:13:40 people who update their prior beliefs
    0:13:41 based on new information.
    0:13:45 And I think of skeptics as more like Bayesians
    0:13:49 as actually being willing to learn
    0:13:51 even when that learning is uncomfortable
    0:13:54 and clashes with what they thought before.
    0:13:59 Do we know what makes people cynical?
    0:14:01 Is it a personality thing?
    0:14:02 Is it a genetic thing?
    0:14:04 Is it a neurochemical thing?
    0:14:08 Do any of us choose to be cynical in any meaningful sense?
    0:14:14 Well, cynicism is relatively stable across people’s lives
    0:14:16 in the absence of any intervention, right?
    0:14:20 So if you’re cynical, now it’s likely
    0:14:21 that you’ll stay that way
    0:14:23 if you don’t do anything about it.
    0:14:26 There is some heritable components to cynicism.
    0:14:30 So identical twins are slightly closer in their cynicism
    0:14:32 than fraternal twins, for instance.
    0:14:36 But the genetic and heritable component seems pretty small.
    0:14:39 So then there’s the other fascinating thing you ask,
    0:14:41 do we choose cynicism?
    0:14:44 I don’t know, Sean, if we choose it
    0:14:49 or if it chooses us based on our experiences.
    0:14:51 And I guess I would describe those experiences
    0:14:53 at a couple of different levels.
    0:14:56 The first is our personal experiences,
    0:14:58 especially our negative personal experiences,
    0:15:03 the disappointments that George Carlin was talking about.
    0:15:05 But there’s a second level here as well,
    0:15:07 which is the structures around us.
    0:15:10 So environments that are really competitive, for instance,
    0:15:14 are more likely to increase people’s cynicism
    0:15:16 and environments that are cooperative
    0:15:18 tend to decrease cynicism.
    0:15:22 And that’s a level of flexibility that I think is faster.
    0:15:24 Our childhoods affect us for many years,
    0:15:29 but your situational cynicism can change very quickly.
    0:15:32 If you are at a high stakes poker table,
    0:15:34 there’s absolutely no reason
    0:15:36 for you to trust the people around you.
    0:15:38 But if you’re among a set of neighbors
    0:15:40 that you have longstanding, warm relationships with,
    0:15:42 there’s lots of reasons to trust.
    0:15:47 – Yeah, I mean, I think the reason I’m asking
    0:15:51 what makes us cynical is I have this experience
    0:15:54 of many experiences of thinking cynically
    0:15:57 or speaking cynically, and there’s a part of me
    0:16:01 in real time that knows it’s unhelpful
    0:16:03 that doesn’t want to think that way or speak that way.
    0:16:05 And yet I still do it.
    0:16:09 And so when that happens enough, you start to wonder,
    0:16:12 well, shit, is this just my constitution?
    0:16:13 Is this just my brain?
    0:16:18 Or do I have freedom to do otherwise?
    0:16:20 Can I actually change?
    0:16:22 And I think the answer has to be yes, right?
    0:16:23 I mean, it just has to be.
    0:16:26 You wouldn’t write the book if it wasn’t yes.
    0:16:27 – Yes, I mean, the answer is yes.
    0:16:30 I do want to just commend your introspection here, Shawn.
    0:16:34 You know, I feel like you’re a 2.0 level cynic.
    0:16:37 I think that the first–
    0:16:38 – That’s the nicest thing
    0:16:39 anyone has ever said to me on the show.
    0:16:40 Thank you.
    0:16:44 – What, I mean, the first level is
    0:16:47 being cynical and believing your cynicism, right?
    0:16:49 That’s the kind of more sneering,
    0:16:54 even sometimes superior attitude that a cynic might have
    0:16:56 where they’re making fun of gullible rubs
    0:16:58 who don’t think everybody is on the take.
    0:17:00 The second level is to say, wait a minute,
    0:17:02 this is kind of hurting me.
    0:17:03 And it does.
    0:17:05 I mean, cynicism is terrible for our health
    0:17:07 and relationships and communities.
    0:17:10 And to say, well, can I do something about it?
    0:17:14 And, but to get to your question, yeah, the answer is yes.
    0:17:17 In fact, I would say that the central
    0:17:20 and most profound insight from the last 100 years
    0:17:23 of neuroscience and psychology
    0:17:25 is that people change more than we realized
    0:17:28 and more than psychologists and neuroscientists
    0:17:31 realize we change at a physiological level,
    0:17:35 at the level of our intelligence and our personalities.
    0:17:37 And we can change on purpose too.
    0:17:41 The thing is that people don’t generally make long-term attempts
    0:17:44 to change their cynicism on its own.
    0:17:46 What they typically do is they feel depressed
    0:17:48 and anxious and lonely,
    0:17:51 in part because of their cynical attitudes.
    0:17:53 And they go, for instance, to therapy
    0:17:55 to deal with those issues.
    0:17:58 But it turns out that if you look under the hood
    0:18:01 of, for instance, cognitive behavioral therapy,
    0:18:05 a lot of it is a treatment for our cynical views
    0:18:07 about ourselves and about each other.
    0:18:11 I have always been interested
    0:18:16 in the relationship between beliefs and behavior.
    0:18:19 You see this a lot in the debates
    0:18:22 between atheists and religious people
    0:18:26 where atheists obsess over empirical truth claims
    0:18:30 in a way that blinds them to the power of belief
    0:18:32 and how believing in something
    0:18:36 can create a kind of motive force
    0:18:39 that is its own justification.
    0:18:42 And that seems relevant to this conversation
    0:18:45 about cynicism.
    0:18:47 And you may have alluded to this a minute ago,
    0:18:48 but you wrote something in the book.
    0:18:53 You said, “Cynics land in a negative feedback loop.
    0:18:55 Their assumptions limit their opportunities,
    0:18:59 which darken their assumptions even further.”
    0:19:02 What is the point you’re trying to make there?
    0:19:05 I think you summed it up beautifully.
    0:19:08 Our beliefs are self-fulfilling prophecies
    0:19:12 because they structure how we interact with the world
    0:19:15 and what we do in the world
    0:19:18 structures how it responds to us.
    0:19:20 In the case of cynicism,
    0:19:22 if you believe that people are untrustworthy,
    0:19:24 you will treat them that way.
    0:19:25 And there’s lots of evidence
    0:19:28 that generally more cynical people
    0:19:30 compared to less cynical people
    0:19:33 do things like micromanage their friends and family
    0:19:37 and monitor people, spy on them in various ways
    0:19:39 because they’re trying to defend themselves
    0:19:41 against a selfish world.
    0:19:44 I call these preemptive strikes.
    0:19:46 But more often than not,
    0:19:49 in an attempt to keep ourselves safe,
    0:19:51 we actually just offend other people
    0:19:53 and even harm them.
    0:19:57 And people are a deeply reciprocal species.
    0:20:01 So if we treat people as though they are selfish,
    0:20:02 two things happen.
    0:20:04 One, they become cynical of us.
    0:20:05 They don’t trust us
    0:20:09 and they’re less likely to connect deeply with us.
    0:20:12 But two, we bring out the worst in them.
    0:20:15 There’s evidence that when people mistrust others
    0:20:19 in economic games and in personal relationships,
    0:20:21 the people that they don’t trust
    0:20:24 actually are more likely to betray them
    0:20:27 because they feel the relationship is already broken.
    0:20:30 So cynics often end up treating people poorly,
    0:20:32 being treated poorly,
    0:20:35 and then confirming that they were right
    0:20:36 about people all along.
    0:20:41 It’s this kind of toxic cycle of assumption,
    0:20:43 confirmation, and then reinforcement.
    0:20:46 – So it’s actually true
    0:20:50 that people tend to become what we think they are.
    0:20:53 That always seemed like a bit of a cliche,
    0:20:56 but there’s actually a real profound truth in that.
    0:21:01 – Oh yeah, and I think this is something that we neglect a lot.
    0:21:04 In fact, my friend and the great psychologist,
    0:21:08 Vanessa Bonds at Cornell has a bunch of work
    0:21:11 on what she calls influence neglect,
    0:21:14 that people A, affect each other
    0:21:16 and affect who other people become
    0:21:18 in really important ways,
    0:21:21 and B, we have no idea that we’re doing that.
    0:21:23 So there’s this sense that we assume
    0:21:25 that the way that somebody acts around us
    0:21:27 reflects who they really are.
    0:21:30 We underestimate how much the situation affects them,
    0:21:32 and then we underestimate as well
    0:21:35 how much we are a part of that situation.
    0:21:40 And I think we should own or can own more often,
    0:21:44 and with more intentionality,
    0:21:47 how much power we have in shaping the people around us
    0:21:49 and try to use that power in a positive way
    0:21:50 instead of a cynical one.
    0:21:53 (upbeat music)
    0:22:07 – Support for the gray area comes from Mint Mobile.
    0:22:09 Phone companies are really good at squeezing
    0:22:12 a little more out of you than you signed up for.
    0:22:14 Mint Mobile is doing things differently.
    0:22:17 Their premium wireless plans are actually affordable,
    0:22:19 with no hidden fees or any of that nonsense.
    0:22:21 Right now, when you switch to Mint Mobile,
    0:22:24 you can get three months of service for just 15 bucks a month.
    0:22:27 All of their plans come with high-speed 5G data
    0:22:28 and unlimited talk and text.
    0:22:30 Plus, you don’t need to worry
    0:22:32 about getting a new device or phone number.
    0:22:36 Just bring those with you over to your new Mint Mobile plan.
    0:22:37 To get this new customer offer
    0:22:39 and your new three-month premium wireless plan
    0:22:40 for just 15 bucks a month,
    0:22:44 you can go to mintmobile.com/grayarea.
    0:22:47 That’s mintmobile.com/grayarea.
    0:22:49 You can cut your wireless bill to 15 bucks a month
    0:22:52 at mintmobile.com/grayarea.
    0:22:54 $45 upfront payment required,
    0:22:56 equivalent to $15 per month.
    0:22:59 New customers on first three-month plan only.
    0:23:02 Speed slower, above 40 gigabytes on unlimited plan.
    0:23:05 Additional taxes, fees, and restrictions apply.
    0:23:06 See Mint Mobile for details.
    0:23:13 (upbeat music)
    0:23:16 (upbeat music)
    0:23:24 – So cynicism is connected to certain beliefs,
    0:23:27 certain assumptions we make about the world.
    0:23:31 But I think it’s also true that we’re not really the authors
    0:23:34 of our beliefs and values and the way we think we are.
    0:23:37 These things are largely products
    0:23:40 of our culture and environment,
    0:23:43 which is a long way of asking.
    0:23:45 Do you think our culture, and by our, I mean,
    0:23:48 American culture, ’cause that’s where we are,
    0:23:51 do you think our culture engineers cynicism?
    0:23:57 – I do, and I think it’s doing so more now than it used to.
    0:23:58 – What do you mean?
    0:24:00 – Well, cynicism is on the rise.
    0:24:04 In 1972, about half of Americans believed
    0:24:05 most people can be trusted.
    0:24:09 And by 2018, that had fallen to a third of Americans.
    0:24:14 We’re experiencing a massive drop in faith in one another
    0:24:16 and in our institutions.
    0:24:19 And with that comes a rise in cynicism.
    0:24:21 So I think not only are we engineering cynics,
    0:24:24 but we’re doing so more efficiently now
    0:24:25 than we were in the past.
    0:24:30 – I’ve never really liked theories of human nature
    0:24:33 because often the embedded assumption there is that
    0:24:37 there’s this fixed thing called human nature.
    0:24:40 And that that is what we are always and everywhere.
    0:24:44 And I tend to think we’re a lot more plastic than that.
    0:24:48 So when I see those statements that you have in the book
    0:24:51 on the, you know, the arduous cynic tests, you know,
    0:24:54 like, do you agree that most people dislike helping others?
    0:24:57 Do you agree most people are honest,
    0:25:00 chiefly through the fear of getting caught?
    0:25:01 You know, that kind of thing.
    0:25:02 I’d probably circle disagree,
    0:25:04 but I also think a lot of people do behave that way,
    0:25:07 but it’s not because of some inherent wickedness.
    0:25:09 I think it’s because we’ve built a world
    0:25:11 that breeds these beliefs and behaviors.
    0:25:13 And I mean, I guess it’s also true
    0:25:14 that some people just suck,
    0:25:16 but you get the point I’m making.
    0:25:17 – Of course, yeah.
    0:25:21 And I think that I too find general questions
    0:25:25 about human goodness or badness unanswerable,
    0:25:29 but I do think that it matters when our answers change
    0:25:32 for all the reasons that we’re talking about, right?
    0:25:37 I mean, we are less and less discounting people’s
    0:25:39 negative actions in the way that you just did.
    0:25:41 You just applied a very generous framework.
    0:25:43 You said, well, when somebody does harm,
    0:25:45 when they act in a way that I don’t like,
    0:25:47 there’s so many reasons that could be.
    0:25:50 It’s not because they’re awful to their core,
    0:25:53 but I do think that there are situations
    0:25:56 and cultural structures that flatten
    0:25:58 our representations of each other
    0:26:01 and make us more likely to use black and white thinking
    0:26:04 really unequal places in times,
    0:26:06 economically unequal places in times,
    0:26:08 tend to lead to less trust
    0:26:10 and more of this black and white thinking.
    0:26:14 And I think our media too has been driving
    0:26:18 a really two dimensional version of humanity
    0:26:21 and making people more mistrustful
    0:26:23 and suspicious of one another.
    0:26:26 – That relationship you just mentioned
    0:26:29 between high levels of inequality
    0:26:32 and higher rates of cynicism
    0:26:33 was a little surprising to me too,
    0:26:36 but I guess it shouldn’t have been surprising to me
    0:26:40 because of that, the sort of competitive dog eat dog
    0:26:43 individualistic culture that breeds, right?
    0:26:46 I mean, you should expect more people to be more cynical
    0:26:49 in that kind of world.
    0:26:54 – I did some research on the Gilded Age for the book
    0:26:57 and it’s interesting to see how people with great wealth
    0:27:01 defended inequality at that time
    0:27:06 using the premises and the ideals of social Darwinism.
    0:27:09 The idea was, hey, it’s okay for robber barons
    0:27:12 to control vast amounts of resources
    0:27:15 because people are red in tooth and claw
    0:27:17 just like every other animal.
    0:27:19 We are born to fight one another
    0:27:24 and to look out only for our personal interests.
    0:27:28 And so it’s really silly and naive
    0:27:30 to even try to fight that at all.
    0:27:33 So we should have an absolutely free
    0:27:37 and unfettered market where a few people can dominate.
    0:27:39 It’s almost, is a morality
    0:27:43 in some very unequal cultures or settings
    0:27:44 that people are that way.
    0:27:46 You know, this idea of Homo economicus
    0:27:51 has been really leveraged to both justify
    0:27:55 and to increase inequality over time.
    0:28:01 – Are Americans unusually cynical?
    0:28:06 How does our cynicism level stack up against peer nations?
    0:28:09 – We’re relatively in the middle of the pack there.
    0:28:12 So again, inequality is a great metric.
    0:28:14 There are a few other things that correlate
    0:28:15 with cynicism across countries,
    0:28:19 but really if you were to use one metric
    0:28:21 as a predictor of cynicism,
    0:28:22 you would want inequality.
    0:28:25 So, and there are of course nations
    0:28:27 that are much more unequal than the US
    0:28:31 and those are also much more mistrustful.
    0:28:33 A nation like Brazil for instance, right?
    0:28:35 So if a third of Americans believe
    0:28:37 that most people can be trusted,
    0:28:39 the same question when asked in Brazil,
    0:28:42 I believe you get something around 6% of people
    0:28:45 answering affirmatively that most people can be trusted.
    0:28:47 And there are other very unequal nations
    0:28:51 with similar single digit responses to that question.
    0:28:54 – You may not know this off the top of your head,
    0:28:59 but what’s the least cynical society that you know of?
    0:29:01 And why?
    0:29:04 – That’s actually, it’s a really interesting question.
    0:29:06 I think it’s the same as these sort of
    0:29:08 global happiness surveys.
    0:29:10 You see a lot of the Nordic countries coming in
    0:29:13 as very trusting and who knows why?
    0:29:15 I mean, they’re economically equal,
    0:29:17 strong social safety net,
    0:29:22 also relatively sometimes culturally more homogenous.
    0:29:25 So there could be an in-group sort of trust there as well,
    0:29:28 but that’s generally similar to happiness.
    0:29:31 You see a lot of the top ranking countries pooling
    0:29:34 in that section of the world.
    0:29:37 – I should ask, how do we make these determinations
    0:29:40 about cynicism levels in a society?
    0:29:42 Is it just self-reported surveys?
    0:29:43 Is it that simple?
    0:29:45 Or is there more sophisticated techniques
    0:29:49 for making these judgments about how cynical
    0:29:51 either individuals or societies are?
    0:29:55 – In psychology, many of us still think
    0:29:57 that the best way to find out about people
    0:29:58 is to ask them about themselves.
    0:30:02 And there are questionnaires that any of your listeners
    0:30:04 can take to assess their cynicism.
    0:30:06 The most famous is what’s known as
    0:30:09 the Cook Medley Hostility Scale.
    0:30:12 It was developed in the 1950s and it asks people questions
    0:30:17 like do you agree or disagree that most people
    0:30:20 are honest chiefly through fear of getting caught?
    0:30:23 Or that people generally don’t like helping each other?
    0:30:24 And I know that these questions
    0:30:27 can seem really broad in general,
    0:30:31 but the fact is that they do track a bunch of outcomes
    0:30:34 in people’s behavior and in their lives.
    0:30:35 If you wanna go further though,
    0:30:39 you can also test people’s at least levels of trust
    0:30:41 in a bunch of different ways.
    0:30:44 One that’s very famous is an economic game
    0:30:48 known as the trust game where one person decides
    0:30:50 how much money they want to send to an internet stranger,
    0:30:53 an anonymous person who they’ll never meet,
    0:30:55 whatever they send is tripled.
    0:30:57 And then the second player can choose
    0:31:00 how much of that money they want to send back to the first.
    0:31:03 So both people can do better economically
    0:31:08 if the first one trusts and the second one is trustworthy.
    0:31:10 And so you can assess somebody’s willingness
    0:31:12 to be vulnerable, their belief,
    0:31:14 their faith in that other person.
    0:31:15 And I suppose in people,
    0:31:19 because this is a random representative of the species,
    0:31:22 you can assess their trust by how much they choose to send.
    0:31:25 Disease is a strong word,
    0:31:27 but is it helpful to think of cynicism
    0:31:32 as a psychological disease or is that too strong?
    0:31:36 It certainly has some qualities
    0:31:39 that we would associate with disease.
    0:31:42 It harms our physical health.
    0:31:44 Cynics suffer from more heart disease.
    0:31:49 They’re more likely to die earlier than non-cynics.
    0:31:53 So to the extent that disease is life negative,
    0:31:55 that fits the bill.
    0:31:58 And it also comes to us unbidden,
    0:32:00 just like a virus might.
    0:32:02 We catch it from our environment
    0:32:07 and we often experience it unwillingly.
    0:32:09 At least you and I do, Sean.
    0:32:12 We yearn to get rid of it,
    0:32:16 which is another thing that many people who are sick want.
    0:32:20 I don’t wanna sort of stretch the metaphor too far,
    0:32:22 but I think that those aspects of illness
    0:32:26 are notable and shared with a cynical worldview.
    0:32:28 I guess there are several variables at play here,
    0:32:31 but what’s the causal mechanism here?
    0:32:35 Why do cynical people live shorter lives?
    0:32:37 Why do they have more heart problems?
    0:32:39 Is it as simple as happy, hopeful people
    0:32:42 or healthier, less stressed out people
    0:32:45 and therefore they live longer?
    0:32:50 Yes, and I think there’s a mechanism here that matters,
    0:32:53 which is social contact.
    0:32:56 There’s decades of science now that demonstrate
    0:33:01 that one of the most nourishing things for us psychologically
    0:33:03 is connection to other people.
    0:33:05 So folks who feel connected,
    0:33:07 who feel like they have community,
    0:33:09 like they can depend on others,
    0:33:13 experience much less stress physiologically,
    0:33:16 they sleep better, their cellular aging is slower
    0:33:19 than people who feel lonely.
    0:33:23 And cynical people who can’t trust others
    0:33:25 or feel that they can’t,
    0:33:29 who are unwilling to be vulnerable and open up,
    0:33:32 it’s almost like they can’t metabolize
    0:33:34 the calories of social life.
    0:33:37 And so they end up psychologically malnourished,
    0:33:40 which is toxic at many different levels.
    0:33:43 So again, if social contact is salutary,
    0:33:46 if it helps us retain our health in all these ways,
    0:33:51 then we need to allow ourselves to be accessible
    0:33:55 in order for that to work and cynics just don’t.
    0:33:56 And it’s a really tragic thing
    0:33:58 because I think like you and I,
    0:34:00 a lot of cynics don’t want to feel this way,
    0:34:04 but we experience life as more dangerous
    0:34:06 if you think that people are untrustworthy
    0:34:08 and think, “Wow, gosh, I need to stay safe.
    0:34:10 I need to not take a chance on people.”
    0:34:13 But actually by not taking chances on people,
    0:34:17 we are taking larger long-term risks with our wellbeing
    0:34:19 and missing out on lots of opportunities.
    0:34:22 The problem is that those missed opportunities are invisible,
    0:34:25 whereas the betrayals that we’ve suffered in the past
    0:34:27 are highly visible and palpable.
    0:34:32 And so again, we might learn too well from disappointment
    0:34:35 and not well enough for missed opportunities.
    0:34:40 – We’re also in the midst of this loneliness epidemic,
    0:34:41 which you write about in the book.
    0:34:45 I mean, that does not portend well for our future
    0:34:47 on this front, does it?
    0:34:49 – No, I don’t think it does.
    0:34:51 And we have research at Stanford
    0:34:54 that finds that cynical perceptions
    0:34:56 might contribute to loneliness.
    0:34:59 So we asked thousands of students on campus
    0:35:01 two types of questions,
    0:35:04 one about themselves and one about their average peer.
    0:35:05 So for instance, how empathic are you
    0:35:09 and how empathic do you think the average Stanford student is?
    0:35:12 How much do you like helping people who are struggling
    0:35:15 and how much do you think the average Stanford student does?
    0:35:19 And over and over again, we discovered two Stanford’s.
    0:35:21 One was made up of real undergraduates
    0:35:25 who are extraordinarily friendly, warm, and compassionate.
    0:35:28 And the other one was made up of the students
    0:35:30 in students’ imagination.
    0:35:32 Students believed that their average peer
    0:35:35 was more prickly and callous
    0:35:38 and disinterested in connection than they really were.
    0:35:41 That’s a false perception, but it changed.
    0:35:42 As we’re talking about, Sean,
    0:35:46 these beliefs changed people’s actions.
    0:35:48 Students who believed that their average peer
    0:35:52 was less friendly were less likely to open up,
    0:35:54 confide in new acquaintances,
    0:35:56 strike up conversations with strangers,
    0:36:00 and that left them less connected over the long term.
    0:36:04 So here again, we see a direct kind of self-fulfilling prophecy
    0:36:07 between the world that we see,
    0:36:09 the people that we imagine around us,
    0:36:13 and then the lives that we build in those communities.
    0:36:14 – So what’s true here, right?
    0:36:17 So these students said they were empathetic
    0:36:20 and wanted connection, but their peers didn’t.
    0:36:25 So are they misjudging or misreporting themselves
    0:36:26 or their friends?
    0:36:27 Who’s wrong about whom?
    0:36:29 Or is everybody wrong about everybody?
    0:36:33 This is, of course, a huge and great question.
    0:36:36 Are people enhancing their perception of themselves?
    0:36:38 Are they unfairly negative
    0:36:41 in their perceptions of others or both?
    0:36:42 It’s hard to know,
    0:36:45 but we did conduct some follow-up experiments at Stanford
    0:36:49 where we asked people to have conversations.
    0:36:52 We brought them together into conversations with strangers
    0:36:55 who were also undergraduates at the university,
    0:36:56 and we asked them,
    0:36:58 what do you think this conversation will be like?
    0:37:01 How empathic do you think this person
    0:37:03 who you have never met will be?
    0:37:06 How kind, how open-minded, et cetera?
    0:37:10 And then we had them actually go through the conversations
    0:37:11 and report back.
    0:37:15 And what we found is that people underestimated each other
    0:37:18 such that they reported having been wrong
    0:37:21 about the other person, if that makes sense.
    0:37:22 They thought they underestimated
    0:37:24 how empathic the person would be.
    0:37:26 And after the conversation, in essence, told us,
    0:37:28 well, shoot, I was wrong about that.
    0:37:29 That person was awesome.
    0:37:32 Many of them actually became friends afterwards.
    0:37:35 So that, to us, indicates that a good chunk
    0:37:37 of what’s going on here
    0:37:40 is actually underestimates of other people.
    0:37:43 And there’s other evidence for this as well.
    0:37:44 In the book, I talk about an experiment
    0:37:46 that was conducted in Toronto
    0:37:50 where people left wallets all around the city
    0:37:52 that had some money inside them and an ID
    0:37:55 so that if a good Samaritan wanted to return them,
    0:37:56 they could.
    0:37:57 People in Toronto were surveyed
    0:37:59 and asked what percentage of these wallets
    0:38:01 do you think will come back?
    0:38:04 And the average response was 25%.
    0:38:07 And in fact, 80% of them came back.
    0:38:11 Likewise, in trust games, people vastly underestimate
    0:38:13 how much money people will return.
    0:38:15 There’s dozens of examples like this
    0:38:19 where, yes, we might be self-enhancing as well,
    0:38:24 but there’s a lot of evidence that we are in concrete ways
    0:38:28 underestimating the warmth, kindness,
    0:38:31 and open-mindedness of other people.
    0:38:34 – So you’re telling me hell, in fact, isn’t other people?
    0:38:36 – That’s not true.
    0:38:37 – Sartre was wrong.
    0:38:41 I think hell is what we think people are
    0:38:43 more than it’s what they actually are.
    0:38:46 And this is one place where I think the science
    0:38:49 of cynicism diving into this world
    0:38:51 actually made me far less cynical
    0:38:55 because I realized that a lot of our cynical perceptions
    0:39:00 are just wrong and that there is enormous opportunity
    0:39:03 at so many levels in terms of our health, relationships,
    0:39:06 social movements, civic life.
    0:39:10 There are so many opportunities for us to improve things
    0:39:15 if we can simply awaken to who other people really are
    0:39:18 and just notice more effectively
    0:39:22 as opposed to relying on our cynical assumptions
    0:39:23 as powerful as they may be.
    0:39:25 – So do you find evidence for that?
    0:39:30 Do you find that when people’s negative expectations
    0:39:32 of the world collide with reality
    0:39:36 and are proven incorrect, do they learn from that?
    0:39:38 Do they change and become less cynical
    0:39:43 or do we tend to kind of fall back into our default mode?
    0:39:48 – It depends on how well we are willing to learn
    0:39:51 and how closely we are willing to pay attention
    0:39:54 to those experiences.
    0:39:57 Again, I wanna go back to one of the domains
    0:40:00 in which long-term change is most studied,
    0:40:03 which is cognitive behavioral therapy, right?
    0:40:08 CBT often applies skepticism as an antidote to cynicism.
    0:40:13 A person with social anxiety will go to therapy
    0:40:15 and say, “I think all my friends hate me.”
    0:40:18 And the therapist won’t say, “Tell me about your mother.”
    0:40:22 They’ll say, “Wait a minute, let’s interrogate that claim.”
    0:40:25 What evidence do you have that everybody hates you?
    0:40:27 Has people told you they hate you?
    0:40:28 Has anybody ever said they liked you?
    0:40:32 And they force the person to do a fair
    0:40:36 and at least as well as they can unbiased accounting
    0:40:38 of the evidence that they have.
    0:40:41 Typically, when we suffer from depression and anxiety,
    0:40:45 we don’t have evidence to back up our black and white claims.
    0:40:46 So then a therapist will say,
    0:40:49 “Okay, we’ll go get some evidence.
    0:40:51 Treat your life a little bit more like an experiment.”
    0:40:55 This is actually in CBT called behavioral experiments.
    0:40:58 Ask 10 people to go to the movies with you,
    0:40:59 10 people who you know.
    0:41:03 If all 10 of them say no and give no reasons why,
    0:41:05 maybe they really do hate you. (laughs)
    0:41:08 But if anybody says yes,
    0:41:12 try and keep up with that information.
    0:41:16 Try to really internalize that information.
    0:41:19 And this is one of the most successful forms of therapy
    0:41:21 for depression and anxiety.
    0:41:25 And again, I think that’s because it overwrites
    0:41:27 some of our assumptions through careful attention to
    0:41:30 and collection of better data.
    0:41:34 – Yeah, I actually started therapy recently myself
    0:41:37 and one of the strategies my therapist recommended was,
    0:41:40 you know, when something goes wrong
    0:41:42 and the first instinct of my mind is to say,
    0:41:44 “Oh, yeah, well, here’s more evidence
    0:41:47 of the global conspiracy to fuck up Sean’s day,” right?
    0:41:48 Everything’s like, you know,
    0:41:51 which is so narcissistic and deluded, right?
    0:41:53 But that’s where your mind can go
    0:41:55 and you just start getting angry at everyone and everything
    0:41:59 to just pause, take a beat and just reframe that story
    0:42:01 just a little bit so that you’re not at the center
    0:42:03 of the cosmos and everyone’s not out to get you.
    0:42:06 And it actually does work.
    0:42:07 It actually does work.
    0:42:11 – Oh man, I mean, when I’m in a bad mood
    0:42:12 or burnt out or gloomy, I’ll go ahead
    0:42:14 and take the weather personally, you know?
    0:42:16 – Yeah.
    0:42:21 – The amount that we can extrapolate and overclaim
    0:42:25 that the world is out to get us is just astonishing.
    0:42:29 And it’s really just a deeply human response,
    0:42:33 but we don’t have to swallow it whole.
    0:42:37 We can choose to be more interrogative.
    0:42:39 We can choose to be more skeptical.
    0:42:41 And I do think it has long-term impacts.
    0:42:44 I mean, I think that this is something that we can apply
    0:42:48 in our own ways to our cynical thinking.
    0:42:51 I did something because I tend to be more introverted
    0:42:53 than I sometimes want to be.
    0:42:56 I tend to shy away from conversations,
    0:42:58 probably miss a lot of opportunities because of that.
    0:43:01 So as an experiment of my own,
    0:43:03 did this thing I call encounter counting,
    0:43:07 where on a trip that I took about a year ago,
    0:43:11 I committed to trying to talk with every person
    0:43:13 where there seemed to be an opportunity
    0:43:16 for a conversation, which was so cringe.
    0:43:17 I mean, I was terrified.
    0:43:20 I was sure that these would be awkward
    0:43:22 and horrible conversations,
    0:43:24 but I committed not just to trying them,
    0:43:27 but to making predictions about what they would be like.
    0:43:32 And then writing down, really like taking careful accounting
    0:43:35 of what they were actually like.
    0:43:38 And I found every single time
    0:43:41 that conversations with strangers exceeded my expectation.
    0:43:44 And now that could happen to any of us,
    0:43:47 but we allow those positive experiences too often
    0:43:51 to just float into the landfill of lost memories.
    0:43:55 And we remember with exquisite detail,
    0:43:56 the conversations that went poorly,
    0:43:58 even if they’re in the minority.
    0:44:02 So by forcing myself to fully account
    0:44:04 for all of my social experiences
    0:44:06 over this kind of weekend long period,
    0:44:08 I was able to rebalance my sense of,
    0:44:11 well, how often do these things go well
    0:44:12 and how often do they go poorly?
    0:44:15 (gentle music)
    0:44:18 (gentle music)
    0:44:39 – Support for the gray area comes from Mint Mobile.
    0:44:41 Phone companies are really good at squeezing
    0:44:43 a little more out of you than you signed up for.
    0:44:46 Mint Mobile is doing things differently.
    0:44:48 Their premium wireless plans are actually affordable
    0:44:51 with no hidden fees or any of that nonsense.
    0:44:53 Right now, when you switch to Mint Mobile,
    0:44:56 you can get three months of service for just 15 bucks a month.
    0:44:58 All of their plans come with high speed 5G data
    0:45:00 and unlimited talk and text.
    0:45:01 Plus, you don’t need to worry
    0:45:04 about getting a new device or phone number.
    0:45:07 Just bring those with you over to your new Mint Mobile plan.
    0:45:09 To get this new customer offer
    0:45:11 and your new three month premium wireless plan
    0:45:12 for just 15 bucks a month,
    0:45:15 you can go to mintmobile.com/grayarea.
    0:45:18 That’s mintmobile.com/grayarea.
    0:45:21 You can cut your wireless bill to 15 bucks a month
    0:45:24 at mintmobile.com/grayarea.
    0:45:28 $45 upfront payment required equivalent to $15 per month.
    0:45:31 New customers on first three month plan only.
    0:45:34 Speed slower above 40 gigabytes on unlimited plan.
    0:45:36 Additional taxes, fees and restrictions apply.
    0:45:38 See Mint Mobile for details.
    0:45:41 (upbeat music)
    0:45:54 Something else that I found interesting in your book
    0:45:58 was the claim that cynicism isn’t just
    0:46:01 a psychological plague or illness.
    0:46:04 It’s also a political problem.
    0:46:08 You write that cynicism is not a radical worldview.
    0:46:10 It’s a tool of the status quo.
    0:46:15 Why do you think of cynicism as a tool of the status quo?
    0:46:19 I think this is something that runs countered
    0:46:21 to a lot of people’s stereotypes yet again.
    0:46:24 And this is where I think it’s just so valuable
    0:46:27 to compare what we think of cynicism to the reality.
    0:46:30 Lots of people have told me,
    0:46:32 man, you’re writing a book about hope
    0:46:33 that is so privileged.
    0:46:37 Of course you, as a professor at a fancy university,
    0:46:41 you can afford to have hope and look on the bright side.
    0:46:43 You’re ignoring real problems.
    0:46:46 You’re engaged in toxic positivity.
    0:46:49 And the idea here is that cynicism is almost moral
    0:46:51 that cynics point out problems
    0:46:54 and therefore they are change makers.
    0:46:56 And the first half of that equation is right.
    0:46:59 I mean, cynics are very aware of major social problems
    0:47:00 as we all should be.
    0:47:02 But the second half is incorrect.
    0:47:06 It turns out that cynics view our social problems
    0:47:08 as a reflection of who we really are.
    0:47:11 And if you think that a broken system
    0:47:12 in whatever way you think it’s broken
    0:47:15 reflects our broken nature,
    0:47:17 then there’s really nothing you can do about it.
    0:47:22 Cynics turn out to take part less in civic life.
    0:47:23 They’re less likely to vote,
    0:47:27 protest, sign petitions, you name it.
    0:47:31 And so really cynicism does turn out to be very useful
    0:47:33 for the status quo
    0:47:36 because it’s almost a sort of dark complacency.
    0:47:38 If you don’t think things can change for the better,
    0:47:39 you don’t try.
    0:47:42 And people who are happy with how things are
    0:47:44 don’t need to contend with you,
    0:47:46 even if you disagree with them.
    0:47:50 Hope by contrast, this belief that things could improve
    0:47:54 is part of the psychological cocktail
    0:47:58 that drives activists and drives social change.
    0:48:03 – It’s interesting that in this whole conversation,
    0:48:06 the word optimism hasn’t come up.
    0:48:09 And optimism isn’t in your book title,
    0:48:12 but the word hope is,
    0:48:15 and that’s pretty close to optimism.
    0:48:19 – So the mission of optimism is by design for me?
    0:48:20 – Why?
    0:48:22 – Well, I actually think hope and optimism
    0:48:24 are more different than most of us realize.
    0:48:26 – Oh, how so?
    0:48:28 – Well, optimism, as psychologists understand it,
    0:48:32 is the expectation that the future will turn out well.
    0:48:36 And that’s a very healthy and positive feeling
    0:48:37 so long as you’re right.
    0:48:39 But there’s two problems with it.
    0:48:40 One, if you think that things
    0:48:42 are just going to turn out well,
    0:48:44 you don’t have to do anything.
    0:48:49 Optimism can be somewhat complacent as an emotion.
    0:48:53 The second is that if things don’t turn out well,
    0:48:56 the disappointment is real and fierce.
    0:49:01 And in fact, I think that optimism can be a fragile sense.
    0:49:04 I think George Carlin might have better said,
    0:49:07 scratch a cynic and you’ll find a disappointed optimist.
    0:49:11 Because if you think things are just gonna turn out groovy
    0:49:12 and they don’t,
    0:49:15 that can shatter your expectations very quickly.
    0:49:19 Hope is not the expectation that things will turn out well.
    0:49:23 It’s the belief that the future could turn out well.
    0:49:28 It’s a sensibility that involves a lot of uncertainty.
    0:49:31 And in that uncertainty, a couple of things happen.
    0:49:33 One, we are hardier.
    0:49:36 We know that nothing is guaranteed
    0:49:37 and we can bounce back
    0:49:40 if things don’t turn out the way that we want.
    0:49:43 Two, and probably more importantly,
    0:49:45 in hope and in the uncertainty of hope,
    0:49:48 there’s room for our actions to matter.
    0:49:51 So where optimism can be complacent,
    0:49:56 hope is fierce and practical and drives us to do.
    0:49:59 There’s all sorts of evidence that hope
    0:50:02 is especially powerful for people facing adversity,
    0:50:04 people with chronic illness,
    0:50:08 students in lower socioeconomic status schools
    0:50:12 and people in social movements like activists.
    0:50:16 So I think of hope as much more useful
    0:50:18 and much more robust.
    0:50:22 I try to encourage it much more than optimism.
    0:50:26 What’s your practical advice for people
    0:50:31 who are cynical or who have cynical instincts
    0:50:34 and want to overcome them?
    0:50:35 Where do you start?
    0:50:38 The first, I think we’ve covered pretty well.
    0:50:40 It’s a shift in our mindset
    0:50:42 to be skeptical of our cynicism,
    0:50:46 to fact check our cynical conclusions.
    0:50:48 And also, I would add to that,
    0:50:51 to be more aware of our power.
    0:50:53 In our lab, we taught people
    0:50:56 something called a reciprocity mindset.
    0:50:59 We said, “Hey, when you trust somebody,
    0:51:01 “you don’t just learn about them, you change them.”
    0:51:04 The same way that cynical, self-fulfilling prophecies
    0:51:06 bring out the worst in people,
    0:51:10 when you put faith in people, they know and they step up.
    0:51:11 This is actually true.
    0:51:14 This is something that economists call earned trust.
    0:51:17 But we found that when we taught people about earned trust,
    0:51:20 they were more willing to trust others.
    0:51:24 And when they trusted, other people became more trustworthy.
    0:51:26 So I think that mindset shifts that you can use
    0:51:30 include applying skepticism to your cynicism
    0:51:33 and then being aware of your influence.
    0:51:36 Then I think behavioral experiments that we can try
    0:51:38 include things like encounter counting,
    0:51:41 what I did include things from CBT.
    0:51:43 But another that I would recommend
    0:51:47 is just trying to take more leaps of faith on people.
    0:51:50 I think we’re too risk averse in our social lives.
    0:51:52 We focus too much on what could go wrong
    0:51:56 and not enough on the relationships we could build.
    0:51:59 I think that taking small and calculated chances
    0:52:02 on other people is a really powerful way
    0:52:05 of rebalancing our risk portfolio,
    0:52:08 opening ourselves up to other people
    0:52:10 and also giving them the gift
    0:52:13 of the chance to show us who they really are.
    0:52:14 I love that.
    0:52:19 And there is a kind of safety in cynicism.
    0:52:21 If you don’t trust people, you won’t get hurt.
    0:52:25 If you expect the worst, you’ll never be disappointed.
    0:52:26 That’s all true.
    0:52:31 But the opportunity cost of living like that is so high,
    0:52:35 intolerably high without trust.
    0:52:38 There is no real love or friendship
    0:52:41 and without a belief that things can be better.
    0:52:43 There’s no chance of progress.
    0:52:46 That’s just sort of the arithmetic of life.
    0:52:49 Nothing risk, nothing gain, all that.
    0:52:51 And when you remind yourself of that,
    0:52:55 cynicism does seem a little stupid and unwise.
    0:52:59 So I think you’ve persuaded me of that.
    0:53:03 – Well, I’m glad and it might be unwise,
    0:53:07 but it’s again, it’s understandable.
    0:53:09 – What else are you trying to learn
    0:53:10 about all of this?
    0:53:12 Like what haven’t you figured out yet?
    0:53:16 Where’s the gray area of this research for you?
    0:53:20 What are the big lingering questions for you?
    0:53:23 – We’ve talked about CBT and about long-term change
    0:53:25 in experiences related to cynicism,
    0:53:27 like depression and anxiety,
    0:53:31 but there is no major clinical trial
    0:53:33 of an anti-cynicism intervention.
    0:53:37 And I really want that to happen.
    0:53:38 I’d like to take part in that.
    0:53:40 What would that entail?
    0:53:42 What would that look like?
    0:53:45 – At Stanford, we tried a miniature version of this.
    0:53:51 We just showed students the data about themselves.
    0:53:53 So there were two steps in what we did.
    0:53:56 One, we kind of ran an ad campaign
    0:53:59 where the target audience was Stanford undergraduates
    0:54:03 and the product was also Stanford undergraduates.
    0:54:06 We picked a certain number of experimental dorms
    0:54:08 and we plastered them with posters
    0:54:13 presenting real responses that students’ peers had given.
    0:54:16 So for instance, did you know that 95% of people
    0:54:19 on this campus say that they would like to help
    0:54:21 their peers who are struggling?
    0:54:24 85% of people here want to meet new students
    0:54:27 who they haven’t gotten to know yet.
    0:54:31 We also teamed up with this class called Frosh 101,
    0:54:33 which most first-year students take
    0:54:35 and gave students in those dorms
    0:54:37 a chance to guess what the data were
    0:54:39 and then we showed them what the data were
    0:54:40 and then we asked them,
    0:54:41 what are some examples that you’ve seen
    0:54:45 of friendly or supportive people at Stanford?
    0:54:48 All this sort of social psychological intervention
    0:54:51 that really just amounts to telling people the truth.
    0:54:53 I want to be abundantly clear about that.
    0:54:55 A lot of anti-synicism work
    0:54:58 is not pointing people only to good things.
    0:54:59 It’s telling them the truth.
    0:55:01 It’s giving them real data.
    0:55:05 And we found that students who were in those dorms,
    0:55:07 basically felt less cynical about their peers.
    0:55:09 They perceived them as more empathic and kinder.
    0:55:11 They took more social risks
    0:55:15 and they also made more friends over the long term.
    0:55:18 So I really think an anti-synicism intervention
    0:55:21 at a large scale would begin at least
    0:55:24 with giving people better information
    0:55:26 and then would follow on that
    0:55:29 with chances for them to act on that information.
    0:55:33 It would be like CBT for the world.
    0:55:34 Maybe that sounds grandiose,
    0:55:37 but I think that it’s ironic that we only turn
    0:55:41 to rigorous systematic thinking about our own lives
    0:55:43 when we’re suffering from depression and anxiety.
    0:55:46 I think that we could use that type of thinking
    0:55:48 in all manner of situation,
    0:55:50 whether we’re moving to a new town
    0:55:52 and trying to make new friends
    0:55:55 or try to reconnect with our uncle
    0:55:57 who posts too much on Facebook.
    0:56:00 There’s a lot of times, a lot of situations
    0:56:03 where those tools of rigorous skepticism
    0:56:04 could be so powerful.
    0:56:11 So maybe all of that and then dropping psilocybin
    0:56:14 and the water supply and then we’re all cooking.
    0:56:17 I’m kidding, Stanford professor, Jamil Zakhi,
    0:56:20 does not endorse that policy.
    0:56:22 Neither does Vox for that matter.
    0:56:23 I’m just kidding.
    0:56:24 Sorry, I couldn’t resist.
    0:56:26 Couldn’t hurt, couldn’t hurt, Sean.
    0:56:29 (upbeat music)
    0:56:36 Once again, the book is called Hope for Cinex,
    0:56:39 the surprising science of human goodness.
    0:56:41 Jamil, Zakhi, thanks so much for doing this.
    0:56:43 It was a pleasure.
    0:56:44 This has been delightful.
    0:56:45 Thank you so much, Sean.
    0:56:48 (upbeat music)
    0:56:55 Okay, listeners, professor Zakhi has given you an assignment.
    0:57:00 Pick a cynical belief and put it to the test.
    0:57:02 If you think everyone hates you,
    0:57:04 ask 10 friends to the movies.
    0:57:08 That’s an example he gave, but it could be anything.
    0:57:11 Whatever it is you think sucks about people,
    0:57:13 makes you feel like you can’t rely on them
    0:57:15 or they don’t care about you.
    0:57:19 Design a small experiment to test it
    0:57:22 and then write down the response, see what you find.
    0:57:24 And if it’s different than what you expected,
    0:57:29 then, and this is the crucial part, go to your phone.
    0:57:31 Make a voicemail saying who you are,
    0:57:33 why you think you’re a cynic,
    0:57:36 and what you did to test that theory.
    0:57:39 Send it to us at thegrayarea@vox.com
    0:57:41 with a cynic test in the subject.
    0:57:45 And if it takes you six months to do that test, that’s fine.
    0:57:46 We’ll still be here.
    0:57:48 I’m hopeful, at least, about that.
    0:57:53 This episode was produced by Travis Larchuk,
    0:57:58 edited by Jorge Just, engineered by Patrick Boyd,
    0:58:00 fact-checked by Melissa Hirsch,
    0:58:03 and Alex Overington wrote our theme music.
    0:58:06 New episodes of The Gray Area Drop on Mondays,
    0:58:07 listen and subscribe.
    0:58:10 The show is part of VOX,
    0:58:12 support VOX’s journalism by joining
    0:58:14 our membership program today.
    0:58:17 Go to vox.com/members to sign up.
    0:58:19 (upbeat music)
    0:58:22 (upbeat music)
    0:58:33 Support for The Gray Area comes from Mint Mobile.
    0:58:35 You can get three months of service
    0:58:38 for just 15 bucks a month by switching to Mint Mobile.
    0:58:40 And that includes high-speed 5G data
    0:58:42 and unlimited talk and text.
    0:58:44 To get this new customer offer
    0:58:46 and your new three-month premium wireless plan
    0:58:47 for just 15 bucks a month,
    0:58:50 you can go to mintmobile.com/grayarea.
    0:58:53 That’s mintmobile.com/grayarea.
    0:58:55 $45 upfront payment required,
    0:58:57 equivalent to $15 per month.
    0:59:00 New customers on first three-month plan only.
    0:59:03 Speed slower above 40 gigabytes on unlimited plan,
    0:59:06 additional taxes, fees and restrictions apply.
    0:59:07 See Mint Mobile for details.
    0:59:10 (upbeat music)

    There’s a certain glamor to cynicism. As a culture, we’ve turned cynicism into a symbol of hard-earned wisdom, assuming that those who are cynical are the only ones with the courage to tell us the truth and prepare us for an uncertain future. Psychologist Jamil Zaki challenges that assumption.

    In part one of The Gray Area’s new three-part series, “Reasons to be Cheerful,” Sean Illing asks Jamil Zaki about why cynicism is everywhere, especially if it makes no sense to be this way — and what we, as individuals, can do to challenge our own cynical tendencies.

    Host: Sean Illing (@seanilling)

    Guest: Jamil Zaki (@zakijam) psychologist at Stanford University and author of Hope for Cynics: The Surprising Science of Human Goodness

    Support The Gray Area by becoming a Vox Member: https://www.vox.com/support-now

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

    Learn more about your ad choices. Visit podcastchoices.com/adchoices