Kurt Gray: Understanding Outrage to Heal America

AI transcript
0:00:05 it takes a fifth of a second for your flight or flight response to be activated and 20 minutes
0:00:12 for it to calm down. And so part of, I think, managing outrage is just taking some time and space
0:00:19 away. It’s okay. It’s okay to walk away from the situation and say you need a little bit of time
0:00:24 to calm down when you think about it. And it’s okay to schedule it for a different time when
0:00:29 you’re not feeling any like don’t send emails in anger. Try not to talk about politics when you’re
0:00:34 filled with rage. But I think what I try to do, and to get to the, you know, your point about
0:00:42 thinking about the minds of others, is to think about like how these folks that I disagree with
0:00:48 might feel victimized and the harms that they see. Because once you see that the harms that someone
0:00:53 else is worried about or how they feel like they’ve been victimized, even if I don’t agree
0:00:58 with their position, I can at least see them as a little more human. And that goes a long way.
0:01:06 I’m Guy Kawasaki. This is “Remarkable People.” We’re on a mission to make you remarkable,
0:01:12 so we bring in remarkable guests so that they can pass on their knowledge and wisdom and tactics
0:01:22 and strategies. Today’s remarkable guest is Kurt Gray. He earned his PhD in psychology at Harvard.
0:01:28 He’s currently a professor of psychology and neuroscience at the University of North Carolina
0:01:33 Chapel Hill. Is that where Michael Jordan went to school? You bet, you bet.
0:01:40 He’s the second most famous graduate of the University of North Carolina Chapel Hill.
0:01:48 And he’s the director of, I have never heard of academic labs named like this, but he’s the
0:01:54 director. I got to read this to make sure I got it right. He’s the director of the Deepest Beliefs Lab
0:02:00 and the Center for the Science of Moral Understanding. That’s quite a mouthful.
0:02:08 Yeah, we rebranded. I thought we study everyone’s deepest beliefs, politics, religion, morality,
0:02:16 and why not name the lab what we do? Yeah. It couldn’t just be the University of North Carolina
0:02:26 Neurosciences Lab, are we? Okay, so let’s just ease into this interview, all right? And I’m inspired
0:02:31 by something at the very end of your book where you discuss the work of John Saroof. I hope I
0:02:37 pronounced that right. And so you asked him a very easy question, which I’m going to now ask you,
0:02:45 which is, what are the three easy steps to heal America? Could you just get to that in like 60
0:02:52 seconds or so? First, he laughed and he said no. He’s like, it’s too hard, but then he tried to kind
0:02:59 of lay it out. And so I figured the best three steps from what I could, and they are how to have
0:03:06 better conversations with each other. And so those three steps are connect in conversations, invite
0:03:12 in conversations, and then validate. So those are the three steps. It’s a way of having conversations
0:03:18 about politics that we frequently, well, we don’t have. I’m happy to go through those steps.
0:03:24 Yeah, because there were lots of things in those steps which I found counterintuitive.
0:03:31 For example, you talk about it’s better to ask too many questions than too few. So can you just
0:03:37 tell us like, connect? What’s the power tips for connect? Totally. And all this is against a backdrop
0:03:42 of like, you’re trying to have conversations across differences. It’s hard. You want to do it. You
0:03:46 want to talk to your uncle that you haven’t talked to him while you’re a co-worker who disagrees with
0:03:52 you sometimes. And so the CIV is the SIV, the beginning of more civil conversations. So that’s
0:03:57 where the kind of term comes from. So connect. The first thing you want to do before you talk
0:04:02 about anything contentious is connect with someone as a person. There’s so many things that you could
0:04:07 talk about that aren’t like politics or what you disagree about, right? Food, music, family,
0:04:13 jobs, whatever, you know, weather. I mean, weather is boring, right? And let’s get away from the
0:04:19 weather and let’s ask deep questions. So it turns out that people really like when you ask them
0:04:25 questions, especially surprisingly, deep questions, right? When we think of someone who’s a good
0:04:30 conversationalist, we think of someone who talks a lot. But what we really like is someone who asks
0:04:35 us deep questions. So think about conversations as more like a date. Like if you went on a date with
0:04:40 someone and they just talked at you forever, you’d think, oh my goodness, like what a windbag.
0:04:45 But when someone asks you questions, follow up questions and, you know, about you,
0:04:49 where’d you grow up? Oh, what was it like growing up to? How did that shape you? How did it make you
0:04:54 who you are today? What hopes and dreams do you have about the future? When’s the last time you
0:04:59 cried? These questions might seem crazy to ask someone, but people really appreciate connecting
0:05:08 deeply in conversations. And what about the quantity of questions? Yeah, more than you think. It’s
0:05:12 more than you think is better, especially follow up questions, right? So you can imagine having
0:05:18 conversation with someone and lots of research to back this up. Someone says, here’s this heartfelt
0:05:25 story of me. And once my kid’s fish just died, very tragic. We had to bury my daughter’s fish
0:05:30 out back and imagine I told you this story of my fish dying. And then instead of you saying, wow,
0:05:34 that must have been really hard. I’m sorry for your fish loss. You know, you’re just like, oh,
0:05:38 yeah, yeah. Anyways, I went fishing the other day and that this is what reminded me of fish.
0:05:44 You know, like that’d be terrible. And so you want to ask follow quite right how your daughter’s
0:05:50 dealing with it. Did you have a special ceremony for the fish? And it’s a funny example, but follow
0:05:56 up questions, show that you’re listening. And so I think that is an important thing for connecting.
0:06:04 We had a guest about a week ago who she cited many studies about speed dating. And she gave us
0:06:10 all these speed dating tips. And she basically said exactly what you said. Maybe you guys are
0:06:16 both talking about the same studies. But she said, in speed dating, you got to ask a lot of questions
0:06:22 and you got to go deep really fast to stand out from other speed daters. Yeah, just human nature,
0:06:28 right? It’s what humans want. Yeah. Because if you’re just like, hey, pretty hot out, isn’t it?
0:06:36 Okay, sure. You’re not connecting with that. All right. Okay, so that’s connect now. Invite.
0:06:44 Invite. So invite, if you want to talk about some like deep issues, the invitation is something
0:06:50 that people appreciate. And an invitation means that something where someone can say no, right?
0:06:55 So if I’m inviting you into a party, you can say no. But if I’m demanding you to go to a party,
0:06:59 that’d be kind of a jerk move, right? I’m having this party and you better come or else.
0:07:04 And I think when we talk about politics and we talk about moral questions, we often do demand.
0:07:09 How could you vote for this person? How do you believe this? Explain to me, right? Like,
0:07:13 justify yourself. No one needs to feel like they’re justifying who they are, what they believe.
0:07:19 And so an invitation is really focused on understanding and the intention there is learning.
0:07:27 So please, I’d love to learn what you think about immigration. And I know maybe we disagree,
0:07:31 but I’m just like, I’m trying to understand and I want to understand where you’re coming from
0:07:35 and what experiences might be of lead there. So I just really appreciate hearing kind of your views,
0:07:39 even if they’re different from what I think. How about if you say to the person,
0:07:46 can I ask you a question about immigration? That’s good. Yeah, that’s a great place to start.
0:07:55 And a question before the question, that’s perfect. Yeah. Okay, now can I ask you about validation?
0:08:02 Thanks for the invitation. I’m happy to share. See, like it worked. It just worked for us right
0:08:07 now. So you know, you invite someone to share something that is going to be hard for them to
0:08:13 share invitation. You’re like putting in kind of a little like grace, I like to say, you know,
0:08:17 I talked to a lot of churches. So you’re like putting in some grace, like I’m inviting you.
0:08:21 And I know it’s going to be hard for you to share this thing. And you’re probably going to say
0:08:24 something offensive. You don’t need to say that to someone, but like, come up with a mindset that
0:08:30 someone might say something offensive to you. And then they’ll say something. And then maybe your
0:08:35 first reaction is to say, I don’t think that or what about this? Here’s this other thing you
0:08:40 are considering. You want to say no right away, you want to like fire back something, but you should
0:08:45 resist that impulse and instead validate. It doesn’t mean agree. It does mean, thanks so much for
0:08:50 sharing that I understand it’s hard to share. If I’m listening correctly, I think you mean this,
0:08:56 and I’m going to rephrase it in a way that’s very charitable. And then that’s going to make you feel
0:09:00 heard. It’s going to make you feel seen. And it’s going to allow us to have a better conversation
0:09:07 after that. And it’s not part of CIV, but you also make a very big deal about showing that
0:09:14 you’re vulnerable. So how does vulnerability play into this? Yeah, I guess taking a step back,
0:09:20 the goal for all these conversations is to see the humanity in each other, right? To recognize
0:09:26 that we’re all good people. We’re all trying to figure out our lives and our world. And oftentimes
0:09:32 when we’re disagreeing with someone, we and this is John Saroo’s turn, we flatten them. We see them
0:09:37 as two dimensional. And so all this is a way of seeing someone in their rich, three dimensionality.
0:09:44 You want to connect, you want to invite, you want to validate, and you want to let them see you as
0:09:51 well as someone who is human. And a big part of that is vulnerability. So I’m going to share the
0:09:55 stories after you shared with me. I feel comfortable with you now because we’ve had a connection.
0:10:01 And so I’m going to share the stories of suffering, of harm, my concerns. And so you can see
0:10:07 that I’m really basing my moral judgments, my concerns in questions about harm, because that’s
0:10:13 where our minds are rooted when it comes to morality and politics and concerns about protection.
0:10:21 So the name of your book is Outrage. And I have to ask a very simple question, which is what causes
0:10:30 people to get outraged? It’s a great question. It’s a pretty simple answer. And the answer is we
0:10:40 get outraged when someone rejects or challenges or defies our understanding of what is harmful
0:10:47 and who is vulnerable to harm. So we get morally outraged when we see someone causing harm
0:10:52 or rejecting our understanding of who or what is harmful.
0:11:02 And would you say that being outraged is highly correlated with justifiable reasons to be outraged?
0:11:08 If you stepped aside, would it be irrational to be outraged?
0:11:16 Here’s the thing. So outrage evolves as a way of protecting ourselves. So if in our community,
0:11:22 someone did something that we thought was harmful, we would all band together in anger,
0:11:27 we’d grab our pitchforks, our torches, and we would kick that person out of the community,
0:11:34 or we would punish them or censor them somehow. Except today, we’re no longer faced as much with
0:11:39 kind of things that we all agree are harmful. Instead, we disagree, immigration, abortion,
0:11:45 taxes, different people on the left and right disagree about what’s harmful and what’s worthy
0:11:52 of outrage. And so going back to your question, it’s really hard to know what’s the rational
0:11:58 right thing to be upset about because there are so many people on each side of an issue
0:12:03 who have a legitimate point often. And so I think it’s hard to say what’s rational and what’s not.
0:12:16 We can all agree on some things, but other things we’re very far apart on.
0:12:30 As I was reading your book, this section about outrage, a moral outrage, I just thought, would
0:12:38 you say that the United States reaction to 9/11 was the mother of all examples of moral outrage?
0:12:44 I mean, it’s certainly, there’s a case to be made. So we were harmed. We were harmed by
0:12:53 someone we thought is a true villain, in this case, Bin Laden, and like foreign folks doing harm on
0:12:58 our soil, attacking our institutions. And certainly we were collectively outraged enough, we came
0:13:04 together and then tried to punish the folks who caused that harm. I think that’s a good example.
0:13:11 But isn’t the point of your book that we should look at it from the other perspective and see
0:13:19 what drove them to do 9/11? Or is that a stretch? I think the point of the book is to understand
0:13:25 how everyone has the same moral mind and we’re all driven by outrage. And so I think it’s
0:13:32 justifiable that many Americans were outraged at that act. But I do think that failing to appreciate
0:13:37 the kind of mindset of, let’s say, folks in Afghanistan, folks in Iraq, other folks in the
0:13:43 Middle East is like, you know, America goes in there and causes a lot of harm. I think we maybe
0:13:49 needed to think about how they might feel outraged at America in return, right? There’s always two
0:13:54 sides of any issue. And there’s always people who take the other side. So at least we should have
0:13:57 had a better understanding of what they’re thinking and what they’re feeling.
0:14:05 So do you have a practical tip that when you’re feeling moral outrage, you know,
0:14:11 before you react, you should take these steps or do you just start launching B-52s?
0:14:17 Well, you know, John Sirouf has this other quote that I really like when he’s working with
0:14:24 divided communities. And he says, “It takes a fifth of a second for your flight or flight response
0:14:32 to be activated and 20 minutes for it to calm down.” And so part of, I think, managing outrage is
0:14:39 just taking some time and space away. It’s okay. It’s okay to walk away from the situation and
0:14:45 say you need a little bit of time to calm down when you think about it. And it’s okay to schedule it
0:14:49 for a different time when you’re not feeling anger. Like, don’t send emails in anger. Try not to
0:14:54 talk about politics when you’re filled with rage. But I think what I try to do, and to get to your
0:15:02 point about thinking about the minds of others, is to think about, like, how these folks that I
0:15:08 disagree with might feel victimized and the harms that they see. Because once you see that the harms
0:15:12 that someone else is worried about or how they feel like they’ve been victimized,
0:15:18 even if I don’t agree with their position, I can at least see them as a little more human. And that
0:15:25 goes a long way. Author to author. I’d like to ask you this question or maybe I should pose it as
0:15:35 author to author. May I ask you a question? Yes, you bet. So did you consider titles for your book
0:15:42 that were not quite violent or negative? Instead of outraged, it would be like harmless or something
0:15:49 like that, the flip side of outraged. I did consider harm less. I thought that would be nice.
0:15:55 The original title, the kind of secret working title was The Victim Within, which is a negative
0:16:03 but maybe less violent. It turns out no publisher wanted that book. You’re a victim and no one
0:16:07 wants to walk through the bookstore and be like, you know what, I am a victim, even if we might
0:16:12 all feel like it at some times. But I think we thought that the one word title, you know, with
0:16:18 a bright orange cover and the aggressive question or exclamation mark rather, maybe a question mark
0:16:25 would have been better, right? Outraged. But I think we wanted people to resonate and many people
0:16:30 are feeling outraged today and we wanted to help them make sense of that. Okay, fair enough. And
0:16:38 now I’m going to ask you one more author to author question that I noticed. And you may find this
0:16:46 really bizarre, but on page 190 of your book, I have a question. And let me read a quote. So on
0:16:54 page 190, it says, “Most conservatives generally want to protect black men and most liberals
0:17:00 generally want to protect the police.” Is that an error? Is that a typo? Was it transposed?
0:17:09 No, no, because it is the case that when we think of progressive folks and conservative folks,
0:17:13 we generally think of conservative folks being more concerned about the police. Progressive
0:17:20 folks being more concerned about protecting black men, black lives. But it is the case that most
0:17:26 progressives do think that police officers are good and want to protect them. Oh, okay, I get it
0:17:33 now. Okay, so you were like busting two myths there. Exactly, exactly. And where the disagreement
0:17:38 comes is like the relative concern when those two groups, those two interests are kind of
0:17:45 pit against each other. But most people want to protect most other people. I read that sentence
0:17:52 about four times and I said, is this an error? What is he trying to say? So I stand corrected.
0:17:56 I stand corrected. Thank you very much for clarifying. But you’re right. We have these
0:18:02 myths that the other side is so stuck in their political positions that they don’t care about
0:18:08 the other side or other interests. But most people do generally care for others and want a world where
0:18:14 people are protected. Just these disagreements come down to these trade-offs. The foundation of your
0:18:20 book is something that I never considered, which is I’ve had many social psychologists on this
0:18:25 podcast and there are many explanations for this divide in the United States. But
0:18:33 none of whom have ever said something like harm is the key variable and the master key to explain
0:18:40 human morality. So can you explain why harm is such a big deal? Because I never considered it. I
0:18:47 never thought of it like that. I think the reason that harm is the master key of morality is because
0:18:54 we as a species and as individuals, we’re really concerned with protecting ourselves
0:19:01 and we evolve through time as being threatened. If you go out, I make the argument that we’re
0:19:06 a lot more prey than predator. If you go out into the forest and you strip down to your underwear
0:19:11 and you wait for nighttime and you hear a twig snap behind you, you’re not going to think, oh
0:19:16 good, I hope it’s an animal that I can eat. You’re going to think, oh my goodness, right? Something’s
0:19:22 coming to eat me. Something’s coming to kill me. And if you walk through a dark alley and you hear
0:19:26 some shuffling in the darkness, you’re going to think, oh no, there’s something there. There’s
0:19:32 something trying to get me. So we are so attuned to threats because the revolution that people who
0:19:37 were not attuned to threats, they died. If you were like, oh look, there’s like something lurking
0:19:41 in the darkness. I’m going to go try to hug it. You got eaten. And so we’re attuned to threats and
0:19:47 in the past, those threats were clear. The predators are people who might kill us.
0:19:53 But today, those threats are a little more ambiguous, right? Are immigrants a source of
0:19:59 income for America? Or are they threats? Are taxes going up? Or is that bad? Is that good? So today,
0:20:03 we’re divided by what’s the most threatening. But at the heart of all our moral judgments,
0:20:08 is this concerned about protecting ourselves from harm? And that’s why it’s the master key.
0:20:16 And if someone from 100 years ago took a time machine and came to America today
0:20:23 and saw what we considered harmful threats, would that person just be scratching their
0:20:30 head and saying, listen, you guys, you got to have more serious things to worry about than,
0:20:34 I don’t know, not being able to get into Harvard. That’s not the biggest threat in the world.
0:20:42 I think you’re exactly right. I often think, going back in time, if you went to a parent
0:20:48 in the Industrial Revolution, and their kids were working in a factory, it’s dark, it’s hot,
0:20:53 there’s spinning machinery that could catch your hand and cut it off. And they’re thinking every
0:20:58 day, I hope my child’s going to come home. And then, as you say, you tell them, you’re like,
0:21:04 maybe your kid’s not going to get into this elite school, or maybe your kid is going to
0:21:06 look at the screen too much. They didn’t have a screen, maybe you don’t understand, right?
0:21:10 But if they look at the screen too much and not feel fulfilled, the parent will be like,
0:21:16 I just hope they come home with their hands. And so, it’s not saying that these harms we’re
0:21:22 worried about today aren’t real to us and don’t cause concern, but it’s certainly a stretch from
0:21:28 days of your. So let me ask you something so that if you are dealing with someone who you
0:21:36 disagree with, should you be so transparent and ask them what harm they are seeing or what harm
0:21:41 they’re trying to avoid? Can you just be that blunt and ask, what’s threatening you?
0:21:49 I think you can. I think I probably wouldn’t start with that. Going back to the CIV, I would
0:21:55 probably get into the conversation a little more obliquely, talking to them as a person.
0:22:00 But I think it’s totally reasonable, once you’re actually talking about politics, about morality,
0:22:06 to say, what are you most worried about? What do you think the harm is of this policy? People
0:22:12 will tell you, people have a gut feeling that the things they’re against are harmful. And so,
0:22:16 I think they’re very happy to tell you as long as you don’t throw back in the face, as long as you
0:22:24 don’t deny the kind of victims that they see. Okay, so I’m a hardcore liberal and probably most of my
0:22:29 audience is liberal, too. Maybe I have two or three conservatives listening to this podcast. So,
0:22:35 I gotta tell you, a lot of the book was just eye-opening to me that there is another argument to
0:22:44 be said about abortion or vaccination or guns. So, can you just quickly explain to us, what’s the
0:22:53 other side of you guy? What harm do they see with too much abortion or vaccination or gun control?
0:23:02 Yeah, good question. And I want to preface this with this idea of, look, there are like statistical
0:23:09 truths of who or what is more likely to be harmed in some situations. And so, I think vaccination,
0:23:15 you know, as a scientist, I think vaccination, it’s a good. It’s a clear good. I’m a scientist.
0:23:22 There’s lots of evidence to support that. But I think that statistics don’t always resonate.
0:23:29 In our minds, and in fact, they seldom do. And what does resonate are kind of stories about harm.
0:23:34 And so, even if we are all concerned about protecting the vulnerable from harm, especially kids,
0:23:42 that you can have powerful stories, visions of what’s harmful that contradict any kind of
0:23:47 statistics. So, let’s say your kid gets a vaccine and falls terribly ill, that’s going to be a
0:23:54 personal experience that overwhelms how many scientists tell you that vaccines are okay.
0:23:59 Or if you hear a story of a friend, I think guns are the same way, right? If you have experience
0:24:04 and you’ve used a gun to defend yourself, that’s really powerful. And guns is also an interesting
0:24:09 case because we could argue about what’s most relevant to the gun debate. If you’re like,
0:24:14 look, the real problem is his mental illness, or it’s not like taking care of your gun or putting
0:24:18 them in safes, then we shouldn’t be worried about how many guns are there. We should be worried
0:24:23 about responsible gun ownership. And if someone comes to your family, if the zombie apocalypse
0:24:28 comes, you want to have a handgun. And so, you see those threats and you see guns as a way of
0:24:33 protecting yourself against those threats. So, it just always comes down to this kind of this
0:24:41 worldview about what best protects us and our family. Would you say that it’s accurate that
0:24:49 conservatives and liberals would almost completely agree on the desirability of protecting kids?
0:24:55 Yes. I think we agree on the desirability of protecting most people. We just make different
0:25:02 assumptions about what protects them. So, a liberal would make the argument that vaccination
0:25:07 protects kids, and a conservative might make the argument that vaccination is going to cause
0:25:14 autism. So, they’re both trying to protect kids. That’s exactly right. Wow. But let me ask you,
0:25:20 Kurt, is there a point too far? I understand seeing the other sides, but are you telling me
0:25:27 I’m supposed to see the other side for Hitler and Putin and Musk and Trump? I mean, RFK says,
0:25:31 don’t get vaccinated. Is there a point where you just say, I don’t think so?
0:25:39 I think there is a point. There is a point where you say this is a step too far. But I do want to
0:25:46 separate the morality and humanity of everyday people from maybe political elites. So, this book
0:25:51 is not written as like being an apologist for Putin. In fact, in one chapter, I say like,
0:25:57 Putin thinks he’s a victim, and obviously he’s not. That idea is crazy. But I do think that many
0:26:03 people in America have so many kind of different assumptions about who’s vulnerable to harm from
0:26:08 social media, from cable news. And I think most people left or right just want to protect their
0:26:14 family and want to have a better world to live in. And maybe some folks make assumptions or
0:26:19 misunderstand the science. Questions of vaccination are probably more complex than most people realize,
0:26:26 even if they’re uniformly, I think, a good thing for society. And so, I think we should just
0:26:34 see the humanity on the other side, even if we want to argue against them. But that understanding
0:26:39 of humanity doesn’t have to extend to Elon Musk if you don’t want it. This is about reconnecting
0:26:43 with your friends and family and coworkers. And if people don’t want to have a reasoned conversation
0:26:50 with Kanye or Elon or Donald Trump, I think that’s okay. This is about healing our kind of
0:26:56 everyday fractures. Although, I do want to say there are people who think that you can extend
0:27:02 the bridge even further. So, Daryl Davis is a black blues musician. Talk about him in the book.
0:27:07 He devotes his life to befriending KKK members and getting them to hang up their robes. So,
0:27:13 should we require a black man to talk to a KKK member? No. But the fact that he did it,
0:27:19 I think, makes society better because he deradicalizes them and brings them towards the middle. So,
0:27:24 those conversations are good. But I think a very subtle and important point is that
0:27:32 where do you draw the line between communication and understanding and empathy and vulnerability
0:27:38 and crossover into persuasion? Because once you cross into persuasion,
0:27:45 it changes the CIV, right? Right. And so, in general, when you have these conversations,
0:27:49 for them to go, “Well, you should be trying to understand and not win.” Because if you try to
0:27:55 win, say you’ve already lost. But you can persuade someone through understanding, which I think is
0:28:00 actually the only way to do it. Because if I come at you and I say, “Look, your positions are wrong.
0:28:05 I heard some statistics on Fox News and here’s how, like, you’re following the wrong party
0:28:09 and you’re a sheep and let me tell you the right statistics,” that’s not going to persuade. That’s
0:28:13 going to send you in the other direction, right? You’re going to get angry. You’re an idiot. Like,
0:28:17 you don’t know the right facts. And this is why facts don’t work to persuade because people have
0:28:23 different facts. But if instead I say, “Look, here’s why I believe what I believe. It’s because of how
0:28:28 I was raised. This really formative thing happened to me as a child. I felt very vulnerable and I
0:28:34 got assaulted by a migrant or like I used a gun to defend my right. These positions on the right,
0:28:40 I think you’ll think, “Oh, okay. Like, I see where you’re coming from and now you can generalize my
0:28:45 own stories and feelings of harm maybe to other folks on the side. And that could be persuasion,
0:28:49 but not because I’m trying to persuade. Because I’m trying to just show you who I am.”
0:28:59 But isn’t there an inherent flaw in using stories this way? Because stories are not necessarily
0:29:05 statistically or scientifically valid, right? You could say, “Yeah, my uncle smokes cigarettes
0:29:12 every day for 70 years. He never got lung cancer. So cigarettes, according to my story, are safe.”
0:29:17 Yeah. No, that’s a good point. And I teach my students in psychology and science that
0:29:26 stories, just anecdotes, are not data. And the point is very apt. But our minds do not
0:29:33 work well around statistics. And they should work better. And I teach my students to think about
0:29:39 statistics, right? But in a political conversation, it doesn’t work. I wish that it worked. Then we
0:29:44 could agree on like a common set of statistics or facts, but we don’t. Because each side has
0:29:51 statistics and they disagree about which statistics are most relevant. So there’s a statistic that
0:29:57 somewhere between 100,000 and 1.2 million times every year, guns are used in self-defense.
0:30:05 And there’s a broader set of statistics, even more times in America each year. Guns harm people,
0:30:09 not in self-defense. But I could say, look, as a conservative, like those other statistics,
0:30:13 they’re not relevant because they’re not using guns correctly. They’re not well-trained. It’s not
0:30:18 the right kind of guns. And there’s always a way to, I think of like Neo and the Matrix,
0:30:22 you dodge around which facts are real. And so at the end of the day, if you want to understand
0:30:27 someone, you need to put statistics aside, at least at first, at first connect with someone
0:30:33 as a person, and then you can talk about statistics. So do you basically tell your students that
0:30:40 stories are very powerful way to communicate and persuade, but you should also be understanding
0:30:50 that when you run the side of the recipient of the story, you have to ask, is this valid in general?
0:30:57 I think people already do that. I think if I tell you a story, as I have, so here’s how a gun
0:31:03 writes out of a kit, why they would be pro-gun because of they used a handgun to defend themselves,
0:31:09 or your kid got sick and you thought it was because of the vaccine. That’s a story you would
0:31:14 spontaneously say, that’s not valid. That’s not a right story. That’s not generalizable. People
0:31:18 already do that. And so I think the step we’re missing is to say, wow, that must have been hard.
0:31:24 Thanks for sharing that. I can understand where you’re coming from. And then you can talk about
0:31:29 some broader statistics. I think once you’ve shared your own story, so the person knows where
0:31:34 you’re coming from. And there’s lots of programs actually to bring lawmakers together across the
0:31:41 country. And those programs, they allow storytelling initially, and then afterwards, they have lawmakers
0:31:46 think about statistics and policies and affecting the most good. I’m kind of narrow range of issues
0:31:52 typically, but statistics are still part of it, but they don’t happen first, if that makes sense.
0:32:00 In Silicon Valley, one of the most common stories is that you don’t need a college education because
0:32:06 Bill Gates, Steve Jobs, and Mark Zuckerberg don’t have an undergraduate college degree,
0:32:14 so they don’t need it. You don’t need it. Drop out of college. And that is statistically a very,
0:32:20 very misleading story, right? Right. I think about in terms of the stock market too, right?
0:32:26 Yeah. Sometimes there’s big crashes, but statistically speaking, it just makes sense
0:32:32 to invest your money in a stock market because it goes up. But morality is not, it’s not like
0:32:39 other facts. It’s not like other statistics. If you have a deep moral conviction and I say,
0:32:43 “Hey, there’s a statistic out there,” and it suggests that your moral conviction is wrong,
0:32:47 it’s not that you’re not going to be like, “Oh, you know what? You’re right. I guess I’ll just
0:32:52 give up my view,” because that’s not how we’re built, right? Our moral convictions, they tie
0:32:56 communities together. They make us feel like we’re good people. And so maybe you can have
0:33:00 moral change over time and people do. People shift left to right or right to left. But like,
0:33:05 it’s not the same kind of thing of what’s the most reliable car you can buy. It’s not the same
0:33:09 kind of thing as what should you believe about abortion. Okay, so now I’m kind of reversing
0:33:16 my direction on stories. Can you tell us how to optimize the use of stories? So let’s say I’m
0:33:23 sold. I’ve seen the light. I love stories versus facts. So how do I use stories most effectively?
0:33:28 Before we get there, I think one thing that’s useful to note is that political operatives,
0:33:32 the people that I’m saying you don’t need to have quite as much empathy for as everyday people,
0:33:37 they understand the power of storytelling and fears and harms. These are the things that motivate
0:33:42 the base that get out donations, that get out the vote. These people are out there,
0:33:47 they’re coming for you, they’re coming for America. So I think stories do work.
0:33:52 And the other side uses them, whether you’re on the left or the right, the other side is always
0:33:58 using them to almost for evil to drive division, especially during elections. And so I think it’s
0:34:05 useful for us to know that they work. And I think even drawing from those kind of like the success
0:34:11 of fear and threat, the stories that work best for bridging divides are, as we mentioned a
0:34:17 little bit earlier, stories that kind of reveal your vulnerability and reveal that you’re a good
0:34:24 person. If I tell you the story and I talk about my family. And so here’s a story of like why I
0:34:31 want to bridge divides and see the humanity in the other side. And even though I like hang out
0:34:38 mostly with progressives, I have family in Nebraska and they very clearly love me and they
0:34:43 love me as a kid. I was a stepchild. I was a foreigner. I came down when I was like seven and
0:34:49 10. And they opened their arms to me and accepted me as part of the family, despite these facts.
0:34:56 And even as I got older and I realized I disagreed on issues, I couldn’t write them off and say that
0:35:02 they’re just, you know, stupid or evil because I know that’s not true. And so it’s my personal
0:35:08 experience of like feeling loved and having them kind of sacrifice for me, drive across the country
0:35:15 to see me get married. These are the things that make me feel connected. And so my own story for
0:35:18 why I’m a pluralist and why I think we should bridge divides is like grounded in my personal
0:35:23 experience and my feelings of love and openness. So I think those are the stories that are optimized
0:35:30 in a sense, like bearing yourself and showing who you are personally up next on remarkable people.
0:35:34 And so I think we’re exhausted about fighting about what the best way to do it is and we’re
0:35:39 exhausted about the kind of loudest voices on social media and on cable news. Even though I
0:35:44 think we’re kind of like there’s like terrible addiction to kind of social media and cable news
0:35:50 like we would be happier if we put it aside for a little bit, took some time, rested, slept and
0:36:00 connected with people on a human basis. And that might make us feel less exhausted.
0:36:09 Thank you to all our regular podcast listeners. It’s our pleasure and honor to make the show for
0:36:15 you. If you find our show valuable, please do us a favor and subscribe, rate and review it.
0:36:20 Even better, forward it to a friend, a big mahalo to you for doing this.
0:36:25 Welcome back to remarkable people with Guy Kawasaki.
0:36:30 You brought it up so I gotta ask, did you ever get baptized?
0:36:39 I never got baptized, despite as the story in the book goes, in a Sunday school in Nebraska
0:36:45 and the lessons on baptism. And I was like a seven-year-old or 10-year-old or something sitting
0:36:49 in the basement. And the teacher says, can anyone tell me what happens to people who aren’t baptized?
0:36:55 I’m the only kid obviously who’s not baptized. And some kids put it in his hand and the teacher says,
0:37:03 yes. And the kid says, they go to hell. And that’s what everyone believed. But I felt like it was
0:37:09 a bit of a setup, like you’re just like telling me I’m going to hell. And I was a little offended
0:37:14 initially. But then I took some time. I was like hot on it for a little bit. But I took some time
0:37:20 to think about it. I went away a little bit. And I realized kind of what harms do they see? Well,
0:37:26 they see me going to hell. And they love me. And they don’t want me burning in a pit of fire for
0:37:33 eternity. That’s dad. That’s a harm. And now that I see this concern, which is a little offensive,
0:37:38 actually as a way of trying to protect me and reach out to me. And so I see it in a new light.
0:37:42 And I’m still not baptized. I don’t think I’m going to hell. But I do appreciate their concern
0:37:45 for me, even if I disagree with their assumptions about the world.
0:37:55 I got to tell you, I love that story. I was baptized. I was probably 40 years old or something.
0:38:03 It’s never too late. I got baptized at 40. I took up hockey at 44. I took up surfing at 60.
0:38:08 You, Kurt, take your time. You got a lot of times. There you go. I’ve already had a couple
0:38:15 concussions. So I don’t know about hockey, but definitely surfing. You put together two words
0:38:23 that I have never seen put together in my life. And I want you to explain the concept of moral
0:38:30 humility. What do those two words have to do with each other? So there has been a big
0:38:37 trend, a movement towards intellectual humility, which is this idea of maybe I don’t know how the
0:38:44 world works. And so you can make people feel intellectually humble very easily if I say,
0:38:48 imagine a helicopter. You know how it works. And you’re like, yeah, I got it. There’s like
0:38:53 two rotors. It’s cool. And then I say, could you please draw out a helicopter in sufficient
0:38:58 detail that you could like explain how it works? And then you like get your pen and paper and you
0:39:02 start drawing the helicopter and you’re like, I don’t know. I have no idea. That creates some
0:39:08 humility or even like how does the toilet work, whatever that creates humility because you realize
0:39:13 intellectually, you don’t know everything that you thought you knew. Moral humility is that
0:39:20 understanding about a moral issue. And it’s harder one, I think, but I think you can still have it.
0:39:25 And I think it’s still important if you acknowledge that like, well, maybe there’s someone on the other
0:39:34 side of an issue about guns or abortion or taxes, that maybe has a bit of knowledge or a bit of
0:39:40 opinions that I might learn from that really changes the conversation, right? Because now they’re
0:39:45 not evil. They just they think differently. And even if I don’t agree with it, like I could still
0:39:50 learn a little bit from them. And every time I’ve had commerce, like I had an Uber ride with a
0:39:55 Christian nationalist, I don’t agree with Christian nationalism. But at the end of this 20 minute
0:40:00 conversation I had where I asked him a lot of questions, I connected, I invited, I validated.
0:40:05 I learned a lot about his position and I still don’t agree with it. But I still learned about
0:40:09 what he’s thinking about. And I appreciated that learning. So I think humility can help.
0:40:16 Okay, Kurt, you opened another door here. Tell me, what did you learn from a Christian nationalist?
0:40:21 I want to hear this. I want to be morally humble and learn this.
0:40:27 So a Christian nationalist was an Uber driver. He had his own business. He has his family.
0:40:31 And he was describing to me like what he thought about the state, what he thought about the church.
0:40:40 And I was surprised in a sense how tolerant he actually was of other faiths. He thought that
0:40:44 like Christianity should be the kind of national religion. It’s definitional in the idea of a
0:40:48 Christian nationalist. But he’s like, you can be Muslim. Maybe it’s not as prominent as being
0:40:54 Christian. And I didn’t realize that he would be supportive of that idea. And he also had like more
0:40:59 nuanced beliefs on the economy than I thought. He’s like, oh, I’m kind of libertarian, but not in
0:41:05 the following ways. And so I think I learned about the kind of economic and social complexity
0:41:10 in this view that I didn’t appreciate. And that made me kind of reflect on like, well,
0:41:15 what do I think of the connection between the state and the church? And there’s lots of tax
0:41:20 breaks for churches. And do I think that’s reasonable? I’m not sure. But he’s advancing
0:41:24 arguments for like why that’s meaningful to him and like the connection between the church and
0:41:28 supporting his family. So just make me think of there’s nuance around these issues. And it didn’t
0:41:34 change my core convictions. But it did make me realize that there’s unanswered questions you
0:41:41 could have about these topics. And did you give them five stars and a tip? I did give them five
0:41:46 stars. And here’s why. Here’s why. Okay. So we had a 20 minute conversation by Christian National.
0:41:52 I asked questions. I invited, I validated. And then as we’re driving up the ramp to the airport,
0:41:55 we start talking about abortion. And abortion really comes down to your like assumptions
0:41:59 about when life begins. And this may be a whole other conversation. But interestingly,
0:42:03 evangelical Christians used to think that life started at birth, not at conception,
0:42:08 which is like a whole turnaround in the 70s. But I digress. So we’re having conversation about
0:42:14 abortion. He’s very pro life. And he starts saying that anyone who’s pro choice is kind
0:42:19 of aligned with the Nazis. What? Yeah. And you mentioned it as well, it doesn’t take long for
0:42:25 someone to connect a moral position they disagree with to the Nazis, right? It’s so easy in discourse
0:42:31 today. It’s like natural, even because they’re like the paradigm of evil. And I teach a class
0:42:35 on this, I teach undergrads to have reasonable conversations about contentious issues. And I
0:42:42 say, Listen, we can’t have this conversation. If you’re going to compare half of America to the
0:42:46 Nazis, it’s just not acceptable, right? That’s not fair. We can’t have a good faith conversation
0:42:54 about about morality if this is going to happen. And he pauses. And he says, you know what, you’re
0:43:00 right. I’m sorry. I’m going to take that back. All I was trying to say around giving us some grace
0:43:06 here to explain his views, like the invitations and validation is that I’m worried that people
0:43:11 who don’t respect the sanctity of fetal life will slide down a slippery slope is kind of my words,
0:43:16 but like, you know, and other areas will neglect life. And that’s going to be bad for everyone in
0:43:21 society, especially kids are the vulnerable. And the idea of a slippery slope is a reasonable
0:43:28 philosophical argument that people on both sides have. And I could understand where that was coming
0:43:35 from. And five stars, he took it back. He had some moral humility himself, because he’s like,
0:43:39 that’s wrong. I’m sorry. But that only happened because I tried to understand and ask questions.
0:43:44 And so I think it’s a good example of how these things can, if not change people, at least allow
0:43:48 them to have respect for the other side in a way that probably didn’t happen for him.
0:43:54 I’m a college professor, he probably thinks I’m like the indoctrinating enemy. But now I think
0:44:01 there’s some respect that hopefully persists over time. And why didn’t this story make the book?
0:44:07 It happened after the book was written. Oh, bummer. But I have a sub stack on it if folks
0:44:12 want to see the sub stack. But it’s one of my favorite stories, because I took all the book
0:44:18 and I put it into practice. Yeah, it encapsulates the whole book, right? Yeah, exactly, exactly.
0:44:24 You never know what happens in an Uber. That’s another important lesson here. My last section
0:44:33 of questions is I want to explore the concept of being mentally exhausted. So what causes mental
0:44:41 exhaustion? Well, am I asking questions that are too easy? So in the book, I talk about this idea
0:44:48 of the exhausted majority. When it comes to politics, and then most people are, like you say,
0:44:53 mentally exhausted, we’re tired of the shouting, we’re tired of all the anger, we’re tired of the
0:45:01 division. And even if it’s a gut reflex, we just want to live our lives. And we just want to feel
0:45:08 like the country is working for us by us. And we disagree about that, obviously, about who’s the
0:45:13 most effective leader. But most people want cheaper gas prices, want cheaper milk prices,
0:45:18 want schools that flourish, want healthcare that helps them. And so I think we’re exhausted about
0:45:22 fighting about what the best way to do it is and we’re exhausted about the kind of loudest voices
0:45:27 on social media and on cable news. Even though I think we’re kind of like there’s like terrible
0:45:32 addiction to kind of social media and cable news, like we would be happier if we put it aside for
0:45:38 a little bit, took some time, rested, slept and connected with people on a human basis. And that
0:45:44 might make us feel less exhausted. So your recommendation is put social media aside, don’t
0:45:52 doom scroll and try to interact on a personal basis using CIV. Exactly. Exactly. And even if you use
0:45:58 social media, we have some research on this, that if you unfollow the most divisive accounts,
0:46:04 like that’s going to make you feel better. Just those like 10 people that are conflict
0:46:09 entrepreneurs is a nice word from Amanda Ripley, that people are making money by making you angry.
0:46:16 Put it aside and actually have conversations with your Uber driver, someone who bowls with you,
0:46:20 the guy who like runs the whole landscaping operation around here. He’s certainly more
0:46:26 conservative than I am. We talk about kids and we talk about how he’s doing, talk about like his house
0:46:30 and he’s planting trees. There’s so many things that we’ve in common that’s not about politics.
0:46:34 And then if we get to politics, we can like tiptoe around a little respectfully,
0:46:38 but those in-person conversations are much better than screaming into platforms.
0:46:43 You made the argument that you should unfollow the 10 most disruptive people.
0:46:50 I could make the case you should unfollow the 10 people that you most agree with, right? Because
0:46:58 put you outside the echo chamber. Yeah, I think that’s not a bad idea either. I think, again,
0:47:02 you should be striving, maybe this is a mindset for life, right? You should be striving to learn
0:47:07 in general. And if you’re just hearing anything that you already believe parroted back,
0:47:12 it’s not a way to learn. But with social media, I mean, the algorithm is so fine-tuned to induce
0:47:16 outrage. Like, I don’t even know if social media is the right place to learn. You can read reports
0:47:20 by think tanks. I think that’s probably boring for most people. But I think if you’re going to use
0:47:26 social media, follow like the US parks, cat memes, I don’t know, anything’s better than politics on
0:47:31 social media. In fact, I’m going to study the shows that people who follow politics on social
0:47:36 media and pay attention to virality metrics, like how many retweets, how many shares,
0:47:40 they have symptoms of PTSD that are above clinical threshold often.
0:47:47 Crazy. Like, that’s how bad it is to like politics and care about virality on social media. Don’t
0:47:53 do it. The guy who just time traveled 100 years, he’s going to be saying, yeah, 100 years ago,
0:47:59 we were worried about being killed. And now you’re worried about PTSD from reading posts on social
0:48:06 media. True. Although, you know, 100, I don’t know how many 150 years ago, right, sale and witch
0:48:11 trials, like all sorts of panic about witches. And they didn’t have social media, right? But they
0:48:15 had the Bible and they had like preachers saying that like Satan’s coming for them and things are
0:48:20 terrible. So they still had a kind of panic. It didn’t happen as fast as social media. But I think
0:48:26 it’s still a kind of like deeply human thing to be terrified of threats and to band together and
0:48:34 get outraged. Okay, this is my last question. And this last question may not go over well with your
0:48:41 relatives in Nebraska, but let’s just pretend that you become the chairman of the Democratic
0:48:47 National Committee. What would you do? I would tell stories of harm because I know that’s effective,
0:48:57 I suppose. I mean, I think deep in my heart, I think that the best way to run America is how it
0:49:03 was founded, which is as a pluralistic democracy. And it’s totally fair to have convictions. But I
0:49:08 think some of the best bills that were passed are bipartisan bills, like overhauling the criminal
0:49:15 justice system, to be kinder, to be less punitive, to be less racially biased, but also just give
0:49:21 individuals a second chance, no matter what their race is. And so I would try, I think, to build
0:49:26 broad-scale support. This is why I would never be elected, by the way, or chosen as the DNC
0:49:32 chairman, because I’m not like motivating the base enough. But I think big social movements pass.
0:49:40 Civil rights, women’s suffrage, right? Because there is bipartisan support. There’s understanding
0:49:45 on both sides. There’s allies that see, same with gay rights. It’s like John McCain’s daughter
0:49:50 is gay, and he could see the humanity in those folks, and then there’s this widespread support.
0:49:55 I think for moral progress to happen, I think it needs to be a more united effort than what we’re
0:50:04 seeing today. And as you said, and I misunderstood, quote, “Most conservatives generally want to protect
0:50:11 black men, and most liberals generally want to protect the police.” And right? So we’re probably
0:50:20 more similar than we are different, if we could just stop being outraged. Exactly. We are so
0:50:26 similar. If you take a law book and you flip to a random page, we probably agree on 99.99 percent
0:50:32 of those pages of law. And even when it comes to contentious issues, we generally do agree.
0:50:36 What pushes us apart is our partisan filters and what the media tells us. So if we just go
0:50:40 within ourselves and think about our own minds and our own concerns about harm and other people’s
0:50:45 concerns about harm, it brings us a lot closer together. All right. Kirk, thank you so much.
0:50:51 This has been such an interesting episode, and I learned a lot reading your book. I would love
0:51:00 to see all your theories and concepts get into practice so that we do have a more civil CIV
0:51:08 in-caps society. So thank you very much, Kurt. I appreciate this very much. And you are a remarkable
0:51:15 person. So we will be in touch, okay? Sounds good and appreciated. All right. So I’m Guy Kawasaki.
0:51:22 This has been the Remarkable People Podcast with the Remarkable Kurt Gray. And I hope you learned
0:51:29 some very tactical and practical ways to bring society together. I certainly did. And remember
0:51:34 his book is called Outraged, even though I think it should be called Harmless.
0:51:41 But don’t worry about that because he had to get the book published. And if publishers
0:51:48 want to call it outraged, so be it. And focus on the big issues, right? That’s right. Outrage cells.
0:51:56 Outrage cells. All right. Thank you very much. Have a great week, everybody. Mahalo and Aloha.
0:52:02 Thank you for my crew, Madison, Nismar, Shannon Hernandez, and Jeff C. That’s the
0:52:08 Remarkable People team behind me, Kurt. And we’re all trying to help people be remarkable.
0:52:15 This is Remarkable People.

Step into the fascinating world of moral psychology with Kurt Gray, professor of psychology and neuroscience at UNC Chapel Hill, who explores the psychology of outrage and moral understanding. As director of the Deepest Beliefs Lab and the Center for the Science of Moral Understanding, Kurt unveils how we can bridge America’s deepest divides through his groundbreaking CIV approach – Connect, Invite, and Validate. His new book ‘Outrage’ challenges us to understand both sides of moral conflicts and find common ground in our shared humanity.

Guy Kawasaki is on a mission to make you remarkable. His Remarkable People podcast features interviews with remarkable people such as Jane Goodall, Marc Benioff, Woz, Kristi Yamaguchi, and Bob Cialdini. Every episode will make you more remarkable.

With his decades of experience in Silicon Valley as a Venture Capitalist and advisor to the top entrepreneurs in the world, Guy’s questions come from a place of curiosity and passion for technology, start-ups, entrepreneurship, and marketing. If you love society and culture, documentaries, and business podcasts, take a second to follow Remarkable People.

Listeners of the Remarkable People podcast will learn from some of the most successful people in the world with practical tips and inspiring stories that will help you be more remarkable.

Episodes of Remarkable People organized by topic: https://bit.ly/rptopology

Listen to Remarkable People here: **https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827**

Like this show? Please leave us a review — even one sentence helps! Consider including your Twitter handle so we can thank you personally!

Thank you for your support; it helps the show!

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Leave a Comment