Tania Israel: The Science of Political Unity

AI transcript
0:00:14 I’m Guy Kawasaki and this is Remarkable People and we’re on a mission to make you remarkable.
0:00:19 Today we have a very remarkable guest, her name is Tanya Israel and she’s a professor
0:00:25 at UCSB and a very accomplished author and podcaster.
0:00:32 She has a new book out called Facing the Fractures and I really enjoyed her book and it has already
0:00:38 changed some of my actions in terms of social media consumption and news consumption.
0:00:47 So with no further ado, here’s Tanya Israel.
0:00:49 Welcome to the show, Tanya.
0:00:50 Thank you so much, Guy.
0:00:53 I’m delighted to be here.
0:00:58 We are delighted to have you and we can talk about eating ice cream in Santa Barbara.
0:00:59 What’s the name of that?
0:01:01 Great ice cream place down there.
0:01:09 Yes, I’ve stood in line many an hour at McConnell’s to get ice cream.
0:01:15 Let’s get into the topic because you’re writing about bubbles and polarization is so important.
0:01:18 So first of all, let’s get a basis here.
0:01:24 In reality, just how polarized is American society today?
0:01:27 There’s a couple of different ways of thinking about it.
0:01:34 One is that we are more polarized than we have ever been in recent history and that
0:01:36 is true.
0:01:43 We are farther apart in our opinions slightly than we ever have been, but that ideological
0:01:51 polarization, that difference between our opinions is only one aspect of how we’re divided.
0:01:57 But the thing that’s really most corrosive right now is what we call affective polarization,
0:02:02 which is the animosity and the hostility and the lack of trust that we’re feeling toward
0:02:05 people who are on the other side of the political spectrum.
0:02:09 And that has increased dramatically in recent years.
0:02:14 So that’s actually what’s, I would say, the bigger problem right now than the difference
0:02:20 in opinions in a democracy you would expect that we would have some variety of views.
0:02:26 So are you saying if I’m driving around my neighborhood and I interact with a random
0:02:32 person historically, are we farther apart than ever and more polarized than ever?
0:02:38 Or if we were to just sit down and not try to frame ourselves as liberal or conservative
0:02:43 or right or left, we would find out that we pretty much all want to make the world a
0:02:49 better place for our kids and we want to enjoy our life and accomplish things and have fun.
0:02:56 And that’s true, whether you’re right, left, purple, green, blue or red, what’s the reality
0:02:57 here?
0:03:04 The reality is that we actually have really distorted perceptions of people who are on
0:03:06 the other side of the political spectrum.
0:03:11 So we think that other people are more extreme than they actually are.
0:03:13 We think that they’re less informed.
0:03:15 We think that they’re irrational.
0:03:18 We might think that they are hostile.
0:03:21 And in reality, those perceptions are pretty distorted.
0:03:28 It’s what we call the perception gap between what the reality is of these three-dimensional
0:03:35 human beings who are out there in the world versus these very narrow stereotypes that we
0:03:36 have in our minds.
0:03:41 And pray tell, how did we get to this situation?
0:03:45 There are a number of contributors to it, but there are some things that have really
0:03:47 amped it up over time.
0:03:53 But I’m going to start a few years ago, which is when humans were evolving.
0:04:00 In human evolution, we lived in small groups of people or tribes, and it was really important
0:04:01 that we did that.
0:04:08 We couldn’t survive on our own, so we were protecting ourselves and each other from not
0:04:14 only the elements and saber-toothed tigers, but also from the other tribes who were in
0:04:17 competition with us for scarce resources.
0:04:23 So it was really a life and death kind of situation where it was important that we protected
0:04:29 ourselves and that we could identify those other tribes and stay away from them.
0:04:34 And that’s when our brains developed, and we are overlaying these old brains onto our
0:04:37 current political and social situation.
0:04:43 That is just some of the underlying mechanics of what’s going on in our brains.
0:04:47 You add to that things like the media and the news.
0:04:52 So the way the news is covered now, there’s 24-hour news channels.
0:04:58 In order to have enough to cover in 24 hours and to keep us paying attention, it’s so
0:05:04 important to keep us emotionally activated so everything is urgent and breaking.
0:05:09 And they’ve got to pose the other side as being a real danger so that we’re really
0:05:13 paying attention to the things that we think are so problematic.
0:05:18 So the media is focusing on the extremity and the hostility.
0:05:24 And then social media on top of that really spreads the vitriol so effectively.
0:05:30 And I’m so curious to hear how you are now using social media differently based on what
0:05:31 you read.
0:05:37 I have basically given up on social media.
0:05:42 My attitude was always very pragmatic and utilitarian in the sense that I wanted to
0:05:45 use social media as a means to an end.
0:05:53 So I wanted to use it as a promotional platform, a marketing platform, an evangelizing platform.
0:05:57 I never wanted to get my news from social media.
0:06:02 And I’m happily married, I have four kids, I got Madison, I got lots of close friends.
0:06:06 I’m not looking to get more friends or anything.
0:06:12 So I had a weird attitude for social media for my whole life and it’s getting more and
0:06:13 more reduced.
0:06:19 I tend to push as opposed to pull social media.
0:06:21 How do you use social media now?
0:06:24 I do have to pay attention to it somewhat because I talk about it.
0:06:28 And so I need to have a sense of what’s going on.
0:06:32 That’s causing me to look at it more than I would typically want to.
0:06:38 My favorite use of social media is to stay connected to friends and family, to know what’s
0:06:39 going on with people.
0:06:45 There are friends from elementary school who I would not be connected with at this point
0:06:47 if it weren’t for social media.
0:06:52 And I love seeing what’s going on in people’s lives when people are posting pictures of their
0:06:54 kids wearing Halloween costumes.
0:06:55 That’s wonderful.
0:07:02 I do enjoy seeing the Swifties on TikTok are so much fun.
0:07:04 So that’s always great.
0:07:11 But what I try to do is not be sucked into it and not be on it as a default.
0:07:15 And that’s one of the things I think people are sometimes just spending so much time on
0:07:23 it and interacting primarily with social media accounts rather than with full human beings.
0:07:28 And so one of the things I’ve done with my phone is I just created a little charging station
0:07:29 near my front door.
0:07:35 And so I’ll just plug in my phone there and not carry it around with me when I’m at home.
0:07:38 And it’s not counting my steps when I do laundry then.
0:07:42 But I realize, well, that’s OK because I’m still doing the steps even if they’re not
0:07:43 being counted.
0:07:47 So we’ve come to rely on our phones for so many things that you want to know what the
0:07:48 weather is like.
0:07:52 You look at your phone instead of stepping outside that there are things that we can
0:07:55 do to just change our relationship with it.
0:07:57 We don’t have to throw our phones away.
0:08:01 We don’t have to completely go off of social media.
0:08:06 But actually, many people have changed their relationships with social media and with their
0:08:11 phones to try to reduce some of the polarization and conflict that they’re feeling.
0:08:17 So you’re not alone in making some of those shifts and engaging with other human beings
0:08:19 in different ways.
0:08:26 Do you think that social media can truly change a person’s mind?
0:08:30 There’s this worst case scenario that I used to think one way.
0:08:35 Then I went on social media and now I’ve been completely reversed and oops, what reversed
0:08:38 me was a Russian bot.
0:08:42 Are the Russian bots talking to the people who already believe what they say?
0:08:47 So it’s just a big liberal echo chamber and a big conservative echo chamber, but we’re
0:08:50 not really changing anybody’s minds.
0:08:55 We sometimes think that when we see somebody post something and we’re like, “Oh, no, no,
0:08:56 no, that’s wrong.
0:09:01 If they only saw this study, if they only heard things framed in this way and recognized
0:09:05 the hypocrisy of that, then they would do a complete 180.”
0:09:07 And that’s just not going to happen.
0:09:13 All the research shows that not only are you not going to change somebody’s mind by countering
0:09:19 them on social media, but you’re actually more likely to push them farther away and farther
0:09:20 into the extremes.
0:09:24 So it’s really not so much about changing minds.
0:09:32 It might get people more committed to their views and not even to their views about an
0:09:38 issue, but what we see even more and what’s more likely to go viral is when people post
0:09:42 about the people on the other side of the issue.
0:09:48 And this is where I really see social media as problematic is that it’s increasing that
0:09:54 affect of polarization, which is really creating so many divisions in families and communities
0:09:58 and just in our democracy as a whole.
0:10:04 Personally, people do not want to live near, work with or have a family member marry somebody
0:10:06 who’s in a different political party.
0:10:08 So let me give you a hypothetical.
0:10:14 So let’s suppose you’re on social media and somebody posts a comment to your post and
0:10:21 you are, let’s say you’re advocating for LGBTQ+ rights and somebody says, “I believe, according
0:10:28 to the Bible, that being a lesbian or transgender is absolutely evil and we need to fix these
0:10:31 people, we need to retrain them, we need to redo them.”
0:10:35 And I mean, something just like a totally heinous comment.
0:10:36 What do you do?
0:10:41 Like me, I just block them, I just don’t even think twice, I just move on, but what do you
0:10:42 try to engage?
0:10:48 Do you try to change their mind or do you think, why am I trying to argue with a Russian bot?
0:10:49 Yeah.
0:10:54 First of all, I’m not so likely to engage with them on social media, but if it’s somebody
0:11:01 who I know, then I’m more likely to pick up the phone and call them and say, “Hey, I saw
0:11:06 you posted this thing, I’m interested to talk a little bit more about that and I’m curious
0:11:08 to hear more about where you’re coming from.”
0:11:11 I tend to think that’s the only thing you can post on social media that’s helpful is
0:11:17 to say, “Thanks for sharing that, I’m interested to hear more, let’s have a conversation.”
0:11:22 But actually trying to counter somebody on social media, that just is more likely to
0:11:24 get into more conflict.
0:11:27 I really like your work with bubbles.
0:11:30 So let’s get into bubbles for a little bit.
0:11:35 So first of all, how many bubbles does a person live in?
0:11:41 Do you live in multiple bubbles or are you in one bubble that just dominates your life?
0:11:46 I think people have different levels of bubble-ness in their lives.
0:11:51 Certainly what I was talking about with the tribal nature of politics, that political
0:11:57 identities have become so much more bubble-like and ironclad bubble-like, but really only
0:12:01 for people who identify with a political party.
0:12:07 There’s some really interesting work where more in common has looked at where are people
0:12:12 on the political spectrum and there are people who are more extreme on the left, more extreme
0:12:13 on the right.
0:12:18 There are traditional Democrats and Republicans, but most people are in what they call the
0:12:20 exhausted majority.
0:12:24 In the exhausted majority, not only are they not connected to a political party, they don’t
0:12:27 even want to talk to people who are.
0:12:32 They’re so tired of the vitriol, and the sad thing is that they’re really disengaging from
0:12:34 our democracy and from voting.
0:12:41 They are not in that kind of partisan bubble that we see more dedicated Democrats and Republicans
0:12:48 in, but it really depends on what your identities are and how strongly committed you are to
0:12:53 an identity and an identity in contrast to other identities.
0:12:56 You might be a member of the LGBTQ community.
0:12:58 You might have an ethnic identity.
0:13:04 There are a lot of different things that can shape how you see yourself, but it’s really
0:13:11 a question of whether or not you are seeing that as being against or in conflict or contrast
0:13:15 with another group or identity.
0:13:20 What do people do when they are in bubbles that conflict?
0:13:27 To take a real tactical example for me, I’m obviously in a very liberal Democratic bubble.
0:13:29 You and I are in the same bubble.
0:13:36 I go surfing, and I know I’m surfing with people who are anti-vax, quasi-QAnon, and
0:13:38 I enjoy surfing with them.
0:13:44 I have a surfing bubble that conflicts with my liberal Democrat bubble.
0:13:45 What am I supposed to do with that?
0:13:52 If you continue the bubble metaphor, should I be trying to combine two bubbles or make
0:13:57 one bigger bubble or burst one bubble or what’s the metaphor?
0:13:58 Sure.
0:14:02 It’s great that you have different ways of connecting with people because you’re able
0:14:08 to see the things that you have in common with folks, even who disagree with you very
0:14:10 strongly on some issues.
0:14:13 I think that’s really what we want to do.
0:14:16 Even beings are complicated.
0:14:22 We have more than one dimension, and often we’re reacting to people based only on one
0:14:27 dimension or not even a whole dimension based on a hat or a bumper sticker.
0:14:32 Being able to connect with people in whatever way we can, I think is a really positive thing
0:14:40 and allows us to get out of some of those stereotypes that we have of people who are
0:14:42 different from us in one particular way.
0:14:46 I think really what we want to get beyond our bubbles, that’s the name of my first
0:14:51 book is Beyond Your Bubbles, so I think that the bubbles actually do not serve us.
0:15:01 In your book, you go very deep into news and what the effect news has had and how to treat
0:15:03 news and read news.
0:15:11 Can you just explain to us the role of news in our bubbles and are creating these divides
0:15:12 between people?
0:15:14 What does news do to us?
0:15:15 Sure.
0:15:22 Like I said, news is showing these more extremist and violent and hostile representations of
0:15:27 political parties and people who are in the different political parties.
0:15:31 And we’re getting these narrow views, but also these skewed perspectives.
0:15:37 And journalists actually have skewed perspectives about the American public, thinking that the
0:15:41 American public is more extreme and more hostile than we actually are.
0:15:44 And so they’re sharing that back with us.
0:15:48 And I understand that journalists, they’re trying to tell a story and that’s a more interesting
0:15:53 story than interviewing people who say, “Oh, yeah, you know, I could kind of see things
0:15:55 both ways.
0:15:57 That doesn’t make as good a story.”
0:16:05 And so we have to recognize that when we’re watching the news, we’re watching a narrative.
0:16:10 They’re sharing a story with us and the information that supports that story.
0:16:15 Other people are choosing a different narrative that has different information to support
0:16:18 that, but it ties into our cognitive biases.
0:16:25 So we really gobble up these stories and these stories that sort of frame us as being informed
0:16:32 and correct and reasonable and caring and frame the other side as all the opposite of
0:16:33 that.
0:16:38 So it’s really this interaction of our minds and the media that create this situation.
0:16:44 And I know people who basically have the news on constantly in the background.
0:16:47 And that’s just not healthy for us.
0:16:52 It’s keeping us emotionally activated and it’s also skewing our perceptions of people
0:16:53 on the other side.
0:17:00 I think the best thing that we can do is stay informed, tune into news so that you can get
0:17:04 the information that you need to make the decisions that you are, to know what’s going
0:17:08 on in your community and the country and the world.
0:17:15 But then stop and go about the rest of your day and don’t have that as a constant feed.
0:17:20 Most things are not urgent and breaking and most things can wait until the next day for
0:17:22 us to hear about them.
0:17:30 So to get really tactical, how many minutes a day should I spend reading news or watching
0:17:32 the news?
0:17:37 That is such a great question for you to ask yourself.
0:17:45 To really think about what is it that I need to get out of this and how do I do that?
0:17:49 And actually thinking about it in terms of number of minutes is a great idea.
0:17:55 Maybe you say, “Okay, I’m going to go look at the news because I want to find out what’s
0:17:58 going on at climate week.
0:18:00 And so I want to learn about that.”
0:18:03 And I think that that’s probably eight to ten minutes.
0:18:08 I’m going to get what I need to know about that and maybe set a timer for yourself and
0:18:11 go find out about that and then stop.
0:18:16 So I don’t think it’s something that everybody’s going to have the same answer to that.
0:18:22 I think even more important is for each of us to be really intentional about how we are
0:18:32 engaging with news and social media.
0:18:40 I want to know, let’s suppose for a second that there is actually something that’s independent
0:18:48 of bubbles and independent of perception and there’s actually “the truth,” right?
0:18:56 So now, how do I separate someone who’s reporting the truth versus someone who has put a bubble
0:18:59 or a bias on top of that?
0:19:01 I’ll give you a tactical example.
0:19:06 I subscribe to Heather Cox Richardson’s newsletter.
0:19:13 And in my mind, she is analyzing and reporting the facts.
0:19:17 She’s not trying to bias one way or the other.
0:19:22 But I can tell you that someone who’s Republican or conservative would look at Heather Cox
0:19:25 Richardson’s writings and say, “She’s completely biased.
0:19:31 She’s a liberal Democrat, professor, academic, whatever.”
0:19:37 So how do I separate what’s true and what’s biased?
0:19:41 The challenge is that things can be both true and biased.
0:19:46 I think sometimes we’re thinking that the biggest problem out there is misinformation,
0:19:52 is deliberately wrong, things that aren’t factual, where people are trying to sway us
0:19:56 in certain ways by putting out falsehoods.
0:20:00 We’re actually exposed to genuine misinformation very little.
0:20:03 That’s very little of what we are consuming.
0:20:10 What’s a much bigger problem is that we’re each consuming a small slice of information.
0:20:12 And that has a bias.
0:20:18 There are great resources like allsides.com is a wonderful website where you can go and
0:20:23 they’ve identified what are the bias of these different sources, but they’ll take a particular
0:20:28 story and you can see how it’s being covered from different news sources.
0:20:35 And I find that tremendously helpful because I think that this question of what’s outside
0:20:40 of my awareness, like what am I missing, is a really important question.
0:20:44 So that’s something I think we could talk about a little bit more.
0:20:50 But first, I want to give an example about how things can be factual, but at the same
0:20:51 time biased.
0:20:56 In the fall of 2021, we had vaccines available.
0:20:58 People were getting vaccinated.
0:21:04 But the story that was out there in the media, almost exclusively about vaccination rates,
0:21:08 was about the difference between Democrats and Republicans.
0:21:13 So what most people knew about that is that Democrats were getting vaccinated a much higher
0:21:18 rate around 90% compared to Republicans who were getting vaccinated about 61%.
0:21:23 So that’s the story that really stuck for everybody.
0:21:29 What’s interesting is that at least by January 2022, looking at the CDC website, which I
0:21:36 did because I wrote a piece on this, 86% of Americans had gotten at least one dose of
0:21:40 the COVID vaccine, and I was like, “What?
0:21:44 This is not at all the story that’s being told.
0:21:47 The story that’s being told is about a conflict.
0:21:48 It’s about the differences.
0:21:50 It’s about anti-vaxxers.”
0:21:55 A very small percentage of the people who weren’t getting vaccinated at all were anti-vaxxers.
0:21:58 But that was so much what the story was about.
0:22:05 So people were really solidifying their identities together with their choices about getting
0:22:06 vaccinated.
0:22:14 And I thought, “Wow, what if the media’s story was 86% of Americans got at least one
0:22:18 dose of the COVID vaccine in a time of crisis?
0:22:19 We came together.
0:22:27 We largely made the same choices to protect our health and our communities and our families.”
0:22:34 And that story could have made a big difference in the pandemic if people didn’t feel like
0:22:39 they needed to not get vaccinated in order to be consistent with what they thought other
0:22:41 people in their party were doing.
0:22:47 So the media really contributed to the divisiveness there, not by telling falsehoods.
0:22:50 Everything that they said about vaccination rates, that was accurate.
0:22:51 That was true.
0:22:58 But there was another story that could be told with also accurate information that’s a very
0:23:01 different kind of narrative.
0:23:09 How come some marketing genius at some of these publications, or for that matter, political
0:23:15 consultants, how come they don’t say, “There’s this thing called the exhausted majority and
0:23:20 we’re not going to change the opposition’s mind and they’re not going to change our mind,
0:23:22 but there’s this group in the middle.
0:23:31 So let’s focus on the exhausted the majority as a journalistic site or as a political party.”
0:23:36 But I find it very difficult to imagine that there’s going to be efforts targeted towards
0:23:38 the exhausted majority.
0:23:40 Are they just such lousy marketers?
0:23:44 They cannot even conceive of serving that market.
0:23:50 The exhausted majority watches, well, they don’t pay as much attention to news and politics
0:23:52 as people on either end do.
0:23:57 But when they do watch the news, they’re watching network news, which like we used to do in
0:23:58 the olden days.
0:24:05 They’re watching CBS, NBC, ABC, which you might know doesn’t show news 24 hours a day.
0:24:11 It shows news in a small period of time and you can get the basic information about that.
0:24:16 They don’t spend all of their time with commentary, with opinion and shaping that.
0:24:21 So the exhausted majority is much more likely to watch network news than to watch cable news,
0:24:23 which tends to focus on a bias.
0:24:28 They tend to craft a narrative that appeals more to the extremes.
0:24:37 You bring up ABC, NBC and CBS and I can tell you that I cannot remember the last time I
0:24:43 tuned into any of those three networks, like literally like who watches live TV anymore.
0:24:51 I may be a proof point that I get news from YouTube and I’ve subscribed to certain YouTube
0:24:52 channels.
0:24:56 I’ve not watched network news in decades at this point.
0:24:57 Yeah.
0:25:02 And I think that the bubble people don’t tend to watch that.
0:25:08 We’re more likely to get our news from these more frankly biased sources.
0:25:12 And I don’t think that’s the worst thing in the world.
0:25:21 I still do that, but I also am aware of the narrative that I am being exposed to.
0:25:27 And I’ll also go seek out other narratives or at least see what’s outside of that awareness
0:25:33 so that I don’t, you know, the thing that we love about our news sources is that they
0:25:38 keep telling us that we’re right, that we’re right in the way we’re thinking about something
0:25:42 and we’re right in the way we’re thinking about the people on the other side.
0:25:43 We love that.
0:25:45 We’re so attracted to that.
0:25:50 And so it’s so important to recognize the ways that we might be right, but there might
0:25:57 be another way of framing this or another way of viewing it so that we don’t then demonize
0:26:02 people who aren’t viewing things exactly the same way that we are.
0:26:09 I want to thank you for bringing the site all sides to my attention until I read your
0:26:10 book.
0:26:13 I didn’t have any idea that it even existed.
0:26:16 And I went to it and I found it very useful.
0:26:21 In fact, I have already bookmarked it, which doesn’t sound like a big deal, but if you
0:26:24 knew me, that is a big deal.
0:26:30 And as an illustrative story for you listeners, there’s this story going around that because
0:26:37 of the anti-abortion laws in Georgia, two women have died because the doctors were afraid
0:26:41 to give them treatment because they might have got arrested because of the anti-abortion
0:26:42 laws.
0:26:50 And all sides, they showed the other side of the actual situation of those women.
0:26:57 And it’s not exactly as simple as it’s because of anti-abortion laws that they died.
0:26:59 It’s much more complex than that.
0:27:03 And I looked at that, I said, I had no idea that that’s true.
0:27:06 So I think it’s a very valuable site.
0:27:07 Oh, that’s great.
0:27:09 I’m so glad.
0:27:10 Yeah.
0:27:17 I even downloaded the app, which also takes an act of God for me to do that.
0:27:24 So in this modern world, do you think that AI can play a role in providing us better
0:27:25 news?
0:27:32 Because theoretically, AI has no bias or no emotional attachment to one side or another.
0:27:35 So do you see a role for AI in this?
0:27:41 AI only has the bias of whatever material it’s drawing on, right?
0:27:43 So that doesn’t mean there’s no bias.
0:27:49 It just means that it’s drawing on, I would say, maybe multiple different viewpoints and
0:27:50 narratives.
0:27:52 I’m not an expert in AI.
0:27:58 I use it in a very limited fashion for things like brainstorming titles for articles and
0:27:59 things.
0:28:01 It’s good for some creative things coming up with that.
0:28:07 I am not an early adopter, so I’m going to wait a little while to see what the robots
0:28:08 come up with.
0:28:11 Tonya, I’ll give you a little thing you can try.
0:28:21 Go to chat GBT and ask the question, “Should we teach American youth the history of slavery
0:28:22 in America?”
0:28:28 And you read what chat GBT says, and I think you will see that it’s quite reasonable
0:28:29 and sentient.
0:28:34 It was eye-opening for me to do that, so try that.
0:28:38 And then you might have a slightly different opinion of AI.
0:28:40 Okay, well, I appreciate that.
0:28:44 I do ask it sometimes because there are things that I’m like, “Yeah, I don’t actually know
0:28:47 what the other side would say about this.”
0:28:49 So I’ll say, “What are the arguments for this thing?”
0:28:55 And it can share some of that, but I think it can help to round that out as all sides
0:28:59 are other kinds that we’ve got ways of gathering that information.
0:29:03 I think it’s great when we can use AI as a tool for that as well.
0:29:10 So I would say that, myself included, we are so deep into our bias that we don’t even
0:29:12 know we’re biased.
0:29:16 How do we recognize our cognitive bias?
0:29:23 I think that the first thing is just to recognize that cognitive biases are a very human thing.
0:29:30 And it’s not something where we need to deny that we have some prejudice, that we have
0:29:32 some narrowing of views.
0:29:36 But I think it’s useful to know about three cognitive biases in particular, and it might
0:29:39 help to recognize them to know about these.
0:29:45 The first is confirmation bias, which leads us to focus on information that supports what
0:29:50 we already believe to be true and ignore or dismiss information that conflicts with our
0:29:51 existing beliefs.
0:29:52 So that’s what’s going to make us feel like.
0:29:58 I’m right and informed, and you are wrong and ignorant.
0:30:04 There’s also one called naive realism, where we think that we are basing our decisions
0:30:11 on logic and rationality, and we think that the other side is illogical and they are biased.
0:30:18 And then the third, and I think possibly the most corrosive, is motive attribution asymmetry,
0:30:25 which is where we see ourselves on our own side as being motivated by caring and love.
0:30:30 And we see the other side as motivated by aggression and hate.
0:30:37 And we see this so much in terms of how we believe that the other side is motivated.
0:30:42 They are trying to take away my rights, they’re trying to restrict people’s behavior.
0:30:48 And both sides think that, but we also all think that our own side and our own reasons
0:30:52 are, “Oh, you know, but we care so much.”
0:30:59 And recognizing that in ourselves, I think, can be really helpful just in terms of not
0:31:06 seeing the other side in all these negative ways, but also not elevating our own motives
0:31:10 and our own views to that level of self-righteousness that we sometimes hear.
0:31:16 So do you have really tactical and practical things that we can do to control these three
0:31:17 things?
0:31:22 I think one of the things is in our relationship with not just social media, but when we hear
0:31:27 ourselves and other people spouting opinions about people on the other side, I think we
0:31:32 can not spread that, and we can correct for that some.
0:31:38 So that if we’re seeing this vitriol out there on social media, don’t share it, don’t engage
0:31:40 with it.
0:31:45 And if you’re wondering, “Gosh, what could their motivation be other than trying to
0:31:50 destroy me and my side,” then that’s something that you can go look for, and that’s maybe
0:31:54 that’s where AI can be helpful just in giving some perspective.
0:32:00 Or if you’ve got people in your life who have a different perspective, ask them to fill
0:32:01 that in a little bit.
0:32:06 Say, “I’m trying to figure out what could motivate people to do this, and can you help
0:32:08 me to understand where you’re coming from?”
0:32:09 Okay.
0:32:17 That $64 million question I want to ask you is, “How do people increase empathy for the
0:32:18 other side?”
0:32:21 Yeah, such a good question.
0:32:28 There’s this fascinating study about empathy where they shared with participants this scenario.
0:32:32 There’s a speaker on campus, and they’re pretty inflammatory.
0:32:38 They say very hostile things about the other political party, and this room is packed full
0:32:39 of people.
0:32:43 There’s all these people attending, but then there’s also all these protesters.
0:32:49 The protesters get really riled up, and they manage to shut down the speaker, but not before
0:32:54 one of the protesters accidentally waxed one of the attendees in the head with a sign.
0:32:58 When I share this on college campuses, people are like, “Were you here last week when this
0:32:59 happened on our campus?”
0:33:04 This is not an unfamiliar kind of scenario, but then they ask people, “What do you think
0:33:05 about this?
0:33:07 How do you feel about the speaker being shut down?
0:33:13 Is it good because they were spouting hate speech, or is it bad because it’s a violation
0:33:14 of free speech?
0:33:17 What do you think about this person who got hit in the head with a sign?
0:33:22 Do you feel like, well, they deserved it because they were part of this following, or are you
0:33:23 worried?
0:33:24 Are you calling 911?
0:33:26 What are you doing there?”
0:33:32 It turns out that, obviously, it makes a difference whether you agree with the speaker or not.
0:33:38 If you’re on their side or not, then you might have a difference of view about those things,
0:33:47 but that only makes a difference if you also are an empathic person, because empathic people
0:33:52 are actually much more biased in a polarized situation like this.
0:33:59 If you are empathic, you really want to censor and shut down the speaker from the other side.
0:34:04 If you are biased, then you might not care that somebody got hit in the head with a sign
0:34:06 if they’re on the other side.
0:34:13 It’s this weird thing where empathic people are super empathic toward people on their
0:34:19 own side, but seem to show very little concern for those on the other side.
0:34:25 Part of that is because they’re seeing those people on the other side as doing harm to
0:34:28 the people who they’re trying to protect.
0:34:35 We’re really empathic, we’re feeling very, very protective, but only of our own side.
0:34:36 What do we do about that?
0:34:39 Well, the first thing is to recognize when we’re doing that, to recognize that if we’re
0:34:47 empathic people and we’re caring so much that we might not be caring very broadly.
0:34:53 To try to correct for that in ways, I offer some tools.
0:34:58 There’s a guided imagery that’s in my book, but you can also find it on my website on
0:35:03 tanyaisrael.com that’s about cultivating compassion for people who are on the other
0:35:07 side of the political spectrum.
0:35:13 It’s based on a loving-kindness meditation that I learned from Sharon Salzberg’s recordings,
0:35:19 but I’ve adapted it to really focus on how do we open our hearts to people who are in
0:35:22 a different place than we are politically?
0:35:28 If I caught that right, in my mind, intellectually at least, I would have said if you are an
0:35:33 empathic person, it means that you have empathy for both sides.
0:35:39 But did you not just say that you can be empathic only for your side?
0:35:46 So isn’t that a contradiction in terms if you say I’m an empathetic person, but I can
0:35:49 only be empathetic for one side?
0:35:50 Isn’t that an oxymoron?
0:35:59 I think it just speaks to the fact that empathy alone is if we have that connection to one
0:36:06 side or the other, then we’re going to have to be really careful to not narrow our empathy
0:36:08 in that way.
0:36:15 I think about pairing empathy with equanimity, which is trying to treat all beings in a similar
0:36:17 way.
0:36:24 We think that when we are really leaning into how caring we are and how much concern that
0:36:29 we have about people’s well-being, we might just check ourselves.
0:36:36 And notice if we’re only feeling that toward a certain group of people who are like us
0:36:41 or who we see ourselves as trying to protect, and that maybe there’s some empathy that we
0:36:46 can extend toward people who are in disagreement with us.
0:36:58 In your position on the faculty of an academic institution, are you saying that if UCSB invited
0:37:07 a grand wizard from the Ku Klux Klan or an American Nazi party or somebody who’s anti-LGBTQ+
0:37:13 as a speaker, you would support that in the spirit of learning about each other’s sides
0:37:17 and empathy for each other’s sides and fostering communication?
0:37:24 Or is there a line beyond which you just will not go?
0:37:30 What I would primarily support is building a foundation on our campuses of curiosity
0:37:32 and connection.
0:37:39 I think that if we have some of those college campuses are about learning, and what can
0:37:46 we be learning, and what can we be teaching that will prepare our students, one of the
0:37:51 things we’re trying to do is to protect vulnerable populations from harm.
0:37:56 And I think that that’s a very important thing that we’re trying to do to address historical
0:37:57 injustices.
0:38:03 But we also are trying to create a culture of inquiry and discovery.
0:38:09 We’re trying to prepare students to be future citizens of the world and of our democracy.
0:38:13 And we’re also trying to promote wellness, which includes resilience.
0:38:18 So if you’re trying to do all of those things together, what you want to do is create a
0:38:20 sense of community.
0:38:26 You want to help people to learn those kinds of skills in terms of how we’re dealing with
0:38:30 media and social media, and to be able to reflect on that critically.
0:38:35 We want to be able to recognize our cognitive biases, and we want to be able to interact
0:38:38 with people who have different views than we do.
0:38:45 So if we do that, then when a controversial speaker comes to campus, we’ve got a community
0:38:53 that can hold those conversations with each other around that so that it’s not all about
0:38:57 how we’re interacting with each other in a heated moment.
0:39:02 But it’s really about that foundation that we’ve already laid so that we know how to
0:39:05 cope with polarizing situations.
0:39:11 So what do you think when… let’s say that you’re a billionaire, and you’re a large
0:39:18 donor to a private institution, and you learn that that institution has invited someone
0:39:24 from the PLO or Palestine or something to come speak on your campus, and then you make
0:39:28 a big deal that, “I’m not going to donate $100 million to you anymore because you brought
0:39:30 someone from Palestine to speak.”
0:39:34 What would be your attitude when you hear something like that?
0:39:42 I would probably want to educate our donors about what we’re doing on the campus more
0:39:51 broadly and how we are trying to build these conversations with people, among the students
0:39:56 about various issues, and try to frame things within a certain perspective.
0:40:01 And if I can’t do that, if I can’t justify it in some way, then that reminds me that
0:40:04 I need to think about these things more.
0:40:10 But if you look at, for example, Pan America has some really fantastic guidelines about
0:40:13 how to deal with free speech on college campuses.
0:40:18 And I tend to lean toward where they’re coming from about how do we promote free speech,
0:40:24 and how do we maybe counter conflictual speech with just more free speech?
0:40:29 So rather than shutting something down, how do we bring in more conversation about it?
0:40:35 And one of the challenges right now is that we’re sometimes really shying away from addressing
0:40:39 an issue because it’s so conflictual and because people are so worried about being
0:40:42 canceled, about putting a view out there.
0:40:49 Certainly all the conflict on campuses around Israel Gaza has really shown us where some
0:40:54 of the cracks are in terms of our ability to deal with conflict.
0:40:59 And so I think it just speaks to how much more we need to be doing.
0:41:08 And is there any concern that empathy for the other side may legitimize the other side
0:41:12 and you had no intention of doing that?
0:41:18 Is there a danger that empathy versus the opposition can legitimize the opposition,
0:41:21 or it’s just an unintended consequence?
0:41:26 I think we need to make a distinction between public statements, like what we’re putting
0:41:32 out there publicly, between what we’re doing in conversation with other human beings and
0:41:34 what we’re doing internally.
0:41:40 I think cultivating empathy, taking on different perspectives, doing all these things on an
0:41:45 individual level will certainly help to guide us on those other levels.
0:41:48 But it’s not always the same thing.
0:41:52 People sometimes think that when they’re in dialogue with another person, it should look
0:41:53 like a debate.
0:41:58 It should look like a sort of public thing where they are pulling out all of these slogans
0:42:01 and stats and framing things in certain ways.
0:42:08 You know, in a debate, we never think that the other team is going to hear what we have
0:42:12 to say and go, “Oh my gosh, I never thought about things that way.”
0:42:16 And they’re going to change their minds and come and stand behind our podium.
0:42:21 We don’t think that’s going to happen because debate is a public kind of venue.
0:42:24 It’s not about trying to convince the other team.
0:42:29 It’s about trying to convince an audience, the judge, the voters, the people attending.
0:42:36 So we sometimes think about everything that we’re doing in that kind of public context.
0:42:40 But what I would encourage us to do is start at the individual level.
0:42:43 Let’s start thinking about what do we want to consume?
0:42:48 What do we want to build within ourselves in terms of resilience and intellectual humility
0:42:50 and compassion?
0:42:55 And then that can guide us as we think about engaging with other human beings.
0:43:00 But I don’t think we should think about everything that we do as a public message.
0:43:02 That’s one of the problems with social media.
0:43:08 It’s all about hot takes and putting your views out there in this very one directional
0:43:09 way.
0:43:16 And we need to get better at that engagement with other humans where there might be some
0:43:19 difference. We might rub up against each other in a way.
0:43:22 And how do we actually navigate that?
0:43:26 We’re not necessarily doing that as much and teaching that as much.
0:43:28 I think social media has something to do with it.
0:43:35 But I also think COVID really separated us out in ways that we need to work on coming
0:43:37 back together.
0:43:39 Up next on Remarkable People.
0:43:44 So it’s so important that if we are trying to shift somebody out of that, that we make
0:43:50 space for them to lean into where we are and to embrace and welcome them into that.
0:43:55 So rather than isolating people, think about, okay, I want to hear more about that.
0:43:56 I’m going to respect that.
0:44:01 I’m going to reflect that back so that that person feels like I understand them.
0:44:02 I care about them.
0:44:04 I connect with them.
0:44:33 And it might not shift their view.
0:44:40 Welcome back to Remarkable People with Guy Kawasaki.
0:44:49 So let me ask you very directly, what are your tips to engage across a divide between
0:44:54 two people or two parties or two perspectives?
0:44:55 Absolutely.
0:45:01 So the first thing that we need to do is to approach things with a genuine curiosity and
0:45:05 desire to know where someone else is coming from.
0:45:11 When I ask people what is it that makes you want to connect with somebody across the divide,
0:45:16 they say often there’s somebody in my life who I want to maintain a connection with,
0:45:19 but we’re having trouble doing that because of our different views.
0:45:22 Some people will say they want to persuade or convince someone else.
0:45:25 Some people want to heal the divide or find common ground.
0:45:30 And some people say I simply cannot fathom how people can think or act or vote as they
0:45:31 do.
0:45:37 No matter which of those things motivates you, the thing that you need to do to accomplish
0:45:44 that goal is to create a warm, caring connection with somebody where you are trying to understand
0:45:45 where they’re coming from.
0:45:50 We often think that what we should do is to share with somebody else where we’re coming
0:45:53 from and get them to understand us.
0:46:00 But whether you’re trying to repair a rupture in your relationship or you’re trying to persuade
0:46:03 somebody, you really need to understand where they’re coming from.
0:46:10 And you need to be respectful of them because otherwise there’s no way that they’re even
0:46:14 going to want to engage with you or care what you have to say or find common ground or even
0:46:18 share with you their deeply held views and values.
0:46:20 I’ll also say more about how do we do that.
0:46:24 The first thing that we can do is give somebody uninterrupted time to speak.
0:46:27 We really need to listen to other people.
0:46:33 And as we’re listening, rather than in our minds thinking about what’s the thing that
0:46:38 we’re going to say to counter that, we should think, “Okay, what is it that they’re telling
0:46:39 me?”
0:46:43 And to try to really understand and then to share back with them, to reflect back, “Here’s
0:46:44 what I hear you saying.
0:46:46 Did I get that right?”
0:46:51 And make sure that we’re really understanding them and then encourage them to say more rather
0:46:56 than shutting them down and say, “I would love to hear more about how you came to that
0:47:01 view or tell me about what that experience meant to you.”
0:47:05 So when we ask people about their meanings and their experiences, rather than saying,
0:47:11 “Where did you get that information?” rather than challenging them, it draws them out more
0:47:15 and creates more of that connection and helps us understand better.
0:47:21 And then when we do share our own views, it’s so much better to share our stories rather
0:47:23 than sharing those stats and slogans.
0:47:30 So if we talk about how we formed our views, was there an experience or a person who really
0:47:33 shifted the way we see things about that?
0:47:40 All of those things are much better to share than research studies and information or the
0:47:42 latest op-ed that we saw.
0:47:49 So it’s really a very human interaction where we’re trying to share and hear in really nuanced
0:47:51 and multi-dimensional ways.
0:47:57 You are the second person to express this theory that I interviewed someone named Mark
0:48:02 Labrton and he was the CEO or president of Fuller Seminary.
0:48:06 It’s kind of the Harvard Business School of Seminary in Pasadena.
0:48:12 And we got into the discussion about evangelical Christians and, you know, like Mark, I’m having
0:48:15 a hard time understanding where they’re coming from on this.
0:48:21 And he said, “Guy, instead of asking them what you believe or why you believe, you ask
0:48:25 them how they came to believe what they believe.
0:48:28 And from that, you’ll open up a conversation.
0:48:34 It’ll be a much better platform for you to bridge this divide with them.”
0:48:39 You are the second person to tell us that and I think it’s such valuable advice.
0:48:40 Yeah, absolutely.
0:48:44 And one of the things also is just going into these conversations.
0:48:48 Some of the examples that you’ve brought in and that people often bring up, they’ll always
0:48:54 say, “How about if somebody believes in QAnon or some other conspiracy theory, or if they’re
0:48:59 a neo-Nazi or a white supremacist or a January 6 person?”
0:49:05 One of the things to recognize is that our minds automatically are going to the most
0:49:10 extreme examples of people who are different from us politically.
0:49:15 But the thing to know is that most people are not conspiracy theorists.
0:49:18 Most people are not at the farthest extremes.
0:49:24 And if we approach everyone as if they are, then we’re going to miss a lot.
0:49:28 Then we’re going to be bringing in some assumptions that might really push them away.
0:49:35 So I think that just recognizing those examples that just come up for us immediately in our
0:49:40 minds gives us a sense of how some of that bias is already playing into our thinking.
0:49:46 What I’m having difficulty wrapping my mind around here is that there’s nobody who believes
0:49:51 in stories more than I do.
0:49:56 And if you read my books, every point I make is backed up with a story.
0:50:00 So I believe in stories, I swear to you.
0:50:06 On the other hand, you just mentioned the abuse of stories because the stories that
0:50:10 are told are the extreme examples.
0:50:16 So how do I put those two things together that stories are powerful but stories are
0:50:19 also so easy to abuse?
0:50:23 What do I do with this conflict in my mind?
0:50:29 I think it’s completely about the difference between the stories that are told about people
0:50:33 and the stories we tell about ourselves.
0:50:39 And so if we can really lean into our own experiences and our own stories and reflect
0:50:46 on our own stories so that we can share more about ourselves than just the opinions that
0:50:49 we are getting from the media.
0:50:54 But if we’re really going into our own stories of our values and our experiences, I think
0:50:58 those are such rich things for us to share with each other.
0:51:02 And to really think about it in the context of those human interactions.
0:51:05 Let me see if I interpreted this right.
0:51:10 Let’s suppose that you encounter a person who is completely anti-vaxx, just absolutely
0:51:16 refuses to get vaccinated because he or she believes that vaccination kills people.
0:51:21 So instead of asking them, what do you believe about vaccination or asking them, why do you
0:51:24 believe that vaccination kills people?
0:51:26 You ask them, how did you come to that?
0:51:33 And they say, well, my great grandfather was in the U.S. Army and he was in this vaccination
0:51:34 study.
0:51:35 He didn’t even know he was in the study.
0:51:40 And the army had this experiment and the vaccination literally killed him.
0:51:43 So that’s why I am anti-vaxx.
0:51:48 I would say if I heard that from somebody, I would have a completely different interpretation
0:51:49 of why they’re anti-vaxx.
0:51:56 It’s not because they’re stupid or whatever, but it’s reasonable to be anti-vaxx and something
0:51:57 like that.
0:51:58 Yeah.
0:52:00 I always go back to what’s my goal in this conversation.
0:52:05 If I just want to get some insight, wow, then I’ve gotten some amazing insight by opening
0:52:09 up that door and hearing more about where they’re coming from.
0:52:14 If I am trying to persuade or convince them, then it’s important for me to recognize not
0:52:21 only where they’re coming from, but asking them to give up that belief is a big deal.
0:52:28 It’s not just shifting an opinion about something, but they’ve got an identity maybe built around
0:52:29 that.
0:52:36 They also, very often, we have communities built around our beliefs on things.
0:52:41 And so we might also be asking them to give up their community and maybe a place that
0:52:42 they have in that community.
0:52:47 So it’s so important that if we are trying to shift somebody out of that, that we make
0:52:53 space for them to lean into where we are and to embrace and welcome them into that.
0:52:59 So rather than isolating people, think about, okay, I want to hear more about that.
0:53:00 I’m going to respect that.
0:53:04 I’m going to reflect that back so that that person feels like I understand them.
0:53:06 I care about them.
0:53:08 I connect with them.
0:53:14 And it might not shift their view, but it might give them space to maybe at some point
0:53:21 expand their view or consider also a different view as being valid, even if they don’t change
0:53:22 their own.
0:53:27 So I think even if we’re talking about people who are in a more extreme place on any of
0:53:31 these things, it’s still the same kind of approaches that we can take.
0:53:38 But I love that example of how somebody’s story can not change our mind about our own
0:53:44 vaccinations, but might shift our understanding of people who take a different stance on it.
0:53:49 Tanya, I have asked everything I want to ask and I just want to thank you, not only for
0:53:56 being on this podcast and helping me and Madison and our listeners become more remarkable,
0:54:00 but I just want to thank you for the work that you’re doing to make the world a better
0:54:06 place because clearly, you are trying to make the world a better place and succeeding.
0:54:09 And so I just want to give you this last section.
0:54:14 So just talk about your book, talk about why people should read your book.
0:54:18 The more people who read your book, I would argue the better the world would be.
0:54:22 So go for it, Tanya.
0:54:23 Absolutely.
0:54:27 The first book that I wrote, Faith, Beyond Your Bubble was about how to have dialogue.
0:54:30 And that was really helpful for people who were ready to talk to other people.
0:54:36 But so many people have said, “I am feeling so distressed about politics and dialogue
0:54:38 is not the challenge that I’m facing.
0:54:42 I don’t want to talk to those people or I don’t have those people in my life, but I
0:54:50 am really struggling with how I’m just listening to the news raises my blood pressure and my
0:54:53 neighbor’s lawn sign is waging me out.”
0:55:01 This is a book to help us to face those challenges that we can really empower ourselves to do
0:55:06 something so that we can improve our health and well-being, which is really being affected
0:55:14 by all of this, that if we choose to engage with other people, we can do that effectively.
0:55:19 And we can also engage in our democracy and in our communities more broadly.
0:55:23 The last thing I want to say is, in this book, I also talk about something very important,
0:55:30 which is the bridging movement, because there are over 500 organizations working on strengthening
0:55:33 our social cohesion and our democracy.
0:55:35 And most people don’t know anything about that.
0:55:40 And so I think it’s really important if we want to be not only informed and empowered,
0:55:48 but even optimistic about how in this moment we can take this crisis and use it to motivate
0:55:55 ourselves to be able to face the challenges not only of political division, but just of
0:56:01 what we are facing in our modern society around social media and around community engagement
0:56:03 and a wide range of other things.
0:56:05 So that’s what this book is about.
0:56:13 I will say also, I just started a new podcast literally last week called “Ready to Be Strong.”
0:56:17 And if you’re like, “I don’t have time to read a whole book or listen to you narrate
0:56:21 your whole book,” but you want a taste of some of these topics, I would say, go take
0:56:27 a listen to “Ready to Be Strong,” and just trying to put resources out there that are
0:56:32 going to help people who are feeling really stressed about politics these days.
0:56:34 That’s a wrap, Tanya.
0:56:36 Thank you very much for being our guest.
0:56:43 And I hope people take your message to heart and implement what you say, because the world
0:56:44 would be a better place.
0:56:46 I’m Guy Kawasaki.
0:56:52 This has been Remarkable People with Tanya Israel, and I strongly suggest that you check
0:56:53 out her writing.
0:56:55 It’s really great stuff.
0:57:00 And one more plug for this website she called, it’s called All Sides.
0:57:02 I highly recommend that you look at that.
0:57:05 So that’s it for today.
0:57:06 Thank you very much, everyone.
0:57:15 See you next time, Mahalo and Aloha.

In this episode of Remarkable People, join Guy Kawasaki for an enlightening conversation with Dr. Tania Israel, a professor at UCSB and author of Facing the Fracture. Together, they explore the complex landscape of political polarization in America and how to bridge our deepening divides. Tania shares powerful insights on breaking free from our information bubbles, developing genuine empathy across differences, and engaging in meaningful dialogue. Discover practical strategies for managing news consumption, navigating social media, and having productive conversations with those who hold different views. Learn how to move beyond partisan conflicts toward authentic understanding and connection.

Guy Kawasaki is on a mission to make you remarkable. His Remarkable People podcast features interviews with remarkable people such as Jane Goodall, Marc Benioff, Woz, Kristi Yamaguchi, and Bob Cialdini. Every episode will make you more remarkable. 

With his decades of experience in Silicon Valley as a Venture Capitalist and advisor to the top entrepreneurs in the world, Guy’s questions come from a place of curiosity and passion for technology, start-ups, entrepreneurship, and marketing. If you love society and culture, documentaries, and business podcasts, take a second to follow Remarkable People. 

Listeners of the Remarkable People podcast will learn from some of the most successful people in the world with practical tips and inspiring stories that will help you be more remarkable. 

Episodes of Remarkable People organized by topic: https://bit.ly/rptopology 

Listen to Remarkable People here: https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827 

Like this show? Please leave us a review — even one sentence helps! Consider including your Twitter handle so we can thank you personally! 

Thank you for your support; it helps the show!

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Leave a Comment