The science of ideology

AI transcript
0:00:04 We all have bad days, and sometimes bad weeks, and maybe even bad years.
0:00:08 But the good news is we don’t have to figure out life all alone.
0:00:11 I’m comedian Chris Duffy, host of TED’s How to Be a Better Human podcast.
0:00:15 And our show is about the little ways that you can improve your life,
0:00:19 actual practical tips that you can put into place that will make your day-to-day better.
0:00:23 Whether it is setting boundaries at work or rethinking how you clean your house,
0:00:29 each episode has conversations with experts who share tips on how to navigate life’s ups and downs.
0:00:32 Find How to Be a Better Human wherever you’re listening to this.
0:01:05 A word you hear a lot these days is ideology.
0:01:12 In fact, you could argue this is the political term of the moment.
0:01:21 When Trump is denouncing the left, he’s talking about gender ideology or critical race theory or DEI.
0:01:28 When the left is denouncing Trump, they’re talking about fascism or Project 2025.
0:01:35 Wherever you look, ideology is being used to explain or justify policies.
0:01:44 And buried in all that is an unstated assumption that the real ideologues are on the other side.
0:01:51 Often, to call someone ideological is to imply that they’re fanatical or dogmatic.
0:01:57 Most of us don’t think of ourselves as ideological for that reason.
0:02:02 And if someone does call you an ideologue, you might recoil a little bit.
0:02:10 I mean, sure, you have beliefs, you have a worldview, but you’re not an ideologue, right?
0:02:15 Maybe this isn’t the best way to think about ideology.
0:02:20 Maybe we don’t really know what we’re talking about when we talk about ideology.
0:02:26 Is it possible that we’re all ideological in ways we don’t recognize?
0:02:33 And if we could see ourselves a little more clearly, might that help us see others more clearly?
0:02:39 I’m Sean Elling, and this is The Gray Area.
0:02:46 Today’s guest is Liorce McGrath.
0:02:50 She’s a cognitive neuroscientist and the author of The Ideological Brain.
0:02:56 The book makes the case that our political beliefs aren’t just beliefs.
0:03:01 They’re neurological signatures written into our neurons and reflexes.
0:03:08 That’s a fancy way of saying that how we think and what we believe is a product of the way our brains are wired.
0:03:16 To be clear, she isn’t saying that our beliefs are entirely shaped by our biology.
0:03:19 The point isn’t that brain is destiny.
0:03:27 But she is saying that the way our brains handle change and uncertainty may shape not only the beliefs we adopt,
0:03:30 but how fiercely we cling to those beliefs.
0:03:37 A book like this feels especially relevant in such a polarized moment,
0:03:46 because it’s hard to imagine bridging the divides in our society without understanding ourselves and each other much better.
0:03:50 And part of that understanding is knowing what’s really motivating us.
0:03:57 Lior McGrath, welcome to the show.
0:04:00 Maybe I will ask you to say that again.
0:04:01 Oh, did I mess it up?
0:04:02 I knew I was going to do it.
0:04:04 I knew I was going to do it.
0:04:06 I was in my head.
0:04:08 All right, let’s try to get home.
0:04:11 Liorce McGrath, welcome to the show.
0:04:12 Great to be here.
0:04:14 I totally got it right that time, right?
0:04:15 Yeah, you did.
0:04:15 You did.
0:04:15 You did great.
0:04:16 All right.
0:04:23 This is a very interesting book, full of a lot of provocative, compelling claims.
0:04:26 And we are going to get to all of that.
0:04:34 Before we do, I am just curious, what drew you to this question?
0:04:36 Why ideology?
0:04:40 Well, in many ways, ideology is all around us.
0:04:43 But often, we don’t really know what it is, right?
0:04:49 We kind of say, well, an ideology is just a system of beliefs or just a kind of insult.
0:04:54 We used to kind of demean someone who believes something totally different to us, which we think
0:04:55 is wrong.
0:05:01 And I was really interested in delving into what it means to think ideologically and what
0:05:07 it means for a brain to really be immersed in ideology and whether that’s a kind of experience
0:05:13 that can change the brain, that certain brains might be more prone to taking an ideology and
0:05:16 kind of embracing it in an extreme and intense way.
0:05:23 And that’s why in the book, in The Ideological Brain, I really delve into this question of
0:05:29 what makes people gravitate towards ideologies and what is it about some brains that makes them
0:05:30 especially susceptible?
0:05:36 And in doing that, I’m really interested in thinking about ideology in a more precise way
0:05:41 than we typically think about it as, which is not just as a broad system of beliefs floating
0:05:48 above our heads in an ambiguous way or something that’s purely historical or sociological, but
0:05:53 it’s something that’s really deeply psychological and that we can see inside people’s brains.
0:05:55 Well, let’s take it step by step.
0:05:58 What does it mean to think ideologically?
0:05:59 What is ideology?
0:06:02 How are you defining it?
0:06:05 And how is that different from how people typically define it?
0:06:10 So the way I think about ideology is as really being comprised of two components.
0:06:17 One is a very fixed doctrine, a kind of set of descriptions about the world that’s very
0:06:22 absolutist, that’s very black and white, and that is very resistant to evidence.
0:06:28 So an ideology will always have a certain kind of causal narrative about the world that describes
0:06:31 what the world is like and also how we should act within that world.
0:06:36 It gives prescriptions for how we should act, how we should think, how we should interact
0:06:36 with other people.
0:06:39 But that’s not the end of the story.
0:06:44 To think ideologically is both to have this fixed doctrine and also to have a very fixed
0:06:48 identity that you really kind of judge everyone with.
0:06:54 And that fixed identity stems from the fact that every ideology, every doctrine will have
0:06:55 believers and non-believers.
0:07:04 And so when you think ideologically, you’re really embracing those rigid identity categories and
0:07:09 deciding to exclusively affiliate with people who believe in your ideology and really reject
0:07:10 anyone who doesn’t.
0:07:17 The degree of ideological extremity can really be mapped onto how hostile you are to anyone with
0:07:21 differing beliefs, whether you’re willing to potentially harm people in the name of your
0:07:22 ideology.
0:07:28 You write that, and now I’m quoting, not all stories are ideologies and not all forms
0:07:32 of collective storytelling are rigid and oppressive, end quote.
0:07:34 How do you tell the difference?
0:07:39 How do you, for instance, distinguish an ideology from a religion?
0:07:43 Is there even room for a distinction like that in your framework?
0:07:50 What I think about often is the difference between ideology and culture, because culture can encompass
0:07:56 eccentricities, it can encompass deviation, different kinds of traditions or patterns from the
0:07:57 past.
0:08:03 But it’s not about legislating what one can do or what one can’t do.
0:08:08 The moment we detect an ideology is the moment when you have very rigid prescriptions about what
0:08:10 is permissible and what is not permissible.
0:08:17 And when you stop being able to tolerate any deviation, that’s when you’ve moved from culture,
0:08:23 which can encompass a lot of deviation and kind of reinterpretations, where as an ideology,
0:08:29 there is no room for those kinds of nonconformities or differences.
0:08:34 What you’re doing here and what you do in the book that is interesting to me, and novel as
0:08:42 far as I know, is this reframing of ideology more as a style of thinking rather than just
0:08:44 a set of beliefs.
0:08:49 I mean, as you know, like the conventional way to think about ideology has always been to focus
0:08:53 on the content, on what people believe, not how they think.
0:08:55 And you flipped us around.
0:09:02 What does this understanding let us see that other definitions missed?
0:09:10 What that inversion reveals is that embracing an ideology in an extreme way and thinking really
0:09:13 about what are the mechanics of thinking ideologically?
0:09:16 What are the ways in which reason gets shifted?
0:09:18 How emotion gets distorted?
0:09:25 How our biological and kind of even physiological responses to the world get distorted is that
0:09:31 we stop thinking about ideologies as things that just envelop us from outside and that just
0:09:35 kind of are almost tipped into us by external forces.
0:09:42 And we start to see how it’s a much more dynamic process and that we can even see parallels between
0:09:47 ideologues who believe in very different things and partisans to completely different parties to
0:09:53 different missions, but that really it’s how they think that’s very similar, even if what
0:09:54 they think is very different.
0:09:59 I mean, some people might be more ideological than others, but does everyone more or less
0:10:05 have an ideology, even if they don’t think of themselves as having an ideology?
0:10:12 I kind of think about ideological thinking as something more specific, that it’s this antagonism
0:10:19 to evidence, this very kind of tight embrace of a particular narrative about the world and
0:10:23 rules about how the world works and how you should behave within that world.
0:10:31 And so when we think about it as that kind of fixed, rigid set of behaviors, of compulsions,
0:10:35 we see that not everyone is obviously equally ideological.
0:10:41 And I don’t know whether there’s a perfect human being completely without any ideology,
0:10:44 but in the book I do talk about, you don’t think so?
0:10:45 I don’t think so.
0:10:46 We’ll get there, but I don’t think so.
0:10:54 I think that you can be a lot less ideological and that, that’s almost the challenge that I
0:10:59 talk about in the book is what does it mean to think non-ideologically about the world,
0:11:02 maybe anti-ideologically about the world?
0:11:04 And what does that look like?
0:11:09 Well, tell me how you test for cognitive flexibility versus rigidity.
0:11:11 What kind of survey work did you do?
0:11:12 What kind of lab work?
0:11:18 So in order to test someone’s cognitive rigidity or their flexibility, one of the most important
0:11:24 things is not just to ask them because people are terrible at knowing whether they’re rigid
0:11:24 or flexible.
0:11:29 The most rigid thinkers will tell you they’re fabulously flexible and the most flexible thinkers
0:11:30 will not know it.
0:11:34 And so that’s why we need to use these kind of unconscious assessments, these cognitive
0:11:41 tests and games that tap into your natural kind of capacity to be adaptable or to resist
0:11:42 change.
0:11:49 And so one test to do this is called the Wisconsin Card Sorting Test, which is a card sorting game
0:11:52 where people are presented with a deck of cards that they need to sort.
0:11:57 And initially they don’t know what the rule that governs the game is, so they try and
0:11:58 figure it out.
0:12:02 And quickly they’ll realize that they should match the cards in their deck according to
0:12:02 their color.
0:12:07 So they’ll start putting a blue card with a blue card, a red card with a red card, and
0:12:10 they’ll get affirmation, the kind of positive feedback that they’re doing it right.
0:12:15 And so they start enacting this rule, adopting it, kind of applying it again and again and again.
0:12:20 And after a while, unbeknownst to them, the rule of the game changes, and suddenly this
0:12:22 color rule doesn’t work anymore.
0:12:28 And so that’s the moment of change that I’m most interested in, because some people will
0:12:30 notice that change and they will adapt.
0:12:33 They will then go looking for a different rule and they’ll quickly figure out that they
0:12:37 should actually sort now the cards according to the shape of the objects on the card.
0:12:39 And fine, they’ll follow this new rule.
0:12:42 Those are very cognitively flexible individuals.
0:12:47 But there are other people who will notice that change and they will hate it.
0:12:48 They will resist that change.
0:12:54 They will try to say that it never happened and they’ll try to apply the old rule despite
0:12:57 getting negative feedback, despite being told that they’re doing it wrong.
0:13:03 And those people that really resist the change are the most cognitively rigid people, that they
0:13:04 don’t like change.
0:13:08 They don’t adapt their behavior when the evidence suggests that they do.
0:13:13 And what’s interesting about this kind of task is that it’s not related to politics at
0:13:14 all, right?
0:13:19 It’s just a game that taps into how people are responding to information, responding to
0:13:20 rules, responding to change.
0:13:27 And we see how that people’s behavior on this kind of game really predicts their ideological
0:13:28 rigidities too.
0:13:34 Can we say that the point here is that if someone really struggles to switch gears,
0:13:40 in a card sorting game like that, that that says something about their comfort with change
0:13:42 and ambiguity in general.
0:13:48 And someone who struggles with change and ambiguity in a card game will probably also have an aversion
0:13:54 to pluralism in politics because their brain processes that as chaotic.
0:13:58 I mean, is that a fair summary of the argument or the logic?
0:14:04 Yeah, broadly it is, because people who resist that change, who resist the uncertainty, who’d like
0:14:08 things to stay the same, that when the rules change, they really don’t like it.
0:14:15 Often that can be translated into, you know, the most cognitively rigid people don’t like
0:14:17 plurality, don’t like debate.
0:14:26 They like a kind of singular source of information, a singular argument about a single theory of
0:14:26 everything.
0:14:33 But that can also, that can really coexist on both sides of the political spectrum.
0:14:40 So when we’re talking about diversity, like that can be a more politicized concept that
0:14:48 you can still find very rigid thinkers being very militant about certain ideas that we might
0:14:48 say are progressive.
0:14:50 So it’s quite nuanced.
0:14:58 Are there particular habits of mind or patterns of behavior that you’d consider warning signs
0:15:02 of overly rigid thinking, things that people can notice in themselves?
0:15:09 Well, it’s funny that you say habits of mind, because in many ways, I think that habits are
0:15:11 the biggest culprits here.
0:15:15 You know, we live in a society that constantly talks about how good it is to have habits and
0:15:18 to have routines that you repeat over and over again.
0:15:24 But actually, habits are the way in which we become more rigid because we become less
0:15:25 sensitive to change.
0:15:28 We want to repeat things exactly in the same way.
0:15:35 And so probably the first step, if you’re wanting to be more flexible in the way you approach
0:15:41 the world, is to take all your habits and routines and interrogate them and think about what it
0:15:47 does to you to be repeating constantly rather than to be exploring and navigating change.
0:15:54 I mean, I think it’s intuitively easy to understand why being extremely rigid would be a bad thing.
0:15:59 Is it possible to be too flexible?
0:16:02 Like, what does that look like at the extreme of flexibility?
0:16:08 If you’re just totally unmoored and just like permanently wide open and like incapable of settling
0:16:12 on anything, that seems bad in a different way.
0:16:12 Yeah.
0:16:13 Yeah.
0:16:19 And what that is, is a kind of immense persuadability, but that’s not flexibility, right?
0:16:25 So there is a distinction there because being flexible is about updating your beliefs in line
0:16:30 of credible evidence, not necessarily adopting a belief just because some authority says so,
0:16:34 but it’s about, you know, seeing the evidence and responding to it.
0:16:40 You write that we possess beliefs, but we can also be possessed by them.
0:16:46 And, you know, that reminds me of Carl Jung’s claim that, you know, we don’t have ideas, ideas
0:16:46 have us.
0:16:49 But what are you getting at here?
0:16:53 Like, what does it mean to say that we’re possessed by beliefs?
0:16:58 Does that mean that we are being animated and controlled by them unconsciously?
0:17:00 Or is it something different?
0:17:10 I think that it means that, I’ll pause here to think about the best, because it’s such
0:17:11 a massive question.
0:17:11 Yeah.
0:17:18 What we see with this science, with the science I’ve been involved in called political neuroscience,
0:17:24 where we use neuroscientific methods to study these questions about people’s political beliefs
0:17:30 and identities, is that the degree to which you espouse really dogmatic ideological beliefs
0:17:38 can get reflected in your body, in your neurobiology, in the way in which your brain responds to the
0:17:40 world at very unconscious levels.
0:17:43 And so it becomes a part of us.
0:17:53 And so there’s a kind of, I’m losing the word, but there’s a kind of expansion or a kind of echoing of your
0:18:00 thought patterns, not just in politics, but that they become part of how you think about anything in the
0:18:04 world and how your body responds and reacts to anything in the world.
0:18:07 And so our politics are not just things outside of us.
0:18:11 They’re really part of how the human body starts to function.
0:18:14 So you think ideologies can really change us physiologically?
0:18:20 What we see in a lot of studies is that, and this is obviously a growing field and there are many more
0:18:27 studies to conduct, but what we see across these experiments is that ideology really conditions
0:18:29 your physiological responses to the world.
0:18:36 So in one experiment, they looked at how much you justify existing systems and existing inequalities.
0:18:44 So some people think that very stark inequalities are bad and unnatural and maybe things that should be
0:18:48 corrected, whereas others think that inequalities are fine.
0:18:52 They’re natural parts of human life and maybe that they’re even good, that they’re desirable things
0:18:53 to have in society.
0:19:00 And what we see is that people who believe that inequalities are bad, we see that those
0:19:06 people, when they look at videos of injustice taking place of someone, for instance, discussing
0:19:11 their experience of homelessness and the adversity of that, their whole bodies react, their heart
0:19:16 rates accelerate, their kind of physiological markers of arousal really spike.
0:19:22 Because they’re biologically disturbed by what they’re seeing, they’re disturbed physically
0:19:24 by the injustice that they see.
0:19:31 In contrast, people who believe that those inequalities are fine, that they’re justifiable, that they
0:19:36 should not change at all, and that we should continue to have stark inequalities in society,
0:19:41 those people, when they see that injustice, their bodies are numb.
0:19:43 They’re physiologically unmoved.
0:19:47 They will not biologically be disturbed by the injustice that they see in front of them.
0:19:55 And so you really see how ideology conditions even our most unconscious, rapid physiological responses.
0:20:16 Support for the gray area comes from Mint Mobile.
0:20:19 There are a couple ways people say data.
0:20:20 There’s data.
0:20:21 Then there’s data.
0:20:22 Me, personally?
0:20:24 I say data.
0:20:25 I think.
0:20:26 Most of the time.
0:20:31 But no matter how you pronounce it, it doesn’t change the fact that most data plans cost an
0:20:32 arm and a leg.
0:20:36 But with Mint Mobile, they offer plans starting at just $15 a month.
0:20:38 And there’s only one way to say that.
0:20:40 Unless you say $15, I guess.
0:20:44 But no matter how you pronounce it, all Mint Mobile plans come with high-speed data and
0:20:49 unlimited talk and text delivered on the nation’s largest 5G network.
0:20:52 You can use your own phone with any Mint Mobile plan.
0:20:55 And you can ring along your phone number with all your existing contacts.
0:20:58 No matter how you say it, don’t overpay for it.
0:21:02 You can shop data plans at mintmobile.com slash gray area.
0:21:04 That’s mintmobile.com slash gray area.
0:21:09 Upfront payment of $45 for a three-month, five-gigabyte plan required.
0:21:12 Equivalent to $15 per month.
0:21:14 New customer offer for first three months only.
0:21:17 Then full price plan options available.
0:21:18 Taxes and fees extra.
0:21:20 See Mint Mobile for details.
0:21:27 Support for the gray area comes from Greenlight.
0:21:33 School can teach kids all kinds of useful things, from the wonders of the atom to the story of Marbury vs. Madison.
0:21:38 One thing schools don’t typically teach, though, is how to manage your finances.
0:21:42 So those skills fall primarily on you, the parent.
0:21:43 But don’t worry.
0:21:44 Greenlight can help.
0:21:49 Greenlight says they offer a simple and convenient way for parents to teach kids smart money habits,
0:21:53 while also allowing them to see what their kids are spending and saving.
0:21:57 Plus, kids can play games on the app that teach money skills in a fun, accessible way.
0:22:02 The Greenlight app even includes a chores feature, where you can set up one-time or recurring chores,
0:22:07 customized to your family’s needs, and reward kids with allowance for a job well done.
0:22:12 My kids are a bit too young to talk about spending and saving and all that.
0:22:17 But one of our colleagues here at Vox uses Greenlight with his two boys, and he absolutely loves it.
0:22:21 Start your risk-free Greenlight trial today at greenlight.com slash gray area.
0:22:25 That’s greenlight.com slash gray area to get started.
0:22:27 Greenlight.com slash gray area.
0:22:36 Support for the show comes from the podcast Democracy Works.
0:22:40 The world certainly seems a bit alarming at the moment.
0:22:42 And that’s putting it lightly.
0:22:46 And sometimes it can feel as if no one is really doing anything to fix it.
0:22:51 Now, a lot of podcasts focus on that, the doom and gloom of it all,
0:22:53 and how democracy can feel like it’s failing.
0:22:57 But the people over at the Democracy Works podcast take a different approach.
0:23:02 They’re turning their mics to those who are working to make democracy stronger.
0:23:04 From scholars to journalists to activists.
0:23:08 They examine a different aspect of democratic life each week.
0:23:12 From elections to the rule of law to the free press and everything in between.
0:23:17 They interview experts who study democracy as well as people who are out there on the ground
0:23:22 doing the hard work to keep our democracy functioning day in and day out.
0:23:25 Listen to Democracy Works wherever you listen to podcasts.
0:23:29 And check out their website, democracyworkspodcast.com to learn more.
0:23:35 The Democracy Works podcast is a production of the McCourtney Institute for Democracy at Penn State.
0:23:56 Focusing on rigidity does make a lot of sense.
0:24:07 But I can’t imagine one critique of this being that you risk pathologizing conviction, right?
0:24:12 How do you draw the line between principled thinking and dogmatic thinking?
0:24:17 Because as you know, one of those codes is good and the other codes as bad.
0:24:31 In many ways, I think that it’s not about pathologizing any conviction, but it is about questioning what it means to believe in an idea without being willing to change your mind on it.
0:24:39 And I think that there is, you know, there is a very fine line, right, between what we call principles and what we call dogmas.
0:24:47 And that’s what in many ways I hope that implicitly readers come to think about and interrogate is,
0:24:59 are they holding kind of broad moral values about the world that help them, you know, make ethical decisions, but also being sensitive to context and the specifics of each situation?
0:25:13 Or are they adhering to certain rules without the capacity to take context into account, without being willing to see all the shades of gray that a situation might kind of enable?
0:25:22 And thinking that taking very strong, principled positions is a purely good thing, I think is, I would like to challenge that.
0:25:27 I think it gets particularly thorny in the moral domain, right?
0:25:36 Like, no one wants to be dogmatic, but it’s also hard to imagine any kind of moral clarity without something like a fixed commitment to certain principles or values.
0:25:42 And what often happens is, if we don’t like someone’s values, we’ll call them extremist or dogmatic.
0:25:46 But if we like their values, we call them principled.
0:25:56 Yeah, and that’s why I think that a kind of psychological approach to what it means to thinking ideologically helps us escape from that kind of very slippery relativism.
0:26:03 Because then it’s not just about, oh, where is someone relative to us on certain issues on the political spectrum?
0:26:07 But it’s about thinking, well, what does it mean to resist evidence?
0:26:16 So there is a delicate path there where you can find a way to have a moral compass.
0:26:30 Maybe not the same absolutist moral clarity that ideologies try to convince you exists, but you can have a morality without having really dogmatic ideologies.
0:26:33 We all want things to make sense.
0:26:37 We want things to have a reason or a purpose.
0:26:48 How much of our rigid thinking, how much of our ideological thinking is just about our fear of uncertainty?
0:26:55 Ideologies are, in many ways, our brain’s way of solving the problem of uncertainty in the world.
0:26:59 Because, you know, our brains are these incredible predictive organs.
0:27:10 They’re trying to understand the world, but they’d also like shortcuts wherever possible, because it’s very complicated and very computationally expensive to figure out everything that’s happening in the world.
0:27:13 And so ideologies kind of hand that to you on a silver plate.
0:27:16 And they say, here are all the rules for life.
0:27:17 They’re all rules for social interaction.
0:27:21 Here’s a description of all the causal mechanisms for how the world works.
0:27:23 There you go.
0:27:29 And you don’t need to do that hard labor of figuring it out all on your own.
0:27:46 And so that’s why ideologies can be incredibly tempting and seductive for our predictive brains that are trying to resolve uncertainty, that are trying to resolve ambiguities, that are just trying to understand the world in a coherent way.
0:27:49 And so it is a kind of coping mechanism.
0:27:57 And what I hope to show in the book is that it’s a coping mechanism with very disastrous side effects for individual bodies.
0:28:01 Well, yeah, I think the main problem is that the world isn’t coherent.
0:28:06 And in order to make it coherent, you have to distort it often.
0:28:12 And I think that’s where this can lead to bad outcomes.
0:28:16 But look, so ideologies are certainly one way.
0:28:22 I mean, maybe the main way we satisfy this longing we have for clarity and certainty.
0:28:30 Do you think there are non-ideological ways to satisfy that longing?
0:28:33 I think so.
0:28:40 But I also think that it’s about recognizing that we have that longing and that ideologies are solutions to that longing.
0:28:56 And maybe by realizing that there’s that constructive element to it, right, that we gravitate towards ideologies, not necessarily because they’re true, but just maybe because they seem at first glance useful or nice or comforting.
0:29:12 And I think that already goes maybe some way at chipping away at the kind of illusion that ideologies try to claim and to establish, which is that they are the only truth and the theories of everything and that there is no other truth.
0:29:23 And so I think that it’s already important to recognize that kind of magnetism that happens between our minds and these ideological myths.
0:29:39 And I think that there are ways to live that don’t require you to espouse ideologies in a dogmatic way, in a way that inspires you to dehumanize other people for the sake of justifying your ideology.
0:29:59 And I think that that lies with thinking about what it means to update your beliefs in response to credible evidence, living in a society that has information and evidence that is accessible to everyone, rather than what is going on now with digital environments,
0:30:10 where that information that you receive is increasingly skewed, is increasingly selective and designed to disregulate you and to manipulate you rather than to offer you information.
0:30:37 But once you start to battle some of those systemic kind of problems with our information systems, I think you can do a lot of work to learn how to process information, to respond to disagreements in a way that is flexible, in a way that is balanced, in a way that is really focused on evaluating evidence in a kind of balanced way.
0:30:54 I think that in experiments, what we find is that people who are most cognitively rigid will kind of adhere to the most extreme ideologies.
0:31:03 But that doesn’t have to be a purely kind of far-right authoritarianism that we most typically are familiar with.
0:31:05 It can also exist on the left.
0:31:07 There are also left-wing authoritarianisms.
0:31:19 In the studies, we see that the people who are most rigid can exist both on the far left and on the far right, which is important because a lot of times there’s been this assumption that it’s only the political right that can be rigid.
0:31:29 But we see that when we measure people’s unconscious traits, that you can also find that rigidity on the left.
0:31:40 And I hope that that’s a kind of warning signal for a lot of people on the left who think that liberalism and the left are inherently about change and flexibility and progress.
0:31:46 Well, it can also attract rigid minds.
0:32:07 And so you need to think about, if you want to enact progress that maybe has a liberal flavor to it, you need to think about how to avoid those kind of rigid strains, the kind of dogmatic, conformity-minded, authority-minded way of thinking that exists on both sides of the spectrum.
0:32:15 And to that very point, somewhere in the book, you write that every worldview can be practiced extremely and dogmatically.
0:32:26 And I read that, and I just wondered if it leaves room for making normative judgments about different ideologies.
0:32:28 But let me put that in the form of a question.
0:32:37 Do you think every ideology is equally susceptible to extremist practices?
0:32:42 I sometimes get strong opposition from people saying, well, my ideology is about love.
0:32:59 It’s about generosity or about looking after others, kind of positive ideologies that we think surely should be immune from these kind of dogmatic and authoritarian ways of thinking.
0:33:20 But in many ways, what I’m trying to do with this research and in the book is rather than compare ideologies as, you know, these big entities represented by many people, is just to look at people and to look at, well, can we find, are there people who are extremely rigid in different ideologies?
0:33:42 And we do see that every ideology that has this very strong utopian vision of what life and the world should be or a very dystopian kind of fear of where the world is going.
0:33:47 And all of those have a capacity to become extreme.
0:34:00 Support for this show comes from Shopify.
0:34:04 When you’re creating your own business, you have to wear too many hats.
0:34:11 You have to be on top of marketing and sales and outreach and sales and designs and sales and finances.
0:34:17 And definitely sales, finding the right tool that simplifies everything can be a game changer.
0:34:21 For millions of businesses, that tool is Shopify.
0:34:26 Shopify is a commerce platform behind millions of businesses around the world.
0:34:30 And according to the company, 10% of all e-commerce in the U.S.
0:34:35 From household names like Mattel and Gemshark to brands just getting started.
0:34:40 They say they have hundreds of ready-to-use templates to help design your brand style.
0:34:44 If you’re ready to sell, you’re ready for Shopify.
0:34:49 You can turn your big business idea into with Shopify on your side.
0:34:55 You can sign up for your $1 per month trial period and start selling today at Shopify.com slash Vox.
0:34:58 You can go to Shopify.com slash Vox.
0:35:10 Have you ever gotten a medical bill and thought, how am I ever going to pay for this?
0:35:16 This week on Net Worth and Chill, we’re tackling the financial emergency that is the American healthcare system.
0:35:21 From navigating insurance nightmares to making sure your emergency fund actually covers those emergencies,
0:35:25 We’re diving deep into the hidden healthcare costs that no one warns you about.
0:35:35 Most hospitals in the U.S. are actually nonprofits, which means they have to have financial assistance or charity care policies.
0:35:41 So essentially, if you make below a certain amount, the hospital legally has to waive your medical bill up to a certain percent.
0:35:45 Listen wherever you get your podcasts or watch on YouTube.com slash YourRichBFF.
0:35:51 The regular season is in the rear view, and now it’s time for the games that matter the most.
0:35:54 This is Kenny Beecham, and playoff basketball is finally here.
0:36:02 On Small Ball, we’re diving deep into every series, every crunch time finish, every coaching adjustment that can make or break a championship run.
0:36:04 Who’s building for a 16-win marathon?
0:36:06 Which superstar will submit their legacy?
0:36:10 And which role player is about to become a household name?
0:36:15 With so many fascinating first-round matchups, will the West be the bloodbath we anticipate?
0:36:17 Will the East be as predictable as we think?
0:36:19 Can the Celtics defend their title?
0:36:23 Can Steph Curry, LeBron James, Kawhi Leonard push the young teams at the top?
0:36:28 I’ll be bringing the expertise, the passion, and the genuine opinion you need for the most exciting time of the NBA calendar.
0:36:32 Small Ball is your essential companion for the NBA postseason.
0:36:36 Join me, Kenny Beecham, for new episodes of Small Ball throughout the playoffs.
0:36:38 Don’t miss Small Ball with Kenny Beecham.
0:36:40 New episodes dropping through the playoffs.
0:36:43 Available on YouTube and wherever you get your podcasts.
0:37:05 How do you think about causality here, right?
0:37:13 I mean, are some people just constitutionally, biologically prone to dogmatic thinking?
0:37:22 Or do they get possessed, to use your word, by ideologies that reshape their brain over time?
0:37:24 Yeah, this is a fascinating question.
0:37:28 And I think that causality goes both ways.
0:37:35 I think there’s evidence that there are pre-existing predispositions that propel some people to join ideological groups.
0:37:43 And that when there is a trigger, they will be the first to run to the front of the line, kind of in support of the ideological cause.
0:37:49 But that at the same time, as you become more extreme, more dogmatic, you are changed.
0:37:54 You are changed to the way in which you think about the world, the way in which you think about yourself.
0:38:00 You become more ritualistic, more narrow, more rigid in every realm of life.
0:38:01 So that can change you too.
0:38:04 Just to be clear about what you mean by change, right?
0:38:07 When you say it changes our brains, how do you know that?
0:38:11 Are you looking at MRI scans and you can see these changes?
0:38:12 What do those changes look like?
0:38:20 So we don’t yet have, you know, the longitudinal studies required to see complete change.
0:38:32 But we do have other kinds of studies that look at, for instance, what happens when either when a brain is in a condition which makes it more prone to becoming ideological.
0:38:40 So, for example, what happens when we take people who are already have quite radical beliefs, so radical religious fundamentalists.
0:38:45 We put them in a brain scanner, we put them in a brain scanner and we make them feel very socially excluded.
0:38:52 We heighten that feeling that they’re socially excluded from others, that they’re alienated.
0:38:59 So we take vulnerable minds and we also kind of put them in a more kind of psychologically vulnerable state.
0:39:04 And what we see is that then they become a lot more ideological.
0:39:13 Their brains start imbuing every value as sacred, as something that they would be willing to die for, as something they would be willing to hurt others for.
0:39:26 And we see that these processes are so dynamic that even in conditions where people are stressed out, where they feel lonely, excluded, like there aren’t enough resources to go around.
0:39:51 And what we see is that there are these experiments that show the arrows pointing one way and also that people who have experienced brain injury, traumatic brain injury, two specific parts of the brain, that that later on we see that they’re more radical.
0:39:58 That their beliefs are a lot more extreme, that they’ll see a radical idea and they will say that it’s fine.
0:40:25 So through these kind of natural studies of either brain injury or about what happens to a brain that is already radical in those environments, we can get a sense that being in a rigid environment, in an environment that is stressful, that is authoritarian, that tries to put people into that mindset of thinking about every human being as an instrument to an end,
0:40:30 And that that can change how the brain responds to the world and maybe how it functions too.
0:40:37 So when these circuits get activated, there are corresponding parts of the brain that light up.
0:40:37 Exactly.
0:40:39 And that’s how you can make the connections?
0:40:40 Exactly.
0:40:40 Yeah.
0:40:41 So go ahead.
0:40:42 That’s so fascinating.
0:40:49 Now, look, I know you’re being careful about saying causality runs both ways and surely it does, but I want to push you a little bit.
0:40:56 And how far would you go in saying that genes determine political beliefs?
0:41:03 Would it be too neat to say that people are born with liberal brains or conservative brains?
0:41:11 So what we do know is that there are genetic predispositions to thinking more rigidly about the world.
0:41:17 These predispositions are related to dopamine and how dopamine is expressed throughout your brain.
0:41:30 So that can be about how dopamine is expressed in the prefrontal cortex, the area behind your forehead responsible for the high level decision making, and the dopamine kind of in your reward circuitry in the striatum.
0:41:35 And what we see is that there are genetic traits that make some people more prone to rigid thinking.
0:41:39 But there’s still so much scope for change.
0:41:44 These genetic traits are kind of potentials.
0:41:48 They can activate risk, but they can also really be subdued.
0:41:59 And that’s where we can look at also how what happens to minds with those genetic predispositions, but that grew up in environments and upbringing that were much more liberal or that were much more authoritarian.
0:42:02 Yes, that is what I found myself thinking a lot about.
0:42:05 This is not straight up determinism you’re doing here.
0:42:08 In your words, we’re talking about probabilities, not fates.
0:42:14 So our biology opens up certain possibilities, inclines us in one direction or another.
0:42:23 But our environment, our stresses, our communities, our family life, all of that can push us different ways.
0:42:29 Can you say a little bit more about this tension between biology and environment?
0:42:38 I think there’s sometimes the sense that, oh, if you’re talking about the kind of biology of ideology, that you’re saying everything is fixed and predetermined.
0:42:44 But actually, there’s huge scope for change and malleability and choice within that.
0:42:56 And what we see and what in my experiments I’ve found is that the best reflection of a person’s cognitive style is not necessarily the ideologies that they grew up with, but the ideologies that they ended up choosing.
0:43:12 So people who chose to enter a dogmatic ideology and kind of embrace it strongly, even though they grew up in a much more secular, presumably non-ideological upbringing, those people were the most cognitively rigid.
0:43:23 Choosing an ideology is the best reflection of your rigidity, whereas people who maybe grew up really ideological but left that environment are the most cognitively flexible people.
0:43:28 More flexible than people who grew up in non-ideological settings and stayed non-ideological.
0:43:33 And so there’s huge range and capacity for choice.
0:43:35 And so our biology doesn’t predetermine us.
0:43:43 It puts us on certain paths for risk or resilience, but then it’s our choices that affect which of our traits get expressed or not.
0:43:48 How much room is there for agency here, right?
0:43:54 Like, if I want to change the way I think, cognitively and politically, can I really do that?
0:43:56 How much freedom do I have?
0:44:07 I think you have an immense amount of freedom, and I think we know that you can change because people do change, and people do change the rigidity with which they approach the world.
0:44:09 They’ve changed their beliefs over time.
0:44:19 And so if we are all lying on a kind of spectrum from flexible thinking to rigid thinking, and we’re all somewhere on that spectrum, we can also all shift our position.
0:44:21 So how do people do that?
0:44:23 How do they go about bringing that change about?
0:44:30 Well, the first way in which we can understand this is by looking maybe what we would say might be the negative change.
0:44:33 What happens, what prompts people to think more rigidly about the world.
0:44:41 And the best way, and maybe the most sinister way, to make people think more rigidly is to stress them out.
0:45:01 And we can do that even in the lab, we can stress a body out by either asking it to do something that would make any person socially nervous, like standing in front of a big group and speaking unexpectedly, or by asking them to do something that would physically stress their body out, like putting their hand in a bucket of ice water.
0:45:09 And that just automatically for any person, you know, all of your body’s resources get channeled to dealing with that physical stressor.
0:45:26 And what we see is that even in that immediate moment, like the kind of three minutes that pass when your body stresses out, you immediately become more rigid in how you solve problems, in how you kind of solve all kinds of game mental challenges.
0:45:33 And so you can see that stress is a huge factor that pushes people towards more rigid thinking.
0:45:40 Well, that’s a very profound finding and one I think that maps onto like the historical record.
0:45:56 What you find very often in history is that when material conditions and societies decline, when people get more impoverished and deprived and desperate, they become more vulnerable.
0:46:00 To authoritarian or extremist movements, right?
0:46:14 And maybe part of what’s going on there is these circuits that are getting induced in people’s brains, that stress, those stressful conditions make them, they prime them to be more susceptible to these ways of thinking.
0:46:42 Yeah, because understanding that a body that is stressed is a body that is more vulnerable to extreme authoritarian dogmatic thinking really helps us understand who is most susceptible and at what times we are all most susceptible, so that we understand why and how maybe malicious agents can take hold of those experiences of stress, of adversity, of precarity or lack of resources.
0:46:58 Or to actually, you know, create ideological rhetoric that stresses us out or that makes us think that there aren’t enough resources to go around, that that’s a profoundly, you know, powerful way to get people to think in a more authoritarian minded way.
0:47:12 Well, we’re in an era, especially in America, but I think this is also true in your neck of the woods, of highly polarized partisan politics.
0:47:16 Do you feel like this research has some particular insight into that?
0:47:22 Do you think absorbing this can actually make us more intelligible to each other?
0:47:25 I think so.
0:47:40 I think, first of all, it allows us, by recognizing that actually people at the very ends of, for instance, the political spectrum, or people of many different ideologies, when taken to the extreme, actually start to resemble each other, I think is probably a very humbling insight.
0:47:49 Because you realize that although you might be feeling like you’re fighting for completely different missions, you’re psychologically engaged in a very similar process.
0:47:57 And so, hopefully, hopefully that is one way to maybe help us understand each other in very polarized times.
0:48:10 But I think that there’s also this really profoundly individual or personal problem here that you have to confront, which is how flexible or how dogmatic are you?
0:48:22 And how would you like to live, you know, rather than just judging other people for their dogmatism, it’s about thinking, well, what are the rules that you impose on yourself or on those around you?
0:48:27 And can those be actually damaging to your mental freedom?
0:48:34 Because those rules we impose on ourselves, yeah, reduce our capacity to think authentically.
0:48:42 The end of your book imagines a mind that’s ideology-free.
0:48:45 Do you really think that’s possible?
0:48:48 Do you think we can live without ideology?
0:48:51 I think we can certainly try.
0:48:53 And I think…
0:48:56 Well played.
0:49:06 I think we can, because I think, you know, starting to shed those ideological, those really harsh ideological convictions with which we,
0:49:14 encounter ourselves and others, is possible, is probably desirable from a psychological perspective, and quite empowering.
0:49:24 It’s also a very difficult process, because to be flexible is not just an end state that you arrive at, and you made it, you’re flexible, that’s it, you’re good.
0:49:35 It’s this continuous struggle I even talk about as a Sisyphean task, because there are so many pressures trying to rigidify you, to narrow your thoughts, that to stay flexible,
0:49:43 to stay in that space of being willing to accept nuance, ambiguity is a really, really hard thing.
0:49:53 Flexibility is very fragile, but I think it’s also really fulfilling to be in pursuit of that more flexible, ideologically free way of being.
0:50:01 The sort of flexibility you’re talking about is, to me, not just an intellectual virtue.
0:50:13 I think it’s also a moral virtue in the sense that it enables us to be more open to ideas and people, and more humble about what we don’t or can’t know.
0:50:28 Do you have any thoughts on, do you have any thoughts after all this research on how to educate children, how to parent, how to teach people to be anti-ideological in the way you defined it?
0:50:32 Yeah, I mean, in many ways…
0:50:36 Sorry, there were too many different ways to answer that question.
0:50:37 Yeah, no, please start over.
0:51:01 I think one of the most profound insights from this research is that when you start to embody flexible thinking in your everyday life, in the way in which you psychologically approach the world, that will bleed into the way in which you evaluate moral space, the political space, the ideological space.
0:51:15 And so if we wanted to cultivate that flexibility in children, in fellow adults, it would be about encouraging that kind of flexibility in all things.
0:51:20 Responsibility, like we talked about, is not just this endless persuadability or a kind of wish-washiness.
0:51:31 It’s this very active stance where you’re trying to think about things in the most wide-ranging anti-essentialist way.
0:51:48 And so teaching people to think really creatively, of course, we also need to teach them to be critical thinkers, but to be really creative in any domain, not just in art, which tries to demand creativity, but in every realm of life.
0:51:55 Rather than repeating your day in the same way again and again, how do you incorporate change?
0:52:01 How do you incorporate thinking outside the box, breaking down essences into kind of new ways of thinking?
0:52:15 So teaching people to be more flexible, original, creative, imaginative in that way, I think is something that education systems and families can do, and hopefully they should.
0:52:21 As you said earlier, we’re very much in the beginning of this research.
0:52:25 Where does it go from here?
0:52:30 What do you think is the next frontier of political neuroscience?
0:52:43 Where we go from here is to continue to tackle those questions about causality, to really learn to see how ideologies can change the human body, the human brain, how it responds to the world.
0:53:04 And also how those, how, what we bring to the table, I think continuing to understand that, that requires studies of people over a long time, over their whole lives, to see how people’s changes in their psychological kind of expressions can map onto their, their ideological commitments.
0:53:07 And I think now is probably a very good time to do it, because there is a lot of change.
0:53:17 People are both becoming at times more dogmatic, more extreme, but also changing allegiances at paces that maybe we haven’t seen in a long time.
0:53:33 And so this is a great moment to stop thinking about things purely as the political left versus the political right, but evaluate any ideological commitment, whether it’s nationalistic, social, religious, environmental, any kind of ideology, to start to see those parallels.
0:53:36 And kind of like you’ve been hinting at, well, what are the differences?
0:53:40 When does it matter what you think and not just how you think?
0:53:45 So there’s a lot of exciting science to do there, and it’ll be, it’ll be interesting to see where it goes.
0:53:47 I think that’s a good place to leave it.
0:53:54 Once again, the book is called The Ideological Brain, The Radical Science of Flexible Thinking.
0:53:55 This was a lot of fun.
0:53:56 It’s a great book.
0:53:58 Thank you for coming in.
0:53:59 Thank you so much.
0:54:08 All right.
0:54:10 I hope you enjoyed this episode.
0:54:21 One thing I really appreciate about what this book is doing is that it just speaks to how complicated we are and how complicated our beliefs are.
0:54:25 And that should be humbling in a lot of ways.
0:54:28 But as always, we want to know what you think.
0:54:32 So drop us a line at thegrayareaatvox.com.
0:54:40 Or you can leave us a message on our new voicemail line at 1-800-214-5749.
0:54:46 And if you have time, please go ahead, rate, review, and subscribe to the pod.
0:54:58 This episode was produced by Beth Morrissey, edited by Jorge Just, engineered by Christian Ayala, fact-checked by Melissa Hirsch, and Alex Overington wrote our theme music.
0:55:02 New episodes of The Gray Area drop on Mondays.
0:55:03 Listen and subscribe.
0:55:06 This show is part of Vox.
0:55:10 Support Vox’s journalism by joining our membership program today.
0:55:13 Go to vox.com slash members to sign up.
0:55:17 And if you decide to sign up because of this show, let us know.
0:55:17 Thank you.

What do you do when you’re faced with evidence that challenges your ideology? Do you engage with that new information? Are you willing to change your mind about your most deeply held beliefs? Are you pre-disposed to be more rigid or more flexible in your thinking?

That’s what political psychologist and neuroscientist Leor Zmigrod wants to know. In her new book, The Ideological Brain, she examines the connection between our biology, our psychology, and our political beliefs.

In today’s episode, Leor speaks with Sean about rigid vs. flexible thinking, how our biology and ideology influence each other, and the conditions under which our ideology is more likely to become extreme.

Host: Sean Illing (@SeanIlling)
Guest: Leor Zmigrod, political psychologist, neuroscientist, and author of The Ideological Brain

Listen to The Gray Area ad-free by becoming a Vox Member: vox.com/members

Learn more about your ad choices. Visit podcastchoices.com/adchoices

Leave a Comment