Response

  1. admin Avatar

    Rather than pushing people to change, ask why they aren’t doing it already. Want to understand behavior? Look at the situation. When someone acts in ways that don’t make sense, ask yourself, ask yourself, what would the world have to look like for that behavior to make sense?
    … our beliefs are formed by people more than facts. We agree with people we like, despite the facts. It’s easier to believe a lie from someone you like than a truth from someone you dislike. We form identity beliefs. Liberal, conservative, Democrat, Republican. They can do no wrong. If they’re wrong, we’re wrong. And we can’t handle that.

AI transcript
0:00:07 Delay your intuition. Don’t try to form an intuition quickly, which is what we normally do.
0:00:12 Focus on the separate points, and then when you have the whole profile,
0:00:15 then you can have an intuition and it’s going to be better.
0:00:26 Welcome to The Knowledge Project. I’m your host, Shane Parrish. In a world where knowledge is power,
0:00:30 this podcast is your toolkit for mastering the best of what other people have already figured out,
0:00:33 so you can use their insights in your life.
0:00:42 Before we get into the interview, I want to tell you about a moment that didn’t make it into the episode.
0:00:49 I first came across Daniel Common’s work in the early 2000s. His impact on me and so many people
0:00:55 around the globe has been unbelievable. By the time I sat down with him in his New York City home in
0:01:01 2019, I had so many questions for him. Condon won a Nobel Prize in Economic Sciences in 2002,
0:01:08 yet he never took an economics course. His central message was very simple. If we want to make better
0:01:15 decisions, we need help. Danny died last year on March 27th, 2024. He was 90. This conversation is
0:01:22 now one of the final opportunities to hear directly from one of the most influential thinkers of our time.
0:01:28 I get messages about this episode every week. People come away with new insights on everything from life
0:01:36 to decision making. I re-listened to it recently and it’s timeless. That’s exactly why I’m republishing it.
0:01:42 Consider loss aversion, one of his most important discoveries. Why does losing $100 hurt twice as much
0:01:49 as gaining $100 feels good? The asymmetry affects everything. It affects your stock portfolio, your golf
0:01:55 game. Check your portfolio when it’s down and you’ll start making emotional decisions. A golfer putts better
0:02:02 for power than for birdie. But here’s what happened near the end of our interview. Danny’s phone rang and it
0:02:07 was loud. He’d forget to turn it off. We’re almost done the interview at this point but he answered and
0:02:13 someone obviously wanted him to give a talk or review a book. He ended the call with words that have stayed
0:02:19 with me since then. My rule is I never say yes on the phone. I’ll get back to you tomorrow. I wanted to
0:02:26 discuss that on air but we ran out of time. As I packed up my gear, I asked him about that. This rule was a trick
0:02:32 to avoid saying yes intuitively. It gave him time to think. He’s always bombarded with requests and he
0:02:38 often says yes when he didn’t want to. At first, he would try saying no. That date doesn’t work. That
0:02:43 timeline doesn’t work. But what happened in those moments was it turned into a negotiation. What about
0:02:50 another date? Another timeline? So he hit on this rule. And to me, this is his most practical discovery.
0:02:57 Most people don’t even know about it. This rule lets you reprogram your unconscious mind. Your desired
0:03:03 behavior becomes your default behavior. And that’s incredibly powerful. It changed my life. I now
0:03:08 exercise every day. It’s actually easier than three times a week. The activity, duration, and scope can
0:03:15 change but working out and exercising doesn’t. I think I’ve missed five days in five years at this point.
0:03:20 And I talk about this in my book, Clear Thinking. And the concept has changed so many lives, including
0:03:26 my great friend, Brent Beshore. In episode 196, we talk about this a little. Several parts of this
0:03:31 conversation stuck out when I was re-listening to it. First, we talk about happiness versus satisfaction.
0:03:38 Happiness is feelings. It’s mostly social. Am I with the people who love me and whom I love back?
0:03:43 Satisfaction, on the other hand, is how you feel about your life, your job, your career,
0:03:50 conventional aspects. Danny argued people want satisfaction more than happiness. Second,
0:03:57 changing behavior. Make good behavior easier and bad behavior harder. The insight? All behavior is
0:04:04 equilibrium. Rather than pushing people to change, ask why they aren’t doing it already. Third, behavior
0:04:11 is situational. Want to understand behavior? Look at the situation. When someone acts in ways that don’t
0:04:16 make sense, ask yourself, ask yourself, what would the world have to look like for that behavior to make
0:04:24 sense? Fourth, agents making decisions on your behalf beat you at certain types of decisions. They have no
0:04:31 sunk costs. They have no emotions. Brian Johnson talks about this in episode 188. He turned his health
0:04:38 decisions over to effectively an algorithm because that algorithm makes better decisions than he does.
0:04:44 Fifth, our beliefs are formed by people more than facts. We agree with people we like,
0:04:49 despite the facts. It’s easier to believe a lie from someone you like than a truth from someone you
0:04:57 dislike. We form identity beliefs. Liberal, conservative, Democrat, Republican. They can do no wrong. If they’re
0:05:03 wrong, we’re wrong. And we can’t handle that. Finally, intuition. Danny had talked about this so much,
0:05:08 his answers sounded repetitive. So I framed my question on this to include his typical answer
0:05:15 in the question, forcing him to think a little deeper. Whether this is your first listen or your
0:05:18 third, you’ll come away with ideas that you can use in life.
0:05:40 Daniel, I’m so happy to give it a chance to talk to you.
0:05:43 Well, I’m happy to have you here.
0:05:47 What was your childhood like? What were you like as a child?
0:05:56 Oh, my God. That was a long time ago. I was, I was an early child, as you might expect, I suppose.
0:06:04 I was, I thought I’d be a professor when I was like three or four years old because people told me I would
0:06:12 be because I probably spoke with long words and stuff like that. So, and then the rest of my childhood,
0:06:21 I mean, I was five when World War II began. So, and I was a Jew in France. So, I’ve had a difficult
0:06:28 childhood, but from that point on. But, but I was, was I like, yeah, I was a, I was a nerdy child.
0:06:36 I was quite inept physically. Very fortunately for me, when I’d finally moved to Israel at age 12,
0:06:43 they held me up a grade in them. And that was all right. But that’s, that’s what I was like.
0:06:47 Are there any particular lessons or memories that stand out for you?
0:06:55 There are two of them that I speak about. So, one is that I was, I was a psychologist very early on.
0:07:04 That was, that was very clear. I, I wrote an essay before I was 11. I remember where, because it was,
0:07:11 it was a German counterattack. It was during that period we were in Paris. And I wrote an essay about
0:07:19 faith and religion. And it was a very pompous essay. I had a little book that was, that was titled
0:07:28 what I, what I write about what I think, something pompous like that. But the essay started with
0:07:36 another pompous thing that I quoted Pascal. My, my sister had passed her exams and I had read,
0:07:44 she’s, you know, she’s studied some Pascal and I had read it. And Pascal had said that faith
0:07:56 is God made sensible to the heart. And, you know, little me, I said, how true. That’s what my essay said.
0:08:04 And then, and then, but then I said, but faith is really hard to get. You don’t say it’s God all the time. So, that’s what
0:08:12 that’s what religious pomp is for. Cathedrals, organ music. They give you, and I call that urzat faith,
0:08:21 sort of a substitute faith, because it’s a, it’s a similar feeling. It’s got to do with God. And that’s what you
0:08:30 must do with me. That’s, that’s a psychologist. So, it’s clear that, you know, that was my calling. And so,
0:08:35 that’s one significant memory of my childhood.
0:08:36 So you wanted to be a psychologist?
0:08:46 I think so. I think so. I mean, I, you know, it’s always had that point of view that later, as a
0:08:52 teenager, I was, you know, interested in all the philosophical issues, like, you know, does God
0:08:58 exist? And what’s good and bad? And stuff like that. And why shouldn’t we masturbate? You know, serious
0:09:06 questions. But, but I discovered that, actually, I was less interested in the question of whether or
0:09:13 not God exists, then in why do people believe that he exists? That I thought was interesting. And I
0:09:18 wasn’t particularly interested in the question of what’s good or bad, but I was really interested in
0:09:24 what makes people angry and indignant. So, you know, I’ve had the psychological point of view since,
0:09:26 turns out, since my childhood.
0:09:33 Was there anybody that sort of influenced you to go on to study this? I mean, it’s one thing to have
0:09:39 these dreams as like a 12, 13, 14-year-old boy. It’s another to turn this into, you know, probably
0:09:43 the most eminent career that’s ever happened for a psychologist.
0:09:50 No, not the most eminent career. You know, and I wasn’t sure, actually, that I would do psychology.
0:09:59 And when I took a vocational exam to tell me what I was good at, and psychology and economics stood in
0:10:07 out. But, you know, that was unexpected. And then I took psychology as an undergraduate and mathematics,
0:10:13 at which I was not particularly good. So, and no, it’s not that I knew at the time that, you know,
0:10:19 I had that calling to be a psychologist. It didn’t occur to me. I thought, you know, I thought I’d be a
0:10:26 professor in one thing or another. I mean, I thought I’d be an academic, but not psychology specifically.
0:10:32 You worked with Amos Tversky for a long time. Are there any particular stories that you remember
0:10:34 about working with him that bring a smile to your face?
0:10:44 Almost everything about working with him brings a smile to my face. You know, he was a very unusual
0:10:51 person. Most people who knew them thought that he was the smartest person that I’ve ever met.
0:10:57 And in fact, the famous psychologist, Nick Nesbitt, said that it’s sort of an intelligence test when
0:11:03 you said that when you’re with Amos, how long does it take you to figure out that he’s smarter than you
0:11:09 are. And the faster you figure that out, the smarter you are. So, you know, he was, uh, he was
0:11:17 super bright and very, very funny. He joked a lot. He laughed a lot at his own jokes. And that was
0:11:25 infectious. When I was with him, I was very funny too. More than half of my, the last of my, of my lifetime
0:11:31 I’ve had during the 10 years I worked with him. You know what doesn’t belong in your summer plans? Getting
0:11:37 burned by your old wireless bill. While you’re locking in itineraries, your wireless bill should
0:11:43 be the last thing holding you back. That’s why I recommend the switch to Mint Mobile. No gimmicks,
0:11:49 just a smarter play most people overlook. You make smart trade-offs everywhere else in life. Why not here?
0:11:56 All plans come with a high-speed data and unlimited talk and text delivered on the nation’s largest 5G
0:12:01 network. Use your own phone with any Mint Mobile plan and bring your phone number along with all
0:12:07 your existing contacts. It’s a small switch that compounds over time, just like any great decision.
0:12:12 Most people leave money on the table. This is how you stop doing that. If I had needed this product,
0:12:18 it’s what I’d use. The savings are unbeatable. This year, skip breaking a sweat and breaking the bank. Get this
0:12:26 new customer offer and your 3-month unlimited wireless plan for $15 a month at mintmobile.com/knowledgeproject.
0:12:34 That’s mintmobile.com/knowledgeproject. Upfront payment of $45 required, equivalent to $15 a month.
0:12:40 Limited time new customer offer for first 3 months only. Speeds may vary. About 35GB on unlimited plan.
0:12:43 Taxes and fees extra. See Mint Mobile for details.
0:12:49 You have an interesting distinction between happiness and satisfaction. Can you walk us through that?
0:12:58 Yeah, sure. I mean, the word happiness is so ambiguous and it means so many things to many people,
0:13:06 but one sensible interpretation of it is that it’s got to do with your emotions, with how you feel,
0:13:12 with the emotional tone of your life, whether it’s a happy life, you know, it’s pleasant to be you.
0:13:19 Life satisfaction is a completely different thing. I mean, life satisfaction is how you feel about your
0:13:24 life when you think about your life. And most of the time, you don’t think about your life, you just live.
0:13:33 But, you know, sometimes you sort of look and that’s when you determine how satisfied you are.
0:13:38 That’s life satisfaction. It’s not satisfaction, it’s life satisfaction.
0:13:43 Should we balance the two? Or how would you think about them? Should we be more happy when we’re younger,
0:13:44 more satisfied when we’re older?
0:13:51 That thought had never occurred to me. When I began to work on this, I started out thinking that
0:14:00 happiness, in that sense of how you feel when you live, that was reality. And that
0:14:07 life satisfaction was just stories that people tell themselves. And the important thing was to be
0:14:13 happy in real time. But later, when we did more research, it turned out that
0:14:19 the circumstances that make people happy and the circumstances that make them satisfied with their
0:14:25 life are not the same. So happiness is mostly social. It’s, you know, it’s being with people
0:14:33 you love and who love you back. That’s, that’s a lot of what happiness is. Life satisfaction is much more
0:14:39 conventional. It’s to be successful. And, you know, so it’s money, education, prestige,
0:14:46 that sort of thing is what life satisfaction is about. So those are two very different things.
0:14:53 I thought that life satisfaction is irrelevant. You know, that’s how I began. And we, we had a research
0:15:00 program where we were, we were trying to, you know, to show that this is the case. But then after a few
0:15:09 years, I realized that what people really want in their life is they don’t seem to care about how happy
0:15:14 they’ll be. They seem to want to be satisfied with their life. They seem to want to have a good story
0:15:21 about their life. And then I was in the position of saying that to define wellbeing in a way that people
0:15:28 didn’t seem to care particularly about. So that was not a tenable position. So I, I dropped back into
0:15:31 saying that I had no idea how to deal with it.
0:15:37 Was this a result of the research? You did some research that was, I think it said above 70,000,
0:15:40 you don’t become happier, but do you become more satisfied?
0:15:51 No. The research I did with Angus Deaton at Princeton, famous economist, we showed that in terms of
0:15:58 happiness, in terms of emotional tone, positive and negative, having a lot of money doesn’t make you
0:16:06 happier. But being poor makes you miserable. So that’s above the threshold that was like $70,
0:16:15 $50,000 approximately in the U.S. Then extra money didn’t make you emotionally happier. But with life
0:16:21 satisfaction, it was a different story. With life satisfaction, that doesn’t satiate. So it’s always
0:16:29 good to have more. Because basically, I think, money is a proxy for success. And it’s a proxy for
0:16:31 subjective success in many cases.
0:16:35 So it’s not necessarily about spending it or doing something with it. It’s just a measure.
0:16:41 Just getting it. I mean, you know, you look at all those people, those billionaires working their heads
0:16:46 off. And they’re clearly not doing this because they need more money. They’re trying to get more
0:16:51 money. And they’re trying to get more money because that would be an indication that they’re good at what
0:16:54 they do. I think mostly it’s a proxy.
0:16:59 Do either of those variables correlate to longer living, happiness, or satisfaction?
0:17:06 Both, apparently. But you know, it’s hard to separate. And I haven’t been followed. You know,
0:17:14 shortly after deciding that I didn’t know what well-being was, I sort of stopped doing research
0:17:22 on this. So I haven’t been following. But I think there’s clear evidence that being effectively happy,
0:17:27 you know, is very good for you. And you do live a little longer, and you live better, and so on.
0:17:34 And life satisfaction works in the same direction. Whether it’s separable, which of them, you know,
0:17:36 is it more important that I don’t?
0:17:44 I want to switch gears a little bit and talk about behavior. And I’d love your insider expansion
0:17:49 upon the idea of we can change behavior. And how do we go about changing our behavior?
0:17:56 Well, you know, I’m not sure I buy the premise. I think changing behavior is extremely difficult.
0:18:02 There are a few tips and, you know, a few guidelines about how to do that. But anybody who is very
0:18:10 optimistic about changing behavior is just deluded. It’s hard to change other people’s behavior. It’s very
0:18:12 hard to change your own. Not simple.
0:18:14 This is what marriage is all about, right?
0:18:19 Yeah, among other things. You know, people, when, when, you know, married people try to
0:18:21 change each other’s behavior.
0:18:23 It’s a lot of dissatisfying.
0:18:26 They are not on their way to a good marriage, I think.
0:18:28 We’d all be happier with lower expectations.
0:18:34 Yes. I mean, and, and even if you have expectation, don’t try to change because, you know,
0:18:38 it’s very unlikely to work in a significant way.
0:18:43 I can think of the common ways that we would sort of go about behavior change and it would be,
0:18:48 you know, making good behaviors more easy or negative behaviors harder.
0:18:55 And that’s the main, the main insight. You know, when you want to influence somebody’s behavior,
0:19:01 that’s a very big insight. I’ve always thought that this is the best psychological idea ever,
0:19:08 you know, so far as I’m concerned. But it’s that when you want somebody to move from A to B in terms of
0:19:15 their behavior, you can think of it that there are two ways of doing it. You can push them or you can ask
0:19:22 the question, “Why aren’t they doing B already?” Which is an unusual question, but you know, why?
0:19:32 So then when you ask, “Why, why not? Why aren’t they doing B?” as they ought to, as they think they ought to,
0:19:40 then you get a list of what’s got to win. That’s the psychologist who my guru and that’s my hero and
0:19:45 many people’s hero. He spoke of restraining forces. I mean, so there are reasons.
0:19:52 why they’re not where you want them to be. So he spoke of behavior as an equilibrium. There are
0:19:58 forces that are pushing you one way, forces that are pushing you the other way. So how loud you speak,
0:20:06 how fast you drive. It’s easy to think of it as an equilibrium. And what we tend to do when we want to
0:20:17 move people from A to B is we push them. We add to the driving forces. And Kurt Lewin’s insight was that
0:20:23 this is not what you should do. You should actually work on the restraining forces and try to make them
0:20:31 weaker. And that’s a beautiful point. And he showed, he had that image that, you know, I’ve had since I was an
0:20:37 undergraduate. And I’m not sure actually whether it was his image or something that I drew from
0:20:44 reading him. But it’s like you have a plank and it’s being held by two sets of springs. You know,
0:20:50 you want it to move one direction. And so you could add another spring that would push it that way, or
0:20:56 could remove one of the springs that are holding it back. And the interesting thing, and that’s the
0:21:04 striking outcome, is when it moves, if it moves because of the driving force you’ve added to the
0:21:12 driving force, then at equilibrium, it will be in a higher state of tension than it was originally.
0:21:17 That is because you’ve compressed one spring and such pushing back harder. But if you remove
0:21:23 a restraining force, at equilibrium, there’ll be less tension in the system. I must have been 20 years
0:21:30 old. I thought that’s just so beautiful. What do you wish that everybody knew about psychology that you
0:21:34 don’t think that they do? If that was class one, what’s class two?
0:21:41 You know, class two, which is a development from class one, you know, it’s the same idea extended.
0:21:51 Class two is that behaviors don’t necessarily reflect the personality, but behaviors have a lot to do with
0:21:58 the situation. And so if people behave in strange ways, look at the situation they’re in and what are
0:22:05 the pressures in the situation that make them act as well. So there is a bias that the social
0:22:11 psychologist, well-known social psychologist, call the fundamental attribution error. And that means
0:22:17 that when you see people acting in some way, you think that it’s because of their personality that they
0:22:23 do it. That may not be the case. It’s quite likely that the situation is making them do it.
0:22:31 I’d like people to know that motivation is complex and that people do good things for a mixture of
0:22:37 good and bad reasons. And they do bad things for a mixture of good and bad reasons. And I think that
0:22:43 there is a point to educating people in psychology is to make them less judgmental. Just have
0:22:50 more empathy and more patience and being judgmental doesn’t get you anywhere.
0:22:55 When you talk about situational, one of the things that comes to mind is it’s so easy for us to give
0:23:00 our friends advice. But if we were in that situation, we might not necessarily see it.
0:23:04 Why is that the case? Why is it so much easier to give other people advice?
0:23:11 I mean, feelings get in the way of clear thinking. There is a phenomenon that we call the endowment effect,
0:23:18 which is that when I’d ask for more money to sell you my sandwich than I’d pay to get it. I mean,
0:23:24 that’s essentially the endowment effect. And our explanation of it, there are many explanations,
0:23:30 but a story I like to tell about it is that it’s more painful to give something up than to get something.
0:23:37 But there is an interesting result that if you have an agent making decisions on somebody’s behalf,
0:23:43 that agent doesn’t have loss of those. So that agent sells and buys at the same price,
0:23:48 which is the economically rational thing to do. Where this goes into policy and governments and
0:23:54 really important things is that governments are like agents or people who think about the good of
0:24:00 society. And agents, they take the economic view. They take the view of what things will be like at
0:24:06 the end. They don’t figure out that there are some people are going to be losing because of the reform
0:24:13 that they make. And it turns out that you can really expect losers, potential losers to fight a lot
0:24:20 harder than potential winners. And that’s the reason that reforms are frequently fair. And that when they
0:24:27 succeed, they’re almost always way more expensive than anticipated. And they’re more expensive because
0:24:35 you have to compensate the losers. And that frequently is not anticipated. So that’s an example of a story
0:24:46 that incorporates behavior change. And the difference between perspective, between being, you know, in the
0:24:52 situation, feeling the pain of giving up the sandwich, and not feeling the pain of giving up the sandwich.
0:24:59 That would have huge public policy sort of implications too, right, that we don’t tend to think about or discuss.
0:25:05 That’s a really interesting angle there. I want to come back to sort of situational decision-making based on
0:25:10 sort of like what we see is all there is. And we have these feelings that we can’t sort of disassociate
0:25:17 with. How does environment play a role like the physical environment in sort of what we decide or does it?
0:25:25 I mean, you know, there are sort of obvious things that we know. If people are hot and bothered and
0:25:34 distracted, and there is a lot of noise and so on, then they’ll think as well. And that we know that’s…
0:25:39 But even there, there are puzzles. I mean, many people think and work a lot better in cafes,
0:25:46 you know, where there is actually ambient noise and activity around them, and it helps them concentrate
0:25:52 better. So there isn’t a very simple story of the environment. But certainly, you can make the
0:25:58 environment tough enough so that people won’t be able to think properly. That’s feasible.
0:26:05 Are there things that we could do to, I guess, push the environment to be more conducive to
0:26:09 to clear thinking, the physical environment in this case?
0:26:16 Oh, there are all sorts of, you know, odd findings, you know, the color of the color of the room.
0:26:22 Some colors are better than others. And you would expect that some colors are more calming than others.
0:26:25 So you wouldn’t want to be in a red room.
0:26:26 Making decisions.
0:26:26 Making decisions.
0:26:32 Making decisions. But, you know, those are extreme and minor effects.
0:26:38 I want to come to intuition and noise later. Is there anything else that stands out that gets in
0:26:43 the way of clear thinking that we can sort of bring to the surface now?
0:26:51 Well, you know, what gets in the way of clear thinking is that we have intuitive views of
0:26:57 almost everything. So as soon as you present a problem to me, I have, you know, I have some
0:27:02 ready-made answer. And what gets in the way of clear thinking are those ready-made answers.
0:27:08 And we can’t help but have them. So that’s one thing that gets in the way. Emotions get in the
0:27:19 way. I would say that independent clear thinking is, to first approximation, impossible. In the sense
0:27:25 that, you know, we believe in things most of the time, not because we have good reasons to believe
0:27:31 them. If you ask me for reasons, I’ll explain you. I’ll always find a reason. But the reasons are not
0:27:38 the causes of our beliefs. We have beliefs because mostly we believe in some people and we
0:27:45 trust them and we adopt their beliefs. So we don’t reach our beliefs by clear thinking,
0:27:50 something, you know, unless you’re a scientist or doing something like that.
0:27:52 But even then, it’s probably a very narrow…
0:27:57 But that’s very narrow. And there is a fair amount of emotion in neuroscientists as well
0:28:02 that gets in the way of clear thinking. You know, commitments to your previous views,
0:28:08 being insulted that somebody thinks he’s smarter than you are. I mean, lots of things get in the way
0:28:14 even when you’re a neuroscientist. So I’d say there is less clear thinking than people like to think.
0:28:22 Is there anything that we can do at the belief formation stage? Like it sounds almost as though
0:28:29 when you say that we’re reading a newspaper, we read this op-ed, and it’s well constructed and fits
0:28:37 with our view of the world. Therefore, we adopt that opinion. And we forget the context that we didn’t
0:28:42 learn it through our own experience or reflection. We learned it sort of from somebody else. So we don’t
0:28:48 know when it’s sort of likely to work or not work. But we just proffer that as our opinion, is there?
0:28:54 That’s how I believe in climate change. You know, I believe in the people who tell me there is climate
0:28:59 change. And the people who don’t believe in climate change, they believe in other people.
0:29:04 So, but similarly, there’s like fake news and all this other stuff that we would have the same
0:29:12 reaction to you. You know, but I’m much more likely to believe fake news on my side than the fake news
0:29:22 on the other side. I mean, it’s true that there is a huge degradation in public discourse in the recent
0:29:28 10, 15 years in the United States. I mean, there used to be an idea that facts matter.
0:29:33 What would be your hypothesis as to why that is playing out? Are they getting into politics?
0:29:37 because I don’t want to talk politics. But like, why is that? Well, I mean, it’s hard to,
0:29:44 it’s hard to answer that question without, without politics, because it’s a general political
0:29:54 polarization has had a very big effect. And the fact that people can choose the sources of information.
0:30:01 Let’s switch gears a little bit and talk about intuition. I think one of the,
0:30:07 the things that strikes me the most about some of the work that you’ve done is the
0:30:15 cases where we’re likely to trust our intuition and when we’re not. And so if I’m, correct me if I’m
0:30:21 getting this wrong. So it’s sort of like a stable environment, repeated attempts and rapid feedback.
0:30:31 It strikes me that most decisions made in organizations do not fit that environment.
0:30:39 And yet we’re making a lot of these decisions on judgment or experience. What are the ways that we can
0:30:43 sort of make better decisions with that in the context?
0:30:48 : Well, in the first place, I think, you know, you shouldn’t expect too much.
0:30:50 : And pat to low expectations. : And pat to low expectations.
0:30:53 : I shouldn’t think too young. You should have low expectations
0:30:59 about improving decisions. I mean, there is, you know, one basic rule is slow down,
0:31:05 especially if you, if you have that immediate conviction, slow down. There are procedures,
0:31:11 you know, there are ways of reaching better, better decisions, but reaching better reverence,
0:31:13 and we can talk about them. : I would love to hear.
0:31:19 : If you really want to improve the quality of decision-making, use algorithms. I mean,
0:31:27 whenever, wherever you can, if you can replace judgments by, by rules and algorithms, they’ll do
0:31:36 better. And there’s big social costs to trusting, allowing algorithm to make decisions, but, but the
0:31:42 decisions will likely to be better. So that’s one thing. If you can’t use algorithms, then you slow
0:31:48 yourself down. And then there are things that you can do for certain types of problems. And there are
0:31:55 different types of problems. So one class of problems, like forecasting problems. A friend,
0:32:03 Phil Tetlock, you know, has that book on super forecasters, where he identifies with people who are
0:32:10 good at forecasting the future, what they do that makes them good. And, you know, he tries to train
0:32:16 people and he can improve people. So that’s one classic problem. I’m interested specifically in
0:32:23 another kind of problem, judgment problems, where basically you’re considering options or you’re
0:32:30 evaluating a situation and you’re trying to give it a score. There, there, there is advice, I think,
0:32:39 on how to do it. For me, it goes back to something I did in the Israeli army when I was like 22 years
0:32:46 old. So that’s a long time ago, like 63 years ago. I was a psychologist in the Israeli army. And I was
0:32:54 assigned the job of setting up an interviewing system for, for the army. That’s ridiculous. But you know,
0:32:59 this was the beginning of the state of Israel. So people were improvising all over the place.
0:33:06 So I had a BA and I was, I think I was the best trained psychologist in the army. My, my boss was a
0:33:15 chemist. Brilliant. But anyway, and the, the existing system was one where people would interview and try to
0:33:23 form an intuitive global image of how well that recruits would do as a combat soldier, which was
0:33:29 the objective of the, the object of the interview. And because I had read a book, I told me, I took a
0:33:38 different tack. And the different tack was, I identified six traits that I sort of made up. And I had them
0:33:45 ask questions and evaluate each of these traits independently and score it and write down the score,
0:33:51 then go on to the next trait. And they had to do it for all six traits. And that was, that’s all I asked
0:33:58 them to do. And the interviewers who were about one year younger than I, or recruits, but very,
0:34:04 very smart, selected for being good at it. They were furious with me. And they were furious with me
0:34:09 because they wanted to exercise their intuition. And I still remember that one of them said,
0:34:16 you’re turning us into robots. So I compromised with them. And I said, okay, you, you do it my way.
0:34:23 And I told them, you try to be reliable, not valid. You know, I’m in charge of validity. You be
0:34:30 reliable, which was pretty arrogant, but that’s, that’s how I presented it. But then when you’re done,
0:34:39 close your eyes and just put down a number of how good a soldier is that guy going to be. And when we
0:34:47 validated the results of the interview, it was a big improvement on what had gone on before. But the
0:34:54 other surprise was that you have an interesting distinction between happiness and satisfaction.
0:35:03 Can you walk us through that? Yeah, sure. I mean, the word happiness is so ambiguous and it means so
0:35:10 many things to many people, but one sensible interpretation of it is that it’s got to do with
0:35:16 your emotions, with how you feel, with the emotional tone of your life, whether it’s a happy life that,
0:35:22 you know, it’s pleasant to be you. Life satisfaction is a completely different thing. I mean, life
0:35:29 satisfaction is how you feel about your life when you think about your life. And most of the time,
0:35:34 you don’t think about your life, you just live. But you know, sometimes you sort of look,
0:35:43 and that’s when you determine how satisfied you are. That’s life satisfaction. It’s not satisfaction.
0:35:48 It’s life satisfaction. Should we balance the two or how would you think about them? Should we be
0:35:53 more happy when we’re younger, more satisfied when we’re older? That thought had never occurred to me
0:36:02 when I began to work on those. So I started out thinking that happiness in that sense of how you feel
0:36:10 when you live. And that was reality. And that life satisfaction was just stories that people tell
0:36:17 themselves. And the important thing was to be happy in real time. But later, when we did more research,
0:36:25 it turned out that the circumstances that make people happy and the circumstances that make them satisfied
0:36:32 with their life are not the same. So happiness is mostly social. It’s, you know, it’s being with people
0:36:39 you love and who love you back. That’s, that’s a lot of what happiness is. Life satisfaction is much more
0:36:48 conventional. It’s to be successful. And you know, so it’s money, education, prestige, that sort of thing,
0:36:55 is what life satisfaction is about. So those are two very different things. I thought that life satisfaction
0:37:00 is irrelevant. You know, that’s how I began. And we, we had a research program where we were,
0:37:07 we were trying to, you know, to show that this is the case. But then after a few years, I realized
0:37:16 that what people really want in their life is they don’t seem to care about how happy they’ll be.
0:37:21 They seem to want to be satisfied with their life. They seem to want to have a good story about their
0:37:28 life. And then I was in the position of saying that to define well-being in a way that people didn’t seem
0:37:36 to care particularly about. So that was not a tenable position. So I, I dropped back into saying that I had no
0:37:41 idea how to deal with it. Was this a result of the research? You did some research that was,
0:37:46 I think it said above 70,000, you don’t become happier, but do you become more satisfied?
0:37:58 No. The research I did with Angus Deaton at Princeton, famous economist, we showed that in terms of happiness,
0:38:05 in terms of emotional tone, positive and negative, having a lot of money doesn’t make you happier,
0:38:13 but being poor makes you miserable. So that’s above the threshold that was like 70,000 dollars
0:38:22 approximately in the US. Then extra money didn’t make you emotionally happier. But with life satisfaction,
0:38:28 it was a different story. With life satisfaction, that doesn’t satiate, so it’s always good to have more.
0:38:38 Because basically I think money is a proxy for success, and it’s a proxy for subjective success in many cases.
0:38:42 So it’s not necessarily about spending it or doing something with it. It’s just a measure.
0:38:47 It’s just getting it. I mean, you know, you look at all those people, all those billionaires working
0:38:53 their heads off, and they’re clearly not doing this because they need more money. They’re trying to get
0:38:57 more money. And they’re trying to get more money because that would be an indication that they’re
0:39:00 good at what they do. I think mostly it’s a proxy.
0:39:05 Do either of those variables correlate to longer living, happiness, or satisfaction?
0:39:13 Both, apparently. But you know, it’s hard to separate. And I haven’t been followed, you know,
0:39:20 shortly after deciding that I didn’t know what well-being was, I sort of stopped doing research
0:39:28 on this. So I haven’t been following. But I think there’s clear evidence that being effectively happy,
0:39:34 you know, is very good for you. And you do live a little longer, and you live better, and so on. And
0:39:40 life satisfaction works in the same direction. Whether it’s separable, which of them, you know,
0:39:45 isn’t more important that I don’t. I want to switch gears a little bit and talk about
0:39:53 behavior. And I’d love your insider expansion upon the idea of we can change behavior, and how do we
0:40:00 go about changing our behavior? Well, you know, I’m not sure I buy the premise. I think changing
0:40:07 behavior is extremely difficult. There are a few tips and, you know, a few guidelines about how to do that.
0:40:14 But anybody who is very optimistic about changing behavior is just looted. It’s hard to change other
0:40:18 people’s behavior. It’s very hard to change your own. Not simple.
0:40:20 This is what marriage is all about, right?
0:40:25 Yeah, among other things. You know, people, when, you know, married people try to
0:40:27 change each other’s behavior.
0:40:29 It’s a lot of dissatisfying.
0:40:32 They are not on their way to a good marriage, I think.
0:40:34 We’d all be happier with lower expectations.
0:40:38 Yes. I mean, and even if you have expectation, don’t try to change,
0:40:44 because, you know, it’s very unlikely to work in a significant way.
0:40:49 I can think of the common ways that we would sort of go about behavior change, and it would be,
0:40:55 you know, making good behaviors more easy or negative behaviors harder.
0:41:02 I think that’s the main, the main insight. You know, when you want to influence somebody’s behavior,
0:41:07 that’s a very big insight. I’ve always thought that this is the best psychological idea ever,
0:41:15 you know, so far as I’m concerned. But it’s that when you want somebody to move from A to B in terms of
0:41:22 their behavior, you can think of it that there are two ways of doing it. You can push them. Or you can
0:41:29 ask the question, why aren’t they doing B already? Which is an unusual question, but you know, why?
0:41:38 So then when you ask why, why not? Why aren’t they doing B as they ought to, as they think they ought to,
0:41:46 then you get a list of what’s going to win. That’s a psychologist who, my guru on this, my hero,
0:41:52 and many people’s hero. He spoke of restraining forces. I mean, so there are reasons why they’re
0:41:59 not where you want them to be. So he spoke of behavior as an equilibrium. There are forces that
0:42:06 are pushing you one way, forces that are pushing you the other way. So how loud you speak, how fast you
0:42:13 drive. It’s easy to think of it as an equilibrium. And what we tend to do when we want to move people
0:42:24 from A to B is we push them. We add to the driving forces. And Kurt Lewin’s insight was that this is
0:42:30 not what you should do. You should actually work on the restraining forces and try to make them weaker.
0:42:37 And that’s a beautiful point. And he showed, he had that image that, you know, I’ve had since I was an
0:42:43 undergraduate. And I’m not sure, actually, whether it was his image or something that I drew from
0:42:50 reading him. But it’s like you have a plank and it’s being held by two sets of springs. You know,
0:42:56 you want it to move one direction. And so you could add another spring that would push it that way,
0:43:01 I could remove one of the springs that are holding it back. And the interesting thing,
0:43:10 and that’s the striking outcome, is when it moves, if it moves because of the driving force,
0:43:15 you’ve added to the driving force, then at equilibrium, it will be in a higher state of
0:43:21 tension than it was originally. That is because you’ve compressed one spring and such pushing back
0:43:28 harder. But if you remove the restraining force at equilibrium, there’ll be less tension on the
0:43:32 system. I must have been 20 years old. I thought that’s just so beautiful.
0:43:38 What do you wish that everybody knew about psychology that you don’t think that they do?
0:43:40 If that was class one, what’s class two?
0:43:48 You know, class two, which is a development from class one, you know, it’s the same idea extended.
0:43:57 Class two is that behaviors don’t necessarily reflect the personality, but behaviors have a lot to do with
0:44:04 the situation. And so if people behave in strange ways, look at the situation they’re in and what are
0:44:11 the pressures in the situation that make them act as well. So there is a bias that the social
0:44:17 psychologists, well-known social psychologists call the fundamental attribution error. And that means
0:44:23 that when you see people acting in some way, you think that it’s because of their personality that they
0:44:29 do it. That may not be the case. It’s quite likely that the situation is making them do it.
0:44:37 I’d like people to know that motivation is complex and that people do good things for a mixture of
0:44:43 good and bad reasons. And they do bad things for a mixture of good and bad reasons. And I think that
0:44:50 there is a point to educating people in psychology is to make them less judgmental. Just have
0:44:56 more empathy and more patience and being judgmental doesn’t get you anywhere.
0:45:01 When you talk about situational, one of the things that comes to mind is it’s so easy for us to
0:45:07 give our friends advice. But if we were in that situation, we might not necessarily see it. Why is that
0:45:11 the case? Why is it so much easier to give other people advice?
0:45:17 I mean, feelings get in the way of clear thinking. There is a phenomenon that we call the endowment effect,
0:45:24 which is that when I’d ask for more money to sell you my sandwich than I’d pay to get it. I mean,
0:45:30 that’s essentially the endowment effect. And our explanation of it, there are many explanations,
0:45:36 but a story I like to tell about it is that it’s more painful to give something up than to get something.
0:45:44 But there is an interesting result that if you have an agent making decisions on somebody’s behalf,
0:45:49 that agent doesn’t have loss of urge. So that agent sells and buys at the same price,
0:45:55 which is the economically rational thing to do. Where this goes into policy and governments and
0:46:01 really important things, that governments are like agents or people who think about the good of society.
0:46:07 And agents, they take the economic view. They take the view of what things will be like at the end.
0:46:13 They don’t figure out that there are some people who are going to be losing because of the reform that
0:46:20 they make. And it turns out that you can really expect losers, potential losers to fight a lot harder
0:46:27 than potential winners. And that’s the reason that reforms are frequently fair. And that when they succeed,
0:46:33 they’re almost always way more expensive than anticipated. And they’re more expensive because
0:46:42 you have to compensate the losers. And that frequently is not anticipated. So that’s an example of a story
0:46:52 about that incorporates behavior change and the difference between perspective, between being in the
0:46:58 the situation, feeling the pain of giving up the sandwich, and not feeling the pain of giving up the
0:46:58 sandwich.
0:47:05 That would have huge public policy sort of implications too, right? That we don’t tend to think about or
0:47:10 discuss. That’s a really interesting angle there. I want to come back to sort of situational decision
0:47:16 making based on sort of like what we see is all there is. And we have these feelings that we can’t sort of
0:47:23 disassociate with. How does environment play a role like the physical environment in sort of what we
0:47:24 decide or does it?
0:47:32 I mean, you know, there are sort of obvious things that we know. If people are hot and bothered and
0:47:41 distracted and there is a lot of noise and so on, then they’ll think as well. That we know. But even
0:47:48 there, there are puzzles. I mean, many people think and work a lot better in cafes, you know, where there is
0:47:54 actually ambient noise and activity around them. And it helps them concentrate better. So there isn’t
0:48:00 a very simple story of the environment, but certainly you can make the environment tough enough so that
0:48:04 people won’t be able to think properly. That’s, that’s feasible.
0:48:11 Are there things that we could do to, I guess, push the environment to be more conducive to
0:48:15 clearer thinking? The physical environment in this case?
0:48:21 Oh, there are all sorts of, you know, odd findings, you know, the color of the
0:48:27 color of the room. Some colors are better than others. And you would expect that some colors are
0:48:31 more calming than others. So you wouldn’t want to be in a red room.
0:48:39 Making decisions. Making decisions. But, you know, those are extreme and minor effects.
0:48:44 I want to come to intuition and noise later. Is there anything else that stands out that
0:48:49 gets in the way of clear thinking that we can sort of bring to the surface now?
0:48:54 Well, you know, what, what gets in the way of clear thinking is that we have,
0:49:01 we have intuitive views of almost everything. So as soon as you present a problem to me,
0:49:07 I have, you know, I have some ready-made answer. And what gets in the way of clear thinking are
0:49:13 those ready-made answers. And we can’t help but have them. So that’s one thing that gets in the
0:49:22 way. Emotions get in the way. And I would say that independent clear thinking is, to first
0:49:29 approximation, impossible. I mean, in the sense that, you know, we believe in things most of the time,
0:49:34 I’m not because we have good reasons to believe them. If you ask me for reasons, I’ll explain you.
0:49:39 I’ll, I’ll always find a reason, but the reasons are not the causes of our beliefs.
0:49:47 We have beliefs because mostly we believe in some people and we trust them and we adopt their beliefs.
0:49:54 So we don’t reach our beliefs by clear thinking something, you know, unless you’re a scientist or
0:49:56 doing something like that.
0:49:59 But even then, it’s probably a very narrow…
0:50:03 But that’s very narrow. And there is a fair amount of emotion in neuroscientists as well
0:50:08 that gets in the way of clear thinking. You know, commitments to your previous views,
0:50:12 being insulted that somebody thinks he’s smarter than you are. I mean,
0:50:18 lots of things get in the way than the neuroscientists. So I’d say there is less
0:50:20 clear thinking than people like to think.
0:50:27 Is there anything that we can do at the belief formation stage? Like, it sounds
0:50:35 almost as though when you say that we’re reading a newspaper, we read this op-ed, and it’s well-constructed
0:50:42 and fits with our view of the world. Therefore, we adopt that opinion. And we forget the context that
0:50:48 we didn’t learn it through our own experience or reflection. We learned it sort of from somebody
0:50:53 else. So we don’t know when it’s sort of likely to work or not work. But we just proffer that as
0:50:54 our opinion is there.
0:51:01 That’s how I believe in climate change. You know, I believe in the people who tell me there is climate
0:51:05 change. And the people who don’t believe in climate change, they believe in other people.
0:51:11 But similarly, there’s like fake news and all this other stuff that we would have the same reaction to.
0:51:19 You know, but I’m much more likely to believe fake news on my side than the fake news on the other
0:51:29 side. I mean, it’s true that there is a huge degradation in public discourse in the recent
0:51:34 10, 15 years in the United States. I mean, there used to be an idea that facts matter.
0:51:39 What would be your hypothesis as to why that is playing out? Are they getting into
0:51:42 politics because I don’t want to talk politics? But like, why is that?
0:51:46 Well, I mean, it’s hard to, it’s hard to answer that question without,
0:51:55 without politics, because it’s a general political polarization has had a very big effect.
0:52:01 And the fact that people can choose the sources of information.
0:52:08 Let’s switch gears a little bit and talk about intuition. I think one of the,
0:52:16 things that strikes me the most about some of the work that you’ve done is the cases where we’re
0:52:22 likely to trust our intuition and when we’re not. And so if I’m, correct me if I’m getting this wrong,
0:52:29 so it’s sort of like a stable environment, repeated attempts and rapid feedback. It strikes me that
0:52:40 most decisions made in organisations do not fit that environment and yet we’re making a lot of these
0:52:49 decisions on judgement or experience. What are the ways that we can sort of make better decisions with that in the context?
0:52:54 Well, in the first place, I think, you know, you shouldn’t expect too much.
0:52:56 And pat to low expectations.
0:53:02 I shouldn’t think too young or should have low expectations about improving decisions. I mean,
0:53:08 there is, you know, one basic rule is slow down, especially if you, if you have that immediate
0:53:13 conviction, slow down. There are procedures, you know, there are ways of reaching better,
0:53:18 better decisions, but reaching better judgment and we can talk about them.
0:53:20 I would love to hear.
0:53:25 If you really want to improve the quality of decision making, use algorithms. I mean,
0:53:33 whenever, wherever you can, if you can replace judgments by, by rules and algorithms, they’ll do
0:53:42 better. And there’s big social costs to trusting, allowing algorithm to make decisions, but, but the
0:53:48 decisions will likely to be better. So that’s one thing. If you can’t use algorithms, then you slow
0:53:54 yourself down. And then there are things that you can do for certain types of problems. And there are
0:54:02 different types of problems. So one class of problems, like forecasting problems, a friend,
0:54:09 Phil Tetlock, you know, has that book on super forecasters, where he identifies with people who
0:54:16 are good at forecasting the future, what they do that makes them good. And, you know, it tries to train
0:54:21 people and we can improve people. So that’s one class of problem. I’m interested specifically
0:54:29 in another kind of problem, judgment problems, where basically you’re considering options or you’re
0:54:36 evaluating a situation and you’re trying to give it a score. There, there, there is advice, I think,
0:54:45 on how to do it. For me, it goes back to something I did in the Israeli army when I was like 22 years
0:54:52 old. So that’s a long time ago, like 63 years ago. I was a psychologist in the Israeli army. And I was
0:55:00 assigned the job of setting up an interviewing system for, for the army. That’s ridiculous. But you know,
0:55:06 this was the beginning of the state of Israel. So people were improvising all over the place. So I
0:55:13 had a BA and I was, I think I was the best trained psychologist in the army. My, my boss was a chemist.
0:55:22 Brilliant. But anyway, and the, the existing system was one where people would interview and try to form an
0:55:30 intuitive global image of how well that recruit would do as a combat soldier, which was the objective,
0:55:37 the object of the interview. And because I had read a book of Paul Neal, I took a different talk. And
0:55:45 the different talk was, I identified six traits that I sort of made up and I had them ask questions and
0:55:52 evaluate each of these traits independently and score it and write down the score, then go on to the
0:55:58 next trait. And they had to do it for all six traits. And that was, that’s all I asked them to do. And
0:56:06 the interviewers who were about one year younger than I, all recruits, but very, very smart, selected
0:56:12 for being good at it. They were furious with me. And they were furious with me because they wanted to
0:56:18 exercise their intuition. And I still remember that one of them said, “You’re turning us into robots.” So I
0:56:26 compromised with them. And I said, “Okay, you do it my way.” And I told them, “You try to be reliable,
0:56:33 not valid. You know, I’m in charge of validity. You be reliable.” Which was pretty arrogant, but that’s,
0:56:41 that’s how I presented it. But then when you’re done, close your eyes and just put down a number of
0:56:49 how good a soldier is that guy going to be. And when we validated the results of the interview,
0:56:55 it was a big improvement on what had gone on before. But the other surprise was that
0:57:05 the final intuitive judgments added, it was good. It was as good as the average of the six straights,
0:57:11 and not the same. It added information. So actually, we ended up with a score that was
0:57:18 half, was determined by the specific ratings, and the intuition got half the way. And that,
0:57:22 by the way, stayed in the Israeli army for well over 50 years. I don’t know whether it’s,
0:57:28 I think it probably, some version of it was still being forced, but around 15 years ago,
0:57:36 a visit of my old base. And, and the commanding officer of the research unit was telling me how
0:57:43 they run the interview. And, and then she said, and then we tell them, “Close your eyes.” So that,
0:57:50 that had stayed for 50 years. Now, the “Close your eyes” and that whole idea is not the basis of the
0:57:57 book that I’m writing. So actually, I have the same idea really, that when you are making decisions,
0:58:03 you should think of options as if they were candidates. So you should break, break it up into
0:58:10 dimensions, evaluate each dimension separately, then look at the profile. And, and the key is,
0:58:19 delay your intuition. Don’t try to form an intuition quickly, which is what we normally do. Focus on the
0:58:25 separate points. And then when you have the whole profile, then you can have an intuition and it’s going to
0:58:32 be better because people make form intuitions too quickly. And the, the rapid intuitions are not
0:58:38 potentially good. So if you delay intuition until you have more information, it’s going to be better.
0:58:40 I’m curious how we delay intuition.
0:58:49 You delay intuition by focusing on the separate problems. So our advice is that if you have, you know,
0:58:56 a board of directors making decisions about an investment, we tell them you do it that way. Take
0:59:04 the separate dimensions and really think about each dimension separately and independently and don’t
0:59:11 allow, you know, if you’re the chair, don’t allow people to give their final judgment. Say, we’ll wait
0:59:18 until we cover the whole thing. I mean, if you find a deal breaker, then you stop. But if you haven’t found
0:59:27 a deal breaker, wait to the end and look at the profile and then your decision is almost certainly going to be better.
0:59:33 Does that include weighting the different aspects of the problem differently or do you highlight that in advance or do you?
0:59:42 Yeah. I mean, it makes you see the trade-offs more clearly. Otherwise, when we don’t follow that discipline,
0:59:50 there is a way in which people form impressions. Very quickly you form an impression and then you spend
0:59:58 most of your time confirming it instead of collecting evidence. And so if accidentally your impression was
1:00:04 in the wrong direction, you’re going to confirm it and you don’t give yourself a chance to correct
1:00:14 it. Independence is the key because otherwise, when you don’t take those precautions, it’s like having a bunch of
1:00:23 witnesses to some crime and allowing those witnesses to talk to each other. They’re going to be less valuable if you’re
1:00:29 interested in the truth than keeping them rigidly separate and collecting what they have to say.
1:00:37 What have you seen work in a repeatable way? It may be a particular organization or across organizations
1:00:48 to not only reliably surface disconfirming evidence, but then place a value on what is surface instead of
1:00:52 being dismissive. Is there a framework for that? Is there?
1:00:58 Well, yeah. There are many, you know, there are many procedures like red team, blue team,
1:01:05 a devil’s advocate. I mean, there have been, you know, many attempts. In general, you know,
1:01:12 if you are the head of a group that makes decisions, one of your missions would be to protect the dissenters
1:01:21 because they’re very valuable and you should make it painless to be sent or as painless as possible.
1:01:29 Well, it’s hard to be sent. It’s painful and costly. So protecting dissenters is important.
1:01:37 I’m curious about the distinction between intuition and judgment. You had mentioned intuition, judgment,
1:01:43 intuitive judgment. Can you walk me through some of like how those differ?
1:01:53 It’s a bit hard to separate and judgment is what you do when you integrate a lot of information informally
1:01:56 into a score of some kind.
1:02:03 I, we speak, we being my co-authors and I in the book we’re writing, we speak of judgment as
1:02:10 measurements. But it’s measurement where the measuring instrument is your mind. But you do it informally.
1:02:17 And because you do it informally, people are going to, are not necessarily going to agree. So wherever we
1:02:22 say it’s a matter for judgment, we’re allowing for differences, for variability.
1:02:32 Now, judgment can be more or less slow, more or less systematic. So at one end, you have pure intuition,
1:02:38 where you allow the judgment to go very quickly and so on. And at the other end, you try to delay
1:02:44 intuition. But ultimately, if you’re making it by judgment, you’re going to have a judgment and it’s
1:02:51 going to be like an intuition and you’re going to go with it. So the more or less deliberate judgment,
1:02:56 intuition is always involved at one point or another.
1:02:59 You’re either sort of like listening to it or fending it off?
1:03:03 Yeah. And our recommendation is fend it off.
1:03:07 Are there ways to judge the quality of somebody’s judgment?
1:03:12 Yeah, sure. I mean, some of them would be unique to the actual scenario,
1:03:15 but what are the sort of other ways that we could?
1:03:22 Well, I mean, you may require people to explain their judgments and evaluating the quality of the
1:03:29 explanation is, you know, whether it’s logical, whether it uses the evidence, whether it uses all
1:03:39 the evidence, whether it is strongly influenced by wishes, whether the conclusion was reached before
1:03:48 the judgments supposedly is made. You know, there are lots of ways for judgment to fail that can be
1:03:53 recognized. So it’s harder to recognize very good judgment, but it’s really easy to see, you know,
1:03:57 what goes wrong. And there are quite a few ways for judgment to go wrong.
1:04:04 And I think some of those ways are the cognitive biases, like overconfidence and sort of using
1:04:11 small or extrapolating from small sample sizes. And one of the interesting things that I’ve heard you
1:04:18 say in interviews before, so correct me if I’m off here, is that you’ve studied cognitive biases
1:04:23 effectively your whole life and you’re no better at avoiding them than anybody else.
1:04:27 Yeah, certainly. Not much better, no.
1:04:29 What hope do the rest of us have?
1:04:37 Not much. I mean, I never, you know, I think, you know, the quality of people’s judgment is affected
1:04:46 by education. But, so in general, you know, more educated people make better judgments, I think,
1:04:53 on average. But people decide I’m going to make better judgments. I don’t think that’s very hopeful.
1:05:01 I’m much more hopeful about organizations because organizations think more slowly and they have
1:05:08 procedures for thinking. And so you can control the procedures. Individual judgment is really hard to fix.
1:05:10 Not impossible.
1:05:16 One of the things that I see people do in response to cognitive biases and trying to account for them
1:05:23 is to sort of make a list of them, almost like a checklist, and then go through that checklist and
1:05:29 explain or rationalize why those things don’t apply in this situation. It also strikes me that the more
1:05:35 intelligent you are, the more stories you’d be able to conjure up about why you’re avoiding this.
1:05:42 I really think that’s not very hopeful because there are so many biases. And the biases work in
1:05:50 different directions anyway. So sometimes you can recognize a situation as one in which
1:05:59 you’re likely to be wrong in a particular way. So that’s like illusions. If you recognize a particular
1:06:05 pattern as something that gives rise to a visual illusion, then you don’t trust your eyes.
1:06:13 You know, you do something else. And the same thing happens when you recognize this is a situation where
1:06:22 I’m likely to make an error. So sometimes you can recognize the importance, for example, of what we’ve called
1:06:31 an anchor. So you’re going to negotiate a price with somebody. They start very high. And that has an effect.
1:06:39 So you know, or you should know, that the person who moves first in a negotiation has an advantage.
1:06:50 Because the first number changes everybody’s view of what is considered plausible. So it moves things
1:06:57 in that direction. That’s a phenomenon. People can learn that. And they can learn to resist it.
1:07:05 So when I was teaching negotiations, I would say, somebody does that to you, comes up with a number
1:07:14 that’s absurd. I would say, lose your temper. Make a scene. Say, I will not start the conversation from
1:07:20 that number. It’s an absurd number. I don’t want to let’s erase that number. So that’s something that,
1:07:27 you know, you know, you can improve if you recognize it. I think people are aware of the fact that you
1:07:34 shouldn’t make a decision about road safety within a short interval of a terrible accident.
1:07:42 So you should allow things to settle down and cool down. There is a more subtle error and harder,
1:07:51 harder to fix. But that, the best prediction, the best guess is always less extreme than your impression.
1:07:56 intuitive prediction. Intuitive prediction is, as we say, not regressive. It doesn’t recognize
1:08:04 regression to the mean. But statistics is statistics. And in statistics, things are less extreme.
1:08:11 Should I give you my favorite example of a bias? Yeah, please. Okay. I have been unable to think of a
1:08:19 better one. But the story is about Julie. That’s part of the story. That’s her name. She is a
1:08:25 graduating senior at university. And I’ll tell you one fact about her, that she read fluently when she
1:08:33 was four. But it’s a GPA. And the interesting thing here is that everybody has a number. As soon as I
1:08:39 told you that thing, her number came to mind. Now, we know where that number came from. We really,
1:08:45 that’s one of the few things that I’m reasonably sure I understand perfectly. And this is that when you
1:08:53 hear she read fluently at age four, you get an impression of how smart she is, of how precocious
1:09:00 she was at age four. And you could put that in percentiles. You know, where did that put her on a
1:09:06 percentile for sort of aptitude, ability? And it’s high. It’s not, you know, if she had read fluently
1:09:13 at age two and a half, it would be more extreme. But age four is pretty high. So say at the 90th percentile.
1:09:23 And then the GPA that comes to your mind is around the 90th percentile in the distribution of GPA.
1:09:28 So you pick something, your prediction is as extreme as your impression.
1:09:40 And it’s idiotic statistically, completely stupid, because clearly the age at which a child learned
1:09:47 to read is not all that diagnostic with respect to GPA. So it’s better than nothing. If you didn’t
1:09:55 know anything, you would predict the mean GPA, whatever it is, 3.1, 3.2. Now, she’s bright,
1:10:04 so probably a little higher, but not 3.7. You don’t want to. So that’s cool. That’s a bias. That’s
1:10:13 non-regressive prediction. And that’s very hard to resist. Sometimes I’m able to resist it,
1:10:19 but never when it’s important. You know, when I’m really involved in something, I don’t think about it,
1:10:25 but sometimes I will recognize, oh, you know, that’s the situation. I should moderate my prediction.
1:10:31 And if you’re conscious of it, that’s an example of one you can sort of talk yourself into.
1:10:38 Yeah. Yeah. You can talk yourself into. Although, you know, you usually will find a way to cheat
1:10:47 and end up with your intuition. It’s remarkable. You know, when you’ve been in academic life for a long
1:10:56 time, so you’ve been in many situations where people discuss a job candidate. And absurdities of that kind
1:11:05 are very common. So somebody, a job candidate gives a talk, and people evaluate the talk, and this is
1:11:11 something happens, you know, at Berkeley when I was teaching there, that somebody gave a talk. It wasn’t
1:11:20 a very good talk. Stammered a bit. Now, that person had teaching prizes, and yet what was said about him
1:11:28 in the discussion? He can’t teach. You know, we heard the talk. So that’s a mistake. But the funny
1:11:33 thing is you can point out to people that that’s a mistake. They still don’t want to hire him because
1:11:40 he gave a lousy talk. So it’s hard to resist. It’s interesting. I think one of the ways I probably
1:11:46 got my job is using psychology in the interview, which is asking why I was there, and then reinforcing
1:11:54 those beliefs throughout the interview. I want to come back just one second to the immediacy of sort
1:12:02 of having a stimulus and then making a decision. So we use the example of roads, and a tragic accident
1:12:08 happens, and you’re rethinking sort of policy or laws around the roads. How much of that do you think
1:12:16 is social pressure? And I’m wondering if we could even extrapolate that a little more to we’re taught
1:12:22 to answer questions on a test right away, right? So we see this question, then we answer it. We’re
1:12:29 taught that we, or maybe it’s reinforced, taught is probably the wrong word, that politicians need to
1:12:35 have a response, an immediate response to, and even if they know the best thing to do is like, okay,
1:12:43 let this settle, take some time. It’s society writ large seems to demand it, like the environment is
1:12:50 not conducive. I think it’s pretty clear that people prefer leaders who are intuitive and who are
1:13:00 overconfident. Leaders who deliberate too much are viewed with suspicion, you know. So I think Obama
1:13:05 Obama was at a certain disadvantage relative to George Bush, you know.
1:13:07 Because he was seen as more deliberate.
1:13:12 Yeah, he was more deliberate. And then when you’re very deliberate, you look as if you don’t know what
1:13:20 you’re doing. But when you act with confidence, so people want leaders who are intuitive, I think,
1:13:24 they’re very much. Provided they agree with me.
1:13:28 I’m just working my way back through some of these rabbit holes that we’ve gone down. You taught
1:13:36 negotiations. I’m curious what would be in your sort of syllabus for negotiations that everybody should
1:13:41 learn about negotiations when it comes to your work in psychology.
1:13:49 Well, you know, that goes back to a theme that we started with, the essence of teaching
1:13:56 negotiations, that negotiations is not about trying to convince the other guy. It’s about trying to
1:14:01 understand them. So again, it’s slowing yourself down. It’s not doing what comes naturally,
1:14:11 because trying to convince them is a prime pressure. Arguments, promises, and threats are always a prime
1:14:17 pressure. And what you really want is understand, you know, what you can do to make it easy for
1:14:25 them to move your way. Very non-intuitive. That’s a surprising thing when you teach negotiation. It’s not
1:14:30 obvious. You know, we are taught to apply pressure. I mean, socialize that way.
1:14:36 You mentioned that there was procedures for thinking in organizations. Are there any that
1:14:43 stand out in your mind that we could use to elevate thinking, and if not elevate,
1:14:49 but give feedback on the quality of thinking to improve it? Well, I think one of the ideas that
1:14:58 people like the most is an idea by Gary Klein, what he calls the premortem. And that’s the universal
1:15:03 winner. People really like that idea. And this is that when you’re about to make a decision,
1:15:10 a group, not quite, because if you’ve made it, it’s too late, but they’re approaching you. And then
1:15:19 you get people in a room who can be the people who are making the decision. And you said, suppose it’s two
1:15:26 years from now. And we made the decision that we’re contemplating. And it turned out to be a disaster.
1:15:32 Now, you have a page in front of you, write the history of that disaster in bullets. That’s the
1:15:40 pre-mortem. And it’s beautiful as an idea. It’s beautiful because when people are coming close to a
1:15:49 decision, it becomes difficult to raise doubts or to raise questions. People who are slowing the group
1:15:56 down when the group is nearing a decision are procedures really, you know, it’s annoying. You
1:16:03 know, you want to get rid of them. And the pre-mortem legitimizes that sort of dissent and that sort of
1:16:11 of doubts not only legitimizes it, you know, it rewards it. And so that’s a very good idea. I don’t,
1:16:17 you know, I don’t think that it’s going to prevent people from making mistakes, big mistakes,
1:16:25 but it could certainly, it will alert people to possible loopholes, to things that they ought to
1:16:30 do to make a safer decision. So that’s a good procedure. And there are many others.
1:16:37 What comes to mind. What comes to mind is, is to make intelligence, I mean,
1:16:46 the collection of, of the information independent of the decision-maker’s wishes. And you really want
1:16:54 to protect the independence of the people collecting the evidence. And I would add to, you know, a procedure
1:17:01 that really people don’t like, but if it were possible to implement it, I think would be good.
1:17:09 And that’s, that when you’re going to be discussing a topic, and it’s done in advance on people in sense
1:17:16 and material to think about the topic, that you may want them to write down their decision, the decision they
1:17:24 are in favor of before the discussion starts. That has many advantages. It’s going to give you a broader
1:17:33 diversity of points of view, because people tend to converge very quickly in a, in a group discussion.
1:17:40 And it forces people to be better prepared. It’s, except people don’t want this.
1:17:47 So I, I don’t know whether it’s even possible to implement it, but clearly, if you could,
1:17:49 it would be a good idea.
1:17:51 What are the reasons people don’t want it?
1:17:52 It’s too much work.
1:17:57 Right. Forces you to do a lot rather than the signaling you can sort of get away with.
1:18:02 Yeah. And then, you know, there’s somebody who is going to prepare the case. And so I glanced at
1:18:13 the material and then, you know, so a lot of meetings are a tremendous, a sink for wasted time.
1:18:17 And improving the quality of meetings would be a big thing.
1:18:19 Do you have any insights on how to do that?
1:18:26 Keeping them short. You know, I’m not a professional at fixing meetings. So I have,
1:18:35 I have a few ideas, but not an incomplete view. The, the question of structuring the meetings to
1:18:43 be discussing topics one at a time, that I think is, is really useful. I’ll give you an example. I mean,
1:18:48 it’s something that I suggested when I was consulting, but for some reason,
1:18:54 people didn’t buy that suggestion. So you, when an investment is being discussed,
1:19:01 say by an investment firm, some staff people, if it’s a big investment, staff people will prepare
1:19:11 a briefing book with chapters. Now, our recommendation would be that the staff should end each chapter with
1:19:19 a score. How does that chapter taken on its own independently of anything else affect the likely
1:19:26 decision? And then you could structure the meeting that discussed this and the meeting of the boards,
1:19:33 say, to discuss these scores one at a time. That has the effect that I was talking about earlier,
1:19:40 making the decision, making the judgments about the dimensions. We call them mediating assessments,
1:19:47 is a drogen tube. The mediating assessments come first, and then you have the profile of them,
1:19:55 and then you make a global judgment. And you can structure it. So if the staff has presented a score,
1:20:03 and you discuss in the board, do we accept their score? You’re forcing people to have a look at
1:20:09 the evidence. And think about why they would accept or reject. And then they feel like they have to
1:20:16 construct an argument that might be less intuitive. That’s it. So, you know, there are ways of doing this,
1:20:24 but if you’re going to be too rigid about it, it won’t work either. I’m curious what other advice
1:20:29 you gave as a consultant that nobody followed. Oh, I mean, virtually all the advice I gave,
1:20:34 people don’t follow. I mean, you know, I think that’s, that’s not, you shouldn’t, you know,
1:20:40 you’re not going to be a consultant if you expect your advice to be taken. You have to give the best advice
1:20:46 you can. What would be other examples of something you think could be widely applicable that you would advise,
1:20:51 you would have advised people and you just sort of like saw them drop the ball?
1:20:58 Well, I mean, you know, I would advise people who make a lot of decisions to keep track their
1:21:09 decisions and of how they turned out so that later you can come and evaluate your procedures and see
1:21:15 whether there is anything that is in common with those decisions that turned out well and didn’t,
1:21:18 not so well and so on. People hate doing this.
1:21:20 Why do you think people hate doing it?
1:21:27 Oh, because, because retrospectively, they may look foolish, some of them or all of them,
1:21:33 or in particular, the leader. So they really don’t like keeping track. I mean, there are exceptions.
1:21:43 Ray Dalio and his firm, and where everything is explained. Bridgewater, yeah. Bridgewater. But in
1:21:51 general, in my, I haven’t consulted with Bridgewater, they don’t mean me. But in general, when I suggested
1:21:56 that, it never went anywhere. What are the variables that you would recommend people keep track of?
1:21:59 Like, what would your decision journal look like?
1:22:02 Oh, I mean, I, my, my decision journal would be a mess.
1:22:08 I don’t, I’m not putting myself as an example, but…
1:22:11 So obviously the outcome, but you’ve got to do that post after.
1:22:20 Yeah, but no, no. You, you would want to say, what were the main arguments, pro and con? What were
1:22:22 the alternatives that were considered?
1:22:29 You know, it doesn’t have to be very detailed, but it should be enough so that you can come later and
1:22:31 debrief yourself.
1:22:34 So do you have a calibration, like what degree of confidence you are?
1:22:40 That would be good. Then, you know, it would depend on something that you could evaluate later.
1:22:49 It strikes me that decision journals and premortems are a way to identify people that are sort of,
1:22:55 perhaps, suppressed by their manager, where you have somebody who’s actually a better,
1:23:01 better at exercising judgment than the person that is, you know, that they’re working for.
1:23:07 And this would be a pain-free sort of way to calibrate that score over time and identify
1:23:11 the quality of judgment in a consistent way.
1:23:16 Oh, yeah. I mean, that strikes me as worth a lot of money to an organization.
1:23:23 Yeah. But, but also very costly. And you, you will see that certainly anything that threatens the
1:23:30 leader is not going to be adopted. And, and leaders may not want something that threatened their
1:23:34 subordinates either. People are really very worried about embarrassment.
1:23:36 You’re writing a book now on noise.
1:23:37 Yeah.
1:23:40 Tell me about noise and decision-making. Can you explain the concept?
1:23:47 Yeah. I can really explain it by saying what, you know, was the beginning of it,
1:23:56 and which was a consulting assignment in an insurance company where we, I had the idea of running a test
1:24:05 to see whether people in a given role who were supposed to be interchangeable agreed with each other.
1:24:11 So, you know, when you come to an insurance company and an underwriter gives you a premium,
1:24:17 the underwriter speaks for the company. And so it’s, you expect that any underwriter,
1:24:23 that it doesn’t matter which underwriter you get to afford a premium. And the company has that
1:24:30 expectation. It shouldn’t make much difference. So we tested that. And they constructed some cases.
1:24:35 And then we had some like 50 underwriters assess a premium for the case.
1:24:37 With the same information.
1:24:37 Hmm?
1:24:43 Yeah. With a really very realistic, we didn’t construct it. They constructed the case.
1:24:48 So, and they conducted the experiment. But now, the interesting question is,
1:24:56 how much variation do you expect there to be? So, we asked the executives the following question.
1:25:01 Suppose you take two underwriters at random. By what percentage do they differ? I mean,
1:25:06 you look at the difference between their premium, divide that by the average premium. What number do you
1:25:14 get? And people expect 10%. By the way, it’s not only the executives in that company. For some reason,
1:25:22 people expect 10%. And it was roughly 50%, 5-0. So that’s, you know, that’s what made me curious about
1:25:28 noise. That and the fact that the company was completely unaware that it had the noise. It took
1:25:35 them completely by surprise. So now we’re writing a book because there’s a lot of noise. So wherever
1:25:43 a rule is that wherever there is judgment, there is noise. And more of it than you think. So that’s the
1:25:44 pattern.
1:25:50 Are there procedures to reduce noise? And conversely, it strikes me that the variation would be good,
1:25:52 but maybe only in an evolutionary concept.
1:26:00 Well, we call that noise is useless variability. I mean, variability can be very useful if you have
1:26:07 a selection mechanism and some feedback. So evolution is built on variability, but of course,
1:26:13 it’s useful, it’s useful. But noise among ambient writers is useless. There’s nothing. Nothing
1:26:22 gets learned. There’s no feedback. It’s just noise. And it’s costly. The first advice, of course, would be
1:26:30 algorithms, as I said earlier. So algorithms are better than people, than judgment. That’s not intuitive,
1:26:40 but it’s really true. And after that, then the procedure that I mentioned earlier for making
1:26:46 decisions in an orderly way by breaking it up into assessments. That’s the best that we can do.
1:26:54 And there is one very important aspect that I haven’t mentioned. And this is training people in what the
1:27:02 scale is. So there is one piece of advice that you’d have for underwriters, that they should always
1:27:12 compare the case to other cases. And if possible, if you can have them share the same frame of
1:27:17 reference with other underwriters, you’re going to cut down on the noise.
1:27:19 Oh, that’s a clever idea, yeah.
1:27:30 And that exists in human resources, where performance evaluation, which is one of the scandals of modern
1:27:36 commerce, how difficult it is. But performance evaluation, they have the thing that’s called
1:27:44 frame of reference training, which is teaching people how to use the scale. There’s a lot of variability in
1:27:52 the scale. And a part of what the super forecasters do, they make judgments in probability units,
1:28:00 and they teach them to use the probability scale. So learning the scale is a very important aspect of
1:28:01 reducing noise.
1:28:07 I know we’re coming up to the end of our time here. What have you changed your mind on in the past 10 years?
1:28:08 Oh, a lot.
1:28:09 Anything big?
1:28:17 Yeah. There’s been a replication crisis in psychology. And some of the stuff that I really
1:28:24 believed in, when I wrote “Thinking Fast and Slow”, some of the evidence has been discredited. So I’ve had to
1:28:25 change my mind.
1:28:27 What are the, what’s the biggest?
1:28:35 Some of the sexiest stuff, priming and unconscious priming, and so it just hasn’t held up in replication.
1:28:43 And I believed it, and I wrote it as if it were true, because the evidence suggested it. And in
1:28:52 fact, I thought that you had to accept it, because that was published evidence. And I should have,
1:28:58 I blame myself for having been a big gullible. That is, I should have known that you can publish
1:29:05 things, even if they’re not true. But I just didn’t think that through. So I changed my mind.
1:29:13 I’m now much more cautious about spectacular findings. I mean, very recently, I’ve come,
1:29:21 I think I have a theory about why psychologists are prone, or social scientists generally are prone to
1:29:29 to exaggerate, to be overconfident about their hypotheses. So I’ve done quite a bit of learn.
1:29:30 What’s the theory?
1:29:39 Well, the theory, the one element of the theory is that all these hypotheses are true. In what sense?
1:29:46 That, you know, if I, there’s a famous study that you mentioned wrinkles to people, and then
1:29:52 you measure the speed at which they walk, and they walk more slowly. Turns out that hasn’t held up in
1:29:59 replication, which is very painful. It’s one of the favorite studies. But actually, you know, that if
1:30:04 you mention wrinkles, and it’s going to have any effect on the speed of walking, it’s not making to make,
1:30:09 it’s not going to make people faster. If it has any influence, it’s going to make them slower. So
1:30:19 directionally, all these hypotheses are true. But what there is, is what people don’t see, is that
1:30:28 then huge number of factors that determine the speed at which individuals walk, and the differences in the
1:30:36 speed of walking between individuals. And that’s noise. And people neglect noise. And then there is
1:30:43 something else, which is, touches on both philosophy and, and psychology. When you have intuitions about
1:30:51 things, there are clear intuitions, and there are strong intuitions. Another thing. So clear intuition is if I
1:31:00 offer you a trip to Rome, or a trip to Rome and an ice cream cone, you know what you prefer. It’s easy. But it’s very
1:31:06 weak, of course. I mean, the amount of money you would pay to get a trip to Rome, a trip to Rome and an ice cream cone,
1:31:15 nothing. But when you are a philosopher, and I should add one thing, to see the clear intuitions,
1:31:20 you have to be in this kind of situation that psychologists call within-subject. That you
1:31:28 have both. You have both with the ice cream cone and without the ice cream cone. So in a within-subject
1:31:34 situation, that’s an easy problem. In a between-subject situation, it’s an impossible problem. But now,
1:31:40 if you’re a philosopher, you’re always in a within-subject situation. But people live in a
1:31:46 between-subject situation. They live, you know, in one condition. And the same thing is true for
1:31:55 psychologists. So psychologists live in a, when they cook up their hypotheses, they’re in a within-subject
1:32:01 situation. But then they make guesses about what will happen between subjects. And they’re completely
1:32:09 lost between clear intuitions and strong intuition. We have no way of calibrating ourselves. So that makes
1:32:17 us wildly overconfident about what we know and reluctant to accept that we may be wrong.
1:32:23 That’s a great place to end this conversation, Danny. Thank you so much.
1:32:28 Thanks for listening and learning with us. Be sure to sign up for my free weekly newsletter at
1:32:34 fs.blog/newsletter. The Farnham Street website is also where you can get more info on our membership
1:32:40 program, which includes access to episode transcripts, my repository, ad-free episodes,
1:32:46 and more. Follow myself and Farnham Street on X Instagram and LinkedIn to stay in the loop. Plus,
1:32:50 you can watch full episodes on our YouTube channel. If you like what we’re doing here,
1:32:54 leaving a rating and review would mean the world. And if you really like us,
1:32:58 sharing with a friend is the best way to grow this community. Until next time.

Daniel Kahneman won the Nobel Prize for proving we’re not as rational as we think. In this timeless conversation we discuss how to think clearly in a world full of noise, the invisible forces that cloud our judgement, and why more information doesn’t equal better thinking. Kahneman also reveals the mental model he discovered at 22 that still guides elite teams today. 

Approximate timestamps: 

(00:36) – Episode Introduction  

(05:37) – Daniel Kahneman on Childhood and Early Psychology  

(12:44) – Influences and Career Path  

(15:32) – Working with Amos Tversky  

(17:20) – Happiness vs. Life Satisfaction  

(21:04) – Changing Behavior: Myths and Realities  

(24:38) – Psychological Forces Behind Behavior  

(28:02) – Understanding Motivation and Situational Forces  

(30:45) – Situational Awareness and Clear Thinking  

(34:11) – Intuition, Judgment, and Algorithms  

(39:33) – Improving Decision-Making with Structured Processes  

(43:26) – Organizational Thinking and Dissent  

(46:00) – Judgment Quality and Biases  

(50:12) – Teaching Negotiation Through Understanding  

(52:14) – Procedures That Elevate Group Thinking  

(55:30) – Recording and Reviewing Decisions  

(57:58) – The Concept of Noise in Decision-Making  

(01:01:14) – Reducing Noise and Improving Accuracy  

(01:04:09) – Replication Crisis and Changing Beliefs  

(01:08:21) – Why Psychologists Overestimate Their Hypotheses  

(01:12:20) – Closing Thoughts and Gratitude

Thanks to MINT MOBILE for sponsoring this episode: Get this new customer offer and your 3-month Unlimited wireless plan for just 15 bucks a month at MINTMOBILE.com/KNOWLEDGEPROJECT.

Newsletter – The Brain Food newsletter delivers actionable insights and thoughtful ideas every Sunday. It takes 5 minutes to read, and it’s completely free. Learn more and sign up at ⁠⁠⁠⁠⁠⁠fs.blog/newsletter⁠⁠⁠⁠⁠⁠

Upgrade — If you want to hear my thoughts and reflections at the end of the episode, join our membership: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠fs.blog/membership⁠⁠⁠⁠⁠⁠⁠⁠ and get your own private feed.

Watch on YouTube: ⁠⁠⁠⁠⁠⁠@tkppodcast

Photograph: Richard Saker/The Guardian

Learn more about your ad choices. Visit megaphone.fm/adchoices

Leave a Reply

Your email address will not be published. Required fields are marked *