The Future of AI and How It Will Shape Our World — with Mo Gawdat

AI transcript
0:00:03 Support for property comes from Viori.
0:00:05 Oh my God, true story.
0:00:08 I am wearing totally coincidentally,
0:00:09 guess what, Viori shorts.
0:00:11 Viori’s high quality gym clothes
0:00:14 are made to be versatile and stand the test of time.
0:00:17 They sent me some to try out and here I am.
0:00:20 For our listeners, Viori is offering 20% off
0:00:23 your first purchase, plus you have free shipping
0:00:26 on any US orders over $75 in free returns.
0:00:27 Get yourself some of the most comfortable
0:00:30 and versatile clothing on the planet,
0:00:35 Viori.com/PropG, that’s V-U-O-R-I.com/PropG.
0:00:38 Exclusions apply, visit the website
0:00:40 for full terms and conditions.
0:00:47 – This is a Reese’s Peanut Butter Cub sound experiment.
0:00:50 We’re looking to find the perfect way to hear Reese’s,
0:00:52 so you’ll buy more of them.
0:00:53 Here we go.
0:00:55 Reese’s, Reese’s.
0:00:57 – Reese’s.
0:00:59 – Reese’s.
0:01:00 – Reese’s.
0:01:02 – Get out of here, you little stinker.
0:01:03 – Reese’s.
0:01:06 – Reese’s.
0:01:09 – Reese’s.
0:01:11 – Peanut Butter Cups.
0:01:14 – That breathy one sounded very creepy, am I right?
0:01:17 – This isn’t your grandpa’s finance podcast.
0:01:19 It’s Vivian too, your rich BFF
0:01:21 and host of the “Networth and Chill” podcast.
0:01:23 This is money talk that’s actually fun,
0:01:26 actually relatable, and will actually make you money.
0:01:28 I’m breaking down investments, side hustles,
0:01:30 and wealth strategies, no boring spreadsheets,
0:01:32 just real talk that’ll have you leveling up
0:01:33 your financial game.
0:01:35 With amazing guests like Glenda Baker.
0:01:37 – There’s never been any house that I’ve sold
0:01:40 in the last 32 years that’s not worth more today
0:01:41 than it was the day that I sold it.
0:01:42 – This is a money podcast
0:01:44 that you’ll actually want to listen to.
0:01:47 Follow “Networth and Chill” wherever you listen to podcasts.
0:01:49 Your bank account will thank you later.
0:01:54 – Episode 335.
0:01:56 In 1935, the first canned beer was sold
0:01:57 in Richmond, Virginia,
0:01:59 and Alcoholics Anonymous was founded.
0:02:00 I really enjoy A&A.
0:02:02 It’s the only place where I can pee in an Uber
0:02:06 and then show up and tell other guys who’ve peed in an Uber
0:02:07 that it gets better.
0:02:09 Go, go, go!
0:02:22 Welcome to the 335th episode of “The Prop G Pod.”
0:02:24 In today’s episode, we speak with Mogadot,
0:02:27 the former Chief Business Officer of Google X,
0:02:29 bestselling author, founder of One Billion Happy
0:02:33 and host of “Slow Mo,” a podcast with Mogadot.
0:02:35 We discuss with Mo how AI could shape our lives
0:02:37 in the coming decades, the opportunities it brings,
0:02:39 and the risks it poses to society,
0:02:41 ethics, and mental health.
0:02:42 We also get into his latest book,
0:02:46 “Unstressable, A Practical Guide to Stress-Free Living.”
0:02:47 Yeah, that’s gonna happen.
0:02:49 Yeah, I’m gonna read a book and all of a sudden,
0:02:52 Mr. Stress is gonna leave the neighborhood.
0:02:54 Call me, call me cynical.
0:02:55 Color me a bit skeptical.
0:02:57 What’s going on with the dog?
0:02:58 What’s going on with the dog?
0:03:03 So I am in New York after a stop in Orlando,
0:03:05 where I went for a speaking gig.
0:03:07 I have absolutely no sense of Orlando,
0:03:08 other than Disney World,
0:03:12 which is the seventh circle of hell for parents.
0:03:13 Essentially, I do almost no parenting,
0:03:17 364 days a year, and I compensate for all of it
0:03:19 by agreeing to take my boys and their
0:03:23 five or six closest friends to Walt Disney World,
0:03:25 which is just, I mean,
0:03:28 that is cruel and unusual punishment for a parent.
0:03:30 But anyways, not doing it this time,
0:03:32 just bombing in, speaking to a lovely group of people,
0:03:34 then getting back on a plane and going up to New York,
0:03:37 well, I spent four days with the team and do a bunch of,
0:03:39 I find New York, I get so much done in New York.
0:03:40 There’s something about, I don’t know,
0:03:43 everyone just seems to be on high, if you will.
0:03:44 By the way, it’s fascinating.
0:03:46 All these members clubs are opening.
0:03:49 In the last couple of years, there’s been zero bond,
0:03:53 my favorite, Costa Chupriani, downtown, weird location.
0:03:55 They put a ton of money into it, has that Italian vibe.
0:03:57 I get the sense it’s trust on kids from New Jersey,
0:03:59 but that’s just me.
0:04:03 And then what else has opened up?
0:04:05 San Vicente bungalows is opening up from Los Angeles,
0:04:07 so everyone assumes it’s gonna be cool,
0:04:08 and I’m excited about that.
0:04:12 The Crane Club, which is the guys from the Tau Group,
0:04:15 who are probably the most successful nightclub.
0:04:19 Pretty much a giant fucking red flag is when you find out
0:04:22 that your daughter’s dating a club promoter,
0:04:27 but these guys made good and made so much dough and cabbage
0:04:29 and really kind of professionalized the industry,
0:04:31 if you will, and they’re the folks,
0:04:33 or the power behind Crane Club,
0:04:34 so it should be interesting.
0:04:36 And then I went to another one last week,
0:04:39 and it’s my favorite so far based on my snap impressions.
0:04:44 Shea Margo, ooh, hello, hello ladies.
0:04:46 I don’t know exactly how to describe it,
0:04:47 other than the thing that struck me
0:04:49 was it was super cool, super crowded,
0:04:53 and the thing I liked about it was it was intergenerational.
0:04:54 What do I mean by that?
0:04:56 There was a lot of young, hot people.
0:04:57 Oh, it was a good thing.
0:04:57 Oh, it was a good thing.
0:05:00 New York, by the way, is run on hot women,
0:05:02 hot young women, and rich men.
0:05:03 That’s it for everyone else.
0:05:04 It’s a soul-crushing experience.
0:05:07 Anyways, and then it had people my age,
0:05:10 and then it had parents eating and dining,
0:05:12 and I loved that whole sort of like,
0:05:13 we can be cool at any age,
0:05:15 which is becoming increasingly important to me
0:05:18 as I become a hundred fucking years old.
0:05:19 Anyways, I love being back in New York.
0:05:20 New York’s on fire.
0:05:23 Still think it’s the greatest city in the world,
0:05:24 and am excited to be here.
0:05:27 I’m also gonna talk to Moe about specifically,
0:05:30 I think there’s a paradigm shift going on in AI.
0:05:33 Little bit of a teaser.
0:05:34 Little bit of a teaser.
0:05:36 I’m like those promos for all those YouTube videos
0:05:39 that say the secret to happiness is,
0:05:41 and then they cut out.
0:05:44 But we’re gonna talk to Moe about what I think is,
0:05:46 I think I had a realization around
0:05:49 what is how the whole AI economy might shift.
0:05:52 Anyways, with that, here’s our conversation with Moe Goddard.
0:05:57 Moe, where does this podcast find you?
0:05:58 – I’m in Dubai today.
0:06:03 I am battling with the surprises of February so far.
0:06:08 And yeah, enjoying every bit of it.
0:06:09 – Well, let’s start there.
0:06:13 Surprises of February, what are surprises in February
0:06:16 from an individual such as yourself in Dubai?
0:06:20 – Dubai is wonderful in February,
0:06:22 but we occasionally remember last year
0:06:27 we had this incredible flood that was really, really quite,
0:06:30 and just a couple of days ago,
0:06:34 we had a bit of rain that sort of like triggered
0:06:38 the same fears, but of course the real surprises
0:06:40 were deep seek and the responses in the market
0:06:45 and how the world, I feel overreacted a bit
0:06:48 and then underreacted a bit and life.
0:06:51 – Life, there you go, I hear you.
0:06:53 So let’s press right into it.
0:06:56 The last time you were on, I think it was about a year ago,
0:06:58 and you’re sort of our go-to,
0:07:02 this mix of spirituality and deep technical domain expertise.
0:07:05 And we were talking, as you might guess,
0:07:07 our need to control the response to AI.
0:07:10 – Give us what you think the kind of current state of play
0:07:13 is in AI given some of the recent developments
0:07:16 and how that may have influenced or did it influence
0:07:20 your world view or your predictions around
0:07:22 or thoughts around the future of AI?
0:07:24 – You have to imagine that the short history
0:07:27 of what I normally refer to as the third era of computing,
0:07:31 basically the two years between the time
0:07:33 when Chad GPT came out and today.
0:07:37 That short history was a pace that humanity
0:07:41 has never, ever seen before, I think.
0:07:45 You’ve seen what I used to refer to as the first inevitable,
0:07:48 where basically everyone is in a prisoner’s dilemma,
0:07:49 don’t want the other side to win,
0:07:53 so everyone’s competing, throwing everything on it,
0:07:54 you know, at it.
0:07:57 And basically you’d get releases of new technology
0:08:00 that are sometimes separated by weeks,
0:08:03 if not months at most.
0:08:05 And I think what most people don’t recognize
0:08:09 is that at least within the areas where we invested,
0:08:12 we have made massive stride on tech.
0:08:17 So when it comes to the March to AGI, if you want,
0:08:23 which I think humanity will continue to disagree about
0:08:25 for a while because we don’t really have a definition,
0:08:29 an accurate definition of AGI, you know,
0:08:33 is still steady and very, very fast, right?
0:08:34 So we’re gonna get there,
0:08:38 my prediction is we almost have already gotten there.
0:08:40 And that, you know, when it comes to linguistic intelligence,
0:08:43 they’ve won, they’re winning in mathematics,
0:08:46 they’re winning in reasoning, you know,
0:08:49 and everything we will pour resources on,
0:08:52 they will get to become better than humans.
0:08:54 So it’s just a question of time, really.
0:08:57 The part that hasn’t changed in my mind, Scott,
0:09:00 which now I think is very, very firm
0:09:03 and much more accurate if you want,
0:09:08 is that the impact on humanity in the short term
0:09:10 is gonna be dystopian.
0:09:14 And that has nothing to do with the existential risk
0:09:16 that people speak about with AI.
0:09:19 It has a lot to do with the value set of humanity
0:09:22 at the age of the rise of the machines.
0:09:27 Basically, unelected, influential powers
0:09:30 making decisions on humanity’s behalf
0:09:34 in ways that completely determines how things happen,
0:09:39 leading to massive changes in the very fabric of society
0:09:42 and basically paying to an agenda
0:09:44 where I tend to believe we will end up
0:09:47 with very few big platform players
0:09:49 completely in bed with governments,
0:09:54 completely, you know, feeding on hunger for power,
0:09:58 hunger for wealth and sort of depriving the rest of us
0:10:01 of the freedom to live the life we live.
0:10:05 I can sort of, so I summarized this in an acronym
0:10:08 that I made, seven changes to our way of life.
0:10:10 I call them FACE RIPs,
0:10:13 and we can go into them in details if you want.
0:10:16 But basically, I see this as inevitable.
0:10:20 I see that the short term dystopia
0:10:22 is going to be upon us very, very soon
0:10:25 just because the massive superpower
0:10:29 that is at the disposal of agendas
0:10:31 is going to be in play very, very quickly.
0:10:35 – You said unelected officials that are reshaping society.
0:10:37 Are you talking about Sam Altman, Elon Musk?
0:10:39 Who are you referring to?
0:10:41 – 100%, I mean, with all due respect,
0:10:44 why is my life being determined by Sam Altman?
0:10:48 We all had an accord, unwritten rule if you want,
0:10:51 that we won’t put AI out in the public sphere
0:10:56 until we feel that we’ve tackled safety or alignment
0:11:02 or ethics, if you want, all wonderful dreams to have.
0:11:04 Sam Altman, very soft-spoken,
0:11:05 comes out every now and then and says,
0:11:08 “This is the priority of what we believe in,
0:11:10 “but in reality, it’s a publicly traded company
0:11:14 “creating billionaires, everyone’s rushing very, very quickly.
0:11:16 “Yeah, it’s all about the money.”
0:11:19 And you and I have lived in the tech world long enough
0:11:24 to understand that all you need is a very clever PR manager
0:11:29 to craft a message that is almost exactly
0:11:32 the opposite of what you focus on every day.
0:11:35 But you say it over and over until you yourself believe it.
0:11:39 The truth is the world is not ready
0:11:40 for what is about to hit us.
0:11:44 Whether you take the simple things like the economics
0:11:47 of the world and how they will change as a result of AI
0:11:49 all the way to the change of the dynamics of power
0:11:54 and the resulting deprivation of freedom,
0:11:58 all the way to how the economics of the world
0:12:01 are gonna change and how the jobs are gonna change
0:12:03 and how the human connection is gonna change
0:12:06 and how our understanding of reality is gonna change.
0:12:09 And these are decisions that are not made by us anymore.
0:12:12 And think about it this way,
0:12:16 Spider-Man’s with great power comes great responsibility.
0:12:18 We’ve disconnected power from responsibility.
0:12:21 There is massive, massive power concentration
0:12:25 concentrated in hands that do not answer to anyone.
0:12:27 – So I 100% agree with you.
0:12:31 The idea that everything from which buildings
0:12:34 are these targeted bombs, bomb first,
0:12:38 what our perception of our government election strategies,
0:12:40 all of these things are now being decided
0:12:43 by algorithms program by a very small number of people.
0:12:45 That creates I think a lot of concern.
0:12:50 The steelman argument is that if we don’t iterate
0:12:52 around the public’s usage of these things,
0:12:56 that other entities will leap ahead of us
0:12:59 and their intentions are even more malicious than ours,
0:13:03 that while capitalism perverts things at its heart,
0:13:04 it’s not malicious.
0:13:06 It might be indifferent, but it’s not malicious.
0:13:11 And the fear is that if we let other entities
0:13:13 run unfettered with AI in the sense
0:13:17 that it becomes the wild west and the public provides feedback
0:13:20 and these models leap out ahead of ours,
0:13:23 that ultimately the trade-off between a capitalist motive
0:13:28 is worth it relative to letting other societies
0:13:31 get out ahead of AI, respond to that argument.
0:13:33 – I find that this is a very valid argument
0:13:35 if you think of the short term,
0:13:37 if you think of the long term,
0:13:39 it could lead to a very dystopian place.
0:13:41 So allow me to explain.
0:13:45 A competitive race, arms race,
0:13:49 that basically says if I don’t build a nuclear bomb first,
0:13:51 someone else will build it,
0:13:54 does not necessarily lead to a world
0:13:56 where you’re the only one that owns a nuclear bomb.
0:13:59 As a matter of fact, it leads to a world
0:14:01 that has more than one owner of nuclear bombs.
0:14:04 And I think what you saw from Deepseek, for example,
0:14:09 is a very interesting result that comes out of,
0:14:12 okay, we’re gonna consider this a war,
0:14:14 we’re gonna compete against the other people,
0:14:16 we’re gonna apply sanctions,
0:14:20 we’re gonna try to limit their ability to progress
0:14:22 and what do they do as a result?
0:14:24 Necessity is the mother of all needs
0:14:27 and so basically they find ways to do things differently.
0:14:31 Now, when you really look at the idea
0:14:33 of testing things in public,
0:14:35 which is an argument that’s used very frequently
0:14:39 but by open AI, I think the analogy almost sounds like,
0:14:44 let’s test the trinity in Manhattan, not in New Mexico,
0:14:48 just to see how it impacts humanity, right?
0:14:50 That’s not how you do things.
0:14:52 The way you do things is when you are uncertain
0:14:56 of the outcome, you normally can test it
0:14:58 in ways that are much more contained.
0:15:02 But this genie is out of the bottle long ago
0:15:04 because the truth of the matter is that everyone
0:15:06 is racing already.
0:15:08 The other outcome, believe it or not,
0:15:10 and I say that with a ton of respect,
0:15:16 is that, yes, the US might lead the arms race
0:15:19 or China, you’ll never really know.
0:15:22 It might be open AI or it might be alphabet,
0:15:24 you’ll never really know.
0:15:29 But the problem with that is that a more polar world
0:15:32 where such concentrated power is not a fair world,
0:15:33 either.
0:15:35 It’s not a fair world to the world,
0:15:38 but it’s also not a fair world to most Americans.
0:15:41 And I think that’s what most people don’t recognize
0:15:45 is that you eventually sooner or later,
0:15:47 as more and more power is concentrated
0:15:50 in the hands of very, very few,
0:15:54 which is the only way the US can beat China
0:15:56 if you want in this technology.
0:16:01 Those very few eventually will turn on the American citizen
0:16:03 and say, you know what, you’re not really bringing
0:16:06 any productivity, we care about maximizing the same target
0:16:09 we’ve been chasing so far, more power, more wealth,
0:16:10 and you’re standing in the way.
0:16:14 And I think you can see that in the American society
0:16:18 very, very clearly today before AI takes over.
0:16:21 The only answer in my view, believe it or not,
0:16:24 which I know sounds really idealistic if you want,
0:16:28 is a mutually assured destruction conviction
0:16:33 is that we both understand, by both I mean
0:16:36 every two arch enemies on both sides,
0:16:39 that we are shifting the mindset
0:16:44 and the existence of humanity from a world of scarcity,
0:16:47 where for me to feel safe,
0:16:49 I have to be stronger than the other person,
0:16:52 where for me to gain economically,
0:16:54 I have to compete with the other person
0:16:57 to a world of total abundance.
0:17:01 I mean, we spoke about this last time we met, Scott,
0:17:05 and my definition of the current age of AI
0:17:09 is what I call the intelligence augmentation.
0:17:12 So we’re now augmenting human intelligence
0:17:14 with machine intelligence in ways
0:17:19 where if I can lend you 250 IQ points more,
0:17:21 imagine what you can invent, right?
0:17:24 And I say that publicly all the time,
0:17:28 I dare the world, I say give me 400 IQ points more
0:17:30 and I will harness energy out of thin air.
0:17:33 So why are we competing if that’s the possibility
0:17:38 ahead of us when the competition drives us
0:17:41 to a point of absolute, mutually assured destruction?
0:17:45 – So it’s actually when we talk about
0:17:47 mutually assured destruction,
0:17:48 it’s actually that the two entities
0:17:50 that would have to come to some sort of agreement
0:17:51 around regulation or a pause,
0:17:54 or it would be the US and China.
0:17:55 And I’m sure there’s other entities,
0:17:58 but those are the lead dogs, right?
0:18:02 Do you think it’s realistic that the Chinese
0:18:04 would be sympathetic to this argument
0:18:05 and that there’s enough mutual trust to say,
0:18:08 “Look, we gotta, I don’t wanna say slow down,
0:18:12 but put some of this behind wraps, share with each other.”
0:18:14 I mean, this was sort of Oppenheimer’s,
0:18:15 was it Oppenheimer’s initial vision
0:18:18 that we share this technology and say,
0:18:21 “Okay, when one gets too far out ahead of the other,
0:18:25 that’s a problem, we need to control it together
0:18:28 and realize that if one gets too far out in front of the other,
0:18:30 the temptation to destroy the other is too great,
0:18:32 at which point that person will destroy it,
0:18:34 we’ll make sure they can strike back
0:18:35 in some limited fashion.”
0:18:37 Do you think it’s realistic?
0:18:40 And maybe realistic or not, it’s something we’ve got to do,
0:18:42 that we try and strike some sort of treaty
0:18:44 with the CCP on China?
0:18:47 – It’s not realistic in the current political environment.
0:18:50 Unfortunately, the current geopolitics of the world
0:18:52 is heating up more and more,
0:18:55 but it wasn’t realistic in the case of Russia
0:18:56 and nuclear weapons either.
0:19:01 By the way, I am not for slowing down at all.
0:19:03 I’m actually for speeding up all the way,
0:19:06 but speeding up in the direction that is not competitive,
0:19:09 but rather for the prosperity of the whole world.
0:19:12 I mean, at the end of the day, Scott, again,
0:19:14 give me 400 IQ points more
0:19:17 and I’ll solve every problem known to humankind.
0:19:21 And this is quite straightforward, really.
0:19:23 You and I have both worked with incredibly smart people
0:19:25 and you understand what the difference
0:19:28 of 100 IQ points means, right?
0:19:29 Give me better reasoning, better mathematics,
0:19:31 better understanding of physics,
0:19:34 and I can do things that humanity never dreamt of.
0:19:37 And this is a promised use, Utopia,
0:19:40 that is at our fingers, their tips.
0:19:41 So I’m not saying slow down.
0:19:46 I’m simply saying there is no point to compete.
0:19:48 The issue that is facing our world
0:19:51 is not a problem of technology that’s moving too fast.
0:19:54 Technology has always been good for us, right?
0:19:56 It’s a problem of trust
0:19:58 that if the other guy gets the technology before me,
0:20:00 I’m in trouble.
0:20:03 And that trust is not established in the lab,
0:20:05 it’s not established in the data center.
0:20:08 It’s basically established with the realization
0:20:12 that we can create a world of absolute total abundance,
0:20:13 total abundance.
0:20:18 I could know every piece of knowledge that ever existed.
0:20:21 I know you well, Scott.
0:20:24 I know how big of a dream this is for people like you
0:20:26 and I all love to learn, right?
0:20:29 And I can use that knowledge in ways
0:20:31 that will make me richer.
0:20:34 But how many Ferraris does anyone need?
0:20:36 I think this is the challenge we have.
0:20:39 The challenge is, you know, the founders.
0:20:41 By the way, I don’t believe this is a question of money
0:20:43 for the founders of AI startups.
0:20:48 I think this is a question of ego rather than greed.
0:20:49 I’m the one that figured it out first.
0:20:52 I’m the one that, you know,
0:20:55 provided this amazing breakthrough to humanity.
0:20:59 But if you look back just 150 years
0:21:01 at the King or Queen of England,
0:21:06 they had a much worse life than what anyone today has.
0:21:09 You know, anyone in any reasonable city in the US today
0:21:11 has air conditioning, has transportation,
0:21:14 has clean water, has hot water, has sanitation.
0:21:17 So we’re getting to the point where more
0:21:20 doesn’t actually make any difference anymore.
0:21:22 It is a morality question of,
0:21:26 can we just shift the mindset to abundance instead of scarcity?
0:21:30 We’ll be right back.
0:21:31 Stay with us.
0:21:38 Support for Prop T comes from 1-800-FLOWERS.
0:21:40 Valentine’s Day is coming up,
0:21:41 and you can let that someone in your life
0:21:43 know just how special they are
0:21:46 with the help of 1-800-FLOWERS.com.
0:21:48 They offer beautiful, high quality bouquets,
0:21:51 and this year, you can get double the roses for free.
0:21:53 When you buy one dozen from 1-800-FLOWERS,
0:21:56 they’ll double your bouquet to two dozen roses.
0:21:58 Of course, roses are a classic sweet way to say,
0:22:02 I love you, and 1-800-FLOWERS lets you share that message
0:22:03 without breaking the bank.
0:22:05 All of their roses are picked at their peak,
0:22:07 cared for every step of the way,
0:22:09 and shipped fresh to ensure lasting beauty.
0:22:12 Our producer, Claire, ordered from 1-800-FLOWERS,
0:22:14 and she thought they were just wonderful.
0:22:17 Her partner was just so delighted,
0:22:20 so delighted, strengthening the relationship.
0:22:22 Their bouquets are selling fast,
0:22:24 and you can lock in your order today.
0:22:27 Win their heart this Valentine’s Day at 1-800-FLOWERS.com
0:22:29 to claim your double your roses offer.
0:22:32 Go to 1-800-FLOWERS.com/PROFG.
0:22:35 That’s 1-800-FLOWERS.com/PROFG.
0:22:41 Looking to simplify?
0:22:45 How about the simple sounds of neutral vodka soda
0:22:50 with zero gram sugar per can for the next 15 seconds?
0:22:52 [Squeak]
0:23:10 Neutral. Refreshingly simple.
0:23:15 Health and Human Services Secretary nominee
0:23:17 Robert Cloride Kennedy Jr. went before the Senate today
0:23:20 in fiery confirmation hearings.
0:23:23 Did you say Lyme disease is a highly, likely,
0:23:26 militarily engineered bioweapon?
0:23:28 I probably did say that.
0:23:31 Kennedy makes two big arguments about our health,
0:23:33 and the first is deeply divisive.
0:23:35 He is skeptical of vaccines.
0:23:40 Well, I do believe that autism does come from vaccines.
0:23:42 Science disagrees.
0:23:44 The second argument is something that a lot of Americans,
0:23:47 regardless of their politics, have concluded.
0:23:50 He says our food system is serving us garbage,
0:23:53 and that garbage is making us sick.
0:23:55 Coming up on Today Explained, a confidant of Kennedys,
0:23:58 in fact, the man who helped facilitate his introduction
0:24:01 to Donald Trump on what the Make America Healthy Again
0:24:04 movement wants.
0:24:07 Today Explained, weekdays wherever you get your podcasts.
0:24:16 Do you think sequestering China from our most advanced chip
0:24:18 technology was a mistake?
0:24:21 100%.
0:24:22 It’s the biggest mistake ever.
0:24:26 I mean, since when did we–
0:24:30 I mean, strategically, as I said, of course,
0:24:34 the two big sanctions that America did in the last few years
0:24:38 were backfired massively against America.
0:24:41 The move against Russia basically
0:24:44 got a lot of people to try and de-dollarize a little bit.
0:24:47 And the move against China drove China
0:24:49 to become more inventive.
0:24:51 It’s as simple as that.
0:24:56 But it is also a massive statement of, you know what?
0:25:00 I’m going to try everything I can to beat you.
0:25:04 And I don’t know how to say that in a polite way.
0:25:09 But I’ve gone the first time to America in the ’70s.
0:25:11 And it blew me away.
0:25:15 It was a world apart from anywhere else in the world.
0:25:18 I get that feeling today when I land in Shanghai.
0:25:22 It’s not an easy fight.
0:25:25 It’s not a determined fight.
0:25:31 Let’s say ’70s, ’80s, ’90s, definitely post Berlin.
0:25:35 The US could do whatever the F they wanted in the world.
0:25:39 I don’t think it’s as easy a slam dunk as it
0:25:40 has been in the past anymore.
0:25:43 I think America needs to recognize
0:25:47 that when you win, it’s going to be through strategies
0:25:52 like what Trump is talking about by increasing defense,
0:25:55 spending even further than where it was,
0:26:02 loading the American dead clock even further than it is loaded.
0:26:06 And I had a very good boss of mine
0:26:09 that used to say when we’re under pressure,
0:26:12 we tend to do more of what we know how to do best.
0:26:14 But what we know how to do best is
0:26:16 what got us under pressure in the first place.
0:26:21 And I truly and honestly think that imagine a world where
0:26:25 there is an agreement that America adheres to, by the way,
0:26:28 that basically says, let’s just deliver that world
0:26:30 that everyone’s dreaming of.
0:26:34 Deliver a world where there is no need for you to attack me.
0:26:37 I think of a little bit of this.
0:26:38 How I would couch some of your comments
0:26:41 is you think we’re entering into what I’ll call an age of equivalence.
0:26:44 I don’t know how my semantics might be up.
0:26:50 But I think of America was able to develop and sustain
0:26:52 certain competitive advantages.
0:26:56 Manufacturing, mostly because the German and Japanese
0:26:58 infrastructure had been leveled.
0:27:02 Then services infrastructure and then the technology,
0:27:05 whether it was because of IP, risk taking, multiculturalism.
0:27:10 And we were able to maintain one, two, three decade leads
0:27:13 and find the next thing and establish more prosperity
0:27:16 and create and consume a disproportionate amount
0:27:18 of the world’s spoils.
0:27:20 And tell me if I’m saying this correctly,
0:27:23 you now believe that our competitive advantage around
0:27:27 these things is shrinking from 30 years to 30 days.
0:27:33 So it should bring on this incredible age of cooperation.
0:27:35 And we should stop deluding ourselves
0:27:37 that we’re going to be able to get out ahead and win.
0:27:40 Is that an accurate summary of what you’re stating?
0:27:43 That is a very accurate summary that it’s still
0:27:44 possible for the US to win.
0:27:47 I think the most important competitive advantage
0:27:50 that you may have not mentioned here
0:27:53 is that money has always been free for the US.
0:27:57 You had the ability to print money to create amazing wealth
0:28:02 that got reinvested wisely and sometimes unwisely.
0:28:04 Unfortunately, I think we’re in a time
0:28:10 where $500 billion on Stargate is sounds unwise.
0:28:14 But at a point in time, it was a no issue.
0:28:17 It’s like, OK, if this is what it takes to build
0:28:18 the infrastructure, we’ll do it.
0:28:21 What I’m attempting to say here is
0:28:25 it’s not that the US has lost the capability
0:28:29 to crush other nations on whatever full spectrum
0:28:34 dominance that the US has been attempting to achieve for years.
0:28:39 It’s that other nations have grown an ability to resist.
0:28:43 And that the more the US is becoming–
0:28:47 again, I worked my entire life in corporate America,
0:28:51 so don’t take that as an attack to the American approach at all.
0:28:56 I’m basically saying that the more America will bully the world,
0:29:01 the more you’ll get responses like deep seek across the world,
0:29:04 where people are simply going to say, you know what,
0:29:06 we don’t like this anymore.
0:29:09 I will openly say I don’t like the fact
0:29:13 that there is a small chunk of whatever money I made
0:29:17 anywhere in the world that was somehow handed to America
0:29:20 because of the US dollar dominance.
0:29:24 I don’t feel as a wealthy man.
0:29:27 I don’t feel that this tax, if you want,
0:29:29 on all the money made everywhere in the world,
0:29:33 that this export of inflation to everywhere in the world
0:29:36 is a just setup for all of us to succeed.
0:29:40 And so you can see across the world actions from Japan,
0:29:43 from China, from Russia for sure,
0:29:46 that is basically attacking the US
0:29:51 where it becomes painful, which is the US dollar dominance.
0:29:52 It’s not going to go away anytime soon,
0:29:55 but it makes things a little painful.
0:30:00 And in a typical environment, the US would say,
0:30:02 you know what, I’m going to crush you.
0:30:04 I’m strong enough, and you are strong enough.
0:30:08 I’m going to apply tariffs, as Trump would say,
0:30:12 and make sure that nobody has access to my wonderful market.
0:30:13 Makes sense.
0:30:17 It does make sense, but it also causes pain on the US side.
0:30:21 And it comes from a mindset of we’re still competing
0:30:24 for limited resources, where the world was made up
0:30:26 of metals and mirrors, you know,
0:30:30 and power was acquired by weapons.
0:30:33 I think we are on the cusp of a world
0:30:35 where everything is possible.
0:30:38 Just understand that from a difference
0:30:41 of manufacturing point of view, right?
0:30:44 With enough understanding of nanophysics
0:30:49 and an understanding of a level of intelligence
0:30:53 that helps us bridge the remaining bits of nanophysics,
0:30:57 we could manufacture things out of thin air,
0:31:00 just reorganizing molecules of air, right?
0:31:03 Instead of competing for minerals and resources,
0:31:08 and this is on at our fingertips, it’s years away.
0:31:11 There is a need for a mindset change.
0:31:15 – I always like to pause and double click on
0:31:17 or at least cement and highlight
0:31:18 what I think is real striking insight.
0:31:23 In the notion of an inability to sustain an advantage
0:31:26 and all it does is create fear and weakened relationships
0:31:28 and make one side more likely to strike
0:31:31 while they’re ahead and create workarounds
0:31:35 because there, you know, nothing creates innovation
0:31:38 like war and the threat of survival, right?
0:31:42 And what also really resonated was,
0:31:44 and I’ve been saying this,
0:31:47 I think Sam Altman is Sheryl Sandberg with Hush Tones.
0:31:50 Sheryl Sandberg was weaponized for femininity, her charm,
0:31:53 her maternal instincts, gender,
0:31:55 the important conversation on gender
0:31:56 to basically take what was a company
0:32:00 that was creating rage, making our discourse more coarse,
0:32:02 depressing our teens and make it seem more palatable
0:32:04 to basically revast lean over the lens
0:32:06 of pretty menaceous behavior.
0:32:09 And I feel like Sam and his Hush Tones
0:32:11 and his, that’s a real concern.
0:32:14 You know, Senator, I’m worried too.
0:32:16 Meanwhile, I’m about to raise $40 billion
0:32:18 to a $350 billion market cap.
0:32:21 I mean, it’s just, I have been to this fucking movie before
0:32:25 and we are falling for it again and again.
0:32:28 And so I wanna propose something
0:32:29 and have you respond to it.
0:32:33 And this is literally, you just inspired this thought.
0:32:36 Similar to the way we have the UN or NATO,
0:32:37 we have a new organization
0:32:41 and the two founding members are China and the US
0:32:45 and it’s total open, there’s offices in DC,
0:32:49 Silicon Valley, Shanghai and Beijing.
0:32:51 Every room, every team has a mix of US
0:32:55 and Chinese scientists, regulators, such that
0:32:57 it’s almost impossible to hide anything.
0:32:59 We’re all working on the same damn thing
0:33:01 and we’re trying to solve the world’s most
0:33:07 difficult problems, food distribution, health, poverty.
0:33:10 We’re working together, but we’re also making sure
0:33:12 there’s a very, very thick layer
0:33:15 of supervision and enforcement such that
0:33:18 we are constantly testing, how would you make bio weapons?
0:33:20 And then we’re sending our crawlers out to see,
0:33:23 is anyone working on this, that we don’t want working on this?
0:33:26 And we together try and create, you know,
0:33:27 like what Interpol was doing,
0:33:29 where we had multilateral cooperation
0:33:31 around the drug trade and arm shipments.
0:33:34 But we have this multilateral organization that says,
0:33:36 they’re total transparency.
0:33:38 And our job is to dole it out
0:33:40 where we see the most opportunity
0:33:43 to increase stakeholder value.
0:33:45 And the stakeholders are all seven and a half billion
0:33:46 people on the planet.
0:33:49 And we’re there to ensure that there’s trust
0:33:52 and transparency and ensure that the bad guys
0:33:55 don’t get this and start doing.
0:33:58 And we’re gonna cooperate around either sequestering this
0:34:02 or ensuring that the development of it to make weapons
0:34:05 or create a new super virus that we are hip to these things
0:34:08 before anybody else and act against them.
0:34:10 With that type of organization,
0:34:12 do you think that’s possible?
0:34:15 And in your mind, do you think that that has merit?
0:34:17 – That would be a dream.
0:34:19 I mean, let me just double click
0:34:23 on a very important comment that you said there at the end.
0:34:27 What both parties are unable to recognize
0:34:30 while they are putting their heads down
0:34:34 and competing with each other is how many bad guys
0:34:36 are putting their heads down in silence
0:34:38 and working against both of them.
0:34:40 The thing about AI is a massive democracy
0:34:43 and a massive set of open source.
0:34:47 Once again, because of the speed of this thing,
0:34:51 you know, it took Linux tens of years to actually be,
0:34:54 I mean, at least around 10 years to be established.
0:34:59 It took massive open source models weeks to be established.
0:35:03 And so there is access that, you know,
0:35:06 anyone today can download a deep seek,
0:35:11 you know, a model to their computer
0:35:12 in the jungles of Columbia
0:35:17 and do something malicious without ever being detected.
0:35:22 Now, the dream here is that we work together to say,
0:35:24 look, again, mutually assured destruction,
0:35:27 if we are not both together against the bad guys,
0:35:30 there is harm that can come to all of us.
0:35:33 And I think it’s a beautiful dream, but believe it or not,
0:35:35 there is a bit of that dream that’s already happening.
0:35:37 I mean, I don’t know if you know the statistic,
0:35:42 but 38% of all AI, top AI researchers in America are Chinese.
0:35:43 You know, it’s quite staggering
0:35:44 when you really think about it.
0:35:47 And if you count, if you count the Indians
0:35:50 and if you count, you know, some that have Russian origin
0:35:51 and so on and so forth.
0:35:54 – What percentage of that 38% are spies?
0:35:56 – Great question.
0:35:58 And in all honesty.
0:35:59 – In the world you’re defining it,
0:36:01 those spies are assets to humanity.
0:36:02 – And it’s quite interesting
0:36:06 that if you do not have a reason to spy,
0:36:09 then they become more of an asset to humanity.
0:36:14 I think the truth here is there is no winning.
0:36:16 There’s truly no winning.
0:36:19 And of course I don’t want to be grumpy,
0:36:24 but you know, a massive advantage in AI
0:36:28 is not gonna trump the card of nuclear holocaust.
0:36:30 So we’re competing in the wrong arena,
0:36:31 if you think about it.
0:36:35 Because in a world where we have so many superpowers
0:36:40 of which almost four or five can completely wipe out
0:36:44 our planet in less than two hours, right?
0:36:48 The quest for more power, for a dream
0:36:52 that I can crush someone else is a very dangerous quest.
0:36:56 Nobody in this world today can crush anybody.
0:36:59 I think this message needs to become really, really clear.
0:37:01 What are we competing on?
0:37:03 What are we competing on?
0:37:07 And so of course, what you recommended by the way
0:37:08 can be done by governments,
0:37:11 which I think is an impossible dream.
0:37:12 But believe it or not,
0:37:14 if just a few billionaires got together
0:37:16 and built those things,
0:37:18 the creation of the world of abundance
0:37:22 will basically nullify the need to compete.
0:37:25 You see, the challenge we have in our minds
0:37:30 is we’re not in that world of abundance yet, right?
0:37:34 And so we’re still living in our capitalist way
0:37:38 of every one of us has to play,
0:37:41 to aggregate more wealth, which delivers more power.
0:37:43 And then I take that wealth and power
0:37:46 and protect my wealth and power and make more of it.
0:37:48 This is a world that’s about to end.
0:37:52 It is literally about to end for six billion of us
0:37:54 as soon as jobs go away.
0:37:58 Nobody’s talking about this.
0:38:01 So you and I both know, you probably more so,
0:38:04 but I would say I know personally or somewhat well,
0:38:06 I don’t know, a dozen or two dozen billionaires.
0:38:10 And what I have found is that the majority of them
0:38:14 have what I call their very expensive go-back.
0:38:15 And that is they have a plan,
0:38:20 whether it’s antisemitism or a nuclear war
0:38:25 or some side of AI catastrophe or revolution.
0:38:28 And they have their Gulfstream 650,
0:38:31 ready on a moment’s notice and pilots
0:38:33 and their bunker in New Zealand.
0:38:36 And what I’ve said when I’ve talked to a few people
0:38:38 about this is like, let me get this.
0:38:40 If things really get that bad,
0:38:42 you don’t think your pilots are gonna get you
0:38:44 to your destination and then kill you.
0:38:47 You think they’re gonna sacrifice themselves
0:38:48 to save your family.
0:38:51 You don’t think that everybody else is gonna figure out
0:38:53 where the billionaire bunkers are and come and take you.
0:38:55 I mean, it’s just, it’s such a ridiculous,
0:38:59 I feel like it’s not only a stupid thesis,
0:39:03 it’s an unhealthy one because they’re under the impression
0:39:08 that their money can buy them a ripcord, a way out.
0:39:09 And they can’t.
0:39:12 And so shouldn’t you be focusing all this energy
0:39:14 on making sure that we just don’t get to that point?
0:39:18 I don’t, colonizing Mars, well, here’s an idea.
0:39:19 Take your immense talent and capital
0:39:22 to make this place a little bit more fucking habitable
0:39:25 because you’re not gonna wanna live on Mars.
0:39:27 Mars is an awful place.
0:39:28 You don’t wanna be there.
0:39:30 That’s worse than death.
0:39:34 That’s not space exploration, it’s space execution.
0:39:39 Isn’t this, I mean, don’t we have a real virus?
0:39:42 It’s almost like capitalism collapsing on itself
0:39:45 where we get so caught up in our self-worth
0:39:48 and our masculinity and our power around the number.
0:39:52 And we see this way to a billion, 10 billion,
0:39:57 a trillion dollars, which will increase in the current age,
0:39:59 my worth as a human, don’t,
0:40:02 doesn’t this require an entirely different zeitgeist?
0:40:05 Endlessly, you see, you see,
0:40:08 both directions of this dilemma are quite interesting.
0:40:11 One of them is, you know, remember last time,
0:40:12 I don’t remember when we were,
0:40:14 when we had a drink after the event.
0:40:20 We spoke about the idea of what you can do with money,
0:40:25 you know, there is a specific, you know,
0:40:29 range of wealth where money makes a difference.
0:40:32 You know, if you’ve never driven a sports car before
0:40:34 and you manage to get yourself a sports car,
0:40:36 you go like, ah, I made it.
0:40:37 But then if you drive a real sports car
0:40:40 and you know how annoying and fucking broken they are
0:40:43 and you know how they, you just eventually go like,
0:40:45 I don’t need any more of this.
0:40:48 The problem is, the game of billionaire
0:40:52 or multi-multi-multi-millionaire is wonderful, okay?
0:40:57 It’s a nice game, but it has no significant impact
0:40:59 on gains that you can achieve as a human.
0:41:00 You’ll still sleep in one bed
0:41:02 and you can make it as fancy as you can,
0:41:03 but it’s still one bed.
0:41:05 You can still drive only one car.
0:41:07 There could be 600 other cars,
0:41:09 600,000 other cars in the garage,
0:41:11 but you’re still driving only one.
0:41:12 And by the way, when you’re a billionaire,
0:41:14 you’re not really driving it comfortably anyway,
0:41:16 because you’re targeted all the time.
0:41:19 The other way of this crazy dilemma
0:41:22 is even more, you know, worthy of discussion
0:41:27 because we remember the times when if you had an MBA,
0:41:30 you were like a highly educated post-grad
0:41:32 and you know, now everyone has an MBA
0:41:35 and then if you had a PhD, you know,
0:41:36 you became the special one
0:41:39 and now everyone has a PhD and you know, many have many.
0:41:43 And the idea here is there is an inflation
0:41:47 to the value of something that you acquire, right?
0:41:49 And what is happening with wealth today
0:41:51 with artificial intelligence
0:41:54 is if you just look at the current trajectory,
0:41:56 we’re gonna see our first trillionaire
0:41:58 within years for sure.
0:42:03 And that not only makes that person acquire more wealth
0:42:04 that is not necessary,
0:42:07 but it makes the price of every Rolls-Royce higher.
0:42:10 And then that makes the price of every Mercedes higher
0:42:12 and that makes the price of every Toyota higher
0:42:13 and so on and so forth,
0:42:17 which basically means that as more of those exist,
0:42:19 just in the single digits,
0:42:22 more of the millionaires become poor
0:42:24 and then a few years later,
0:42:28 more of the hundred million millionaires become poor
0:42:31 because they can no longer compete with that level of wealth
0:42:33 to which everyone is now appealing.
0:42:35 And so if you take that cycle
0:42:37 and continue to repeat it over and over,
0:42:39 eventually you’ll end up with a very few,
0:42:44 like way less than 0.01% of all humans
0:42:46 that have so much wealth,
0:42:48 but then the great equalizer
0:42:51 is that the rest of us have no wealth at all.
0:42:54 So once again, from an economics point of view,
0:42:56 we are getting to a point where money
0:42:58 will have very little value
0:43:02 as compared to a world where money has no value
0:43:05 because everything is becoming a lot cheaper,
0:43:08 which is a world we can create with AI.
0:43:10 So I buy it theoretically,
0:43:15 but what I’ve registered is that over the last 50 years,
0:43:17 money becomes an even greater arbiter
0:43:18 of the life you can lead.
0:43:19 When I was a kid,
0:43:23 the difference between my dad’s house and his boss’s house,
0:43:24 little bit nicer car, a little bit bigger house,
0:43:26 but we were in the same neighborhood,
0:43:28 golfed at the same country club.
0:43:30 The market in a capitalist society
0:43:32 always figures out a way for you
0:43:34 to offer you more with more money.
0:43:36 There’s coach, there’s premium economy,
0:43:38 there’s business class, there’s first class,
0:43:41 there’s chartering, there’s fractional jet ownership,
0:43:42 there’s ownership, there’s a challenger,
0:43:45 there’s a Bombardier Global Express,
0:43:46 and there’s a Goldstream 650,
0:43:48 then there’s going into space.
0:43:51 My sense is life has actually gone the other way
0:43:52 the last 50 years,
0:43:55 that the life that the 0.1% lead
0:43:57 is an entirely different life.
0:44:01 It’s like the delta between being middle class and rich
0:44:04 has gotten bigger and bigger and bigger.
0:44:07 And so the incentives are actually the other way,
0:44:09 that there really is a reason.
0:44:10 When you’re the richest man in the world,
0:44:13 you can show up and turn off foreign aid
0:44:14 without being elected.
0:44:16 – Correct, I think we’re saying the same thing.
0:44:20 What that means, however, is that the majority of us,
0:44:23 even the ones that are now millionaires,
0:44:25 are gonna become poor.
0:44:27 That what you’re saying is exactly true.
0:44:31 It’s that the range in which we’re now talking
0:44:33 about the difference between what you can do
0:44:36 with a lot of money and what you cannot do
0:44:37 if you don’t have that money,
0:44:40 makes everyone almost equal at the bottom.
0:44:43 Everyone gets a reasonable car,
0:44:46 but not a massively fancy car.
0:44:49 Everyone becomes equal as compared
0:44:52 to those incredibly wealthy, if you know what I mean.
0:44:55 – We’ll be right back, stay with us.
0:45:02 – Hey, this is Peter Kafka.
0:45:04 I’m the host of Channels,
0:45:06 a podcast about technology and media.
0:45:09 And maybe you’ve noticed that a lot of people
0:45:10 are investing a lot of money
0:45:14 trying to encourage you to bet on sports,
0:45:16 right now, right from your phone.
0:45:19 That is a huge change and it’s happened so fast
0:45:20 that most of us haven’t spent much time
0:45:24 thinking about what it means and if it’s a good thing.
0:45:27 But Michael Lewis, that’s the guy who wrote Moneyball
0:45:29 on the big shore and Liars Poker,
0:45:30 has been thinking a lot about it.
0:45:33 And he tells me that he’s pretty worried.
0:45:36 I mean, there was never a delivery mechanism for cigarettes
0:45:38 as efficient as the phone is
0:45:39 for delivering the gambling apps.
0:45:42 It’s like the world has created less and less friction
0:45:45 for the behavior when what it needs is more and more.
0:45:46 – You can hear my chat with Michael Lewis
0:45:49 right now on Channels, wherever you get your podcasts.
0:45:54 – This week on ProfG Markets,
0:45:56 we speak with Robert Armstrong,
0:45:58 the US financial commentator for the Financial Times.
0:46:00 We discussed Trump’s comments on interest rates
0:46:03 and who might emerge as the biggest winners
0:46:04 from the deep seek trade.
0:46:07 – In the world we lived in last Friday,
0:46:12 having a great AI model behind your applications
0:46:17 either involved building your own or going to ask open AI,
0:46:20 can I run my application on top
0:46:22 of your brilliantly good AI model?
0:46:24 Now, maybe this is great for Google, right?
0:46:28 Maybe this is great for Microsoft who were shoveling money
0:46:31 on the assumption that they had to build it themselves
0:46:33 at great expense.
0:46:34 – You can find that conversation
0:46:37 and many others exclusively on the ProfG Markets podcast.
0:46:46 – We’re back with more from Mogadot.
0:46:49 – Mo, I wanna propose a thesis
0:46:51 and I’m gonna do what we’re supposed to do
0:46:53 and then I’ll talk about your book.
0:46:56 I was sort of blown away by this guy, Robert Armstrong.
0:47:01 He proposed or he talked about certain industries
0:47:04 where the innovation has resulted
0:47:06 in stakeholder value, not shareholder value.
0:47:09 So we have fallen under the notion
0:47:12 that if I can come up with a better search engine,
0:47:14 I’m gonna capture trillions of dollars
0:47:15 in shareholder value for me
0:47:17 and my investors and my employees.
0:47:21 Same way around social media, same way around e-commerce.
0:47:23 I came to Orlando last night for a speaking gig.
0:47:25 I skirt along the surface of the atmosphere
0:47:27 at eight-tenths the speed of sound.
0:47:31 I don’t have to eat my niece going over the Rockies
0:47:34 with scurvy, I don’t get seasick for 14 days
0:47:36 as my parents did coming on a steamship.
0:47:39 It has changed humanity, jet travel.
0:47:42 The PC changed humanity for better.
0:47:43 I mean, it’s just a super computer
0:47:48 that used to cost $10 billion on inflation adjusted.
0:47:49 I can get for 300 bucks, put it on my desk
0:47:52 and increase the productivity of everything.
0:47:53 I was on the board of Gateway Computer.
0:47:56 We were the second largest computer manufacturer in the world.
0:47:58 When I bought 17% of the company,
0:48:00 it was worth $130 million.
0:48:03 If you added up all the profits of the airline industry,
0:48:05 it’s negative.
0:48:07 They’ve lost more money than they’ve made.
0:48:10 There are certain industries and technology
0:48:13 where because of a lack of competitive modes,
0:48:18 the gains, the value seep to humanity and to stakeholders,
0:48:20 they’re not able to be captured
0:48:22 by a small number of shareholders.
0:48:26 And when deep seek came along, it sort of dawned on me,
0:48:29 maybe, and I think this is an optimistic vision,
0:48:33 maybe AI is more like the PC or the airline industry.
0:48:35 And that is many of the benefits
0:48:38 will accrete to stakeholders and citizens,
0:48:41 but no one small set of company or people
0:48:42 are gonna be able to capture all of the value.
0:48:45 Do you think that’s an optimistic view
0:48:47 of where AI might be headed?
0:48:48 In other words, do not participate
0:48:53 in the soft bank round at $350 billion in open AI?
0:48:55 – There is certainty in my mind
0:48:57 that there is going to be a democratization of power,
0:49:01 more access for everyone to more things, right?
0:49:02 You know, unfortunately,
0:49:06 if you take a power hungry scenario
0:49:11 in the recent wars of 2024 in the world,
0:49:14 you got the ultra powerful,
0:49:16 you got a concentration of power, some of it using AI,
0:49:19 by the way, in terms of weapons that have massive impacts,
0:49:22 but you also got access to drones
0:49:25 that can be flown from a very far away distance
0:49:28 and for $3,000 cause a lot of harm, right?
0:49:32 And I think that dichotomy, if you want that arbitrage
0:49:34 between a massive concentration of power at the top
0:49:38 and a democratization of power at the bottom
0:49:43 is going to drive a very, very high need for control.
0:49:47 Once again, I love the hypothesis or the ambition for AI
0:49:51 to become that net positive to the world
0:49:55 because it’s not really driving only profits to the top,
0:49:59 which it will, but I think that the opposite direction
0:50:03 of that is that when you have massive power at the top
0:50:06 and you sense that the bottom has a democracy of power
0:50:09 and that can threaten you at any point in time,
0:50:13 you’re going to have to oppress them.
0:50:15 And so that will take away the benefits
0:50:18 that the majority can get.
0:50:23 And I give a very stark and maybe a bit graphic example.
0:50:27 Think about a world Scott where a bullet could kill,
0:50:29 but if you’re a leader of a nation,
0:50:31 you can have protection around you
0:50:34 and can have everything to protect yourself.
0:50:38 We’ve seen examples in the 2024 wars
0:50:40 where a specific person is targeted
0:50:42 anywhere in the world and killed.
0:50:45 A tiny little drone carries that bullet,
0:50:48 seeks you with AI, finds where you are,
0:50:52 stands in front of your forehead and then shoots.
0:50:55 And these technologies are unfortunately under development.
0:50:59 Now think about what that does to democracy.
0:51:03 Think about those who own that weapon.
0:51:06 By the way, they don’t necessarily have to be governments
0:51:11 and how they can influence the distribution of power,
0:51:14 how they can ensure that whatever is created
0:51:17 is directed in a way that’s different
0:51:20 than what would benefit the majority.
0:51:24 – Yeah, in every war there’s a new weapon
0:51:25 that kind of changes the game.
0:51:26 And I think people don’t talk about this enough,
0:51:31 but I think drones are the new weapon that’s gonna come.
0:51:34 I mean, I think about millions of self-healing,
0:51:39 sass and drones and AI and the AI
0:51:40 under the direction of some individual
0:51:42 puts together a list of people who are not
0:51:45 in the way of my wealth and my power
0:51:46 and those drones can be released at
0:51:49 one of a thousand different,
0:51:52 I mean, you can really get very dark, very, very fast here.
0:51:53 So I’m gonna try and segue out of this
0:51:55 into something a little bit more positive.
0:51:56 – Is this the very first conversation
0:51:59 where I am grumpier than you?
0:52:03 – Yeah, yeah, we’re both, this is a,
0:52:04 yeah, it’s definitely grumpy old men.
0:52:07 It’s grumpy grumpier and grumpiest,
0:52:11 but I do find, whenever I speak to you,
0:52:14 I manage to distill something down
0:52:16 to something understandable and actionable for me.
0:52:18 I love the idea of this multilateral agency.
0:52:21 I was thinking in a zero-sum game philosophy
0:52:23 that we need to get out ahead, we need to develop AI,
0:52:25 we shouldn’t be shipping NVIDIA chips to China.
0:52:27 I was part of that crew.
0:52:29 And what you have taught me is,
0:52:34 okay, what if we cooperated around not only releasing it
0:52:37 to for the betterment of humanity,
0:52:41 but also, quite frankly, policing the bad stuff together
0:52:45 and being 100% transparent with each other
0:52:48 and just saying, not only are there no secrets,
0:52:50 but it would be impossible to have secrets
0:52:52 amongst each other because we’re,
0:52:55 we’ve just decided we’re in the same office space.
0:52:57 I really love that idea.
0:52:59 And I think that as I think about candidates
0:53:03 that I want to support in 2028, I do or 2026,
0:53:05 I do have access mostly because I have money,
0:53:09 but I think this is a really interesting view.
0:53:10 Anyways, thank you for that.
0:53:12 Your latest book on stressable practical guide
0:53:14 for stress reliving addresses the pervasive issues
0:53:16 of chronic stress in modern life.
0:53:19 In an interview on the diary of a CEO with, by the way,
0:53:21 Stephen Bartlett, who I believe is gonna be
0:53:24 the next Joe Rogan, you describe stress as an addiction
0:53:26 and a badge of honor.
0:53:29 Say more, why are we so addicted to stress?
0:53:35 – Part of the fakeness that leads us to success
0:53:39 is I’m busy, I’m busy, I’m busy,
0:53:44 which I have to say I found almost always quite shocking
0:53:49 because if you go across the range of intelligence
0:53:54 if you want, I think most of us know that a good 80 to 90%
0:53:58 of all of the efficiency that you bring to any job
0:54:01 that you do is done within 20% of the time.
0:54:05 But yet, you know, part of your ego is I’m gonna fill
0:54:09 the other 20, you know, the other 80% of the time
0:54:13 with 20% work that’s taking a lot of toll on me
0:54:15 because it basically means I’m driven.
0:54:19 It basically means, you know, that I am maximizing
0:54:22 my performance, maximizing my deliveries
0:54:24 between waking up in the morning at five a.m.
0:54:27 to run an Iron Man and then going in the evening
0:54:30 to attend, I don’t know what and flying all over the world
0:54:32 and so on and so forth.
0:54:37 The truth of the matter is this is a self perception,
0:54:42 a form of ego that says I am doing amazing, okay?
0:54:43 But it isn’t.
0:54:46 And I think the biggest challenge we have is that
0:54:48 we believe that the world stresses us.
0:54:49 The world does not stress us.
0:54:53 I mean, when I wrote Unstressable, I started from physics.
0:54:56 I basically said, look, the easiest way to understand physics
0:54:58 and to understand stress in humans
0:55:01 is to look at stress in objects.
0:55:04 And the stress in object is the force applied to the object
0:55:08 but that is divided by the cross section of the object,
0:55:12 how much resources the object has to carry that force, right?
0:55:14 And so typical reality of our life,
0:55:16 especially the lives of busy executives
0:55:19 who live in busy cities and so on and so forth,
0:55:21 is that there will be multiple challenges
0:55:23 and forces applied to you every day
0:55:25 but that the cross section of you,
0:55:28 your capabilities, your skills, your connections,
0:55:30 your abilities and so on,
0:55:33 the more you have those and apply them properly,
0:55:35 the less stressed you feel.
0:55:37 There might be more force applied to you,
0:55:39 you might be carrying more challenges
0:55:41 but you don’t feel stressed.
0:55:42 Just like an object doesn’t break
0:55:44 when it has a bigger cross section.
0:55:46 And the reality of the matter is that
0:55:48 part of the badge of honor is not that I’m carrying
0:55:51 a lot of things, it’s that I’m busy and I’m angry
0:55:54 and I’m stressed and I’m this and I’m that.
0:55:56 And I find that honestly.
0:55:58 Yeah, and I worked with many people
0:56:01 who are very successful, who appear to be that way
0:56:05 and become a lot very obnoxious and unloved by their people.
0:56:08 And I worked with a few that were totally chill.
0:56:11 I used to be the one that used to tell my sales team,
0:56:13 I really think this pipeline is too wide.
0:56:16 I really think you should focus on 30% of it
0:56:20 and close it, rather than waste your time
0:56:22 on things that you will not serve well.
0:56:25 And in a way you make more money that way,
0:56:27 you become more successful that way,
0:56:30 you get more customer satisfaction that way.
0:56:31 And the rest of the pipeline,
0:56:33 you hand over to a different channel
0:56:36 that does it in a way that’s suited for it
0:56:38 so that it doesn’t stress anyone.
0:56:40 How do we deal with stress in a more sustainable way?
0:56:43 And as we wrap up here, are there any quick fixes?
0:56:46 I feel that what we want to deal with is not stress,
0:56:48 what we want to deal with is a break points.
0:56:49 So we wanna avoid break points.
0:56:51 And I think there are three break points
0:56:54 that happen to us in our world today.
0:56:56 One is of course burnout, okay?
0:56:58 And burnout algorithmically is the sigma
0:57:01 of all of the stressors that you’re under multiplied
0:57:04 by their duration, multiplied by their intensity.
0:57:08 And basically most of the time when you burn out,
0:57:12 you burn out not because one big stressor is in your life,
0:57:13 but it’s because of the aggregation
0:57:14 of all the little things,
0:57:17 the loud alarm in the morning, the commute, the this and that.
0:57:19 And then one little thing shows up on top of it
0:57:20 and you break down.
0:57:24 And so burnout to me is a question of a weekly review.
0:57:26 Literally every Saturday, you sit with yourself,
0:57:27 you write on a piece of paper,
0:57:29 everything that stressed you last week
0:57:31 and you scratch out the ones that you commit
0:57:34 that you will not allow in your life anymore.
0:57:37 You can either remove them from your life
0:57:39 or make them more enjoyable.
0:57:42 So if you have to be stuck in a commute or a long flight,
0:57:46 take some good music, be healthy and so on and so forth.
0:57:51 The other break point unfortunately is trauma, right?
0:57:54 So basically massive stress that happens
0:57:56 in a very short period of time
0:57:57 that exceeds your ability to deal with it,
0:58:00 the loss of a loved one, an accident,
0:58:03 being stuck in war or whatever and so on.
0:58:07 And this unfortunately is not within our hands,
0:58:08 but believe it or not,
0:58:09 it actually is not the reason
0:58:11 for the stress pandemic of the world.
0:58:15 So 91% of all of us would get
0:58:20 at least one PTSD inducing,
0:58:25 like the highest of all trauma PTSD inducing event
0:58:26 once in their life,
0:58:30 but 93% of those would recover in three months
0:58:34 and 96.7% of those will recover in six months.
0:58:37 And all of those will enjoy post-traumatic growth.
0:58:41 So there is no worry about trauma if you want.
0:58:43 It’s not within your control to prevent,
0:58:47 but if you work on it, you’ll recover.
0:58:50 The third and the most interesting reason for stress,
0:58:52 especially in younger generations today
0:58:55 is what I call an anticipation of a threat, right?
0:58:58 And the challenge with it is that looking forward
0:59:02 with fear, worry, anxiety and panic
0:59:04 are probably the biggest stressors
0:59:05 for the younger generations.
0:59:10 And the funny bit is that fear is not a bad emotion.
0:59:13 Fear is actually alerting you to something
0:59:16 that you need to pay attention to, so that’s okay, right?
0:59:17 Worry, anxiety and panic
0:59:19 are actually of a very different fabric.
0:59:23 So worry is not about, I know there is a threat coming.
0:59:25 Worry is I can’t make up my mind
0:59:26 if there is a threat coming or not.
0:59:29 And so you keep flip-flopping and you don’t take the action
0:59:31 and you keep feeling the fear
0:59:33 but not doing anything about it.
0:59:34 When you’re worried,
0:59:37 you need to actually tell yourself openly,
0:59:41 look, I’m going to decide if I should chill or panic, right?
0:59:43 Chill or freak out.
0:59:45 If it’s freak out, then it’s fear, deal with it.
0:59:47 If it’s chill, then stop thinking about it.
0:59:49 Anxiety is not about the threat.
0:59:51 Anxiety is actually about your capability.
0:59:54 And most people, if they really visit themselves
0:59:57 when they feel anxious, when you’re anxious,
0:59:58 there is a threat approaching you
1:00:00 but you constantly think
1:00:02 that you’re not capable of dealing with it.
1:00:04 So the more you attempt to deal with the threat,
1:00:06 the more you feel incapable,
1:00:08 so the more anxious you become.
1:00:10 When you’re anxious, work on your capabilities,
1:00:11 not on the threat.
1:00:14 And then panic is a question of time, right?
1:00:16 And panic really is the stress,
1:00:17 you know, the threat is imminent.
1:00:19 It’s approaching me too quickly.
1:00:22 And so accordingly, when you feel panicked,
1:00:24 don’t work on solving the problem.
1:00:26 Don’t work on addressing the threat.
1:00:28 Work on giving yourself more time.
1:00:30 You know, find someone to help you
1:00:33 or delay the, you know, the presentation time
1:00:35 or, you know, or cancel a few meetings
1:00:37 so that you have more time for the,
1:00:39 for whatever it is that you need to focus on.
1:00:41 And what I mean by all of this,
1:00:43 this is a very, very quick summary of, you know,
1:00:45 of a lot of stuff that we discuss in Unstressable.
1:00:48 But what I mean by this is that it’s all,
1:00:50 it all goes back to your cross-section.
1:00:53 All goes back to skills and choices that we make
1:00:56 so that the external stressors that come to us from,
1:00:59 from the world don’t kill us.
1:01:01 – One of my favorites, even Spielberg movies,
1:01:03 is this movie called “Bridge of Spies”
1:01:06 and this Russian spy who’s been unmasked
1:01:09 by the US government is in court,
1:01:12 he’s being tried for, you know, trees and respying
1:01:14 and he’s potentially facing a life in prison
1:01:17 and his lawyer, I think it’s Tom Higgs says,
1:01:18 “Aren’t you nervous?
1:01:19 “Aren’t you stressed?”
1:01:21 And he looks at me and says, “Would it help?”
1:01:22 (laughing)
1:01:23 – Exactly, exactly.
1:01:24 – Yeah.
1:01:28 Anyways, Mo Gadot is the former chief business officer
1:01:30 for Google X, the founder of One Billion Happy Foundation
1:01:33 and co-founder of Unstressable.
1:01:34 He’s also a bestselling author of books,
1:01:37 including “Solve for Happy,” “Scary Smart”
1:01:38 and that little voice in your head.
1:01:39 Mo, I mean this and seriously,
1:01:43 you bring my stress down because I find you inspiring
1:01:46 and relaxing and you distill things
1:01:48 into kind of actionable solutions.
1:01:51 Really always enjoy speaking with you.
1:01:54 I think you’re really a profound thinker.
1:01:55 Thanks for your good work.
1:01:59 This episode was produced by Jennifer Sanchez.
1:02:01 Our intern is Dan Shalon.
1:02:03 Drew Burroughs is our technical director.
1:02:04 Thank you for listening to the property pod
1:02:06 from the Box Media Podcast Network.
1:02:09 We will catch you on Saturday for “No Mercino Mouse,”
1:02:11 as read by George Hahn.
1:02:13 And please follow our Prodigy Markets Pod
1:02:15 wherever you get your pods for new episodes
1:02:17 every Monday and Thursday.
1:02:20 (upbeat music)
1:02:30 [BLANK_AUDIO]

Mo Gawdat, the former Chief Business Officer for Google X, bestselling author, the founder of ‘One Billion Happy’ foundation, and co-founder of ‘Unstressable,’ joins Scott to discuss the state of AI — where it stands today, how it’s evolving, and what that means for our future.

They also get into Mo’s latest book, Unstressable: A Practical Guide to Stress-Free Living.

Follow Mo, @mo_gawdat.

Subscribe to No Mercy / No Malice

Buy “The Algebra of Wealth,” out now.

Follow the podcast across socials @profgpod:

Learn more about your ad choices. Visit podcastchoices.com/adchoices

Leave a Comment