Category: Uncategorized

  • Tom Aspinall: The UFC Doesn’t Want You To Know This! Jon Jones Wasn’t Living Like An Athlete!

    BREAKING: Dana White announces Jon Jones retirement…but UFC Heavyweight star Tom Aspinall still wants a title fight. He reveals the truth about Jon Jones’ retirement, and breaks down the champion mindset, fear, and mental toughness required for UFC title fights. 

    Tom Aspinall is the undisputed UFC Heavyweight Champion, a top-ranked professional mixed martial artist, and the first British heavyweight in history to claim a UFC title. He was set to face UFC legend Jon Jones until Jones announced his retirement in June 2025, automatically elevating Aspinall from interim to undisputed champion.

    Tom explains: 

    • The mental trick he uses to control fear before stepping into the cage.

    • Why so many young men feel lost, and how MMA gave him purpose.

    • The harsh reality of life as a UFC fighter behind the scenes.

    • How embracing pain, pressure, and adversity made him a champion.

    00:00 Intro

    02:37 Did You See This Coming?

    03:16 What Was Your Reaction When You Found Out Jon Jones Was Retiring?

    04:08 Did You Want to Fight Jon?

    05:06 Was Jon’s Decision a Strategic Dodge?

    06:08 Do You Take It as a Compliment?

    07:14 Would You Fight Jon If He Came Back?

    08:36 What’s Changed Overnight?

    10:28 Who’s the Contender Now?

    11:33 When Will You Fight Next?

    13:47 What Was Your Family’s Reaction?

    15:14 If Jon Is Watching, What Would You Say?

    17:33 The Dream to Become a Heavyweight Champion

    18:17 Where Does Tom Aspinall Come From?

    19:37 Where Did Your Inspiration Come From?

    21:53 What Kept You Going?

    24:38 Why Did Your Mum Never Come to an MMA Fight?

    26:32 What Advice Would You Give to Young People?

    29:41 I’m Scared to Fight Anybody

    31:55 I’ve Always Been Fearful to Fight

    32:56 Overcoming the Fear

    35:29 Working on Your Mental Strength

    37:49 Tom’s Process of Writing Things Down

    41:22 Very Few Make Money Fighting

    44:14 Tom Aspinall’s Career Progression and Fighting Style

    48:33 When Do You Start Making Good Money?

    49:59 Sergei Pavlovich Fight

    51:09 It Takes Years to Become an Overnight Success

    52:34 Having Kids at 23 and Not Being Able to Support Them

    57:11 Your Rock Bottom Moment

    58:37 Tom’s Family

    01:01:58 Ads

    1:02:57 My Knee Problems Helped My Career Massively

    1:06:22 Surrounded by Toxic People

    1:09:45 How Did You Feel After the Injury?

    1:11:58 Did It Knock Your Confidence?

    1:13:32 Jon Jones

    1:17:40 There’s No Contract Signed

    1:23:58 Tom’s Fighting Secrets

    1:26:16 The Health Routine to Get Into Elite Shape

    01:30:08:17 Ads

    1:31:12 Why Do You Do Hypnotherapy?

    1:34:36 Your Journey With Anxiety

    1:37:30 Your Son’s Health

    1:38:18 Having an Autistic Child

    1:47:27 The Importance of an Autism Diagnosis

    1:52:47 The UFC Heavyweight Champion Belt

    1:53:43 How Did You Feel Winning the Heavyweight Championship?

    1:55:28 Retiring Early to Avoid Cognitive Issues

    2:00:54 Why Are You Special?

    2:03:59 How I Prepare Mentally on Fight Day

    Follow Tom: 

    Instagram – https://bit.ly/4kbCZGh 

    YouTube – https://bit.ly/4lhLwbO 

    Get your hands on the Diary Of A CEO Conversation Cards here: https://bit.ly/conversationcards-mp  

    Get email updates: https://bit.ly/diary-of-a-ceo-yt 

    Follow Steven: https://g2ul0.app.link/gnGqL4IsKKb 

    Sponsors: 

    Vanta – https://vanta.com/steven  

    KetoneIQ – Visit https://ketone.com/STEVEN for 30% off your subscription order

    Learn more about your ad choices. Visit megaphone.fm/adchoices

  • Engineering the Future of Fusion

    AI transcript
    0:00:01 This is an iHeart podcast.
    0:00:33 When did you get the fusion bug?
    0:00:34 When did you fall in love with fusion?
    0:00:39 It probably goes back to middle school or before, you know, when a lot of kids would
    0:00:44 go out and play on the playground, I’d go to the library and read about particle accelerators
    0:00:45 and fusion reactors.
    0:00:48 And so, you know, I think the bug was set pretty early.
    0:00:50 This is Greg Pfeiffer.
    0:00:54 He’s the co-founder and CEO of a company called Shine.
    0:00:59 And I watched shows like Star Trek and, you know, certainly even like Star Trek The Next
    0:01:01 Generation, where my moral compass was set.
    0:01:04 So like, tell me the fusion dream.
    0:01:08 I mean, we’ll get to like why it’s going to take a while and it’s going to be hard, but
    0:01:11 just like, why is fusion the dream?
    0:01:15 Yeah, so fusion essentially, like to me, it represents a level up moment for humanity when
    0:01:17 we can commercially unlock it.
    0:01:18 Our species will be changed forever.
    0:01:20 And it’s very similar.
    0:01:23 It’s very akin to when we first started to access chemical energy through fire.
    0:01:27 I thought you were going to say fossil fuels, but you’re saying it’s bigger than fossil fuels.
    0:01:27 It’s fire.
    0:01:29 It’s as big as fire.
    0:01:29 Yeah.
    0:01:31 So it’s not going to happen for a long time.
    0:01:34 But like, what does the world look like when we get to the fusion dream?
    0:01:35 Yes.
    0:01:38 So as technology continues to improve, energy becomes cheaper and cheaper.
    0:01:40 Fuel is no longer an issue.
    0:01:45 So fundamentally today, fuel is the issue that would prevent us from making energy super cheap.
    0:01:46 We just have to continue to work to extract.
    0:01:47 Fusion doesn’t have that problem.
    0:01:53 So technology gets higher, the reactors get cheaper, and fusion becomes super cheap.
    0:01:55 Now we can solve problems that we couldn’t solve before.
    0:01:59 You know, we can desalinate water on a massive scale.
    0:02:03 Like we can, you know, we can pull out minerals from the earth very, very carefully.
    0:02:07 We can go into space and colonize other planets.
    0:02:09 We can make antimatter, right?
    0:02:12 And perhaps have an energy source that allows us to go to other stars.
    0:02:14 Like Star Trek.
    0:02:15 Yeah, exactly.
    0:02:15 Right?
    0:02:18 So that’s always the secret little motivation behind the scenes.
    0:02:19 Yes.
    0:02:21 I mean, also, I have a three-year-old daughter, right?
    0:02:24 Like, and I want to give her a world that’s, you know, okay to live in.
    0:02:33 I’m Jacob Goldstein, and this is What’s Your Problem?
    0:02:37 The show where I talk to people who are trying to make technological progress.
    0:02:43 People who are into technological progress and who dream big tend to be into fusion,
    0:02:48 a kind of nuclear power that could be safer and cheaper than fission,
    0:02:50 which is the way we get nuclear power now.
    0:02:55 By the way, as you probably know, fusion is fusing atomic nuclei together,
    0:02:58 and fission is splitting them apart.
    0:03:01 People have been working on fusion power for decades,
    0:03:06 and reliable economic fusion power is still probably decades away.
    0:03:08 But in the past several years,
    0:03:12 billions of dollars have flowed into a handful of fusion startups
    0:03:16 that are using different technologies to try to make fusion power work.
    0:03:20 My guest today, Greg Pfeiffer, is definitely on Team Fusion.
    0:03:23 He’s been working on it for decades.
    0:03:27 But with his company, Shine, he’s taking a different approach.
    0:03:31 Rather than going straight to the dream of using fusion to create energy,
    0:03:35 Shine’s taking baby steps, or at least mid-sized steps.
    0:03:40 The company is using fusion to enter markets that are easier to compete in than the market for energy.
    0:03:47 As you’ll hear, Shine has already used fusion to get into the business of scanning jet engine blades.
    0:03:51 And the company will soon be in the healthcare business as well.
    0:03:54 Later in the interview, we’ll talk about all of that,
    0:04:01 and about how Greg hopes those businesses will eventually lead to that big fusion dream of cheap, abundant power.
    0:04:06 But to start, we talked about how Greg went from being a kid thinking about Star Trek
    0:04:09 to a grown man starting a fusion company.
    0:04:15 And in particular, about how that path led Greg to take really a very different approach
    0:04:19 than that taken by other people building fusion companies.
    0:04:23 I took a class, actually, in college that was taught by two very inspiring people,
    0:04:28 one of whom ran something called the Fusion Technology Institute at the University of Wisconsin,
    0:04:31 and another one named Harrison Schmidt, who walked on the moon.
    0:04:35 And they were teaching a class about going into space and recovering resources.
    0:04:39 And recovering fusion fuel was one of the key resources they thought we could extract from space,
    0:04:40 in particular the moon.
    0:04:47 And so I got super excited about fusion, because those fuels are, if you burn them, you don’t get nuclear waste.
    0:04:51 So the promise of nuclear energy without nuclear waste, and these people were doing it,
    0:04:53 like on the front edge of it, got me really excited.
    0:04:56 I actually went to the moon to bring it back.
    0:04:57 Right, right.
    0:04:59 Like, these people have done hard things.
    0:04:59 Yeah.
    0:05:01 And so I’m going to go learn with them.
    0:05:01 Yeah.
    0:05:03 And so that got me into fusion.
    0:05:08 But, you know, for me, it was a different experience than if I had done a physics-based program in fusion.
    0:05:11 Like more practical, more hands-on?
    0:05:12 Is that the…
    0:05:12 Yeah.
    0:05:14 This was an engineering program.
    0:05:20 And the Fusion Technology Institute, which I joined, its mission was to design viable fusion reactors.
    0:05:23 It was to say, let’s assume the physics challenges are overcome.
    0:05:25 How would you build a real system?
    0:05:31 And that’s where, over the next few years, I just became a bit depressed, frankly.
    0:05:45 Because even if you master the physics, it became really clear that the challenge of commercializing and making heat for five cents per kilowatt hour, which is sort of the going rate for it, I couldn’t see a way to do that.
    0:05:54 And it was because you’re taking some of the most exotic materials ever developed by humans and putting them in the harshest environments ever created by humans.
    0:05:55 And they don’t live very long.
    0:05:58 And they’re super expensive to make.
    0:06:07 And so the idea that we could go straight to five cents per kilowatt hour, at least when I was in school, seemed far-fetched.
    0:06:09 So it’s sort of a techno-economic problem.
    0:06:11 You’re thinking of not just the technical side.
    0:06:15 But if people are actually going to use it, it has to be price competitive.
    0:06:16 Yeah, exactly.
    0:06:17 Okay.
    0:06:18 So you get sad.
    0:06:22 You get sad because your Star Trek dream doesn’t seem like it’s going to come true.
    0:06:27 And then, as I understand it, you go to a party and you have your big idea.
    0:06:27 Is that true?
    0:06:29 That is the history.
    0:06:30 It was a party at my house.
    0:06:34 And we were thinking about, well, I mean, most people weren’t thinking about this.
    0:06:36 But I had been working on this problem earlier in the day.
    0:06:38 So it was already kind of in my head.
    0:06:41 And it came down to our research.
    0:06:47 You know, I had done work on a specific technology at the UW where we were trying to make small fusion devices.
    0:06:50 And the idea was that there were a number of applications you could use them for.
    0:06:52 And they didn’t work very well.
    0:07:02 And one of the reasons we discovered they didn’t work very well was we were trying to collide these nuclei in the same space that we were trying to speed them up.
    0:07:03 Okay.
    0:07:06 So you just shoot them really fast into each other is the basic idea?
    0:07:06 Yeah.
    0:07:14 But the problem is, like, if you’re trying to make something go fast in a highly collisional space where it’s running into stuff a lot, it can’t really speed up.
    0:07:17 It’s banging into stuff and losing energy all the time.
    0:07:23 And if you take away the target material so that you can accelerate them, then it’s not colliding very much.
    0:07:25 And you don’t get a lot of fusion reactions.
    0:07:28 So you kind of had to operate in this worst of both worlds space.
    0:07:34 And, you know, it was like the revelation was just like, well, why don’t we accelerate in one place and collide in another place?
    0:07:38 Yeah, I was at one point, there were all kinds of people standing around with drinks.
    0:07:43 And I was sitting in one of our recliners, my laptop, punching numbers into it.
    0:07:50 And I just actually built a really quick model to just see what the fusion rate would do if we did that in theory.
    0:07:52 And the numbers came out amazing.
    0:07:58 Like, actually, it was like, you know, a thousand times higher than the output we were getting from our university experiment.
    0:08:01 And so, you know, like, I quickly disengaged from the party.
    0:08:05 I called my former advisor and I’m like, hey, you know, if we did this, like the math.
    0:08:07 And he was like, oh, okay, that’s really cool.
    0:08:09 Like, what do you want me to do?
    0:08:10 And I was like, I don’t know yet.
    0:08:12 I’ve got to figure this out.
    0:08:15 But I think I’m going to start a company to go do this.
    0:08:16 Were you sober?
    0:08:20 I probably had had a couple of drinks by then, actually.
    0:08:21 So it’s amazing that I got the math right.
    0:08:22 But I did.
    0:08:23 Maybe, or maybe it helped.
    0:08:25 Maybe there’s like a curve, right?
    0:08:27 Maybe there’s an optimal number of drinks.
    0:08:29 There certainly is when it comes to bowling.
    0:08:31 So why not nuclear physics as well?
    0:08:32 Yes.
    0:08:34 So you have this idea.
    0:08:39 When you have this idea, do you think, oh, I’ve solved fusion energy?
    0:08:47 This physics revelation doesn’t overcome the techno-economic challenge of fusion energy that I already had.
    0:08:51 And so that was already, like, I had already moved past that.
    0:08:55 And I was trying to see if there were ways, like what I had put in the back of my head,
    0:09:02 are there ways you can make use of fusion, you know, where you might get paid more for the reaction than you get paid for energy?
    0:09:10 So tell me about having this idea of like, oh, maybe there’s a way to commercialize fusion to do something other than generate energy.
    0:09:13 So two formative experiences.
    0:09:24 One, my advisor at the Fusion Technology Institute had identified a family, like, you know, a couple dozen probably applications where you could use fusion for non-electric applications.
    0:09:31 And they hadn’t really done the economic analysis on any of them, but they just said, here are some things you can do with fusion reactions.
    0:09:39 And those included things like making medical isotopes or detecting hidden material or, you know, contraband material, detecting nuclear weapons, stuff like that.
    0:09:44 And to be clear, those are things that people are already doing out in the world, right?
    0:09:45 There is a market for those things.
    0:09:47 These are existing products.
    0:09:50 They’re just not using fusion to make them.
    0:09:50 Yeah.
    0:09:55 Super definable markets, you know, and there are supply chain issues.
    0:09:58 And it’s a good market to get into if you had an alternative way to make things.
    0:09:58 Okay.
    0:09:59 So that was interesting.
    0:10:10 And then the other formative experience for me, it was actually, we had started another company when I was in grad school that had nothing to do with any of this, but we were just recovering data from crashed hard drives.
    0:10:12 One of my roommates had a hard drive crash.
    0:10:13 We looked online.
    0:10:14 All the options sucked.
    0:10:18 It was like, pay us $2,000 and we’ll try, but maybe we won’t get your stuff back.
    0:10:19 And it’s an upfront payment.
    0:10:22 And so we decided that was a bad business.
    0:10:25 So we started a business and we said, we told people, we said, look, we’re just starting this company.
    0:10:30 We’re, we’re new at it, but we’ll charge you a hundred bucks if we get your data and nothing if we don’t.
    0:10:32 And, you know, we might break your stuff.
    0:10:39 So, so, but you’d be surprised how many people like that better than, than being able to pay two grand.
    0:10:42 And so what happened was we got really, really good at it as we practiced.
    0:10:47 The volume we could handle and the throughput we could handle, all of this scaled really, really nicely.
    0:10:56 And so this was like just a formative idea for me that like, okay, if we can get into a niche with fusion and we can find an economic proposition,
    0:10:57 that works, we can practice.
    0:11:00 And if we practice, we’ll get better at it.
    0:11:04 And if we get better at it, like our suppliers and our customers and everyone will grow with us.
    0:11:06 So we’ll move this ecosystem forward.
    0:11:13 And I really liked that because if you look at some of the most high tech, deep tech industries around, they follow the same roadmap.
    0:11:20 You know, if you look at semiconductors in Moore’s law, it was fueled by having products all along the way, right?
    0:11:24 Like the first computers may have only had a few customers, but they would pay a ton for them.
    0:11:24 Yes.
    0:11:27 And by doing that, they got better and they brought the price down.
    0:11:30 And then there were new, a new set of customers, right?
    0:11:31 That could afford those computers.
    0:11:35 More recently, Tesla is the classic model of that, right?
    0:11:49 They started with the Roadster, this super expensive electric car that was not for everybody, but enough people bought it that they could go from the whatever that was, $150,000 car to the $70,000 car to the $50,000 car, right?
    0:11:50 Yeah, exactly.
    0:11:54 And I’d argue that the underlying technology for Tesla started even in other industries.
    0:11:57 So the ability to scale batteries even more cheaply, right?
    0:11:59 Like these rechargeable batteries.
    0:12:04 So you started with toys and special services and you moved to laptops and then you moved to EVs.
    0:12:09 And even once you get into EVs, you do this where you build an expensive thing that few people buy.
    0:12:18 So you have the idea of applying this framework to Fusion, which is quite different, right?
    0:12:26 There are all these other people who are raising lots of money to go straight at making electricity, basically, right?
    0:12:27 Making energy.
    0:12:32 Like why, I don’t know, like why isn’t anybody else doing it the way you’re doing it?
    0:12:38 I think it’s a very exciting proposition to be able to go straight to energy.
    0:12:41 It’s very inviting and it sounds very appealing.
    0:12:48 And even if the odds are long, but I don’t know how many of them have really spent time critically thinking about the engineering challenges.
    0:12:51 And that’s where my education was just different.
    0:12:53 Like that’s all we thought about.
    0:12:57 Like all we thought about were the engineering challenges and how to overcome them.
    0:12:58 Like these were university people.
    0:12:59 They’re super optimistic, right?
    0:13:04 Like, and we worked, like we developed materials for first walls and things like that.
    0:13:12 But like everything we did still broke really fast and it was really expensive stuff.
    0:13:20 So, you know, it’s just, that was different for a different experience for me, different formative experience for me than for a lot of people who are trying to go straight to the end.
    0:13:32 Now, I do think there are some innovative concepts out there, you know, that, that if they work and I say if, because the physics is far from proven, but if they work, they could simplify a lot of the engineering challenges.
    0:13:36 But the main concepts we know that are likely to work will run into these challenges.
    0:13:38 They’re, they’re very, very significant.
    0:13:46 Meaning that even if the physics work, actually building a thing at, at a reasonable cost is going to be super hard.
    0:13:48 Yeah, I think that’s, that’s my view.
    0:13:52 So you actually did start a business and are selling things, right?
    0:13:53 Yes.
    0:13:56 Using fusion to do stuff that people will pay for.
    0:13:57 So let’s talk about that.
    0:13:59 Let’s talk about where the company is today.
    0:14:01 And then we can talk about where you’re about to be.
    0:14:04 And then we can talk about where hopefully you’ll be in some number of decades.
    0:14:06 What, what, what do you sell in today?
    0:14:07 We sell neutrons.
    0:14:07 Uh-huh.
    0:14:09 We sell neutrons.
    0:14:11 Took me a little while when I started it.
    0:14:13 And then I thought, well, you know, I buy electrons.
    0:14:16 I buy electrons every time I turn on a light switch, right?
    0:14:17 I’m used to buying electrons.
    0:14:20 Tell me about the neutron business.
    0:14:21 Like, what does that mean?
    0:14:22 Yeah, and I’ll, I’ll translate it.
    0:14:23 So we sell fusion.
    0:14:26 We just sell fusion to the highest bidders.
    0:14:29 And the highest bidders are not people who buy energy.
    0:14:34 And so it turns out the easiest fusion reaction to do is, is DT fusion.
    0:14:40 And DT fusion produces energy on the one hand, but it produces neutrons on the other.
    0:14:46 And when sold to certain customers, the neutrons are far more valuable than the energy.
    0:14:51 So, so just to be clear, DT fusion is just two different isotopes of hydrogen, right?
    0:14:51 Correct.
    0:14:58 And they make helium, and then they throw off some number of neutrons, which is just the
    0:15:00 neutral, uh, uh, nuclear particle.
    0:15:04 And you’re saying there’s people who actually have a use for neutrons.
    0:15:05 Yes.
    0:15:05 Okay.
    0:15:06 Yeah, it turns out.
    0:15:07 And they’ll pay a ton for it.
    0:15:14 And so, uh, and, and generally the, the historical, uh, neutron sources for these are very specialized
    0:15:15 fission reactors.
    0:15:16 So research reactors.
    0:15:16 Okay.
    0:15:19 So more traditional nuclear reactors.
    0:15:20 Because fission reactors throw off neutrons too.
    0:15:25 Um, and as you, as we’ve talked about already, fission is much easier than fusion, uh, from a,
    0:15:26 from a science perspective.
    0:15:31 Uh, and so there’s these old reactors that serve these industries, but the, but the, the, the
    0:15:34 research reactor fleet that we built in the past is old.
    0:15:38 It’s like 60 plus years old and essentially dying in general.
    0:15:42 So markets that have been served by these reactors are losing that capacity.
    0:15:49 Uh, on top of that, um, fusion based approaches are much cheaper than building new reactors.
    0:15:54 Uh, so as you look to replace the infrastructure, there’s a massive, uh, edge for fusion there.
    0:15:58 Um, and, and when we looked at the markets, you know, we did very quick, like, well, everyone
    0:16:02 else in fusion you probably talked to is chasing something called Q greater than one.
    0:16:07 And that’s the ratio of energy out over energy in, and they want to show that they can make
    0:16:10 more energy than they can put into it.
    0:16:12 That’s the fundamental fusion dream, right?
    0:16:13 Sure.
    0:16:16 But they don’t even think, you know, most of them aren’t really even seriously thinking
    0:16:17 about the economics.
    0:16:22 They’re saying first we need to get to net energy and then we’ll worry about net economics.
    0:16:27 For me, I, I, you know, I couldn’t see a way to scale fusion unless we were worried about
    0:16:28 net economics right away.
    0:16:33 And, and, and if we wanted to practice, we needed to have positive net economics right
    0:16:33 away.
    0:16:35 So we, our, our core metric was Q economic.
    0:16:38 And so how do we get more dollars out than dollars in?
    0:16:41 Which is the classic business question.
    0:16:44 The question every business needs to answer to survive.
    0:16:46 How can our revenues be greater than our costs?
    0:16:46 Yeah.
    0:16:48 And that’s how we have seen deep tech scale, right?
    0:16:51 Like that is the playbook by which it scales.
    0:16:55 So we, we pursued that and, you know, we found actually customers.
    0:16:59 So, you know, if you do a kilowatt hour of fusion, if you produce a kilowatt hour of fusion
    0:17:04 heat and, and you can sell that for five cents, let’s say, um, if you, if you took the same
    0:17:09 neutrons generated by that kilowatt hour of fusion reactions, there are customers who would pay
    0:17:10 $200,000 for it.
    0:17:11 Huh.
    0:17:13 And so that’s a massive difference.
    0:17:17 And so are you in fact selling those neutrons for $200,000 now?
    0:17:18 Is that your business?
    0:17:18 We are.
    0:17:21 And who is buying them and what are they doing with them?
    0:17:21 Yeah.
    0:17:23 So they’re making airplanes safer.
    0:17:26 Uh, you know, they’re making rockets, uh, more reliable.
    0:17:31 What is the link between buying neutrons from you and making a airplane safer?
    0:17:31 All right.
    0:17:37 So, uh, modern engines and jet aircraft operate to get very high efficiency and very high power.
    0:17:38 They operate in a really high temperature.
    0:17:43 In fact, they operate like 20% above the melting point of the blades in the engine.
    0:17:45 I’m glad I didn’t know that.
    0:17:48 And now I’m going to tell you something that gets even more scary.
    0:17:52 So the way they manage that is they suck cold air in from the front of the engine and
    0:17:56 they pipe it through a series of cooling tubes in each fin, like embedded in each fin.
    0:18:01 Uh, and the manufacturing process is such that it’s fairly common that one of these cooling
    0:18:02 tubes is blocked.
    0:18:03 Okay.
    0:18:08 And, and if it’s blocked, it will melt, it will imbalance the engine and possibly destroy
    0:18:08 it.
    0:18:11 Uh, and so we don’t want that to happen.
    0:18:12 Um, truly.
    0:18:19 But with modern materials, and those are materials that x-ray or ultrasound do not interact with
    0:18:19 heavily.
    0:18:23 So if you try to see inside these things with conventional techniques, you cannot see the
    0:18:23 defect.
    0:18:28 So just to be clear, you make this engine and then you want to look inside to make sure that
    0:18:32 these cooling tubes are not blocked so that it doesn’t melt and the plane crashes.
    0:18:37 And so you think, well, we could use x-ray or ultrasound to common technologies, but you’re
    0:18:38 saying those don’t work.
    0:18:38 Yeah.
    0:18:42 But there’s some way you can, what, shoot neutrons at it and see inside of it?
    0:18:42 Yeah.
    0:18:43 Yeah.
    0:18:44 Yeah, there is.
    0:18:49 So, so, um, neutrons have, you know, they have, uh, a characteristic of, there are certain
    0:18:50 isotopes.
    0:18:57 So certain materials in nature that absorb neutrons like crazy, like, like, and, and you can put
    0:18:58 them where you want them to be.
    0:19:04 So, so for example, with jet engine blades, um, we just push a liquid solution containing
    0:19:06 a material known as gadolinium into the blade.
    0:19:09 Uh, and then we blow it out with air.
    0:19:11 Uh, and if the channel’s blocked, it doesn’t blow out.
    0:19:17 So the gadolinium sits in there and then we hit it with neutrons and any neutron that
    0:19:19 comes close to that gadolinium gets absorbed.
    0:19:23 Uh, and then behind the blade, you put a piece of film that’s sensitive to neutrons.
    0:19:25 It’s a little more complex than that.
    0:19:27 And you can see it and you can see it.
    0:19:28 It’s like an x-ray.
    0:19:29 It’s like a neutron x-ray.
    0:19:34 You see the inside of stuff, but, but neutrons can see things x-ray can’t.
    0:19:36 And it’s actually very complimentary.
    0:19:38 X-ray is good at generally heavy materials.
    0:19:41 Neutrons are generally good at seeing light materials.
    0:19:44 And so are you in that business now?
    0:19:45 We are.
    0:19:45 Yeah.
    0:19:46 Yeah.
    0:19:49 We’ll do tens of thousands of parts, you know, in a year.
    0:19:54 And, uh, yeah, we’re, we’re replacing essentially aged capacity.
    0:19:59 So, uh, the biggest imaging reactor in the United States shut down about two years ago.
    0:20:00 It was run by GE.
    0:20:03 Uh, and so there’s this nice tailwind for share acquisition here.
    0:20:07 It’s not just a way for us to make money in fusion, but, but a lot of the customers sort
    0:20:11 of just come to us proactively because they’re very worried about the future of the supply
    0:20:11 chain.
    0:20:12 Uh-huh.
    0:20:15 So there, they send you the blades.
    0:20:15 You have a facility.
    0:20:17 They send you the blades.
    0:20:17 They do.
    0:20:18 And you, yeah.
    0:20:21 And we give them back pictures, uh, with the blades.
    0:20:21 Yeah.
    0:20:22 Okay.
    0:20:24 So that’s the business you’re in.
    0:20:29 Uh, it seems like the next big step is getting into the medical isotope business, right?
    0:20:30 You’re building a.
    0:20:30 Yeah.
    0:20:30 Yeah.
    0:20:31 Okay.
    0:20:34 And, and just on the other thing, there are many others.
    0:20:35 So turbine blades are just one application.
    0:20:39 There’s a lot of other parts and components that we validate, including radiation hardness
    0:20:41 testing and electronics, et cetera.
    0:20:45 So, but yeah, the next step, um, and it required a huge reduction in the cost per neutron.
    0:20:50 Uh, we had to get the cost per neutron down a thousand fold, uh, to make the next step work.
    0:20:51 So this is important, right?
    0:20:52 Yeah.
    0:20:58 The whole arc you’re trying to, uh, follow is like, let’s do one thing where we can make
    0:21:01 a lot of money and then let’s do the next thing where they’ll actually pay us less.
    0:21:06 So we have to figure out how to do it a thousand times cheaper for it to be profitable.
    0:21:09 But they’ll buy a lot more neutrons.
    0:21:14 And so the, the, the market opportunity is actually, you know, let’s call it 10 to 20 times larger
    0:21:15 than the test opportunity in total.
    0:21:20 Uh, and so even though they’re paying you less, they’re buying so many more neutrons that,
    0:21:22 you know, you make more money.
    0:21:27 We’ll be back in just a minute.
    0:21:40 Run a business and not thinking about podcasting.
    0:21:45 Think again, more Americans listen to podcasts than ad supported streaming music from Spotify
    0:21:46 and Pandora.
    0:21:50 And as the number one podcaster, I heart’s twice as large as the next two combined.
    0:21:53 So whatever your customers listen to, they’ll hear your message.
    0:21:57 Plus only I heart can extend your message to audiences across broadcast radio.
    0:21:59 Think podcasting can help your business.
    0:22:05 Think I heart streaming radio and podcasting call 844-844-I heart to get started.
    0:22:08 That’s 844-844-I heart.
    0:22:16 The next step for shine for Greg’s company is to start using neutrons to create medical isotopes.
    0:22:21 Medical isotopes, as it turns out, are widely used in medical imaging.
    0:22:29 To get into that business, shine is building what’s basically a factory that’s going to use fusion to create medical isotopes.
    0:22:32 They call the factory chrysalis.
    0:22:34 And Greg and I were talking on video.
    0:22:40 And at this point in the conversation, he mentioned that you could actually see chrysalis out the window behind him.
    0:22:42 And this is chrysalis behind me, by the way.
    0:22:46 And it’s not a picture for people who are listening.
    0:22:54 It’s like out there, there’s a bunch of grass, and then there’s a building that looks like a rectangle, a cement rectangle over your shoulder.
    0:22:57 That’s over half a billion dollars of invested capital is what it is.
    0:23:00 Over half a billion dollars of cement rectangle for my information.
    0:23:02 So, tell me about what’s going on in there.
    0:23:03 Yeah.
    0:23:06 So, essentially, we needed to get the cost per neutron down.
    0:23:06 We did.
    0:23:09 We demonstrated that back in 2019.
    0:23:16 And what we knew we could do then is if we got it that cheap, instead of using neutrons just to examine material, we can use it to change material.
    0:23:16 Okay.
    0:23:19 In a sense, nuclear engineers call it transmutation.
    0:23:22 But, like, the common population would think of it as alchemy.
    0:23:29 You can use neutrons to turn low-value materials into, I’m going to call them hyper-valuable materials, and I’ll tell you why in just a second.
    0:23:36 So, at small scale, the most interesting markets for these are in medicine, producing isotopes used for medicine.
    0:23:40 Which turns out to be wildly common, right?
    0:23:52 Like, medical isotopes, I learned, you know, researching for this show, are like, what, tens of thousands of people a day in the U.S. are tested with medical isotopes, right?
    0:23:54 Yeah, 50 million per year around the world.
    0:23:55 Yeah, yeah.
    0:23:56 So, they’re super common.
    0:24:04 And, again, just like in the testing business where we’re replacing fission reactors, that’s how isotopes have been made in the past.
    0:24:10 So, old fission research reactors around 60 years old, and, you know, they’re dying, right?
    0:24:11 Like, the infrastructure’s going away.
    0:24:13 And so, it’s the same tailwind.
    0:24:15 We just needed to get fusion a lot cheaper to do it.
    0:24:26 And is it right that, in a kind of crude way, the use is analogous in many cases, that medical isotopes are used for scanning, but you’re scanning people instead of jet airplane blades?
    0:24:28 Yeah, the mechanism’s a little bit different.
    0:24:30 So, but it’s the same idea, right?
    0:24:34 Like, so, and it fits our whole theme of illumination around the company, right?
    0:24:36 In one case, we’re illuminating defects here.
    0:24:37 We’re illuminating disease.
    0:24:39 Eventually, we’ll be illuminating the planet, right?
    0:24:40 With energy.
    0:24:41 That’s a good metaphor.
    0:24:42 Yeah, yeah.
    0:24:48 But, you know, so what you do is you’ve got enough neutrons now that you can turn, you can change materials.
    0:24:58 So, you can take things that are relatively stable, like uranium, you buy it for $6 a gram, turn it into an imaging isotope, molybdenum-99, which is worth like $150 million a gram.
    0:25:01 Presumably, people buy it in very, very, very small amounts.
    0:25:02 Yeah, they do.
    0:25:06 But, you know, it’s a dose for a patient is like one-tenth of one microgram, right?
    0:25:15 Yeah, I mean, if it’s $150 million a gram, and you’re making it in a $500 million building, you don’t have to make much of it to redeem your cost of capital.
    0:25:17 Yeah, and chrysalis will produce a few grams per year.
    0:25:18 That’s it.
    0:25:18 Yeah, wow.
    0:25:19 That’s extraordinary.
    0:25:25 So, it’s like a few grams is like a little cup, not even just a little, like a spoonful.
    0:25:25 No, it’s like a sugar packet.
    0:25:27 Like the sugar packet you dump in your coffee, that’s a few grams, right?
    0:25:29 But that’s millions of doses or something.
    0:25:33 Yeah, so one gram is 10 million doses, is essentially the way I think about it.
    0:25:34 That is wild.
    0:25:37 Can we just have that be wild for one moment?
    0:25:37 Okay, go on.
    0:25:42 So, you know, in the U.S., for example, most of the testing is to look at blood flow in the heart.
    0:25:50 If you’re having chest pain, doctors will give you this test to see if your arteries are blocked or where the muscle is receiving blood and where it’s not.
    0:25:56 But also for staging cancer, and there’s probably another two dozen tests all that use this on a scan.
    0:25:58 So, that’s what you’re going to be making.
    0:26:04 Like, tell me about the business end of Chrysalis, of that facility over your shoulder.
    0:26:08 Like, what’s it look like in there where you’re actually doing the fusion?
    0:26:10 So, there’s a bunch of machines.
    0:26:12 So, there’ll be six fusion machines in Chrysalis.
    0:26:15 They’re built, and they’re, you know, they’re being installed.
    0:26:17 And they are surrounded.
    0:26:24 So, there’s a tube in which the particle beam comes down, and it strikes tritium and makes fusion reactions.
    0:26:26 And the neutrons come out in all directions.
    0:26:30 And we’ve surrounded that tube with a uranium target.
    0:26:31 It’s uranium dissolved in water.
    0:26:35 And as the neutrons hit it, they cause it to split.
    0:26:46 And we get isotopes that are useful for medicine, things like molybdenum-99, iodine-131, which is used to treat cancer, xenon-133, which is used to image brain and heart.
    0:26:51 So, you’re actually using fusion to drive a fission reaction that makes the thing that you want.
    0:26:51 Precisely, yeah.
    0:26:53 It’s like a nuclear turducken.
    0:26:53 Yeah.
    0:26:56 Versus using fission to drive a fission reaction.
    0:26:58 And the difference is cost.
    0:27:08 If you were to look at building a new research reactor to do what Chrysalis does, you’re probably at something like five to ten times the cost when it’s all said and done.
    0:27:11 So, fusion turns out to be much cheaper and much safer.
    0:27:15 And it produces about, you know, somewhere between one and five percent the radioactive waste.
    0:27:17 Of a reactor.
    0:27:18 So, much, much cleaner.
    0:27:24 And is it right that there have actually been shortages of the isotope that you’re going to be making?
    0:27:25 All the time.
    0:27:26 Yeah.
    0:27:28 And it’s been going on for 15 years.
    0:27:31 And so, how close are you to opening?
    0:27:32 What has to happen?
    0:27:35 There’s a building behind you, but it’s not on yet, right?
    0:27:38 The equipment is almost all entirely here in Jamesville.
    0:27:39 We need to install it.
    0:27:40 We need to commission it.
    0:27:42 And then we need to start pushing product out of it.
    0:27:44 When are you going to get the first neutron?
    0:27:48 I won’t say out the door, but, you know, when are you going to make the first neutron?
    0:27:53 Well, the first neutrons are actually, like, being made in a smaller building to the side that we used to practice.
    0:27:56 But the first isotopes should be made in about 18 months.
    0:27:57 Okay.
    0:27:59 So, like, end of next year.
    0:27:59 Yeah.
    0:28:04 I think, and there’s a big difference between first isotope produced and actually commercial readiness.
    0:28:04 Yeah.
    0:28:06 And your whole thing is techno-economics, right?
    0:28:10 The first isotope produced where the unit economics are profitable for you.
    0:28:10 Yes.
    0:28:13 You know, and I would say that’s probably more likely two years.
    0:28:17 But this is a plant that no one’s ever built before with technology that we have tested in the lab.
    0:28:23 But, you know, when you build a working machine that has thousands of moving parts and we’ve de-risked all the…
    0:28:23 Oh, yeah.
    0:28:26 I’m totally willing to believe that it won’t work.
    0:28:26 Oh, it will work.
    0:28:29 But the things that are going to break and burn us are, like, you know…
    0:28:32 Or that it won’t be economical, right?
    0:28:35 Like, nobody has ever done anything like what you were doing before.
    0:28:35 Yeah.
    0:28:36 It’ll be economical.
    0:28:40 I think the question is, for me, is, like, I’m worried about things like valves.
    0:28:42 We have hundreds of valves in this plant.
    0:28:45 And they might have a very low failure rate, right?
    0:28:49 But if the failure rate’s 1% on hundreds of valves, you’re going to have a lot of problems.
    0:28:51 You’re always going to have broken valves.
    0:28:52 This is your engineering training.
    0:28:54 Yeah, it’s exactly right.
    0:28:57 Like, I’ve got an earlier Model S and…
    0:28:57 A Tesla.
    0:28:58 An old Tesla.
    0:28:58 Yeah.
    0:28:58 Yeah.
    0:29:00 The motor runs fantastically.
    0:29:02 The car is still super fun to drive.
    0:29:04 It’s got 170,000 miles on it.
    0:29:07 But I’ve replaced the door handles, like, it feels like a dozen times.
    0:29:12 And it’s not fun when you suddenly can’t get in your car and you’ve got to, like, use a credit card to…
    0:29:14 Especially if it’s like, oh, look how fancy the door handles are.
    0:29:16 Just make regular door handles, man.
    0:29:18 And they went back to that, actually.
    0:29:19 They learned a lesson there.
    0:29:21 But that’s what’s going to hit us.
    0:29:27 So I think as we think about that thing, like, really producing reliably, I tell people probably two years is sort of the soonest.
    0:29:29 And it could be three, right?
    0:29:31 Like, it could be somewhere in that range.
    0:29:33 That’s the current step.
    0:29:33 Yeah.
    0:29:34 I want to get to the big dream.
    0:29:41 How many steps between making medical isotopes and creating cheap and abundant power for all of humanity?
    0:29:42 Yeah.
    0:29:45 And by the way, our steps are, like, pragmatic, not dogmatic.
    0:29:46 That’s nice.
    0:29:47 But they’ve held.
    0:29:51 So, like, if there are new market applications that come up, we’ll definitely look to include them.
    0:29:52 I won’t hold you to it.
    0:29:55 I promise I won’t hold you to your forward-looking statements.
    0:29:55 Yeah.
    0:30:00 But they have held for the last 15 years, I guess, is something I can say with confidence.
    0:30:07 So the next step is to do this transmutation, right, changing one material into another at a larger scale.
    0:30:11 And we can use that to solve one of the biggest problems with fission energy.
    0:30:15 So, again, you can see us starting to come into the fission world a little bit here.
    0:30:21 And one of the things that we should be doing as a nation is we should be recycling all of our nuclear waste.
    0:30:22 We have a lot of nuclear waste.
    0:30:27 For a while, we were going to bury it all in a mountain in Nevada, but people in Nevada didn’t like that idea.
    0:30:30 So it’s still just sort of sitting around everywhere.
    0:30:35 And it’ll be sitting around for millions of years, the right order of magnitude, if we don’t do something about it.
    0:30:35 I think that’s right.
    0:30:38 And the problem with that is it’s just loaded with value.
    0:30:39 Right.
    0:30:40 People worry about it.
    0:30:44 But also, look, look at all this energy that’s just sitting there ready to be harvested.
    0:30:45 Yeah.
    0:30:47 So why not solve two problems at once, right?
    0:30:49 It’s not super safe where it is.
    0:30:50 I mean, it’s pretty safe where it is.
    0:30:53 But if somebody wanted to do something to it, they might be able to.
    0:30:59 A lot of it’s plutonium, which is stuff that if you worked and you processed it enough, you could turn into a nuclear weapon.
    0:31:01 So we should be eliminating that hazard.
    0:31:05 And at the same time, we can solve a strategic fuel supply issue for us.
    0:31:11 Now that our relationship with Russia is not so good, you know, they were the source of a lot of the uranium that we put into our fission reactors.
    0:31:19 But if we recycle all of our spent fuel, essentially we can become totally independent of any other nation for our own fission energy needs.
    0:31:24 And the great part is the more fission reactors we burn, the more recycled fuel we have.
    0:31:26 So it just scales with the number of plants.
    0:31:36 And so you have a sort of clear technological line to using your fusion reactors to do what?
    0:31:40 To get energy out of spent fuel from fission plants?
    0:31:41 Like what do you actually do there?
    0:31:47 We’ll take spent fuel, we’ll dissolve it into a liquid form, we’ll separate out valuable materials.
    0:31:50 That includes uranium and plutonium, which should go back into the reactor.
    0:31:54 So close loop, close the fuel cycle with fission.
    0:31:59 We’ll separate out other things, precious metals, rare earth elements that have decayed enough to sell.
    0:32:04 And then you’re left with this very small waste stream, like it’s less than 5% of the original.
    0:32:08 Almost all of that has relatively short half-lives, decades or less.
    0:32:17 And a little bit of it has these really long problematic half-lives, million year plus isotopes.
    0:32:19 That’s the only place fusion comes in.
    0:32:20 It solves that problem.
    0:32:25 So fusion neutrons can transmute just like we use them to turn low value into high value.
    0:32:28 We can use them to turn long half-life into short half-life.
    0:32:38 And one great example I like to use is iodine-129, waste product from fission, lives over 10 million years, over 10 million year half-life.
    0:32:39 Which is bad.
    0:32:40 It’s going to be radioactive forever.
    0:32:40 Forever.
    0:32:41 Yeah.
    0:32:43 And you hit it with a fusion neutron, though.
    0:32:45 It becomes iodine-128.
    0:32:49 Iodine-128 has a 25-minute half-life, after which it becomes stable.
    0:32:51 And then you put it in salt.
    0:32:52 Yeah, you could, right?
    0:32:53 Like you could eat it.
    0:32:53 Yeah.
    0:32:59 So you do this process with fusion, and you solve the problem with the long-lived waste.
    0:33:05 So we want to do that two steps, and we know how, because we’re already doing both of those processes in chrysalis.
    0:33:13 So as we look to scaling to a waste recycling plant, we’ve already got essentially a prototype for it here.
    0:33:15 And, you know, we’re going to build on that.
    0:33:20 It’s the same part of the regulatory code that would license a recycling plant, same type of construction, everything.
    0:33:25 Yeah, so that one is obviously complicated on multiple dimensions, right?
    0:33:28 I mean, you have to—whatever the technical side is, it’s the technical side.
    0:33:30 But presumably, you’re dealing with nuclear waste.
    0:33:33 There’s going to be a whole, like, political, regulatory side.
    0:33:36 That’s—what is that, a decade when you think about that?
    0:33:39 Ten years, 20—like, that’s a long game already, right?
    0:33:43 But the political winds are changing, and I’m not talking about because of the current administration.
    0:33:47 No, the world is becoming more pro-nuclear, you know, basically nonpartisan way.
    0:33:50 But people are starting to learn that you shouldn’t think in absolute terms, right?
    0:33:56 Like, are you more afraid of climate change, or are you more afraid of the very, very small risk posed by nuclear energy?
    0:33:56 Yes.
    0:34:03 And anyone who thinks about it from a mathematical perspective very quickly comes to, wow, climate change is going to hurt way more people than nuclear energy ever will.
    0:34:09 Even particulate emissions from, you know, certainly coal plants are wildly more dangerous than a fission plant.
    0:34:17 Absolutely, and if you look at, like, coastal flooding and stuff like that, multiply that by, like, tens or hundreds of times in terms of impacted people.
    0:34:25 So, okay, so you’re saying the political—still, it’s going to be—it’s going to take a while, and it’s going to be hard, despite the political shifts you’re talking about.
    0:34:26 Yeah, we’ll see.
    0:34:34 You know, the U.S. has had a long-term policy ban on recycling spent fuel, you know, but new executive orders that just came out are challenging that.
    0:34:37 So, I’m trying to reinvigorate the nuclear industry.
    0:34:39 I mean, when do you think you’re going to do it?
    0:34:40 Uh, 2032.
    0:34:41 Okay.
    0:34:43 Not 10 years, but not too far off of 10 years.
    0:34:44 In a pilot plant.
    0:34:45 Okay.
    0:34:47 Because we want to prove the economics first.
    0:34:49 And then can we get to the big dream after that?
    0:34:49 Of course.
    0:34:52 When do we get to free energy for all of humanity?
    0:34:52 Now?
    0:34:53 Are we ready?
    0:35:02 So, the cool thing is, as you look at, like, these fusion systems that you use for recycling spent fuel, they look technologically very much like fusion power plants.
    0:35:05 But you’re still getting paid at least 20 times as much per reaction.
    0:35:09 And they don’t need to operate 99.99% of the time.
    0:35:12 Because, you know, people freak out if a city loses power for good reason.
    0:35:18 If you slow down recycling a material that has a 10-million-year half-life, no big deal, right?
    0:35:21 Like, you fix the machine, you get to learn, and you get to move forward.
    0:35:27 So, and we’re going to have to build dozens, if not hundreds, of these fusion systems to solve the global problem with nuclear waste.
    0:35:37 So, through economy of scale and through practice, on a much more forgiving environment where you’re getting paid more per neutron, we think we can get that next, you know, that next factor of 10 or so.
    0:35:48 So, really, in your mind, the recycling nuclear waste is like a sort of a straight line.
    0:35:51 It just ramps right up to just generating energy.
    0:35:56 Yes, in my mind, and this is very hard for a lot of people to grasp, but it really is exactly that.
    0:36:00 So, I’m glad that you put that together right away, because it is that.
    0:36:08 So, the sort of fusion reaction you would be running in that context, it’s the kind of thing that, well, let’s go back to Q.
    0:36:12 Let’s go back to this idea of getting more energy out than you put in, right?
    0:36:15 Like, in that setting, how does that happen?
    0:36:18 At some point in the future, somebody has to do that.
    0:36:18 Yeah, yeah, yeah.
    0:36:23 And I know that’s not your primary goal, and it’s a compelling case for why that’s not your primary goal.
    0:36:28 But, like, do you get to that just by incremental engineering tweaks?
    0:36:33 Are you ever going to have to, like, have some, you know, physics-level technological insight?
    0:36:39 Or do you just think you can keep optimizing and optimizing what you’re doing, and you’ll sort of eventually get to more energy out than you put in?
    0:36:42 No, we’ll need physics optimization, too.
    0:36:46 Like, so, and even going from phase one to phase two, it was new technology.
    0:36:47 Yeah.
    0:36:51 But the truth is, through practice and building over time, like, it’s a different path.
    0:36:55 And you have a different technology evolution path than trying to go straight to the endgame.
    0:36:57 And so, and it’s pragmatic, right?
    0:36:59 You’re always building systems that are doing work for customers.
    0:37:06 And so, it’s cost-effective built into the model, and it’s pragmatic built into the model, and that’s just how you design new technology.
    0:37:12 But what I’ll say is, we have our own technology that we like for scaling into phase three, recycling, and ultimately energy.
    0:37:15 But I’m the only fusion company that will say this.
    0:37:18 I don’t think it’s more than 10% likely to be successful.
    0:37:22 And I don’t think any given technology probably is.
    0:37:29 And so, what I do know, though, is we’re going to have an amazing delivery engine that can manufacture fusion systems at scale.
    0:37:35 And whatever technology is successful, I know we will have a role to play in bringing this economically to the world.
    0:37:42 When you say you don’t think it’s more than 10% likely to be successful, you mean the particular technology you are betting on using.
    0:37:42 Yeah.
    0:37:46 You think it’s very unlikely that it will work to put out more energy than you put into it.
    0:37:47 It probably won’t.
    0:37:48 It probably won’t do that.
    0:37:49 And to be clear, cost-effectively.
    0:37:50 Cost-effectively, right.
    0:37:51 Yeah.
    0:37:52 For electricity.
    0:37:53 Yes.
    0:38:00 But you’re saying you’re learning all of these things about the engineering, about the nuts and bolts that will be relevant no matter whose technology works.
    0:38:00 Exactly.
    0:38:01 Let me ask you this.
    0:38:07 I feel like if you think your technology probably won’t work, you must hope somebody else’s will, right?
    0:38:11 Like, if somebody else does it before you do it, will you be happy?
    0:38:13 Will that be good in your mind?
    0:38:13 Yes.
    0:38:15 It would be fantastic.
    0:38:20 And it’s kind of funny because, you know, it’s becoming a competitive world in the fusion space.
    0:38:22 And, like, I’m cheering for everybody.
    0:38:23 I love that.
    0:38:27 I would love to see anyone be successful in moving forward.
    0:38:31 And, look, we’re going to have an awesome economic and manufacturing engine we’d love to work with.
    0:38:35 Whatever technology prevails at the end of the day, we’re going to continue to adapt our strategy
    0:38:38 and invest in what looks like it’s doing great just so we can move fast.
    0:38:42 But this is a tool that I want to see in my lifetime come to humanity.
    0:38:45 And, like, that means looking across the page at everything.
    0:38:48 Just like we looked at fusion holistically, right?
    0:38:49 Not just the energy.
    0:38:52 We’re not dogmatic to a single technical approach.
    0:38:57 We’re going to learn a ton in the next 10 years with all this funding going into all these different approaches.
    0:39:00 And I’m really, really excited to see what comes out of it.
    0:39:05 We’ll be back in a minute with the lightning round.
    0:39:17 Run a business and not thinking about podcasting?
    0:39:18 Think again.
    0:39:23 More Americans listen to podcasts than ad-supported streaming music from Spotify and Pandora.
    0:39:27 And as the number one podcaster, iHeart’s twice as large as the next two combined.
    0:39:30 So whatever your customers listen to, they’ll hear your message.
    0:39:34 Plus, only iHeart can extend your message to audiences across broadcast radio.
    0:39:36 Think podcasting can help your business?
    0:39:37 Think iHeart.
    0:39:40 Streaming, radio, and podcasting.
    0:39:43 Call 844-844-IHEART to get started.
    0:39:45 That’s 844-844-IHEART.
    0:39:50 Let’s finish with the lightning round.
    0:39:53 What’s one thing you would do if you had free unlimited power?
    0:39:54 One thing I would do?
    0:39:59 Well, you know, so this all goes back to my nerdy childhood and space and Star Trek, right?
    0:40:05 Like, I’d love to build a series of spacecraft that would go back and forth from the Earth to Mars and otherwise.
    0:40:10 I think, you know, if you’ve got a fusion engine, that becomes very, very fast and very, very easy.
    0:40:15 You know, this nine-month travel time is actually insanely problematic for humans.
    0:40:20 The radiation you get up in space is going to be very damaging over those timeframes.
    0:40:27 And so, you know, even if we start to build a city on Mars, it’s going to be very harmful for people just to get there and back.
    0:40:28 You think you’ll go to space?
    0:40:30 Well, you know, it’s funny.
    0:40:37 I used to always want to be an astronaut, but the reality of very tiny closed-in capsules is something that I’m not, like, super big fan of.
    0:40:42 So, if we had starships or something a little more spacious, I’d love to, but not so much in today’s environment.
    0:40:48 I mean, you need fusion power to build your Cadillac to space.
    0:40:49 Exactly right.
    0:40:50 Exactly right.
    0:40:52 If you weren’t working on fusion, what would you be working on?
    0:40:55 I’d probably also be working on the same thing.
    0:41:05 I do think, like, even with fission, there are ways to build spacecraft that can go to and from the different planets in the solar system very cost-effectively and fairly quickly.
    0:41:10 Fusion would be faster, but we can get the time down to a couple months, probably, with fission.
    0:41:12 If you go anywhere in the solar system, where would you go?
    0:41:14 Anywhere in the solar system.
    0:41:15 You want to do anywhere in the galaxy?
    0:41:16 I don’t care.
    0:41:17 It’s just a question.
    0:41:18 Well, yeah.
    0:41:25 I mean, if you could go anywhere in the galaxy, it’d be great to go to some place where you could witness, like, a supernova happening from close range
    0:41:26 without being obliterated.
    0:41:29 The world’s most spectacular fireworks show would be something to see.
    0:41:39 Greg Peiffer is the co-founder and CEO of Shine.
    0:41:44 Please email us at problematpushkin.fm.
    0:41:47 We are always looking for new guests for the show.
    0:41:51 Today’s show was produced by Trina Menino and Gabriel Hunter-Chang.
    0:41:56 It was edited by Alexandra Gerriton and engineered by Sarah Bruguer.
    0:42:00 I’m Jacob Goldstein, and we’ll be back next week with another episode of What’s Your Problem?
    0:42:15 This is an iHeart Podcast.

    Getting energy from nuclear fusion has been a dream for decades; it would be cheap, abundant, and safer than today’s nuclear fission reactors. Billions of dollars have flowed into fusion startups in recent years, but reliable, economic fusion power may still be decades away.

    Greg Piefer is the founder of a fusion company called Shine, where he’s pursuing a different path. Rather than go straight to fusion as a source of energy, he’s using fusion to pursue more profitable markets right now – with the hope that what he learns today will eventually help lead to cheap, abundant fusion energy.

    See omnystudio.com/listener for privacy information.

  • Econ Battle Zone: Budget Showdown

    Econ Battle Zone is back! On today’s episode Mary Childs and Kenny Malone enter Econ Battle Stadium to throw down against reigning champion Erika Beras.

    Can Mary explain what effect extending the 2017 tax cuts will have on economic growth AND make her entire segment rhyme? Will Erika be able to overcome her fear of singing and craft a country song about the history of Medicaid? Can Kenny put together a piece about what warning signs economists look for to know whether the national debt has grown too large… but as a romantic comedy?

    Guest judges Betsey Stevenson and David Kestenbaum face a difficult choice… but only one contestant can claim the coveted Econ Battle Zone Belt.

    Artists featured in this episode: Rexx Life Raj (IG: @rexxliferaj); Merle Hazard; Alison Brown; Tristan Scroggins; Matt Coles; and Garry West.

    Special thanks to Liz Garton Scanlon, Robin Rudowitz and Sarah Rosenbaum.

    Find more
    Planet Money: Facebook / Instagram / TikTok / Our weekly Newsletter.

    Listen free at these links: Apple Podcasts, Spotify, the NPR app or anywhere you get podcasts.

    Help support Planet Money and hear our bonus episodes by subscribing to Planet Money+ in Apple Podcasts or at plus.npr.org/planetmoney.

    Learn more about sponsor message choices: podcastchoices.com/adchoices

    NPR Privacy Policy

  • Hippocratic AI’s Munjal Shah on How AI Agents Are Expanding Healthcare Capacity – Ep. 262

    Munjal Shah, CEO of Hippocratic AI, discusses how AI agents can dramatically expand healthcare capacity and access. With 1.8 million patient calls completed and an 8.95/10 satisfaction rating, Hippocratic’s safety-focused LLM demonstrates how optimized AI inference can handle routine patient monitoring, post-surgery check-ins, and medication management – freeing human clinicians to focus on complex care requiring physical intervention. Learn about their constellation architecture using 22 models for safety validation and how their healthcare AI agent app store enables clinicians to scale certain aspects of their expertise.

    Learn more at: ⁠⁠ai-podcast.nvidia.com

  • Business as a sport, Surge AI, and Waymo vs. Robotaxi

    AI transcript
    0:00:05 dude manifest is out there’s a new word uh what generative wait is high agency are we
    0:00:11 selling high agency we’re selling high agency at the top right now we’re spacking high agency
    0:00:18 it’s gone taking that cash and we’re plowing it into generative i feel like i can rule the world
    0:00:24 i know i could be what i want to i put my all in it like no days off on the road let’s travel
    0:00:30 never looking all right what i miss how was the week week was good uh what did we do we had chris
    0:00:36 corner that episode’s popped off it’s over 100k on youtube so that’s going well and uh dude there
    0:00:44 were so many replies to one idea that was in that episode i don’t know if you listened to the episode
    0:00:51 the golfing one the golfing one yeah i got literally hundreds of replies of people who are like i could
    0:00:54 do this right here in my hometown people are studying powerpoint decks people are doing drive
    0:00:58 by sending me videos of the lake where they think they could do it they’re reaching out cold it’s
    0:01:03 very intense uh how many people have replied to this and now we’re going and what was the idea was it
    0:01:09 was about betting as to where you could hit it no so basically on the way there’s a place in new
    0:01:14 zealand on the way to the golf course just kind of side of the road there’s like there’s a road that’s
    0:01:18 driving by a body of water and if you just stop on the side of the road there’s this thing called
    0:01:23 like whatever the hole-in-one challenge and you buy a bucket of balls and you’re going to try to hit
    0:01:29 this hole-in-one of this little golf hole that’s floating out you know 100 100 yards away in the
    0:01:35 water if you hit if you get it you get 10 grand and so it’s just like a fun thing for you to do with
    0:01:40 your buddies like on the way or to or from a golf course and he was talking about like you know sort
    0:01:46 of napkin mathing what he thinks it’s making based off of the available information he’s like i think this
    0:01:51 thing does like 300 to 500k you know in revenue and now the costs are pretty marginal it’s like
    0:01:55 person standing there with an ipad uh there’s a scuba diver that goes in once a week and fishes out
    0:02:02 the balls like that’s it and so people got we we basically said hey i think this idea could work
    0:02:08 in more places than just this random roadside thing in new zealand let’s bring this to life and who wants
    0:02:13 to do this and a lot of people have come come out and so we’re gonna we’re gonna make it a mfm project
    0:02:17 we’re gonna see what we can do with this so i like all the comments were like this is what i’ve
    0:02:22 been missing with mfm because like we started a lot with that and then like our interests have grown and
    0:02:26 so the content has grown to be or evolved to be a little bit different sometimes and one critique is
    0:02:32 like um uh what is this my first billion because we talk about like uh like you know bigger ideas and
    0:02:40 um i was thinking i uh you know we’ve become um acquaintances with joe lonsdale who uh because of
    0:02:46 this podcast who’s worth i don’t know billions some amount of billions and i was uh with him recently
    0:02:52 by the way if you need to pick that up let me know if you need to pick up that that name drop uh did i
    0:02:58 drop did i drop that up here did i drop that name drop somewhere uh no but he uh and he was telling me
    0:03:02 like oh man or i was with him when i got my twitter check like you know how you get like
    0:03:07 twitter money now like you’re like right for example i for some reason my twitter was a thousand dollars
    0:03:12 last payment and uh like the month before it was like six hundred dollars and i was like man this is
    0:03:16 crazy i just got paid six hundred dollars for tweeting which is insane he’s like yeah i got like
    0:03:22 four hundred dollars and he was joking about how it feels uh just as exciting every once in a while
    0:03:27 to get like a four hundred dollar thing and it does however much money he’s created in his lifetime
    0:03:32 and i was wondering do you feel like when you’re talking about these things like you just lit up
    0:03:36 when you talked about three hundred four hundred thousand dollars when that may or may not i mean
    0:03:40 i don’t think so that’s not going to really move the needle for you in your life um but it’s kind of
    0:03:46 exciting isn’t it yeah not because of the money it’s just i think it’s awesome uh i think the idea
    0:03:53 the idea itself is fun making it happen sounds like it’s going to be fun you know actually i was
    0:03:59 just uh watching um an interview with a guy who they the nba finals just ended they had game seven
    0:04:03 the thunder one and there was hey i watched it i watched it there was this interview with one of
    0:04:06 the guys so they they asked jade up they’re like you know when you look back on this year what’s the
    0:04:12 what are you going to remember what were the high points and he goes he’s like it’s weird dude he’s
    0:04:17 like i remember if you think if i think about this year he’s like i remember me and chet we would go
    0:04:21 to our hotel room we would do film sessions but back when he was coming back back from injury to get it
    0:04:25 going or like these team dinners that we were having he’s like i couldn’t even tell he’s like i don’t
    0:04:30 remember what happened last series like i don’t remember in the the recent games what happened but
    0:04:37 those kind of like those inputs on the journey are like just like are so vivid to me and this has been a
    0:04:42 very common thing where if you talk to pro players after their career is done and you’re like what do
    0:04:48 you miss the most and you expect them to be like the big pressure moments that those big games and
    0:04:53 of course they do like those but the thing they talk about always is the team bus rides the locker room
    0:04:57 the it’s all of the like camaraderie stuff that happens along the way it’s like the kind of the
    0:05:03 build-up is the stuff that they miss the most and um i think i think there’s that for entrepreneurship
    0:05:10 too uh i think there’s that that’s a huge amount of the fun of it and it’s what you get excited about
    0:05:16 you need the number to sort of justify it the number gives you some air cover for why you’re acting like a
    0:05:21 little kid you’re so excited about something the numbers help because why are you taking this silly
    0:05:27 thing so seriously but uh i think we would probably all do it without the numbers as well or if the numbers
    0:05:32 are half as much or whatever yeah and i’ve noticed the best the the people who love mfm the most and
    0:05:37 the guests who you and i love the most are folks who um uh you know i hung out with a friend of mine
    0:05:41 um and she was like because she was from a bad neighborhood now she’s rich she’s like you know
    0:05:46 i’m so good at going really high and going really low i was like what’s that mean she’s like i can hang
    0:05:50 out with like my homies from where i grew up and we could just like shoot the shit and kind of be a
    0:05:55 little like hood ratty or i can go hang out with a billionaire and i could i love that too i can i
    0:06:00 have so much joy doing that as well and i can blend in and get along with everyone and i think that’s
    0:06:05 like that’s like what the pot is it’s like you like talking about these smaller things as well as the big
    0:06:10 things and it’s the same type of person who loves both yeah exactly also do you think about business as
    0:06:17 like a sport because that’s more and more become my mental model is the way that because i you meet
    0:06:22 people and a lot of people we know have now become successful but they’re still doing it and
    0:06:29 obviously for many of them i i call it they’ve already made the last dollar they’ll ever spend
    0:06:35 right let’s say you make 30 million dollars at that point you’ve already earned the last dollar you’ll
    0:06:40 ever need to spend especially once you take into account that that 30 million could just sit in a
    0:06:45 whether it’s a simple interest-bearing account or the s&p 500 and it’ll just keep it’ll double every
    0:06:51 seven years so 30 becomes 60 60 becomes 120 120 becomes 240 and that just all happened over the
    0:06:57 course of you know something like 25 years and so you’ve you don’t need to go earn the next dollar
    0:07:02 but why do they anyways and part of it is i think it feels good to be good at something and if you’re
    0:07:05 good at something it’s hard to stop doing it because the feedback loop of being good at something
    0:07:12 is strong but i think in that same way if you think about business not as a mechanism to make money but
    0:07:17 as a sport as uh as a sport you play then it’s like oh of course just because you’re great at
    0:07:21 tennis and you won a tournament doesn’t mean you’ll stop playing tennis why would you do that that’s your
    0:07:25 sport you love to play the sport you’ll basically play the sport till your body breaks down and doesn’t
    0:07:29 let you play the sport anymore and it feels good to manifest it feels good to have an idea and to see
    0:07:34 it in the reality into reality and it’s really fun flexing that model manifest is out there’s a new word
    0:07:40 uh what generative generative what does that mean this is this has happened a few times to me now
    0:07:45 i uh wait is high agency are we selling high agency we’re selling high agency at the top right now
    0:07:52 we’re spacking high agency it’s gone taking that cash and we’re plowing it into generative
    0:07:56 okay so generative i was on a podcast and i was like at the end i was like how was that and you
    0:07:59 could tell me the truth because i do podcasts all the time with guests i know it’s sometimes hit or
    0:08:06 miss like give me the from one podcaster to another what was that like for you and he’s like it was
    0:08:12 great because you’re extremely generative he goes what and he goes it was also hard because you’re
    0:08:17 extremely generative like what does that mean he goes i’ll say two things like i’ll give you one topic
    0:08:24 but you can almost like bloom that or expand that into like a story a framework of this a related idea
    0:08:30 a simple example you just generated all that content off the cuff right away and he goes you
    0:08:34 know biology is like that biology is extremely generative you give him one thing and he’s able
    0:08:40 to like take it from like the origin of man to you know to 100 years in the future and he could connect
    0:08:45 all those dots so i heard it once i was like okay that’s cool i don’t know if i just got insulted or
    0:08:50 complimented being called generative but i’ll take it and then james courier said the same thing he goes
    0:08:54 he’s like he’s like the reason we get along is because we’re both extremely generative he’s like
    0:08:59 we like being around generative people and he’s like you know why do we admire elon it’s not because
    0:09:03 he’s rich it was because he’s the most generative of all of us right and he’s the least fearful and
    0:09:07 that’s why he’s able to be more generative he’s he’s like he literally generates businesses like
    0:09:13 the boring company and neural link and spacex and tesla he’s like he’s generating kids he’s generating
    0:09:18 ideas he generates a president he generates he’s just doing so much and that’s admirable to
    0:09:23 somebody who is generative and so i started using that little lens i started looking at people being
    0:09:27 like how generative is this person meaning if you would give them an inch could they take a mile
    0:09:32 and what is their overall level of output in their life you know like how generative are they with
    0:09:37 like for example james courier it’s not just businesses he’s generating he you know at one point
    0:09:42 he also started a church in san francisco he like started a new religion and you know then he created
    0:09:46 this uh like sort of incubator this fund then he created a podcast he’s just constantly
    0:09:51 creating things because it’s extremely generative and it’s whether it’s with his kids life or it’s
    0:09:56 his business life or whatever so i started to realize oh yeah i’m really attracted to that i like people
    0:10:02 who are like that and i i want to be like that and figure out a way to make that work is um is a fun
    0:10:04 challenge it’s a generative is the new word
    0:10:10 all right this episode is brought to you by hubspot they’re doing a big conference this is their big
    0:10:15 one they do called inbound they have a ton of great speakers that are coming to san francisco
    0:10:19 september 3rd to september 5th and it’s got a pretty incredible lineup they have comedians like
    0:10:26 amy poehler they have dario from anthropic dark cash sean evans from hot ones and if you’re somebody
    0:10:30 who’s in marketing or sales or ai and you just want to know what’s going on what’s coming next
    0:10:34 it’s a great event to go to and hey guess what i’m going to be there you can go to inbound.com
    0:10:40 slash register to get your ticket to inbound 2025 again september 3rd through 5th in san francisco
    0:10:46 hope to see you there have you ever heard of this book called the inner game of tennis
    0:10:53 i’ve heard of it but i’ve never read it is it good yeah who’s the pro who’s it about okay so the inner
    0:10:58 game of tennis i randomly discovered it because i was at the airport and i was just looking for a book
    0:11:03 to read on my kindle and i wanted something short and i for some reason you’re like i’m in a bookstore
    0:11:08 we’re looking for books to download separately no like i i don’t remember what i was just like i think
    0:11:13 like i was on amazon on my phone and like a sports psychology book came up and i was like okay that’s
    0:11:17 intriguing what are what’s like the top sports psychology book there is or something like that
    0:11:22 and i randomly came across the inner game of tennis it’s about uh it’s written by a guy named timothy
    0:11:29 galway and it’s one of these books that it’s about life and it just uses tennis tennis as the analogy
    0:11:35 and the premise of the book is that you have two selves self one is your person so like when you say
    0:11:42 like um uh when you’re playing tennis and you hit you do a bad hit you go uh why do i suck so much or
    0:11:49 like like that is self one the critical self and then self two is like your animalistic self who doesn’t
    0:11:54 um who doesn’t uh think too much and it’s just your body and that learns by observing and it’s all about
    0:12:01 how to be generative uh and by by ignoring self one and letting self two do all the work and it gives you
    0:12:06 all of these tips and tricks on how to listen to self too and this sounds very woo woo uh and it is
    0:12:11 a little bit woo woo but the book was written in the 1970s and the coach of the seahawks writes the
    0:12:16 forward what’s his name pete carroll p carroll yeah and like every new edition they still are like they’re
    0:12:21 still releasing new editions where all these like who’s who of leaders are writing about it and i think
    0:12:25 i i didn’t realize it but after i started reading it i was like oh wait tim ferris talked about this
    0:12:30 book that’s one of his favorite books of all time and i’ve been reading it a whole lot and it applies
    0:12:35 very much to business i think it’s only 150 page book i’ve been reading um i’m almost done i read it
    0:12:41 in like two days uh the the it’s very similar or very applicable to business which is what you said
    0:12:46 about elon of he’s not fearful and things like that this book actually gives you like a set of
    0:12:50 frameworks and a way to communicate yourself in order to not be fearful when you are coming up with
    0:12:56 new ideas this is incredibly fascinating dude this is awesome i love this type of book it says
    0:13:00 the inner game of tennis the classic guide to peak performance introduction by bill gates and a forward
    0:13:04 by pete carroll isn’t that crazy i didn’t know that that i didn’t i don’t have i don’t have the bill
    0:13:10 gates one so i didn’t know that so he wrote the the introduction that’s wild and so have you used any
    0:13:15 of this or give me like has it have you found a way to kind of apply any of these yet well so like
    0:13:21 a very simple example is like for lifting weights or for going for a run when you lift weights you’re
    0:13:24 like okay i have to lift this weight for three times and it’s the heaviest weight that i’ve ever
    0:13:30 done so i’m really scared you don’t listen to that at all and instead you just get under it and you go
    0:13:35 i’m gonna let self to do all the work i’m gonna trust self too and if i fail i will not be judgmental
    0:13:42 i’m not gonna say you suck i’m said i’ll say uh you know your knee moved in a strange way so i’m just
    0:13:47 going to objectively acknowledge what’s happening and then i put and then i when i want to lift
    0:13:52 three times i get it up on me and i just observe the weight on me and i only go for one rep and i’ll
    0:13:56 be like all right how does that feel self two let’s just do the second rep so i basically am talking to
    0:14:01 myself sort of like an objective machine not an emotional person so the whole i’m fearful i’m
    0:14:06 fearful i’m fearful you just set that aside and you go it’s self two time there is no room for that
    0:14:11 it is only room for uh objectiveness all right i did something similar to this in this in this vein
    0:14:16 that i didn’t even plan to talk about but i’ll just tell you this because i think it’s kind of similar
    0:14:23 so one thing i noticed is anytime i go into a project uh you know i obviously have a lot of excitement
    0:14:30 and i have a lot of hope at the beginning correct that’s obvious and then the second obvious thing is
    0:14:36 that i’m going to hit some sort of obstacles walls plateaus something that i don’t want to happen is
    0:14:41 for sure going to happen i’ve never once experienced a project that i just simply started everything went
    0:14:45 as planned and it had a happy ending like this literally just never happened for me to expect that
    0:14:50 to happen would honestly be a little bit foolish it’s like why would i think that that was the case
    0:14:54 yet at the same time as soon as i hit those obstacles on those walls i’m like shit
    0:15:00 like i wish this didn’t happen i don’t want this to happen why is this happening and i waste all this
    0:15:04 energy on something that was inevitable it’s like playing mario and being like oh my god i can’t
    0:15:09 believe these goombas are walking at me it’s like dude that’s the game like what do you mean like you
    0:15:13 wanted to play this game without anybody like trying to bite you you know i don’t understand what you
    0:15:19 thought this was going to be and so recently i was doing a project and uh last week i wrote out a thing
    0:15:25 in advance i’m just going to kind of read you this so i basically wrote like a simple letter to self
    0:15:31 for like two months down the road and by the way three months down the road according to tim according
    0:15:36 to the inner game of tennis when you have that feeling you don’t you do not judge it as positive
    0:15:42 or negative you say now this is a challenge okay noted and then you just keep going do you know what
    0:15:48 i mean there is no why like this is horrible this is awful why me that that that that there are no
    0:15:52 emotions you do not judge you think you don’t judge the the emotion you’re feeling you don’t judge
    0:15:58 yourself for feeling it or you don’t judge the thing both so you uh you you only objectively acknowledge
    0:16:06 it so you say like uh so the ball was out okay right noted the ball was hit too hard and then you’re
    0:16:11 you trust self too to adjust but you don’t you know you know what i’m saying you do not acknowledge
    0:16:18 or judge it as i hate this i suck this is bad it just the ball was out so i’m just going to give you
    0:16:22 a little sense of how i wrote this so i was like i was like hey it’s me from the future i’m writing
    0:16:27 this to you three months from now first congrats thing you did so good turned out amazing i’m really
    0:16:32 proud of you slash me uh and i said this is a letter that is guiding you to some of the entirely
    0:16:37 predictable upcoming road bumps that are headed your way um not only is it predictable that there will be
    0:16:41 road bumps i could probably tell you right now what they’re going to be all right because like
    0:16:46 that’s true so for example i was thinking about this isn’t what i was doing but just to make it a
    0:16:52 simple example let’s say you’re trying to hire a head of sales there’s some entirely predict like you
    0:16:55 know you want to do it you know you’ll you’ll be able to do it but there’s some entirely predictable
    0:17:00 road bumps which is like um you know you’re probably going to procrastinate starting it a little
    0:17:04 bit because it’s the idea of finding that perfect person’s a little bit hard and you might put it off a
    0:17:09 little bit then then you’ll talk to some candidates who are disappointing you may even run into a
    0:17:13 candidate who’s really great but the offer doesn’t work out maybe maybe they don’t take it maybe it’s
    0:17:18 not the right time in their life etc etc so you can basically up front tell yourself yeah these four
    0:17:22 obstacles are probably going to be here i’ve played this level of the game before or i could just see
    0:17:27 what’s coming and so when they come it takes the emotional edge off of it because it’s like yeah i know
    0:17:32 there’s no i don’t feel betrayed by this i don’t feel surprised by this like i knew you just say
    0:17:38 hello to it here you are i thought i’d be seeing you soon and i also i had already kind of thought
    0:17:42 about like what i would do to get around that before it hits me and i’m in like an emotional state so it’s
    0:17:47 like yeah i’m probably gonna meet a bunch of people who are kind of disappointing and it’ll probably feel
    0:17:51 in the moment like god am i ever gonna find somebody great but of course i will i only need one and it’s a
    0:17:55 numbers game and i should i should probably just expect that i’m gonna talk to about you know 30 to
    0:18:02 40 people and that 25 of those people are gonna be truly just a waste of time uh you know in terms of the
    0:18:06 interview but that’s okay that’s that’s part of the process and you tell yourself that up front and then as it’s
    0:18:10 happening you’re like yeah well i already i already addressed this i don’t need to like react to it again
    0:18:12 because i already kind of pre-reacted to the whole thing
    0:18:18 and what uh is this project that you’re doing like big or like do you recommend doing this for a small
    0:18:23 thing or only a big thing i don’t know this is my first time actually doing it like the corny step of
    0:18:30 like writing it out to myself i’m like dear sean yeah and it’s like p.s you’re pretty fucking lame for
    0:18:35 write this yeah exactly it’s like all right that’s three pages now this was cool when it was a paragraph
    0:18:44 um i think it was it was very helpful i will do it again um i will do it again i mean i don’t know
    0:18:49 how much this actually like it doesn’t it sort of blunts the pain but the pain’s still there you know
    0:18:52 what i mean it’s like when you get a shot at the doctor it’s like if you really are looking at it and
    0:18:56 hyper fixated on it and you start hyperventilating about it yeah it’s kind of a worse experience if you
    0:19:00 look away you might still feel a little prick but you know you took the edge off of it i think that’s
    0:19:05 what this has done for me all right so we’re talking about like big and small do you want me
    0:19:13 to tell you about a small thing and a big idea that are to me are equally fascinating okay go to
    0:19:21 patronview.com so i was in view okay yeah so i was with nick gray this weekend so i did this amazing
    0:19:27 or we did this amazing vacation where my friend david owns a home in utah and about eight of us
    0:19:33 or maybe six of us plus our spouses and our kids all went and hung out and uh it was amazing and nick
    0:19:39 was there and i was looking at his computer and i said nick what are you doing he goes let me tell you
    0:19:46 and it was very fascinating and so it’s called patron view uh patronview.com and so nick used to own this
    0:19:52 website or sorry own the service called museum hack where it was kind of amazing that it existed but you
    0:19:59 would pay a hundred dollars and nick or one of his tour guides would take you to uh the met and give you
    0:20:06 a sort of gorilla uh tour of the um of the museum and it was amazing and so that’s where he got really
    0:20:12 into museums and he became buddies somehow or somehow got uh uh in with the guys who do the fundraising
    0:20:17 and because he’s a business person he was like oh wow it’s so fascinating that one person is donating a million
    0:20:22 dollars 10 million dollars 20 million dollars to these museums and they do it every year into tons
    0:20:29 of different museums that’s really amazing and so recently with a mutual buddy stetson blake they
    0:20:37 built this website where it’s pretty amazing where all he did was if you go to the met or one of a dozen
    0:20:42 or hundreds of other museums they every year they have to put out a pdf that explains who donated money
    0:20:48 and how much money that person donated and so he’s aggregated all of them hundreds or maybe even
    0:20:56 thousands and he used ai to upload all of them into a database so if you are fundraising for a museum
    0:21:01 i believe if i had to guess you’re going to be able to pay his service money to find out who the whales
    0:21:07 are you know whatever and it’s crazy that because of ai he was able to make this he told me uh for two
    0:21:14 thousand dollars i’m just going to read the about page it says uh we’re a research platform dedicated
    0:21:18 to documenting cultural philanthropy i’ve never actually heard that before which just shows how
    0:21:22 like much of a noob i am about philanthropy but that makes sense so people who donate to things that
    0:21:27 are about culture and then it says where and then it says the data our research is pulling from annual
    0:21:33 reports 990 tax filings institutional publications official documents and proprietary sources this lets us
    0:21:37 present donor information that’s never before been displayed we like to think of it as celebrating
    0:21:43 philanthropy and enabling development departments pretty cool right awesome it’s great right like
    0:21:47 this is i was i was like nick what’s your deal here like you want to turn this into a business
    0:21:52 and he’s very nick is happy like he’s not looking for anything he’s like i don’t know i’m just
    0:21:58 tinkering and in my head as someone who is probably less uh you know uh content than him i was like
    0:22:03 oh man like nick you could do this you could do this you could do this and that’s like how the entire
    0:22:07 conversation uh came about but isn’t this pretty cool that he’s like building this and this is his
    0:22:13 hobby and the fact that ai has made this so easy yeah dude this is great i mean nick i’ve already
    0:22:18 you know really shouted him out on here a ton of times because he’s somebody who’s made a big impact
    0:22:23 on me just seeing the way this guy rolls through life i’m like he just does things for his own amusement
    0:22:29 he does things on his terms and i think he does things with high intentionality and he’s doesn’t
    0:22:34 see and he basically resisted the rat race i think those are the people i admire the most of all is the
    0:22:40 people that have resisted the rat race like i think he neither chases money nor status um and
    0:22:44 if you think about the people who are talented and successful in your life how many do you think
    0:22:51 actually truly are resisting money and status very very few i know probably two people him and jack
    0:22:57 smith yes it’s pretty crazy and so you just sort of watch their moves and then you look at them and
    0:23:01 you know you can kind of learn from them so this is this is extremely cool like and what’s funny about
    0:23:08 nick is every two or three years or something like that he likes to uh find a publicly traded company that
    0:23:13 he loves and he’ll make a big bet on it and right now his or for the past probably four or five years
    0:23:17 actually his bet has been cloudflare like for some reason i don’t know he’s got all this analysis he
    0:23:22 like loves it to the point where like when he hosted an event he specifically hosted his event in the
    0:23:28 cloudflare uh event space because he’s like so loyal and he’ll wear cloudflare t-shirts whatever
    0:23:34 um like one time there was a a race uh like a like a 5k or a marathon through austin and he’ll like
    0:23:38 hold up a sign that says like cloudflare rules like this because he wants that you told me like
    0:23:43 at his birthday party he had his birthday party at the cloudflare office and then midway through the
    0:23:47 birthday party he ran upstairs and got like two like like a product manager like a marketing manager
    0:23:54 to come down and be like hey everybody quick word from jack from the marketing department why don’t
    0:23:59 you just tell us about the great things you got going on at cloudflare the guy’s like uh yeah so you
    0:24:04 know before and before he brought that guy in he goes i need everyone to treat jack from cloudflare
    0:24:09 like a celebrity and so when he walked in we go oh my god is that jack are you the vp of
    0:24:15 engineering at cloudflare oh my god he’s here he’s here the stock is the stock is up 400 in the last
    0:24:20 five years so he’s done pretty well he’s done well and if you click the about page i know for a fact
    0:24:26 so he lists an area that says technology patron view patron view is built with modern web tech to
    0:24:33 ensure fast reliable access to data and he only did that so he could list that he uses cloudflare i know
    0:24:38 that’s exactly how he thought um but the reason i’m bringing this up is i think that if you’re like
    0:24:43 just starting to build a business or something you should follow patron view or like go there like go
    0:24:48 there once a week and i and i would bet that you’re gonna see like it evolve like uh you know it’s sort
    0:24:53 of like measuring your kid on the wall like you’re gonna see like the measuring like like that’s what’s
    0:24:57 gonna happen with this cool too that i think another another cool thing about this is this fits
    0:25:03 into like a genre that you know personal software so uh or maybe social software so basically
    0:25:11 uh when the internet came out before pre-internet the only people that made media were media companies
    0:25:17 you know you got your media from the new york times and the huffington post whatever like newspapers
    0:25:23 magazines tv etc and then when the internet came out and you got facebook and twitter and instagram and
    0:25:27 snapchat then social media became a thing and everybody became a little broadcaster right everybody
    0:25:31 broadcasted little moments of their life or their content or their their interest whatever it was
    0:25:38 and there was this explosion like you know a sort of like 1 billion x increase in the amount of media
    0:25:43 that was created because everybody was doing it and like one clear thing i see that’s happening in the
    0:25:47 world today is that that’s now happening with software so software used to be something that only
    0:25:53 software companies and software engineers could make and you know there’s only like i don’t know
    0:25:58 there’s less than a hundred million roughly software engineers like proper like professional software
    0:26:04 engineers in the world so you know a hundred million out of eight billion people could do the thing and
    0:26:08 you know if in terms of software companies there’s even less maybe a hundred thousand uh software
    0:26:14 companies i don’t know it’s order of magnitude roughly and now with like replet and v0 and all these
    0:26:19 different tools it’s going to be like social media we’re like oh i have i carry in my pocket a thing
    0:26:23 that can make media it’s like i carry my pocket a thing that can make software so a guy like nick
    0:26:29 who before this probably couldn’t have taken his idea and made it into an app because he would have to
    0:26:34 either a learn to code or b go hire like expensive programmers to make this happen like he did most of
    0:26:39 this with ai and so you see personal software this like you know this personal software category which was
    0:26:45 like didn’t exist three years ago or five years ago um is now going to have the same sort of like
    0:26:52 one billion x you know increase uh just because anybody who’s got an idea can now make their idea now
    0:26:58 today it’s like broken three-fourths of the time doesn’t quite work but like every six months that number
    0:27:03 goes down by 15 and so you know within two or three years that number is going to be like zero right
    0:27:07 it’s going to be like when you have an idea you make your app everything that i’ve been making on
    0:27:13 replet and lovable and cursor it’s basically just like a figma replacement like i’m just like it’s
    0:27:17 basically just like drawing on paper yeah it’s just like a mock-up and you still need someone to like
    0:27:20 actually do the work but it but it’s a sick mock-up
    0:27:27 like it looks yeah somebody called it minimum viable promise so it’s a minimal viable product it’s
    0:27:31 like it’s not really a product but it’s like it kind of has like you make a promise you can see the
    0:27:34 promise of something and i think that’s what a lot of these tools are able to do today
    0:27:42 this episode is brought to you by hubspot media they have a cool new podcast that’s for ai called
    0:27:46 the next wave it’s by matt wolf and nathan lands and they’re basically talking about all the new tools
    0:27:51 that are coming out how the landscape is changing what’s going on with ai tech so if you want to be
    0:27:56 up to date on ai tech it’s a cool podcast you could check out listen to the next wave wherever you get
    0:28:03 your podcast have you heard of a guy named edwin chen edwin chen i mean there’s like you probably
    0:28:09 have there’s probably six thousand of them on my facebook feed yeah i went to school in beijing i
    0:28:17 think i got a few edwin chens in my rolodex edwin chen might be the like if you uh did like a chart
    0:28:23 of like richest slash unknown slash youngest person in the world i think it’s gonna be edwin chen is this
    0:28:33 the guy who’s doing surge so yeah so edwin chen uh in like 2018 2019 he worked at facebook and the
    0:28:40 story is is that he was tasked with like making some type of yelp style product and what that meant was
    0:28:47 he had a list of 50 000 vendors and he needed to figure out which of those 50 000 was a restaurant
    0:28:55 and which were uh a grocery store uh and so he went and hired a firm uh some company to like
    0:29:00 parse it out and it’s like manual you you had to do it manually like you had to like hire some firm
    0:29:05 that had a lot of uh offshore talents to go through and do it all manually by hand and he was like it took
    0:29:10 us four months or six months something like that which basically just meant we had to sit and wait
    0:29:13 like we couldn’t do anything until we had that data so i just had to sit and wait
    0:29:21 and so he had this idea where he was going to make a better way to do data labeling and the data
    0:29:27 data labeling is important now because that is what a lot of ai companies use which i had no idea they
    0:29:32 did that and i’ll explain how they do that but basically when a company like open ai uh wants to
    0:29:39 figure out if a certain reply is unethical so like for example asking like if it is it okay to like
    0:29:44 hit someone or i don’t know like whatever like questions you would ask it a real person and
    0:29:48 actually not just a real person but like a really smart person uh even someone who like does engineering
    0:29:55 or uh philosophy needs to spend time going through all the potential answers and to tell open ai i think
    0:30:00 this one sort of fits what you’re going for but anyway edwin had this idea of i’m going to create this
    0:30:07 massive workforce of philosophers of engineers of ivy league grad grads who can go through and label all of
    0:30:14 these answers as good or bad so ai companies can kind of i can be like their offshore talent and so
    0:30:22 he’s done this uh and it started in 2020 now he has a hundred thousand people who are in the marketplace
    0:30:27 working for him as these data labelers and this company is completely unknown so if you i think it’s
    0:30:35 surge ai i believe is the url so if you go to surge ai it’s a landing page with one paragraph that’s an
    0:30:39 amazing paragraph if you want you can read it you want to read it yeah i was just reading it what
    0:30:44 made people like hemingway callow and von newman’s so extraordinary their life the books they read the
    0:30:48 stumbles they had the reinforcement every time friends laughed at their jokes and every time they didn’t
    0:30:53 it’s the people that met the police explored at every decision they made along the way data does for ai
    0:30:59 what life does for humans it elevates the neural networks that know nothing about the world into the
    0:31:05 intelligence capable of providing new art setting rocket ships to mars etc our mission is to shape
    0:31:09 agi with the richness of human intelligence curious witty imaginative unexpected brilliance we wake up
    0:31:14 every day trying to produce the data that makes this possible amazing right romantic it’s romantic
    0:31:24 and then this guy made like a giant fleet of overseas data labelers sound like the the army from 300
    0:31:31 yeah yeah it’s the it’s the best and the real website i believe is data annotation dot tech that’s the
    0:31:38 website where the uh that’s the website where the uh where the annotators go to apply but the way the
    0:31:44 business the way yeah it’s much more traditional and that one there’s like a brown dude staring at a
    0:31:49 laptop with a reflection blurring his eyes and it says get paid to train ai on your schedule and so the
    0:31:55 way the business model works is they have a hundred thousand of these folks and they train them on
    0:32:01 different standards and whatever and then they’ve also made software so uh they can show the the the
    0:32:07 basically homework or tasks to their folks and a company like open ai or uh google whatever whatever
    0:32:13 is going to pay a surge millions and millions of dollars and surge is then going to take something like 30 or 40
    0:32:19 percent of it and give it to the annotator to do the work so this company is only five years old
    0:32:27 and it’s uh it was leaked that they did one billion dollars in revenue in the last 12 months and this
    0:32:34 guy edwin chen he’s only 37 years old and he owns 100 of the company they have not taken any outside
    0:32:41 funding now listen their biggest competitor is a company called scale scale is run by this guy named
    0:32:47 alexander wang i think alex alex wang i think his name is and it recently sold for something like 30
    0:32:52 times revenue i believe it they were doing like 800 900 million in revenue they just sold half of the
    0:32:59 the company to facebook i think it was for 28 billion or 30 yeah 30 which means which means this guy uh edwin
    0:33:08 who’s 37 and has a five-year-old company presumably is worth something like 30 billion dollars and you
    0:33:14 can’t find him on twitter he has no blog you can’t find photos of him he used to have a blog but you have
    0:33:19 to go to web archive in order to find it because he took it down and his customers are like edwin is not
    0:33:24 online you can’t find him anywhere and we like it that way his born his business is very boring the
    0:33:30 branding is basically non-existent and it just does a very good job and compared to scale who’s like you
    0:33:35 know the hottest kid on the block like alex wang was just on theo von’s podcast he was at the inauguration
    0:33:41 he’s kind of like kind of like uh the the it guy right now these guys are the exact opposite
    0:33:44 you’re not going to find them anywhere they only have 100 employees they’re totally under the radar
    0:33:54 and uh it’s super super fascinating dude this is uh this is wild uh i i did not know that he
    0:33:59 bootstrapped the whole thing i i also had never heard this company until scale got bought i’d never
    0:34:04 heard of this company so they’re the company is is killing it now because scale got bought so because
    0:34:09 scale got bought is now owned basically by facebook google and a bunch of other companies they go ah we
    0:34:12 don’t want to we don’t have with you anymore we’re going straight to surge but they were already winning
    0:34:19 they were at a billion in revenue and scale was at 750 billion and the reason why they’re winning is
    0:34:25 because they are they charge a premium and they’re uh they he’s like i don’t we got scale but it’s like
    0:34:31 i wasn’t trying to get scale meaning i wasn’t trying to grow big i was trying to hire the best people
    0:34:37 and to train them really well and i charged for it i charged three times what scale charges and the
    0:34:42 results have been better and people really like us because of it and this whole table data labeling
    0:34:47 industry i had no idea about this i i didn’t know that people were behind the scenes making these
    0:34:53 decisions it’s kind of wild i mean this is one of the best like uh picks and shovels businesses so if
    0:34:59 you’ve never heard picks and shovels it’s a the idea is like anytime there’s a gold rush who makes the
    0:35:04 money yeah it’s the the few people who find the gold but the more reliable way to make money is
    0:35:09 just to sell picks and shovels to everybody else who’s rushing into the gold rush and scale and surge
    0:35:15 were the best picks and shovels businesses maybe besides nvidia because what they were doing is
    0:35:20 saying cool everybody everybody wants to compete to become the you you want to make agi you want to
    0:35:25 make agi you’re all raising billions and billions of dollars well all of you have this same problem and i will
    0:35:30 sell the data labeling service to all of you and this is so funny that now that facebook is buying
    0:35:38 scale it’s like there’s all that revenue has to find a new home like this that is crazy that that’s
    0:35:44 the best news ever for this guy and there’s another company called handshake so if you go to joinhandshake.com
    0:35:51 previously or it still might be this but they were known as a company that helped recent college
    0:36:00 graduates get jobs and so basically they’re a job board or job network for uh 22 year olds dude yeah
    0:36:09 this was for college kids okay well listen to this they noticed a few months ago uh that surge and uh
    0:36:19 scale were using their service to find these these data annotators and so they go ah we’re gonna do that now
    0:36:26 and so in a very short amount of time they pivoted and that business that they have is gonna be at a
    0:36:30 hundred million dollars a year in the next couple months in a very short amount of time because what
    0:36:36 they did was they went and just said oh you are looking for a data uh annotation gig we got you
    0:36:39 let’s uh let’s go ahead and get your training and we’re just gonna provide that service to folks
    0:36:46 and so handshake is uh building that business now dude that’s so crazy i remember using this because
    0:36:51 i was like oh it’s interesting that nobody’s really built the the kind of like one place to
    0:36:57 go hire college interns or college like fresh grads and they built this like marketplace where you could
    0:37:02 go post on a job board at my local college here and i could get but it was like kind of crappy dude it
    0:37:07 was like it wasn’t great it was very like little liquidity in the market but i remember thinking like
    0:37:11 this is an interesting idea somebody like it’s a marketplace i like marketplaces somebody should do this
    0:37:16 right and um i remember they were kind of like puttering along for a while it seemed like
    0:37:21 and uh this is so funny that they pivoted to this and now we’re going to just explode yeah and if you
    0:37:28 google uh handshake data annotation you can find the blog posts that they uh that they wrote on them
    0:37:32 announcing that they were doing this and so it basically just says that for the past decade
    0:37:37 handshake has changed how college students started their careers and then it goes on to basically say
    0:37:43 we’re changing the company to like just hire just do this thing and it’s already making and they don’t
    0:37:46 actually say this but like it’s now making a hundred million dollars a year
    0:37:51 and you know i don’t know how long this stuff will last like you know this might be a business that
    0:37:57 i think in like seven to ten years you may not need this anymore like it seems like the way ai is going
    0:38:05 you may not need this kind of human in the loop um to label all this data either either they label
    0:38:11 they label enough data where then the model learns how to label data you don’t need humans doing this
    0:38:17 or they use a thing that doesn’t have the rlhf right like you just do reinforcement learning without
    0:38:23 human feedback and i think some people who are kind of pure believers in ai think you won’t need
    0:38:29 the human feedback um at a certain point so this might be a get get while the getting’s good type of business
    0:38:34 so let me let me tell you a a potential counter to that so tim westergen founded the company called
    0:38:45 pandora and i think he started it in 1990 uh uh maybe 98 so it was like pre-iphone um wait when the
    0:38:52 iphone come out oh wait oh six yeah so it was like probably like 2002 then and anyway uh he told me this
    0:38:56 story because we had him talk at one of our events where he was like i raised seven million dollars
    0:39:01 and all seven million dollars of that went to hiring basically ex-musicians or musicians who
    0:39:07 were teachers and didn’t make a lot of money and for two years i had about 150 of them listening to music
    0:39:13 and i gave them basically a scantron of all types of attributes that a song could potentially
    0:39:17 have and so if you’re listening to the beatles you would fill out like okay it sounds like it’s at like
    0:39:22 uh 90 beats per minute it sounds like there’s guitar like it’s melodic it’s light-hearted
    0:39:28 whatever and after two years of doing this he put all the data basically scantrons into this algorithm
    0:39:35 that he built and he started playing like uh uh he told me a beatle song and then and then he clicked
    0:39:40 next and it would suggest new music that was similar to the one that the beatle song that he originally
    0:39:44 played and he said the bg’s came up and he was like the bg’s and the beatles they’re not similar at
    0:39:47 all what the hell and then he kept clicking next he’s like oh wait they have the same melody
    0:39:52 or they had they all like have the same like they make me feel similar and he was like it’s working
    0:39:57 it’s working and so originally his idea was i’m going to create kiosk at best buy so you could say
    0:40:00 i’m interested in beatles but here’s like five other songs that best buy could show you and you
    0:40:05 will buy those cds while you’re there and then the iphone came out and he was like oh my god this is
    0:40:11 actually the exact way to apply this and so this idea of data data labeling has been around forever and i
    0:40:15 didn’t when i was reading scale or about surge i was like oh my god this is exactly what tim was
    0:40:20 explaining to me how pandora started and so this has been around for 20 years and so you say i don’t
    0:40:25 know if it’s gonna be around or not but i don’t know it’s been around for 20 years so far yeah that’s
    0:40:31 true but uh it’s kind of like like self-driving which is coming out now i’ve taken the waymos in san
    0:40:38 francisco and robo taxi in austin the tesla the tesla self-driving just launched in austin um i think
    0:40:43 like two days ago or something but they took two different approaches so waymo basically has this
    0:40:50 really expensive car i forgot the all-in cost but it’s something like 150 to 300 000 is the cost of
    0:40:57 the car with all the sensors on it right so they have this really expensive car with lidar and in
    0:41:03 addition to the lidar they um hard code and hard map the road so you know for for years they would drive
    0:41:09 around and basically like map the road physically and they could only launch in cities where they
    0:41:16 had mapped the roads and tesla took this other approach which was basically cameras only no lidar
    0:41:21 and we’re not gonna hard map the roads we’re gonna let people drive around and then the car
    0:41:25 needs to have a brain that’s smart enough to figure out a road even if it’s never been on that road
    0:41:32 before and uh it was this interesting bet because elon was like lidar is not only we’re not doing it
    0:41:37 it’s stupid and that’s a dead-end path and everybody else was all in on lidar everyone’s like lidar makes
    0:41:44 it safer it’s better you can’t do this without lidar and elon’s point was we humans drive with just
    0:41:49 eyes we only have cameras we don’t i don’t have a lidar in my brain and i’m able to drive safely
    0:41:57 right and lidar is what lidar is uh uh like you’re shooting some type of signal and it bounces back
    0:42:01 you can see through things so i don’t know exactly what’s the difference with lidar radar and all these
    0:42:06 different things but like it’s another version of basically scanning that allows you to do what a
    0:42:10 camera can’t camera can’t see through an object lidar can it can sense that there’s another another
    0:42:16 object behind it so let the classic example is like you know you know maybe you’re gonna do a turn
    0:42:20 there’s something obstructing your view but then there’s a little old grandma walking on the crosswalk
    0:42:24 but you couldn’t see the grandma until you started the turn visibly but lidar would know that there’s
    0:42:30 something there’s an object there that’s that’s moving point is other sensors besides cameras whereas
    0:42:34 elon was like no we’re just gonna put like whatever eight cameras on the car and that’s gonna make it work
    0:42:39 and for a long time there was a big debate some experts thought elon is wrong some were
    0:42:45 just like elon is correct and elon we trust and very smart people were on both sides of the debate
    0:42:50 and it was like a very high stakes debate because self-driving cars is one of the most valuable prizes
    0:42:55 that there is uh like self-driving cars i don’t think people really realize it i think because it’s
    0:43:01 i think because people talked about it for a while they got kind of numb to it this actually happened
    0:43:06 with ai too people have been talking about ai maybe machine learning deep learning for a long
    0:43:11 time people didn’t really realize when something actually had changed and then suddenly like wait
    0:43:15 it’s actually here and the same people who had been tracking it for a long time were almost
    0:43:21 late to the party because they mistakenly wrote it off as yeah yeah yeah i’ve heard this before
    0:43:24 and so the same things happening with self-driving cars were sort of like a yeah yeah yeah but it’s like
    0:43:30 wait a minute it’s actually happening now um it’s because it’s an extreme game changer both like
    0:43:33 for society for tesla’s business right like tesla’s business is now going to be
    0:43:39 if you own a tesla when you’re instead of 95 of the time your car just sits parked you’re going to
    0:43:44 just tap a button say go make me some money please and like a dog it’s going to go fetch it’s just going
    0:43:47 to go out there and it’s going to start doing rides for people and it’s going to start earning
    0:43:53 you money passively uh all the time dude i think uh i think morgan stanley or chase one of the big
    0:43:59 banks like last week was like wrote wrote this report where they had to say what the world’s
    0:44:06 going to look like with self-driving and it wasn’t like um it was far more grand they’re like the
    0:44:12 economy is going to look radically different because people are going to have so much more time like it
    0:44:17 was like at a macro scale it was like oh like the world will change because of this but it was also like
    0:44:23 there’s 60 i think thousand car deaths a year like what’s the world gonna look like with with more
    0:44:27 people like like it was like a pretty meaningful like it was like a very grand way of thinking about
    0:44:31 it wasn’t just like oh wow i could play on my phone while i’m walking or driving to work right it was
    0:44:36 like no everything changes i i asked last night i asked grok i said what are the second order effects
    0:44:41 of self-driving cars here’s what it said so it’s like cities are going to look completely different
    0:44:48 right now parking lots itself occupy 30 of all urban land um in some cities and this is gonna
    0:44:51 you’re not going to need parking lots because the cars aren’t going to just sit parked they’re going
    0:44:55 to be rolling around you’re going to need way less cars in a city plus they don’t they’re not going to
    0:45:00 sit still so you don’t need all of the space just look around a city how much space is dedicated just
    0:45:04 to parking like we’re going to look back and that’s going to look sort of like a cavemen style
    0:45:08 thing it’s like because in the future those are going to be parks and yeah it’s going to be
    0:45:13 it’s going to be smoking in a restaurant yeah exactly and so like the good version of this is
    0:45:18 that’s like you know green spaces and affordable housing but like who knows maybe maybe it actually
    0:45:23 gets co-opted for some other purpose they all just become like you know drone delivery uh you know
    0:45:31 parking units where amazon keeps like 10 million delivery drones um the next one is labor so right now
    0:45:36 there’s three and a half million truck drivers alone let alone all of the like uber and taxi drivers
    0:45:43 and you’re just not going to need that job period like and i don’t know what happens to that but
    0:45:50 there we go uh the next one is you know basically i think the average person spends something like 90
    0:45:58 minutes a day just commuting and so you get you know of your wake time let’s say you’re awake for 16 hours
    0:46:02 you’re going to add you know what is that so let’s just pretend it’s two out of 16 you’re going to add
    0:46:09 like you know 13 more time to everybody’s day where they can now sleep eat work play right you’re
    0:46:12 going to sit in a car and you’re not going to have to think about the car you’re just going to be able
    0:46:18 to do one of those things which also means the car becomes a new place for entrepreneurs to build
    0:46:24 experiences right like today there’s no one out there being like i build car games right there’s
    0:46:27 there’s people who build mobile games and xbox games but there’s nobody builds car games well
    0:46:30 car games is going to be become a thing because people are going to sit in cars and play video
    0:46:34 games people are going to sit in cars and they’re going to relax recover they’re going to work and
    0:46:39 so you’re going to build tools that that go in them uh another one is insurance it’s like the whole
    0:46:45 insurance system like you know buffets big bets and geico and all those things it’s all based on
    0:46:50 human driving and so if humans aren’t driving anymore like both the risk and the the risk reward
    0:46:57 ratios change but also who are you insuring you’re insuring the software company versus like individuals
    0:47:02 like how’s this all going to work and so that whole and all the whole insurance industry uh changes um
    0:47:09 and then basically like car ownership so today owning a car is both like utility but also status symbol
    0:47:13 so it’s going to be kind of interesting like you’re a car guy like i wonder when they’re self-driving
    0:47:18 cars and basically transportation is just on tap like flowing like water right you just
    0:47:22 you push a button and in 30 seconds a little car the car of your liking pulls it’s just going to be
    0:47:26 like uh it’s going to be like people who like horses now like it’s going to be a small group of people
    0:47:31 who are passionate yeah it’s just like oh you’re passionate about it and you are lucky enough to
    0:47:35 have enough room or enough money to like afford it but uh like maybe i would like buy a groupon and
    0:47:40 can go experience that once in my life like that’s what it’s going to be yeah or like you know like
    0:47:43 horseback riding is like therapeutic people like to like brush a horse or pet a horse it’s
    0:47:48 going to be like that with a car it’s going to be like male therapy to just like get in there and
    0:47:53 just be behind the wheel have control over something in your life yeah it’s good or like it’s like punk
    0:47:58 yeah you could like feel the noise and like smell the gas like it’s going to be um it’s going to be
    0:48:01 like a hobby yeah it’s not gonna it’s not going to exist i don’t think i think it’s going to be a lot
    0:48:07 longer um but like in 20 it could be 20 years 25 years it’s not going to be in the next five years
    0:48:10 but yeah it’s going to be a hobby are you sure about that why do you think it’s not gonna be in
    0:48:18 the next five years waybos are now doing 20 of all the rides in san francisco because that was zero
    0:48:25 have you like have you ever like a large percentage of people of americans have to
    0:48:34 have to drive um let’s say 60 miles one way to to work or they have to like pull stuff or carry stuff
    0:48:39 i just don’t think i think that for the urban there’s it’s not um oh um there’s good there’s
    0:48:44 going to be like a there’s probably going to be like four sections of users so it’s like young urbanites
    0:48:47 and it’s like yeah you guys don’t need a car at all like you’re you’re doing probably already there
    0:48:53 with uber yeah and then like the far end of that spectrum is like uh rural people who have to actually
    0:48:57 tow stuff you know even though everyone has a truck very few actually use it but there’s like that section
    0:49:01 and there’s like the people in between and there’s going to be like a timeline because like if you ever
    0:49:07 you can’t really tow anything on an electric car right now uh it’s like it’s they you say you can but
    0:49:10 go talk to someone who lives in rural texas when you have to like be driving shit around all day it’s
    0:49:14 like impossible so i think that there’s going to be like um it’s going to be like for
    0:49:17 um you know what’s that early but when you say it’s not going to be five years are you saying it’s
    0:49:22 not going to be meaning self-driving is not going to work it’s not no it’s going to work it’s just not
    0:49:27 going to just the user adoption it’s like it’s going to it’s going to take a minute for that for the
    0:49:31 whole spectrum of people i think for the urbanites and people like that it’s it’s tomorrow we’re
    0:49:36 going to do it i think yeah i mean that guy towing probably still doesn’t he still has an aol
    0:49:41 email address right yeah yeah like so i think it’s pretty safe to say that that person’s not
    0:49:46 yeah it might be 40 years before that person it’s going to be a long time but then like you
    0:49:52 know there’s like a lot of you know people like i’m one of them like i’m romantic about my gas vehicle
    0:49:57 i had an electric car and i got rid of it and i’m like in my head i’m like i acknowledge it’s better
    0:50:03 i acknowledge that like it’s the future but it sucks i want it’s like our vegan friends it’s like
    0:50:09 i get it yeah we shouldn’t kill creatures but it just tastes so good yeah but when you dip them in
    0:50:11 ranch it’s fantastic
    0:50:19 uh but i’m excited too what’s crazy is in austin i think or sf people are actually paying more for
    0:50:25 the waymos yeah yeah it’s not it’s not that much cheaper yet that was but people are people want to
    0:50:31 not be around someone and uh that was unexpected so like when i was when i drive my my i have a b&w that
    0:50:36 has self-driving stuff i feel way safer on that than if it were just me uh and i think that there’s
    0:50:42 like 20 of people and it’s usually men i’ve noticed i’ve noticed women tend to hate everyone i’ve talked
    0:50:48 to hate self-driving and every man i’ve talked to likes it and uh like have you have you do you have
    0:50:54 any self-driving now um no well i don’t i don’t have it so i haven’t had that that level of uh i
    0:50:59 haven’t had a sample size to no i’ve noticed that i’m curious if that’s common or if you’re just like
    0:51:04 indexing on three people no it’s well yeah i am but uh yeah it’s like five of my friends like the
    0:51:08 husbands use it and the wives are like nope i don’t mess with that i don’t use it right but i feel way
    0:51:15 safer with it so you guys know this but i have a company called hampton joinhampton.com it’s a vetted
    0:51:20 community for founders and ceos well we have this member named lavon and lavon saw a bunch
    0:51:25 of members talking about the same problem within hampton which is that they spent hours manually
    0:51:30 moving data into a pdf it’s tedious it’s annoying and it’s a waste of time and so lavon like any great
    0:51:36 entrepreneur he built a solution and that solution is called molku molku uses ai to automatically
    0:51:41 transfer data from any document into a pdf and so if you need to turn a supplier invoice into a customer
    0:51:46 quote or move info from an application into a contract you just put a file into molku and it
    0:51:51 auto fills the output pdf in seconds and a little backstory for all the techners out there
    0:51:56 lavon built the entire web app without using a line of code he used something called bubble io
    0:52:01 they’ve added ai tools that can generate an entire app from one prompt it’s pretty amazing and it means
    0:52:07 you can build tools like molku very fast without knowing how to code and so if you’re tired of copying
    0:52:15 and pasting between documents or paying people to do that for you check out molku dot ai m-o-l-k-u dot
    0:52:21 ai all right back to the pod you want to do one more thing or you have something well i have a
    0:52:26 so i tweeted something out that elon replied to over the weekend and how did that make you feel did you
    0:52:34 like did you like clap and like scream no so i first of all i played it so cool you wouldn’t if you had
    0:52:39 seen me you would have thought i might be under the weather that’s how cool i was playing it
    0:52:46 and actually what happened is um i just texted my wife and i was like oh not elon replying to me
    0:52:53 and uh and then i just i forgot about it next day i didn’t even think about it i moved on my mom calls
    0:53:00 me she’s like sean what did you say i’m like what she’s like sean what did you say elon what did you say
    0:53:05 say to elon and i was like what my wife put it up on her instagram story and i was like oh my god
    0:53:10 i’m trying to play it cool over here and then you made it like you know lame city so that felt
    0:53:14 interesting that i got like multiple phone calls from people and i was like that’s like the only time
    0:53:21 your wife has shared something that is when another person replied to you yeah exactly and so i thought
    0:53:27 that was interesting how big of the reaction was but the thing i had said was i wrote within a couple
    0:53:33 years not using ai while you’re doing your job will be the equivalent of coming to work without a
    0:53:37 computer like if someone just turned up and they’re like no i didn’t bring it today you’d be like what
    0:53:43 the hell dude like what are you planning to do what is the what’s the plan here that’s how it’s going to be
    0:53:49 if you’re if you’re trying to do your job and you’re not using ai constantly to do your job yeah
    0:53:56 that was good he was like you know sooner probably and um and so that was like and so i started thinking
    0:54:02 about that and i started thinking about um somebody else said this thing they go pretty soon being a
    0:54:07 doctor who’s not using ai as a co-pilot like you let’s say you’re radiologist and you’re just trying
    0:54:12 to eyeball every every mri and you’re not also running it through ai that’ll be considered
    0:54:20 malpractice because like you put the patient at risk by not at least including the second layer
    0:54:24 of ai diagnostics and i thought that’s pretty interesting it’s like the flip is going to go
    0:54:29 so much from this doesn’t work you know something we don’t do we don’t even use it to if you’re not
    0:54:34 using it it’s considered malpractice whether it’s my corporate malpractice or medical malpractice
    0:54:39 my doctor friend admitted to me the other day he goes open ai is a better doctor than me
    0:54:45 and uh he was like and i knew this was going to be popular because for years he’s been a doctor for
    0:54:52 10 years patients come to me and said well google says this or web md says this and he says over the
    0:54:58 last six months the only people who have used that reasoning is with open ai and i said well according
    0:55:04 to open ai yeah chachapiti said this uh and he goes and they’re right a lot of times the diagnosis
    0:55:08 is right dude i got in a fight with a doctor recently about this did i tell you this what
    0:55:14 did they say my mom had to have a surgery and but she was on a trip and so i’m like calling in
    0:55:19 to the doctor every time the doctor would make her rounds she would like facetime me in
    0:55:24 um because she’s on the other side of the country and so the doctor would come in and like doctors
    0:55:29 doctors are very hit or miss i love some doctors but a lot of doctors i’m like wow this is an extremely
    0:55:36 underwhelming experience and so this one doctor comes in and she’s like yeah your levels were fine
    0:55:42 and then i’m like i actually read the test through chat gpt and the levels were like high for this
    0:55:45 and she’s like well which level and i’m like i tell her i’m like whatever the thing
    0:55:53 whatever the term was and she’s like yeah that was high but you know um it depends on the exact number so i go
    0:55:59 what was the number i i would have to check i’m like you’re the doctor so yeah you would have to
    0:56:03 check like you know what are you what are you talking about and i’m like you know they basically
    0:56:08 chat gpt said if it’s above this then you should consider doing this like additional additional step
    0:56:12 like do you believe that that’s like do you agree with that like do you think we should do that
    0:56:17 that step she’s like well i mean you’re putting me on the spot here and i don’t have the number
    0:56:22 and i’m like and she basically was getting pissed and she’s like well if you’re going to ask me
    0:56:26 questions that i’m going to need to go look at the number and i literally was like yeah you are going
    0:56:31 to need to go look at the number then what are we doing here i don’t understand like why are you
    0:56:39 offended by me asking if you if you have seen the data from the test the test you just said to run
    0:56:42 and now you’re coming back to discuss the test results and you don’t want to look at the test
    0:56:46 result i don’t really understand what’s happening here well i think what’s going to happen is that
    0:56:52 you know how have you noticed so uh have you ever been to a doctor now with an ai scribe so like they
    0:56:58 have like okay so for a long time oh i was humiliating her in front of her ai scribe is that what
    0:57:02 happened well for a long time they could have been human scribes and so like have you been to
    0:57:06 a doctor and seen like a person on an ipad like literally it looks like the doctor’s facetiming
    0:57:10 typing notes yeah yeah and that’s like a scribe now they have ai scribes and i think what’s going to
    0:57:17 happen is like the i the ai is going to like talk up and be like actually ma’am he’s right uh like
    0:57:22 like i think that’s what’s going to happen and if i was an entrepreneurial doctor i would 100
    0:57:31 start a new practice all centered around we are ai first so we work with our ai you know and i don’t
    0:57:35 think that you we are we aren’t at the point and maybe we’ll never be at the point where you totally
    0:57:41 trust it just like you always want the pilot even if autopilot is still a thing right but i would like
    0:57:46 go heavy on that uh of leaning into like we have all of the context here uh we have all of your files
    0:57:51 uploaded to our chat gpt or whatever it is and have an ai first because i think that a lot of people
    0:57:55 like you and me and people listen to this podcast they have a similar sentiment where they’re like
    0:58:00 oh no i trust a computer way more than a human being but i would also want the human being to
    0:58:05 put their stamp on it it’s also and i want to sue them if i’m wrong because it’s not even that like oh
    0:58:10 the ai found the problem and the doctor didn’t sometimes it’s just as simple as like cool the
    0:58:15 doctor came in they talked kind of fast they didn’t fully explain i still have more questions
    0:58:19 and so you go and you ask chat gpt to explain it to you maybe simpler or you ask some follow-up
    0:58:23 questions maybe you’re not as embarrassed to ask questions you feel like you’re not like
    0:58:27 you know the person’s not like in a rush to get out of there like a lot of doctors are
    0:58:32 and so sometimes it’s not even that the ai doctor is better because it’s smarter sometimes it’s because
    0:58:38 it’s infinitely patient or it’s an infinitely better communicator or you know it knows um you know maybe
    0:58:41 other things about you or you know you could ask some follow-up questions you don’t feel silly doing
    0:58:48 it like those are other components of the doctor experience essentially bedside manner that ai is
    0:58:56 better at yeah and uh so i’m like very eager to see how this works i go to um i go to a doctor now a
    0:59:02 concierge doctor and it’s not very expensive but the reason i go there is the average at most doctors
    0:59:07 they have to see four patients an hour so they’re at 15 minutes right and is that insane i remember i
    0:59:13 went to a doctor and like i had an earache i’m like guys my ear is killing me and uh like he spent no
    0:59:17 time like trying to like help me like figure this out and i went to a concierge doctor and the average
    0:59:22 time is 45 minutes so we can like thoroughly walk through things and so if i can just use all the
    0:59:28 information that they have and then go and ping chat gpt to further the conversation it is pretty brilliant
    0:59:32 um i’m very eager to see what’s going to happen i like people act like ai is amazing for a bunch of
    0:59:37 different stuff and it is but what they’re doing with medicine and drugs and cancer and things like
    0:59:41 that is like pretty astounding and i think that’s going to be the major breakthrough in the next couple
    0:59:49 years dude the other one lazy ass parenting so your kid’s a little young for this but it is amazing
    0:59:56 dude i’ll open up gemini and it has like a camera mode but why do you use different ones you’ve said
    1:00:01 claude or sorry you’ve said uh grok and now you’re saying gemini and then we also refer to
    1:00:04 you use different ones it’s like you know you go to your different friends for different questions you
    1:00:08 only ask me certain questions sometimes you go to jack smith and sometimes you go to your you know
    1:00:13 you go to joe you go to different people for different things so like if you want to be if you
    1:00:19 want something that’s a little bit more real and objective i think grok is better if you want something
    1:00:26 that’s uh either code or creative writing claude is better you know they the catch-all is chat gpt
    1:00:29 and then gemini has some like advanced features so this is what i’m saying like gemini has the thing
    1:00:35 where you just turn your camera on like facetime and i think it’s for like maybe you’re supposed to like
    1:00:38 show it your car and be like how do i repair this and it like tells you what to do
    1:00:44 but i just pointed at my kids and i’m like hey we’re playing charades guess what they’re doing and
    1:00:49 then my kids will like get on the ground and start like crawling and it’s like hmm seems to be a boy
    1:00:55 crawling maybe it’s a snake is are you a worm and it like tries to guess it and they love it dude and so
    1:01:01 i’m able to just straight up chill and let them play with ai it is amazing another one i’ll do is i’ll just
    1:01:07 be like hey i have a five-year-old and a four-year-old here and they want trivia questions they like animals
    1:01:12 they like paw patrol they like um you know they know a little bit about pokemon but nothing too complicated
    1:01:16 um ask them a bunch of questions cheer them on when they get it right if they get it wrong tell
    1:01:21 them the right answer keep track of the score here’s their names go that’s the prompt and it plays trivia
    1:01:25 endlessly with my kids and they love it because it’s all audio which kids can do they don’t have to like
    1:01:32 be on screens to be able to do this and uh so i’m just discovering like game after game i can play with
    1:01:39 them like i’ll do like basically replaced kuman with hey i need um advanced kindergarten math which like
    1:01:43 i don’t even know what that means but it like gets you for whatever reason those three words give me
    1:01:49 the sweet spot of like the question that that kind of works for my kids and um it’s like a tutor right
    1:01:54 it’s an infinitely patient tutor with them and it’s not perfect in the sense of like you know sometimes it
    1:01:59 like starts and stops it’s audio because if you make any sound it thinks you’re talking but damn it’s
    1:02:07 pretty it’s pretty good and uh it’s like already usable for us i’ve not seen i didn’t even i didn’t know much
    1:02:12 much about gemini gemini live i had no idea what this was is this google this google gemini is like
    1:02:17 after summer break you know that one kid who comes back it’s like they’re kind of like hot now but you
    1:02:24 still have the old image of them like their reputation is still being like not hot yeah but objectively
    1:02:30 they’re hot now yeah nobody’s really on it yet that’s what gemini is gemini was basically out of
    1:02:35 the game this google’s ai tools out of the game it’s like i was just chat gpt grok she changed yeah
    1:02:41 and then she changed and she like it’s like wait like she got contacts and like she learned how to do
    1:02:45 her hair she like walked it watched the makeup tutorial it’s like start rollerblading which was
    1:02:51 like surprisingly good cardio and now like suddenly gemini could do things that like the other ones
    1:02:56 can’t do but nobody’s on it yet which doesn’t really actually give you any benefits wait so gemini is
    1:03:03 hot now gemini’s hot now google’s hot google’s hot yeah i don’t know man that’s hard for me to buy
    1:03:09 into but yeah because you’re one of those jocks at school who’s just stuck in seventh grade you forgot
    1:03:15 what happened over seventh grade summer all right i’ll use this yeah i’m just stuck on chat gpt
    1:03:20 uh and i don’t use gronk because i i’m shocked when people say they use gronk i’m like wait so you go to
    1:03:27 like twitter.com to use a rock.com that’s just is that the same thing as that’s the twitter one
    1:03:33 yeah because steph smith just got a job at this other one what was that other one called oh no she
    1:03:39 got it at grok with a q that’s stupid naming unfortunate unfortunate i’m a shareholder
    1:03:47 of grok with a q also but unfortunate naming situation yeah and it’s ai as well they’re making chips
    1:03:57 okay well they should change their name because yeah that doesn’t make sense or at least the
    1:04:02 pronunciation right like i don’t know like i don’t know how you also used to be grok or something like
    1:04:07 that i don’t know what they’re gonna do they could be grok i guess but they yeah grok is so it’s the
    1:04:13 same i love how you’re putting the n in there like it’s rob gronkowski wait what did i say you’re
    1:04:24 saying grok oh what is it grok grok yeah like the shoes crocs uh yeah like crocs yeah wait and so
    1:04:29 what is the twitter thing what do you mean what is it that’s all that’s not oh that’s not grok oh i
    1:04:37 i thought it was grok yeah there’s no n any of them that guy’s a football player he’s a retired
    1:04:38 football player
    1:04:45 dude i went to montana to visit a friend last week and i wore overalls because they’re like the best
    1:04:52 i saw a photo of that and i just thought to myself holy shit this guy has gotten this guy’s got no no
    1:04:59 limits he’s just wearing overalls as standard standard wear it’s the best clothing because
    1:05:03 you could put your phone in your wallet right there on the chest and so you’re like holy kids
    1:05:06 and like you just have so many pockets that you have like this right here and i love it and she’s
    1:05:11 like oh you got these did you think that we’re all cowboys here and i was like huh but she’s like
    1:05:14 you wore your overalls to montana are you trying to make fun of us i was like what are you talking
    1:05:25 about i uh i’ve worn these for years like i am not pretending uh no i actually just got it by the
    1:05:30 way i was very inspired by your instagram post you wrote something uh the caption of your post you go
    1:05:36 from now on i’m only taking photos that if my kid looked at it 20 years from now they’d be like
    1:05:43 my dad was pretty cool i thought that was great that’s that’s because you have that photo of your
    1:05:48 father right of him when he was in his 30s and you’re uh a baby and he’s like doing something
    1:05:53 cool he’s wearing a cool shirt and you’re like oh wow dad was sick or like oh yeah you don’t see them
    1:05:57 like that anymore right like they don’t have hair anymore they’re like fat now or whatever and so
    1:06:01 you don’t you don’t see that side of them but like it lets you put a little respect on their name when
    1:06:04 you see like oh damn yeah young they were actually kind of that’s actually kind of fly what they’re
    1:06:11 wearing so i was smoking a cigar and uh uh they were uh which i never do but i like was smoking a cigar
    1:06:15 and like they were gonna take a photo with my kid or someone had a camera i was like go take photos
    1:06:20 and i put the cigar i used to hide it i would hide it behind my back and i’m like no fuck this she’s
    1:06:27 gonna be proud like so i put it back in dude you think smoking is gonna be cool in 30 years
    1:06:33 that’s gonna be like you had like a slave with you or something it’s gonna be crazy that you were just
    1:06:39 smoking with a baby on your shoulder brother have you seen the photo of the eight guys sitting on the
    1:06:45 beam off the like empire state building that’s a great picture i think i think to myself those guys
    1:06:52 are crazy they’re dangerous but they’re fucking hard that is awesome and so i will never be on the beam
    1:06:56 of the empire state building a thousand feet above the air but at least i could smoke a cigar and look
    1:07:03 look remotely cool dude we should print this out i want this framed dude three of them have overalls
    1:07:10 very similar to the ones you were wearing yeah what’s up same make and model yeah you just need
    1:07:13 this like beret hat you probably have this what am i talking about of course you have this hat
    1:07:21 yeah and the courage to eat lunch a thousand feet above uh the ground which is like even back then
    1:07:25 the co-workers were like guys what are you doing there’s a cafeteria like right here
    1:07:32 like what the fuck all right that’s it that’s about i feel like i can rule the world i know i could be
    1:07:50 what i want to all right so when my employees join hampton we have them do a whole bunch of
    1:07:55 onboarding stuff but the most important thing that they do is they go through this thing i made called
    1:07:58 copy that copy that is a thing that i made that teaches people how to write better and the reason
    1:08:04 this is important is because at work or even just in life we communicate mostly via text right now
    1:08:10 whether we’re emailing slacking blogging texting whatever most of the ways that we’re communicating
    1:08:15 is by the written word and so i made this thing called copy that that’s guaranteed to make you
    1:08:20 write better you can check it out copy that dot com i post every single person who leaves a review
    1:08:24 whether it’s good or bad i post it on the website and you’re going to see a trend which is that this is a
    1:08:28 very very very simple exercise something that’s so simple that they laugh at they think how is this
    1:08:32 going to actually impact us and make us write better but i promise you it does you got to try
    1:08:39 it at copy that dot com i guarantee it’s going to change the way you write again copy that dot com

    Want to scale your startup? Get 700 prompts for your side hustle: https://clickhubspot.com/fpg

    Episode 720: Sam Parr ( https://x.com/theSamParr ) and Shaan Puri ( https://x.com/ShaanVP ) talk about how Surge built a $1B business in 5 years and the second-order effects of self-driving cars.

    Show Notes:

    (0:00) Best idea of the month

    (6:00) Business as a sport

    (10:57) The inner game of tennis

    (14:21) Shaan writes a letter to himself

    (19:15) Patron View

    (27:42) Surge

    (35:00) Handshake

    (39:47) Waymo vs Robotaxi

    (50:30) Elon replies to Shaan’s tweet

    (53:13) Shaan gets in a fight with his moms doctor

    (1:02:50) Sam wears overalls

    Links:

    • The Inner Game of Tennis – https://tinyurl.com/mphz9zkr

    • Patron View – https://patronview.com/

    • Museum Hack – https://museumhack.com/

    • Surge AI – https://www.surgehq.ai/

    • Handshake – https://joinhandshake.com/

    • Gemini – https://gemini.google.com/

    • Grok – https://grok.com/

    Check Out Shaan’s Stuff:

    • Shaan’s weekly email – https://www.shaanpuri.com

    • Visit https://www.somewhere.com/mfm to hire worldwide talent like Shaan and get $500 off for being an MFM listener. Hire developers, assistants, marketing pros, sales teams and more for 80% less than US equivalents.

    • Mercury – Need a bank for your company? Go check out Mercury (mercury.com). Shaan uses it for all of his companies!

    Mercury is a financial technology company, not an FDIC-insured bank. Banking services provided by Choice Financial Group, Column, N.A., and Evolve Bank & Trust, Members FDIC

    Check Out Sam’s Stuff:

    • Hampton – https://www.joinhampton.com/

    • Ideation Bootcamp – https://www.ideationbootcamp.co/

    • Copy That – https://copythat.com

    • Hampton Wealth Survey – https://joinhampton.com/wealth

    • Sam’s List – http://samslist.co/

    My First Million is a HubSpot Original Podcast // Brought to you by HubSpot Media // Production by Arie Desormeaux // Editing by Ezra Bakker Trupiano

  • What Do Medieval Nuns and Bo Jackson Have in Common? (Update)

    AI transcript
    0:00:05 Hey there, it’s Stephen Dubner.
    0:00:09 We heard from a lot of listeners who really liked our most recent episode, which was called
    0:00:12 What It’s Like to Be Middle-Aged in the Middle Ages.
    0:00:17 So I figured you might want a little bit more medieval programming.
    0:00:18 Here is a bonus episode.
    0:00:22 It’s an updated version of an episode we first published in 2013.
    0:00:27 It’s called What Do Medieval Nuns and Bo Jackson Have in Common?
    0:00:31 The answer is probably not what you think.
    0:00:33 As always, thanks for listening.
    0:00:46 It probably was pretty darn painful because you’re not living in a world with good razors.
    0:00:51 The chances are what they’re using is kitchen cutlery, I would imagine.
    0:00:54 And that is not necessarily all that sharp.
    0:00:57 I can’t imagine how painful it was.
    0:00:58 That’s Lisi Oliver.
    0:01:04 When we spoke with her for this episode around 12 years ago, she was studying medieval law
    0:01:05 at Louisiana State University.
    0:01:10 And what do you think she’s talking about that was so darn painful?
    0:01:17 Between the 5th and the 12th century in early modern Europe, barbarity swept through the continent
    0:01:19 and also the island of England.
    0:01:23 And often the targets of these attacks were monasteries and nunneries.
    0:01:29 But nunneries, you had the added incentive of rape to add to sort of pillage and destruction.
    0:01:34 For a nun, rape was especially problematic, aside from the obvious reasons.
    0:01:43 Rape violated a nun’s chastity, which meant that as a bride of Christ, she might be forbidden entry into heaven.
    0:01:48 So what do you do if you are a nun and there are barbarians at the gate?
    0:01:54 In the 9th century, one nun, an abbess who came to be known as St. Ebba, came up with a plan.
    0:01:58 Here’s Lisi Oliver reading from a history by Roger of Wendover.
    0:02:06 The abbess, with an heroic spirit, took a razor and with it cut off her nose, together with her upper lip onto the teeth,
    0:02:09 presenting herself a horrible spectacle to those who stood by.
    0:02:15 Filled with admiration at this admirable deed, the whole assembly followed her maternal example
    0:02:17 and severally did the like to themselves.
    0:02:22 When this was done, together with the morrow’s dawn, the pagan attackers came.
    0:02:27 On beholding the abbess and the sisters so outrageously mutilated and stained with their own blood
    0:02:32 from the sole of their foot unto their head, they retired in haste from the place.
    0:02:37 Their leaders ordered their wicked followers to set fire and burn the monastery with all its buildings
    0:02:41 and its holy inmates, which being done by these workers of iniquity,
    0:02:46 the holy abbess and all the most holy virgins with her attained the glory of martyrdom.
    0:02:52 There’s a very graphic picture of St. Ebba cutting her nose and lip off
    0:02:56 and all of the women around her looking thrilled at the concept.
    0:03:00 In terms of pain, it must have just been dreadful to cut your nose off at night
    0:03:05 and then wait until the morning with that pain wracking your body.
    0:03:08 But that is the pain of martyrdom.
    0:03:09 It’s the crown of thorns.
    0:03:14 I know it’s hard to transpose oneself to a different time and place,
    0:03:19 but if you could put yourself back in a nunnery,
    0:03:24 do you think you would have followed suit and gone ahead and cut off your own nose to spite your face?
    0:03:24 Probably.
    0:03:25 Why?
    0:03:34 I think that there is a wave of hysteria that follows that kind of action where I don’t think I would have been number two,
    0:03:36 but I probably would have been number 20.
    0:03:38 I mean, it’s the happening thing, man.
    0:03:39 We’re all cutting our noses off, right?
    0:03:44 Now, why are we telling you this grisly tale?
    0:03:51 Because the theme of today’s show is spite, as in cutting off your nose to spite your face.
    0:03:59 Scholars aren’t certain, but this phrase quite likely originates with the practice of medieval nuns like St. Ebba,
    0:04:04 women who mutilated themselves in an attempt to preserve their chastity.
    0:04:07 Now, economics is all about trade-offs.
    0:04:10 Everything has a cost and a benefit.
    0:04:13 What do you make of the nun’s trade-off?
    0:04:15 Was it worth it?
    0:04:35 This is Freakonomics Radio, the podcast that explores the hidden side of everything, with your host, Stephen Dubner.
    0:04:47 Today’s show is about spite.
    0:04:54 We’re going to look at why people sometimes try to hurt others, even when it’s very costly to themselves.
    0:04:58 It struck me that spite is, in some ways, an economic concept.
    0:05:02 So, I called up the economist I know best, Steve Levitt.
    0:05:08 He’s my Freakonomics friend and co-author and host of the podcast, People I Mostly Admire.
    0:05:19 So, when I think about spite as an economist, the way I would think of spite is that it is the response of an individual who has been wronged in some way by another,
    0:05:28 who then is willing in the future to pay a large cost in order to punish the person who wronged him in the first place.
    0:05:41 So, in a strange sense, it’s not a very economic concept because, in general, we don’t think that people are going to be overly willing to pay a lot of costs themselves to punish other people.
    0:05:45 Yeah, I think what you described is more revenge than spite, though.
    0:05:46 I don’t know.
    0:05:47 Maybe I don’t even know what spite is.
    0:05:48 What is spite?
    0:05:49 Excellent question.
    0:05:51 Well, it’s not so easy, indeed, to define spite.
    0:05:53 And that’s Benedikt Hermann.
    0:05:56 He is also an economist, originally from Germany.
    0:06:00 Today, he works as a research officer for the European Commission.
    0:06:04 He has done a lot of research on antisocial behavior.
    0:06:06 You might even call him a scholar of spite.
    0:06:24 Let’s have an easy start here and define spite as a behavior where an individual is ready to harm him or herself at own cost to harm somebody else without creating anything good for a third party, for anyone outside.
    0:06:35 Because you could sometimes be nasty to somebody just because he or she has misbehaved, and you would like to do it in a kind of educational way, which then I would not call spite.
    0:06:37 Because it’s not costing you anything.
    0:06:47 No, if I’m punishing somebody who has misbehaved to a community, to our group, if I punish him or her at own cost, it could look like spite.
    0:06:50 But it’s not spite because it’s an educational momentum.
    0:06:55 You try to get somebody who has done something bad to behave better in the future.
    0:06:59 So it’s a kind of moralistic way of punishing, a moralistic way of being aggressive.
    0:07:02 And so it’s not the kind of spite I’m after.
    0:07:13 I’m after the kind of spite or kind of behavior where somebody would harm others for no reason, for no moral reason, apart from something that might satisfy him or herself only.
    0:07:24 Traditional economics argues that most people try to satisfy their self-interest, to maximize their profits and opportunities.
    0:07:30 Economists have a name for this model of self-interest, homo economicus.
    0:07:33 But within that framework, spite is a bit puzzling.
    0:07:39 Why would someone pay outsized costs for no benefit other than to hurt someone else?
    0:07:45 Well, Benedict Hermann thinks that the idea of homo economicus is a bit archaic.
    0:07:47 He prefers a different term.
    0:07:49 Homo rivalis, yes, indeed.
    0:07:57 Homo rivalis, meaning that humans are driven at our core by competition rather than simple self-interest.
    0:08:01 Homo economicus wants to get as much as possible for himself.
    0:08:07 Homo rivalis just wants to make sure he gets more than the other guy.
    0:08:16 In other words, as much as we like to think that we are absolute animals, we are, in fact, relative animals.
    0:08:22 Now, we know this in part through the experimental games that economists like to play.
    0:08:25 One of the classics is called the ultimatum game.
    0:08:26 Here’s Steve Levitt again.
    0:08:35 So the ultimatum game is a little experimental game that the behavioral economists have developed in which two players come into the lab and they’re completely anonymous.
    0:08:36 They’ll never meet each other.
    0:08:37 It’s a one-shot game.
    0:08:45 And one player is given, say, $10, and they’re allowed to divide that $10 however they’d like between themselves and the other player.
    0:08:52 That other player is then informed about the way in which the division has occurred and is given a choice.
    0:09:02 They can either accept the division, say, $7 for the person who’s splitting the pot and $3 for me, or I have another option to say, no, I prefer both of us to get zero.
    0:09:12 So you always face a choice between, as the recipient of the ultimatum, is I can accept what the other person offered me, or I can have us both get zero.
    0:09:19 And empirically, what we see is that rarely will anyone accept an offer that’s less than 20%.
    0:09:36 So if the person who splits the pot divides it more unevenly than $75, $25, you’re almost guaranteed to have it rejected, even though the rejecter is giving up the 25% or the 20% of their own money in order to take the 75% or the 80% away from you.
    0:09:41 Now, to an economist, this might seem perplexing.
    0:09:48 Why am I willing to throw away two or three of my dollars just to make sure that you don’t get seven or eight?
    0:09:54 Well, maybe it’s because I feel you’ve wronged me by splitting the pot so unevenly.
    0:09:57 But remember what Benedict Hermann said earlier about spite.
    0:10:04 True spite, as he sees it, is not motivated by a desire to punish someone’s bad behavior.
    0:10:09 So he wanted to see how people behave absent such a moral incentive.
    0:10:12 He and a colleague came up with an experiment.
    0:10:17 So let me quickly try to explain here on the radio how this experiment worked.
    0:10:21 So you would be invited to our experiment like many other students.
    0:10:22 You don’t know each other.
    0:10:24 You come to our lab inside.
    0:10:26 You have to sit behind computers.
    0:10:29 You are requested not to talk with anyone during the whole experiment.
    0:10:33 So you’re paired with another player, but you don’t see that person.
    0:10:37 You each get $10 and then you’re given an option.
    0:10:45 If you surrender $1 of your money, you can destroy $5 of the other person’s wealth.
    0:10:48 Now, there’s no revenge going on here.
    0:10:53 There wouldn’t seem to be anything for you to gain by destroying the other person’s money.
    0:10:59 But, as Benedict Hermann found, about 10% of the players did take that option.
    0:11:03 Hermann calls such a player a difference maximizer.
    0:11:10 That means that we want to maximize the payoff differential between the opponent and us.
    0:11:20 So, maybe in a more pittoresque way, being aware that we are losing our trousers for the sake and for the hope that the opponent will lose both the shirt and the trousers.
    0:11:30 In other words, some people were always willing to cut off their noses to spite the other player.
    0:11:32 Hermann was perplexed by this finding.
    0:11:40 And he tried the experiment in a variety of versions, variety of settings, different parts of the world, different kinds of societies.
    0:11:52 But in each case, he found that a surprising number of people would give up some of what was theirs for the sole purpose of taking something away from someone else.
    0:11:57 And what are you, as the researcher, thinking?
    0:12:03 Are you thinking this is remarkably surprising, sad, strange, irrational?
    0:12:08 What is your—I mean, on the one hand, you must be excited because for the sake of a paper, it’s a fascinating finding.
    0:12:12 This exactly—these are the two souls of a researcher.
    0:12:15 Of course, on the one side, exactly as you said it very nicely, you are very excited.
    0:12:19 But on the other side, of course, you start thinking, oh, my God, who the heck are we?
    0:12:20 We, the humans.
    0:12:30 For me, the outcome of all this research is definitely a kind of sadness and also worry that we can be too fast.
    0:12:38 We humans, we can get too fast into intergroup conflict, which don’t make any sense to anyone.
    0:12:49 That we start to harm each other, that we start innocent people to kill each other for something that at the end of the day could have been decided in a much more reasonable way.
    0:13:00 Now, as interesting as this may be, as believable as it may be, Steve Levitt warns us not to make too much of lab experiments like these.
    0:13:07 It’s hard to extrapolate from a lab setting to the hurly-burly of the real world.
    0:13:10 When people are in the lab, they’re completely anonymous.
    0:13:11 It’s the only time we’ll ever play.
    0:13:14 But the real world isn’t usually like that.
    0:13:15 Indeed.
    0:13:19 So, after the break, we’ll get back to the real world.
    0:13:29 See if we can find a story where someone willingly gives up money and not just a few bucks like in these lab games, but lots and lots of bucks in order to prove a point.
    0:13:34 Well, the contract he was offered was five years, $7.66 million.
    0:13:36 That’s coming up.
    0:13:41 After the break, I’m Stephen Dubner, and you are listening to a bonus episode of Freakonomics Radio from 2013.
    0:13:43 We’ll be right back.
    0:14:02 Dave O’Connor is a longtime TV and film producer who’s now president of Time Studios.
    0:14:08 Years ago, he executive-produced a documentary film for ESPN called You Don’t Know Bo.
    0:14:10 Bo as in Bo Jackson.
    0:14:14 He’s remarkable, and look at that one.
    0:14:16 Bo Jackson says hello.
    0:14:17 He’s just so quick.
    0:14:18 Look at that burst.
    0:14:21 Bo Jackson back, leaps, and he makes the kick.
    0:14:23 Nobody catches Bo.
    0:14:24 The answer is no.
    0:14:26 Bo has a no.
    0:14:27 Bo on the charge.
    0:14:29 Bo is there.
    0:14:31 Bo knows exactly what he’s doing.
    0:14:32 Spider-Man.
    0:14:40 Bo was probably the single greatest athlete of his generation.
    0:14:46 Two-sport star, football and baseball, and was just a transformative athlete.
    0:14:51 He just physically, there’s something about his presence that feels different than normal human beings.
    0:14:52 Is it a bird?
    0:14:53 Is it a bird?
    0:14:54 Is it a plane?
    0:14:54 Is it a plane?
    0:14:56 It’s Super Bowl!
    0:15:04 In the spring of the first time, Bo Jackson was playing his senior year of college baseball at Auburn.
    0:15:10 He showed signs of being a very highly valued Major League Baseball player.
    0:15:12 I’m tearing the cover off the ball.
    0:15:14 I’m batting over 400.
    0:15:18 Oh, I don’t know how many home runs I was sitting on then.
    0:15:21 That’s Jackson himself from the film.
    0:15:27 Now, he had just completed his senior season of college football, which had gone even better.
    0:15:29 Dave O’Connor again.
    0:15:35 Football, his senior year, is one of the all-time great seasons of a running back in college football.
    0:15:38 He rushes for nearly 1,800 yards.
    0:15:45 He wins the Heisman Trophy and basically enshrines himself as a legend of college football.
    0:15:50 Sort of the common wisdom was that Bo will be the number one draft pick in football.
    0:15:53 He will probably not play baseball at all.
    0:16:00 And if he does, somebody should pick him in the 20th round or 30th round on a flyer just in case.
    0:16:02 Right. You don’t want to waste a pick on a guy who’s going to be playing football.
    0:16:03 Right.
    0:16:09 So, while finishing up his college baseball career, Jackson starts getting courted by NFL teams.
    0:16:12 The football draft happens before the baseball draft.
    0:16:20 The number one overall NFL pick is held by the Tampa Bay Buccaneers, who are owned by a man named Hugh Culverhouse.
    0:16:24 The Bucs have made it clear that they want Bo Jackson.
    0:16:26 I was all gung-ho.
    0:16:32 And I had taken a few trips to visit some teams.
    0:16:35 My senior year, I got the okay to go visit Tampa Bay.
    0:16:45 Hugh Culverhouse sent his jet to Columbus Airport, drove over, got on the jet, went to Tampa Bay for my visit.
    0:16:50 It was almost like a college visit when you’re a high school senior and you’re going to visit a college.
    0:16:59 And they get some of the players to show you around town, to show you the night spots, take you to a nice restaurant, and entertain you.
    0:17:05 About four or five days later, I’m back at Auburn getting ready for my baseball game.
    0:17:08 And I walked out on the field.
    0:17:13 I have to walk from the athletic department across the parking lot, across the street to the baseball field.
    0:17:19 And as I get to the gate to come around the dugout, Coach Baird approaches me.
    0:17:22 He said, Bo, can I talk to you for a second?
    0:17:23 I said, sure, Coach.
    0:17:24 He said, let’s go.
    0:17:25 We’re behind the dugout.
    0:17:26 Let’s go sit and talk.
    0:17:27 So we go behind the dugout.
    0:17:33 And I’m thinking that he’s going to tell me, hey, some big league team wants to sign me.
    0:17:41 And he said, did you take a trip last week on Hugh Culverhouse’s jet to go down to visit Tampa?
    0:17:43 I said, yes.
    0:17:46 And the folks checked and said that it was OK.
    0:17:51 They checked with the NCAA and said that it was OK to do that.
    0:18:03 He said, well, Bo, somebody didn’t check, and the NCAA has declared you ineligible for any more college sports, so you can’t play baseball no more.
    0:18:08 And I sit there on that ground, and I cried like a baby.
    0:18:11 I cried like a baby.
    0:18:20 Bo Jackson immediately felt that he’d been wronged.
    0:18:30 He loved baseball, and even though it looked like he was going to play football professionally, he was distraught about being barred from finishing out his college baseball career.
    0:18:37 And what’s more, he became convinced that Hugh Culverhouse, the Tampa Bay owner, had done this to Bo on purpose.
    0:18:43 Because the officials at Tampa Bay told me personally, yes, we checked, and they said that it was OK.
    0:18:53 I think it was all a plot now just to get me ineligible from baseball because they saw the season that I was having, and they thought that they were going to lose me to baseball.
    0:18:56 And if we declare him ineligible, then we got him.
    0:19:06 Now, we don’t know whether the Bucs actually meant for this to happen, but it certainly did seem to work out well for them.
    0:19:14 They were in line to pick Bo Jackson number one in the NFL draft and pay him so much money that he’d forget about baseball in a heartbeat.
    0:19:15 It was just one problem.
    0:19:19 Bo Jackson isn’t the forgetting type.
    0:19:22 And I said, there is no way I’m signing with Tampa Bay.
    0:19:25 And I told Hugh Culverhouse, I said, you draft me if you want.
    0:19:27 You’re going to waste a draft pick.
    0:19:29 I said, I promise you that.
    0:19:36 And Hugh Culverhouse, well, this is what I’m going to offer you as a signing bonus, and you’re going to take it whether you want it or not.
    0:19:38 I said, all right.
    0:19:40 They didn’t think I was serious.
    0:19:43 And I sat down after baseball season was over.
    0:19:45 I talked to my baseball coach.
    0:19:51 I said, coach, a lot of people don’t think I’m serious about playing baseball.
    0:20:01 I said, but if Tampa Bay drafts me, I said, on my honor, and I’m looking you in your eye, man to man, I’m playing baseball.
    0:20:07 So if you know any teams out there that’s interested in an outfielder, you let them know.
    0:20:22 In the NFL draft that April, Tampa Bay did select Bo Jackson with the number one pick, which was attached to a $7.66 million five-year contract.
    0:20:30 And then, a couple of months later, Bo Jackson was selected in the baseball draft in the fourth round by the Kansas City Royals.
    0:20:34 They offered him three years at just $1 million.
    0:20:39 The choice would seem obvious, but Bo doesn’t know obvious.
    0:20:43 He rejects the football offer, and he takes the baseball offer.
    0:20:45 How surprising is this?
    0:20:47 Here’s Dave O’Connor again.
    0:20:49 Unprecedented.
    0:20:50 It just doesn’t happen.
    0:20:54 You can’t—I mean, money talks, right?
    0:21:00 I mean, you have $7.6 million sitting there, and you sign a contract for one.
    0:21:02 That’s a rare occurrence.
    0:21:07 It sounds like a decision that very few people, that I know at least, would have made.
    0:21:11 Do you think that was an act of spite on Bo Jackson’s part?
    0:21:17 It’s interesting, because I think Bo would say that he did the honorable thing and that he has a code.
    0:21:21 But when you look at it, on its surface, it is spite.
    0:21:27 There is no rational explanation for walking away from that kind of money.
    0:21:29 He’s not just hurting himself here.
    0:21:33 He’s also doing this to hurt Tampa Bay, to some extent.
    0:21:39 The opportunity cost of losing a first-round draft pick isn’t just that Bo Jackson isn’t playing on my team.
    0:21:45 It’s that every other player I could have selected with that pick is not playing on my team either.
    0:21:56 So it’s a huge impact to Tampa Bay, not to mention the public relations nightmare of going out on a limb and selecting somebody and not getting him.
    0:22:05 So Jackson does sign with the Royals.
    0:22:09 He starts the year in the minor leagues, but by the end of the season, he makes a major league team.
    0:22:12 He’s on track for a nice baseball career.
    0:22:17 And then, the next year, he becomes eligible to re-enter football.
    0:22:18 Now, will he play?
    0:22:20 Nobody knows.
    0:22:24 But the Los Angeles Raiders draft him in the seventh round.
    0:22:28 He signs, and suddenly, he’s playing two professional sports.
    0:22:34 At the end of the baseball season, he jumps straight into football, and he became a star in both.
    0:22:47 He also becomes a household name, in part because of his athletic feats, and in part because he was the star of one of the most beguiling ad campaigns in history.
    0:22:49 Bo Knows for Nike.
    0:22:53 Bo Knows Baseball.
    0:22:55 Bo Knows Football.
    0:22:57 Bo Knows Basketball, too.
    0:22:58 Bo could surf.
    0:23:00 Bo could rollerblade.
    0:23:02 Bo could not play ice hockey.
    0:23:06 That was the one thing that they couldn’t agree to let him actually be able to do.
    0:23:08 Gretzky shakes his head and says,
    0:23:09 No.
    0:23:12 But pretty much everything else.
    0:23:16 Volleyball, tennis, running, lifting weights, aerobics, all kinds of stuff.
    0:23:19 Bo, you don’t know diddly.
    0:23:22 All right.
    0:23:31 So, we agree that Bo Jackson’s athletic career turned out pretty well, remarkable on some dimensions, but overall not one of the greatest ever because it wasn’t long enough, perhaps.
    0:23:44 We agree that because he was such an unusual athlete into sports, he became this icon and the focus of a remarkable and probably quite remunerative ad campaign, right?
    0:23:45 Do we agree on this so far?
    0:23:45 Yes.
    0:24:01 Do we therefore agree that had this catastrophe not happened with him, with getting drafted for the NFL by a team that out of spite or something like spite, he turned down, that if that had not happened, that all the rest may not have happened?
    0:24:15 Yeah, I think that’s a plausible argument to make because he probably, had he signed that deal with Tampa Bay, if he doesn’t get injured, he probably becomes one of the best running backs in NFL history.
    0:24:17 But that’s probably it.
    0:24:21 I mean, honestly, my takeaway lesson here is spite pays.
    0:24:28 Yeah, you would say, I mean, if you take a look at where he ends up, spite certainly paid in his case.
    0:24:33 So here’s a question we’re thinking about.
    0:24:43 If spite indeed exists, is it something that we humans have always carried around in our genetic code or do we pick it up along the way?
    0:24:55 We are constantly wrestling with our conscience and with a tendency to deviate from social norms in a risky way and to do wrong, to be selfish.
    0:25:00 That’s coming up. After the break, I’m Stephen Dubner and this is Freakonomics Radio.
    0:25:21 So far in this episode, we’ve heard about spite in professional sports, spite in medieval nuns, spite as measured in laboratory experiments.
    0:25:26 So is spite an innate part of being human or is it something we learn?
    0:25:34 We’re a very biological organism and we’ve inherited an awful lot.
    0:25:43 In fact, most of the basic emotions that guide us from our animal and paleolithic early human past.
    0:25:47 That is E.O. Wilson. He’s a renowned biologist and author.
    0:25:54 And that’s Catherine Wells. She produced this episode and she interviewed E.O. Wilson for us back in 2013.
    0:25:59 I called him up because I wanted to know where all of this self-destructive spite comes from.
    0:26:03 You know, is this a common behavior throughout nature or are we unusual in it?
    0:26:10 And I have to say that I just assumed that we would be the meanest creatures in existence given everything we’ve heard today.
    0:26:12 But Wilson said that wasn’t true.
    0:26:15 Oh no, we only moderately mean.
    0:26:26 Now, E.O. Wilson has done a lot of thinking about the origins of human behavior and he thinks the nastiness that we see in animals might give us a clue to why we act the way we do.
    0:26:36 There’s a case that comes quickly to mind, for example, of a kind of spider in which the mother has a brood of spiderlings.
    0:26:42 And when they’re born, she sits down and lets the little spiderlings eat her.
    0:26:52 There are a couple of cases in the ants where the workers have a huge gland of poisonous material containing it.
    0:27:03 And when they get into a tough fight, they are able to contract their abdomens and explode their abdomens so that sticky poison covers the enemy.
    0:27:06 It can disable several enemies doing that by giving its life.
    0:27:09 The list of this kind of behavior goes on and on.
    0:27:16 I mean, things that you really don’t want to think about too much before you go to sleep, you might have nightmares.
    0:27:19 But here’s the story about spite.
    0:27:41 If we define spite as doing harm to someone else at the cost of harm to yourself, and that involves a surrender of some advantage or emotional reward on your part, you give it up in order to hurt somebody else.
    0:27:43 That might not exist.
    0:27:44 In nature.
    0:28:00 It’s very difficult to find any case in the great encyclopedia of animal aggression where it doesn’t give some advantage to the individual doing the aggression.
    0:28:14 But it’s very rare that an animal would deliberately injure itself just in order to create injury in another individual without any further gain to itself to deliberately do that.
    0:28:17 I think spite does not exist in the animal kingdom.
    0:28:19 In the way that it does in humans.
    0:28:20 Is that right?
    0:28:22 Well, let’s take humans.
    0:28:38 When a person injures himself or herself, say, in reputation, in diminishing wealth, causing their own early death, whatever it is, in order to harm another person, you would say, oh, that’s spite.
    0:28:40 That’s got to be spite.
    0:28:56 But it really would be true spite, in my mind, as opposed to mere risk-taking or trade-off for one kind of gain in exchange for one kind of loss taken if you can’t see a gain.
    0:28:58 And that’s hard to imagine.
    0:29:01 Even vengeance has its gain.
    0:29:04 It has a strong emotional award to it.
    0:29:10 For example, if you harm yourself and your reputation, you accept that.
    0:29:19 If the damage you can do benefits you in some other way or benefits, say, particularly your own offspring in a particular way.
    0:29:28 You know, like unscrupulous stage moms, murderesses of cheerleading champion competitors.
    0:29:29 I think you’ll get the drift.
    0:29:42 Even a mass murderer who goes out and harms a lot of people is taking some benefit, emotional benefit, from that when suicide is intended.
    0:29:51 A lot of mass murders are just a terrible form of suicide in which a person decides to get the satisfaction in advance of committing it.
    0:30:01 And maybe the satisfaction the person will get in striking out against something they imagined to be their enemy and diminish them before.
    0:30:06 So, when you add that factor, maybe real spite does not exist.
    0:30:14 So, I don’t know whether this is a relief or not.
    0:30:18 I mean, the idea that spite might not even exist seems good.
    0:30:22 But the fact that we get personal satisfaction out of hurting other people?
    0:30:24 I told Wilson that was kind of a bummer.
    0:30:26 That just shows you’re not a psychopath.
    0:30:30 I’m a total wuss.
    0:30:32 But here’s the upside.
    0:30:37 Spite is not the only motivation we have for being self-destructive.
    0:30:38 There’s actually another.
    0:30:40 Altruism.
    0:30:44 When we hurt ourselves, we aren’t always doing it just to hurt someone else.
    0:30:46 Sometimes, we’re doing it to help.
    0:30:52 One of the things that makes us human is our internally conflicted nature.
    0:30:53 Confliction.
    0:30:57 Our ambivalence to our own selves.
    0:31:01 We are constantly wrestling with our conscience.
    0:31:07 And with a tendency to deviate from social norms in a risky way.
    0:31:09 And to do wrong.
    0:31:11 To be selfish.
    0:31:23 The contest within us between doing the moral thing, even the heroic thing on one side, and doing the selfish, perhaps even criminal thing on the other side.
    0:31:28 That contest is what gives us such a continuously conflicted nature.
    0:31:33 If we became completely altruistic, then we would be like ants.
    0:31:43 If we went to the opposite extreme and had complete lack of constraint and it was complete individualism, then we would have chaos.
    0:31:44 If we would not have order, the group would dissolve.
    0:31:45 The group would dissolve.
    0:31:48 So we have to be in the middle.
    0:31:50 This appears to be the human condition.
    0:31:53 It’s funny listening to him talk about that.
    0:31:55 That’s Steve Levitt again.
    0:32:00 He took a class with Wilson when he, Levitt, was an undergrad at Harvard.
    0:32:02 He’s very fond of the way Wilson thinks.
    0:32:07 There could be no two disciplines closer than evolutionary biology and economics.
    0:32:15 And they studied different questions and they used different methods, but the way that evolutionary biologists think is exactly like the way that economists think.
    0:32:24 Both are very much a model of behavior, of individual behavior, an individual behavior that’s motivated by costs and benefits.
    0:32:31 The other thing is that at its heart, both economics and evolutionary biology strive for simplicity.
    0:32:38 That the simplest story, which can explain a set of facts, is the one that we gravitate to, as opposed to other disciplines, history.
    0:32:40 History is all about complexity.
    0:32:43 Literature is all about complexity.
    0:32:46 Even sociology, I think, at heart is about complexity.
    0:32:49 But economics is about simplicity.
    0:32:59 Like E.O. Wilson, Levitt thinks that spite, true spite, may not really exist.
    0:33:05 Because that would mean that I hurt you even though I get nothing for it.
    0:33:05 Nothing.
    0:33:11 And while it may seem that I get nothing, I probably get something.
    0:33:16 What I would say about spite, I would say this.
    0:33:21 To know that an act is spite, you have to be inside the head of the perpetrator.
    0:33:26 Because the idea of spite is that it’s being done without benefit.
    0:33:35 But it’s interesting because one of the first premises of economics is you can never really know what other people are thinking and why they’re doing what they’re doing.
    0:33:37 Instead, we focus on what they do.
    0:33:44 And so, consequently, my view is forget about what’s going on inside of other people’s heads.
    0:33:46 You’ll probably never know what it is.
    0:33:48 And focus on what they’re actually doing.
    0:33:55 Do you see altruism as sort of the flip side of the coin to spite and therefore not quite real?
    0:33:58 Altruism is exactly the flip side of spite.
    0:34:08 In the sense that there are acts which very well could be altruistic, but equally could be done in a perfectly self-interested way.
    0:34:15 Both make you feel really good and it feels good to help other people sometimes and it feels so good to punish other people who’ve wronged you.
    0:34:21 So, I think they’re both actually completely consistent with the idea of people doing the best they can.
    0:34:23 And what about you personally, Levitt?
    0:34:28 Do you get more satisfaction generally from helping people or punishing people?
    0:34:29 I’m a lover, not a fighter.
    0:34:30 You know that.
    0:34:31 I like to help people.
    0:34:40 I’d like to thank Steve Levitt and everyone else for helping us think about spite today.
    0:34:50 I’m sorry to say that Lisey Oliver died in 2015 at age 63 and E.O. Wilson died in 2021 at 92.
    0:34:53 We will be back soon with a new episode.
    0:34:55 Until then, take care of yourself.
    0:34:57 And if you can, someone else too.
    0:35:01 Freakonomics Radio is produced by Stitcher and Renbud Radio.
    0:35:06 This episode was originally produced by Catherine Wells and was updated by Dalvin Abouaji.
    0:35:08 It was mixed by Jasmine Klinger.
    0:35:21 The Freakonomics Radio Network staff also includes Alina Cullman, Augusta Chapman, Eleanor Osborne, Ellen Frankman, Elsa Hernandez, Gabriel Roth, Greg Rippon, Jeremy Johnston, Morgan Levy, Sarah Lilly, Tao Jacobs, and Zach Lipinski.
    0:35:29 You can find our entire archive on any podcast app or at Freakonomics.com, where we also publish transcripts and show notes.
    0:35:32 Our theme song is Mr. Fortune by The Hitchhikers.
    0:35:34 Our composer is Luis Guerra.
    0:35:36 As always, thank you for listening.
    0:35:49 So to cut off my nose and to prevent rape by the Vikings, you said they were in this case?
    0:35:51 No, in this case, they’re Saracens.
    0:35:54 I have Viking examples I can give you.
    0:35:55 I bet you do.
    0:36:03 The Freakonomics Radio Network, the hidden side of everything.

    In this episode from 2013, we look at whether spite pays — and if it even exists.

     

    • SOURCES:
      • Benedikt Herrmann, research officer at the European Commission.
      • Steve Levitt, co-author of Freakonomics and host of People I (Mostly) Admire.
      • Dave O’Connor, president of Times Studios.
      • Lisi Oliver, professor of English at Louisiana State University.
      • E.O. Wilson, naturalist and university research professor emeritus at Harvard University.

     

     

  • Building Cluely: The Viral AI Startup that raised $15M in 10 Weeks

    AI transcript
    0:00:02 We wrote our first lines of code 10 weeks ago.
    0:00:05 We’re like earlier than the latest YC batch of companies,
    0:00:08 yet we’re like generating probably more revenue than every single one of them.
    0:00:13 It’s been so hard to hear through the noise of everything in the AI,
    0:00:15 especially for the consumer facing, for the consumer facing.
    0:00:19 To do that consistently is actually way harder, near impossible.
    0:00:22 I heard someone call it Riz marketing, which is a compliment.
    0:00:24 They’re like, I hate this Riz marketing.
    0:00:26 AI is so magical.
    0:00:29 We like built the digital god, locked it in a chatbot.
    0:00:32 Six months ago, I was some random college kid in a dorm,
    0:00:35 and now I feel like I’m at the center of the tech universe.
    0:00:40 What happens when a founder treats virality not as a tactic, but as the product?
    0:00:45 Roy Lee, co-founder and CEO of Cluely, is either a generational entrepreneur
    0:00:47 or a walking internet experiment.
    0:00:48 Maybe both.
    0:00:52 In just a few months, he’s built one of the most talked about startups in tech,
    0:00:54 not with a massive fundraise or a polished product suite,
    0:00:58 but through relentless short-form content, polarizing stunts,
    0:01:02 and a translucent AI overlay that feels more like performance art than interface.
    0:01:07 Joining me today is Roy, along with Brian Kim, who led A16Z’s investment in Cluely.
    0:01:10 We talk about Roy’s approach to building in public,
    0:01:13 what Gen Z understands about attention that tech still doesn’t,
    0:01:15 and why momentum might be the new moat in consumer AI.
    0:01:18 This is a conversation about distribution as design,
    0:01:20 founder market fit at internet speed,
    0:01:24 and what happens when you stop trying to be professional and start trying to win.
    0:01:25 Let’s get into it.
    0:01:30 As a reminder, the content here is for informational purposes only,
    0:01:33 should not be taken as legal, business, tax, or investment advice,
    0:01:36 or be used to evaluate any investment or security,
    0:01:40 and is not directed at any investors or potential investors in any A16Z fund.
    0:01:46 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
    0:01:49 For more details, including a link to our investments,
    0:01:53 please see A16Z.com forward slash disclosures.
    0:02:00 Roy, man, the myth, the legend, the man of the moment,
    0:02:02 Cluely is the current thing.
    0:02:02 Yes.
    0:02:03 How does it feel?
    0:02:07 The announcement the other day, a lot of love, a little bit of hate.
    0:02:07 Yeah.
    0:02:11 How do you react to this?
    0:02:12 I mean, it’s pretty crazy.
    0:02:17 I think, like, literally six months ago, I was some random college kid in a dorm,
    0:02:19 and now I feel like I’m at the center of the tech universe.
    0:02:25 And the more astonishing thing is how correct my assumptions on virality have been.
    0:02:30 I think it’s growing increasingly clear that people on ex-LinkedIn are behind,
    0:02:33 and there’s such an extremely small intersection of people who understand
    0:02:36 how developed the algorithm is on, like, Instagram, TikTok,
    0:02:40 and people on tech, like, Twitter, LinkedIn, ex-LinkedIn.
    0:02:44 And there’s just such a small intersection that it’s inevitable that my predictions will be right.
    0:02:46 And it’s been crazy to see that play out in real time.
    0:02:47 Yeah.
    0:02:49 Elon’s reaching out.
    0:02:51 Meta’s offering a billion-dollar acquisition offer.
    0:02:53 Brian, we said not to do it, of course.
    0:02:54 Of course, of course.
    0:02:54 I’m to the moon.
    0:02:57 Brian, what is your reaction seeing this play out and having led the investment?
    0:03:00 Yeah, well, first, he’s a bro of the moment.
    0:03:04 I love the little snippets of, like, you calling Molly bro, like, 20 different times.
    0:03:05 Amazing.
    0:03:07 Look, Roy, this is fantastic.
    0:03:09 It’s so interesting.
    0:03:14 I remember right before the announcement, you had already sent me the video 24 hours ago,
    0:03:16 like, bro, this is what we’re going to show.
    0:03:16 Yeah.
    0:03:18 And I was like, okay, like, this is great.
    0:03:20 And I think it’s going to do well.
    0:03:22 I didn’t think it would do that well.
    0:03:23 Yeah, yeah, yeah.
    0:03:25 And I didn’t think it would create that much controversy.
    0:03:31 But I also, I think, underestimated how positive people will view this.
    0:03:36 There’s a lot of meta-analysis on you out there, which is, like, really, really cool.
    0:03:39 And there are people who are going, like, two to three layers deep.
    0:03:45 And I’ve actually read something that, like, linked a meta-analysis of A16Z with Cluelly
    0:03:50 and how there’s, like, this sort of fungibility of thoughts and things like that, which is, like, mind-blowing.
    0:03:51 I have no idea what the fuck they’re talking about.
    0:03:53 So, it’s just been incredible.
    0:03:55 You put people doing literary criticism.
    0:03:57 No, I’m not kidding.
    0:04:02 There’s literally a guy who wrote, like, fungibility of thoughts with A16Z and Cluelly.
    0:04:03 Yeah, yeah.
    0:04:04 That’s funny.
    0:04:06 People always over-read into stuff.
    0:04:07 They’re like, oh, this has this master plan.
    0:04:08 A16Z is trying to do this whole.
    0:04:09 That’s what I’m saying.
    0:04:10 Everyone.
    0:04:10 Yeah.
    0:04:12 No, don’t tell them, Eric.
    0:04:13 We are playing 4D chess here.
    0:04:14 Yeah, it’s like, it’s fine.
    0:04:16 10 threads, analyzing the left-hand handshake.
    0:04:18 It’s like, guys, come on.
    0:04:19 Some of the haters, I tweeted something.
    0:04:20 I was talking to Brian.
    0:04:22 I tweeted, how you feel about Cluelly is how you feel about yourself.
    0:04:26 But I deleted it because I didn’t want to trigger people on Twitter.
    0:04:27 I’ll talk shit on a podcast.
    0:04:28 Maybe I’ll get clipped to Twitter.
    0:04:30 But I don’t want to inflame people.
    0:04:31 It’s fascinating to see.
    0:04:36 So, I want to go back to the beginning because this isn’t a story that has been told a lot.
    0:04:38 We were talking about the Amazon interview.
    0:04:42 But maybe let’s go back even further when we think through where you are now.
    0:04:47 Talk about, like, the threads or the through lines from your childhood that can help make
    0:04:49 sense of who you are and kind of this moment.
    0:04:55 I think from birth, the most character-defining feature of me has been, like, attention-grabbing
    0:04:56 and provocative.
    0:04:58 This played out elementary, middle, and high school.
    0:05:01 I was always, like, I had a camp of people that loved me and a camp of people that hated
    0:05:02 me.
    0:05:03 I was, like, always the boldest.
    0:05:04 Like, I would say the craziest shit.
    0:05:07 Everything that was on my mind, just no filter, I would say it.
    0:05:10 And this ended up with a lot of people liking me and a lot of people just really disliking
    0:05:11 me.
    0:05:13 And I think things culminated senior year of high school.
    0:05:14 I had a well in school.
    0:05:16 I got accepted to Harvard early.
    0:05:19 And then later that year, I was just always doing crazy shit.
    0:05:22 And I snuck out of a school field trip, passed curfew.
    0:05:25 Like, the police had to come and escort me back because we were out at 2 a.m.
    0:05:27 We were all, like, 16, 17, 18.
    0:05:29 And then I got a suspension for that.
    0:05:33 And that was, like, when the camp of Roy haters took the storm and reported everything
    0:05:34 everywhere.
    0:05:36 And it ended up getting me rescinded from Harvard.
    0:05:40 And then that kind of started my journey of, I think, like, entrepreneurial, wanting to
    0:05:41 actually swing big at building companies.
    0:05:44 And that’s when I felt like my life took such a crazy turn.
    0:05:48 For context, my parents literally run, like, a college admissions consulting company.
    0:05:51 So we literally teach kids how to get into Harvard.
    0:05:53 And the youngest son of the company, like, gets fucking rescinded from Harvard.
    0:05:54 It’s, like, the worst thing ever.
    0:05:56 So we decided, let’s keep this quiet.
    0:05:59 Do you still have the same test scores and application and everything?
    0:06:00 Maybe next year you’ll get into a different school.
    0:06:03 So I spent, like, an entire year at home.
    0:06:07 And I underestimated how mentally tormenting that would be.
    0:06:11 Like, I’m probably the most extroverted person you might have ever met in your life.
    0:06:14 I cannot stay maybe, like, eight hours without talking to someone.
    0:06:17 And to spend a year alone, like, it made me think, man, my life is so crazy.
    0:06:22 I might as well just quintuple down on every single crazy belief thought I have and just live
    0:06:23 the most interesting life ever.
    0:06:26 So that was, like, the moment where I decided, like, I’m all in on building companies.
    0:06:28 There’s no way I can do anything else.
    0:06:32 I was wondering if you were going to have a moment of, like, oh, this has set me back
    0:06:32 in some ways.
    0:06:34 Maybe I should reform or tame it down.
    0:06:37 But you took the opposite of, like, no, this is who I am and it’s going to work.
    0:06:41 Yeah, you just sit in a room by yourself for 12 months and all of a sudden your craziest
    0:06:44 thoughts become logical and there’s no one else.
    0:06:47 Like, the echo chamber is you in your brain and it amplifies everything.
    0:06:50 And I think that’s the reason why I am, like, the person that I am today and willing to
    0:06:51 make the bets that I am today.
    0:06:56 Later, I go to community college in California at the behest of my parents.
    0:06:58 I think California, this is, like, a middle ground.
    0:07:01 California, I have the chance to build a company and community college, like, I have the chance
    0:07:04 to get the education that my Asian parents always dreamed of.
    0:07:05 So I do that.
    0:07:09 And then later I get into Columbia and I have to go to Columbia for at least a semester to
    0:07:10 appease my parents.
    0:07:13 I go to Columbia and the first thing I’m thinking is, like, can I find a co-founder and a wife?
    0:07:17 Like, those are the only two things that I’m looking for in college and still looking for
    0:07:17 the wife.
    0:07:21 But on pretty much the first day, that’s when I met Neil, my co-founder, started hacking
    0:07:22 on a bunch of things.
    0:07:24 And the one thing that worked was the earliest version of Cluelay.
    0:07:25 Wow.
    0:07:29 And were your parents ever trying to reform you or moderate you?
    0:07:31 Or were they kind of like, hey, Roy’s going to be Roy?
    0:07:36 Almost every single day of my life until I got into Harvard, then they calmed down and
    0:07:38 they’re like, wow, this kid really made it.
    0:07:42 And then when I got kicked from Harvard again, they were like on my ass a lot until I got into
    0:07:43 Columbia again.
    0:07:43 And they’re like, wow.
    0:07:46 After all this, he gets back into the Ivies.
    0:07:47 He’s like, I guess I really can’t trust him.
    0:07:51 And his unorthodox swings will be home runs.
    0:07:52 And is that where they’re at now?
    0:07:52 Yeah.
    0:07:54 That’s actually what I’m going to ask.
    0:07:57 You mentioned before that your parents, they will love you no matter what.
    0:07:58 Yeah, yeah, yeah, yeah.
    0:08:00 However, I am curious what they think now.
    0:08:05 Man, it’s crazy how lax they’ve gotten since I got back into Columbia.
    0:08:06 Like, now they’re okay with anything.
    0:08:08 When I told them, hey, mom, I’m dropping out to do this.
    0:08:10 Like, she’s like, oh, like, okay.
    0:08:11 You know, like, I expected it.
    0:08:12 Why didn’t you drop out sooner?
    0:08:14 Why did it take a semester and a half?
    0:08:16 Because at this point, I was, like, convincing my co-founder to drop out with me.
    0:08:18 And she’s like, man, like, took it long enough.
    0:08:21 So they’re, like, totally on board with all the crazy shit that I do.
    0:08:21 Wow.
    0:08:24 One of the things we were talking about in the context of going viral, I heard you say,
    0:08:28 is that Twitter is two years behind Instagram.
    0:08:28 Yeah.
    0:08:29 Or behind the other platforms.
    0:08:35 Talk a little bit about when your sort of provocativeness sort of translated over to Twitter
    0:08:36 or just, like, the digital mediums.
    0:08:37 Like, how did that strategy evolve?
    0:08:39 And let’s talk about the difference in the platforms.
    0:08:40 Yeah, this goes way back.
    0:08:44 But many years ago, when YouTube first came out as a platform,
    0:08:45 this was, like, the turning point of everything.
    0:08:48 This democratized, essentially, content.
    0:08:49 And now you weren’t paying for commercials.
    0:08:55 And the visibility and publicity in content was not gated by amount of money you’re willing
    0:08:56 to spend on ads or TV space.
    0:08:58 It’s just gated by the quality of content.
    0:09:03 And five years ago, when TikTok came out and short-form algorithms really started taking over,
    0:09:05 that shifted the frame once again.
    0:09:07 So now it’s not about how much good content you make.
    0:09:10 It’s literally just about how much content can you make.
    0:09:14 There is simply not enough good content out there for the average person to consume,
    0:09:18 which is why you see the same brain rot reels over and over and over again, over and over again.
    0:09:22 You see the same Minecraft parkour video over and over just because there’s literally not enough content
    0:09:23 for the average consumer to consume.
    0:09:26 And people have not caught on to a few things.
    0:09:30 First, you just need to make more content that a consumer will.
    0:09:33 Like, most people don’t know how to make viral content.
    0:09:36 Content that any person can watch, consume, and is digestible.
    0:09:40 And everyone on ex-LinkedIn is trying to go be, like, the most intellectual,
    0:09:41 like, thoughtful person.
    0:09:45 And they’ll generate some slop that maybe, like, 200 people in the world can actually understand and make sense of.
    0:09:49 But, of course, they want to seem like the most interesting, thoughtful, like, intelligent person.
    0:09:50 But this just lacks viral sense.
    0:09:52 There’s not enough viral content to go out.
    0:09:57 And the second thing is that the algorithms really promote the most controversial things.
    0:10:02 And people on ex-LinkedIn, there’s not enough controversial things to be rewarded for.
    0:10:07 So when I come out swinging out of the gate, I’ve been on Instagram, TikTok for the past 10 years of my life,
    0:10:10 and I understand what level of controversialness you need.
    0:10:14 I take the slightest foot into controversialness, and all of a sudden, ex-LinkedIn explodes.
    0:10:18 Because the algorithm inherently highly, highly rewards this stuff.
    0:10:21 And as a result, it’s just getting shoved into everyone’s feed, and they don’t understand.
    0:10:26 But I’m just, like, literally applying the same principles of controversialness from IG, TikTok onto ex-LinkedIn.
    0:10:29 And they’re just so not ready for it that it feels like the craziest thing ever.
    0:10:36 And this is something that I’ve said before, but I guarantee, like, my videos do not go as viral on Instagram, like, Instagram, TikTok.
    0:10:39 And the sole reason is because on those platforms, they are not controversial enough.
    0:10:43 And on those platforms, there are literally people committing felonies in public,
    0:10:45 or at least insinuating, like, they are committing felonies.
    0:10:47 Even then, it’ll be like, oh, good try, bro.
    0:10:48 It’s not interesting enough.
    0:10:50 And, like, ex-LinkedIn, they just have not caught on.
    0:10:53 There’s not enough creators out there who are willing to press the controversial button.
    0:10:55 Mark says this sometimes, right?
    0:10:56 Like, supply chain of the meme.
    0:10:57 Yeah.
    0:11:05 It’s like the strangest concept, but the meme actually travels from, like, Reddit, and then it goes to X, and then it goes to Instagram, then LinkedIn, then CNBC.
    0:11:09 And there’s, like, a train that it goes that certain people are just early, late.
    0:11:15 But when you actually flip it with virality and controversiness, maybe that flips a little bit.
    0:11:16 Well, and sometimes it starts in, like, 4chan or something.
    0:11:19 That’s what I mean, like, I didn’t even want to say it.
    0:11:23 It’s 4chan, Reddit, then Twitter, then Instagram and LinkedIn.
    0:11:31 But for you, it’s actually Instagram and Twitter comes before Twitter in terms of the raunchiness or craziness of it.
    0:11:32 Yeah, yeah.
    0:11:41 I mean, I just feel like the average person who’s in X comments hating about how controversial this is, they spend one hour looking through the timeline that is my Instagram feed.
    0:11:42 Their brains would melt and explode.
    0:11:46 Like, they would not be able to comprehend how are people, like, digesting this at scale.
    0:11:56 It’s funny because ever since Elon took over X, some people start complaining of, oh, this stuff has gotten too controversial or too much dark stuff or too much negative content.
    0:11:59 And, yeah, it turns out it’s just the beginning.
    0:12:00 Yeah, yeah.
    0:12:02 I mean, literally, this is what the future of content is going to be.
    0:12:04 You’re not going to get more millennial founders.
    0:12:06 You’re only going to get more Gen Z founders.
    0:12:09 And I guarantee you, their backgrounds and content are the exact same as mine.
    0:12:10 And they know exactly what I’m doing.
    0:12:12 I’m sort of like the canary that’s leading the way for this.
    0:12:14 But I guarantee you, there’s more of this coming.
    0:12:15 And it’s inevitable.
    0:12:16 And just embrace it.
    0:12:20 When did you realize that this was the way you were going to build a company?
    0:12:24 How did you intuit, hey, distribution is a scarcity, distribution is what matters?
    0:12:26 Because there’s a lot of creators out there, but they’re not combining it with a tech company.
    0:12:28 Like, how and when did you put this together?
    0:12:33 I guess there was a certain point where I kept going viral that I sort of realized that
    0:12:37 I know something that ex-LinkedIn people don’t know yet.
    0:12:40 And it is sort of like mastery of the algorithm.
    0:12:43 And I think everything started with the interview coder situation.
    0:12:45 Interview coder was the earliest prototype of Cluey.
    0:12:48 And it was a tool to let you cheat on technical interviews.
    0:12:50 And I used it to cheat my way through an Amazon interview.
    0:12:51 I made it super public.
    0:12:54 I posted it everywhere and ended up getting me like blacklisted from big tech and kicked
    0:12:55 out of school.
    0:12:57 And that situation was inherently viral.
    0:13:00 Like, when’s the last time someone got kicked out of an Ivy League and raised $5 million?
    0:13:02 Like, this has probably never happened in the history of humanity.
    0:13:04 So that situation was inherently viral.
    0:13:08 And at that time, I had no idea that this was like a repeatable thing that I could do.
    0:13:10 But then the launch video happened.
    0:13:12 And I had my intuitions about the virality of launch video.
    0:13:13 And I just kept scrolling on Twitter.
    0:13:17 And I was wondering, like, man, why is nobody doing what Avi Schiffman with friend.com
    0:13:19 showed the world you could do a year ago?
    0:13:20 Like, why has nobody done this yet?
    0:13:21 And it worked.
    0:13:23 And then I did the 50 interns thing, and it worked.
    0:13:25 And like, I kept doing viral video after viral video.
    0:13:29 And at a certain point, I just realized, like, holy shit, people on ex-LinkedIn, they have
    0:13:30 not caught on yet.
    0:13:35 And this is the massive alpha that I’m trying to capture here, is that they have not caught
    0:13:40 on to what it means to master the short form algorithm or the algorithm that is one
    0:13:41 of short form.
    0:13:46 And as a result, I am able to dominate the timeline for not the past week, but like, probably
    0:13:49 the past few months, just because people on ex-LinkedIn have not caught on.
    0:13:50 And for some reason, they still refuse to catch on.
    0:13:55 And so just to put a finer point out, the 50 interns, maybe explain this idea of basically
    0:13:58 at your company, you either have engineers and you have creators.
    0:14:00 Yeah, there are only two roles.
    0:14:04 You are either a world-class engineer who is building the product, or you are a world-class
    0:14:05 influencer.
    0:14:09 And for a full time, every single person has over 100,000 followers on some social media platform.
    0:14:14 It is the only way to prove that you actually have mastery over virality and you understand
    0:14:15 what it takes.
    0:14:19 And I think if any company in the world has a marketing team and the head of marketing does
    0:14:21 not have at least 100,000 followers, you need to replace them.
    0:14:22 Like, the game has changed.
    0:14:26 And so do you think this is a strategy that other companies should also be employing, whether
    0:14:31 it’s intern-based or just like having an army of creators and sort of deploying them towards
    0:14:31 their end?
    0:14:33 Yeah, I’ll go a bit deeper in the interns.
    0:14:36 So Cluley made a pretty viral video announcing that we were hiring 50 interns and you’d be
    0:14:38 in here making content all day.
    0:14:40 And essentially, that’s almost what we do.
    0:14:42 We have like over 60 contractors.
    0:14:46 These contractors get paid per video and they just are forced to sit in front of a camera
    0:14:48 and make TikTok and Instagram videos about Cluley.
    0:14:50 And this is what marketing looks like.
    0:14:52 This job does not exist five years ago.
    0:14:55 Like, how do you explain the job if you sit in front of a camera and you make five, 10
    0:14:59 second videos that seemingly make no sense to anybody, but just consistently generate millions
    0:15:00 of views like that?
    0:15:03 That’s not a job that makes sense to people, but that is our internship.
    0:15:05 That’s what like a modern day marketing internship looks like.
    0:15:09 And we pay very little money for the amount of views that we get.
    0:15:13 And different companies, they’re paying literally millions of dollars for Super Bowl ads when
    0:15:17 you can get the same quality and quantity of views, $20,000.
    0:15:18 Did you see it converting?
    0:15:19 Yes, yes, yes.
    0:15:19 Of course.
    0:15:21 I mean, like those are real is clear.
    0:15:24 Our only converting videos is the ones that we have on IG TikTok.
    0:15:24 Yeah.
    0:15:29 Brian, why don’t you share your story of how you got excited about Cluley, of how you and
    0:15:32 Roy met and built this relationship and how this partnership formed?
    0:15:37 I had a good contact in New York who travels in the young crack folks area.
    0:15:39 And her name is Ali DeBow.
    0:15:42 And one of the lists that she sent had Roy in it.
    0:15:44 And I like read what they’re working on.
    0:15:49 And it sort of reminded me of, oh, like a scout thing or it’s on the edge of reality.
    0:15:50 I’m like, oh, this is interesting.
    0:15:50 I want to talk to him.
    0:15:51 So I reached out.
    0:15:53 Roy, if you remember, I just reached out.
    0:15:54 Hey, I heard your name, blah, blah, blah.
    0:15:54 We should talk.
    0:15:56 You’re like, oh, bro, let’s talk.
    0:16:00 And then a day later, you wrote back, I’m like, actually, you’re multi-stage.
    0:16:01 I don’t want to talk to you.
    0:16:03 My advisor says, don’t want to talk to you.
    0:16:03 Go away.
    0:16:05 And then what do I do?
    0:16:07 Like, I could have just stood down, but I said, okay, fine.
    0:16:09 I promise you will not talk about fundraising.
    0:16:10 Let’s just talk.
    0:16:11 Like, I want to meet you.
    0:16:12 I want to talk to you.
    0:16:13 I want to build a relationship.
    0:16:14 Thankfully, you agreed.
    0:16:15 We got on a quick call.
    0:16:18 We like chatted a little bit where you like had your origin story.
    0:16:19 I’m like, oh my God, it’s good.
    0:16:19 It’s amazing.
    0:16:20 It’s so cool.
    0:16:21 Like, I’m glad he’s doing this.
    0:16:24 Sadly, he’s not accepting money, but that’s okay.
    0:16:24 Yeah.
    0:16:26 And then I tracked you, I tracked your Twitter.
    0:16:28 I tracked what you’re doing, the 50 intern.
    0:16:32 You moved to San Francisco and I had it in my mind.
    0:16:35 Okay, next time going to opportunity strikes, I’m just going to show up.
    0:16:39 So I think I somehow got your phone number or some such thing and texted you, yo, like I’m
    0:16:40 at your office.
    0:16:41 Can I hang?
    0:16:44 So I come and you say, yeah, like, that’s great.
    0:16:44 Come up.
    0:16:47 And what I actually think was really, really cool.
    0:16:52 There are a couple of steps, but step one was there was like an engineer who randomly
    0:16:58 found you on Twitter or Instagram and had just come up like you did not know him.
    0:16:59 He did not know you.
    0:17:03 He just came to your office, came in to say hi, wanted to say hello.
    0:17:06 And one of your friends, I think Nicholas was just there hanging out.
    0:17:11 And the quality of the people, the fact that random engineers were knocking on the door
    0:17:15 to come talk to you randomly was just like, oh, there’s something like really strange and
    0:17:16 special happening here.
    0:17:20 And all of your team members sitting in the thing and like doing things and creating content
    0:17:22 and you and Neil working on the product.
    0:17:23 It was like, oh, like there’s something very special.
    0:17:28 And I sort of was thinking, oh, like, is this something that we should sort of back?
    0:17:30 And then I think one more video was made or something.
    0:17:35 And next time I visited, I think I came with some stuff or, you know, you’re eating steak.
    0:17:41 And then you flash some metric or something where I realize that you’re converting this like
    0:17:44 awareness and eyeballs into money, dollars.
    0:17:45 You like drop some numbers.
    0:17:47 You’re like, oh, yeah, we’re doing this many revenue.
    0:17:48 And that’s where we’re going.
    0:17:49 And guess what?
    0:17:51 Like some enterprise customer wants to talk to us.
    0:17:52 I don’t know why.
    0:17:52 Blah, blah, blah, blah, blah.
    0:17:59 And that’s when I sort of realized, oh, he is actually able to like convert this awareness
    0:18:03 and distribution that you’re getting into real dollars.
    0:18:06 And I don’t know many people who know how to do that.
    0:18:11 And during that time, I was already writing this thing called momentum as a moat because
    0:18:17 it’s been so hard to pierce through the noise of everything in the AI, especially in the sort
    0:18:19 of consumer facing, pursuers facing.
    0:18:23 To do that consistently is actually way harder, near impossible.
    0:18:29 And so I had the theory that, oh, like companies who know how to do that, companies know how
    0:18:31 to build at that speed are going to be the winners.
    0:18:34 And I felt like I have found a person who was doing that.
    0:18:37 And so I think we moved very quickly.
    0:18:42 I told you, look, like, just hit download, hit download on the Stripe data, hit download,
    0:18:43 send it to us.
    0:18:44 I won’t ask any more questions.
    0:18:45 Yeah.
    0:18:46 And then we’ll have a chat.
    0:18:48 And that hopefully is what we delivered.
    0:18:54 We quickly scrambled to do a very fun small person chat with you and some of the partners
    0:18:57 where you called some of them old, bald and boring.
    0:19:00 And we were excited to do the deal.
    0:19:06 I think I was at an LP summit running around in Las Vegas, trying to call you to get to a
    0:19:07 terms, et cetera.
    0:19:08 And that’s how it all worked out.
    0:19:14 And after a while, I brought you five, six pounds of steak as an excitement and a deal.
    0:19:18 I want to double click on the momentum as a moat piece, Ben, because you have an interesting
    0:19:20 background in that you’ve been doing it similar for a while.
    0:19:22 You worked at Snap beforehand, among other places.
    0:19:28 And I remember Ben Thompson had this post about Snap where he said that Snap has a gingerbread
    0:19:32 strategy where basically if they invent stuff, Meta is going to copy it.
    0:19:34 So they just have to keep inventing stuff.
    0:19:36 And I guess it’s called the breadcrumbs or something.
    0:19:38 And you also have backed a number of these.
    0:19:43 You’ve had a front row view to, hey, these network effects aren’t as necessarily defensible
    0:19:44 as they once were.
    0:19:47 And so companies need to keep innovating, keep pushing.
    0:19:52 And so share more about how this kind of momentum, especially as it moves to AI companies, this
    0:19:54 theory was born and what it really means for defensibility.
    0:19:58 Yeah, this might be somewhat interesting to you, Roy and folks, which is I did not have
    0:19:58 that view before.
    0:20:03 I did not think gingerbread strategy necessarily worked, nor momentum was a moat.
    0:20:03 I did not.
    0:20:09 I actually truly believed in these handcrafted artisan products that really get to the core
    0:20:11 of why people want to use it.
    0:20:15 I still somewhat believe core of it, but these artisan products where it just takes a while
    0:20:16 to build it.
    0:20:17 And it’s like very, very nuanced.
    0:20:21 And I have believed that led to high retention.
    0:20:26 So the thing that I looked at the most always was, is this product highly retained?
    0:20:31 Does it product have like network effect on the traditional sort of theories of moat?
    0:20:36 And what I realized, and this was like true to some extent in the era of mobile.
    0:20:38 Mobile is like a two decade old platform.
    0:20:41 So a lot of things have been tried and a lot of people try different things.
    0:20:45 And therefore finding something of people came back again and again and again was the most
    0:20:46 important thing in my mind.
    0:20:47 And then AI hit.
    0:20:51 And I still had that framework where, oh, like I’m going to look for things that are
    0:20:53 highly retentive and repeated again and again.
    0:20:54 And guess what?
    0:20:56 Things change too fast.
    0:21:00 Like the underlying model changes every day or every week.
    0:21:06 If you like craft this thing and open AI as someone like built their new model to include that
    0:21:08 part in their new product, you’re done.
    0:21:08 You’re gone.
    0:21:14 So then it couldn’t become about like this highly thoughtful, slow build product.
    0:21:18 It needed to be something where founders knew how to move extremely quickly.
    0:21:21 And that included product that included distribution.
    0:21:25 And because these fucking things are so magical, AI is so magical.
    0:21:28 We like built the digital God, locked it in a chatbot.
    0:21:34 Because it’s so magical right now, kind of anything goes like people will give it a chance.
    0:21:41 And therefore, what’s really important is to try to build the plane as it’s falling down the cliff.
    0:21:48 And people who enjoy the thrill of the plane going down and actually is excited about building as it goes down,
    0:21:50 I think those are the winners of the next day.
    0:21:57 And so when I think about folks like Roy, it’s the type of founder archetype who gets value and is excited
    0:22:03 and leading the charge in terms of that speed, whether it’s marketing, distribution, or product build.
    0:22:08 And usually all of that needs to come together to build an extremely durable, long data product.
    0:22:13 And that, to me, eventually will turn into a product that needs to be retained, that needs to be used every day.
    0:22:18 But we’re still in this early stage of AI where I think momentum is the most.
    0:22:22 Going back to Roy, I’d love to hear how you think about it.
    0:22:29 Because we talked about it a little bit over chat where you think like stage one, stage two, stage three of building Cluely
    0:22:31 and sort of the distribution advantage that you have.
    0:22:35 And you keep talking about this, oh, maybe X and LinkedIn people don’t get it right now.
    0:22:37 But that gap may narrow over time.
    0:22:39 So how do you think about the next stage and et cetera?
    0:22:43 But that’s how it links to sort of my theory of momentum of the moment.
    0:22:44 I want to get to Roy’s product strategy.
    0:22:47 But first, I want to add some points, which is it was interesting.
    0:22:53 Paul Graham, when he started Y Combinator, identified that technical founders were undervalued,
    0:22:54 that they were being underappreciated.
    0:22:56 And people thought, oh, you need to have an MBA.
    0:23:03 And you realize, hey, it’s easier to teach technical founders business than it is to teach business people how to build great products or how to code.
    0:23:08 And then what happens over the next 15 years, it becomes way easier to build these things.
    0:23:11 You know, AWS, low-code AI, et cetera.
    0:23:14 And distribution becomes the scarcity in that there’s so much software.
    0:23:20 It’s such a flooded ecosystem, as you quoted in your piece, Andrew Chen’s piece, about how all the marketing channels suck.
    0:23:22 They’ve all been sort of wrang out dry.
    0:23:25 And so distribution is now the scarcity.
    0:23:29 And so in the same way, you know, the technical co-founder, now there’s almost like an audience co-founder.
    0:23:30 How do you really break out?
    0:23:35 And we’ve seen creators start to build business, like Mr. Beast with his Feastables, right?
    0:23:37 You know, Kylie Jenner with her, what was it?
    0:23:37 Lip gloss.
    0:23:38 Yeah, commerce.
    0:23:40 But no one’s really done it with software.
    0:23:42 No one has put the two and two together.
    0:23:48 You know, Justin Bieber tried to do a social network called Shots, or John Jahidi using Bieber.
    0:23:54 But no big creators really built, whether a consumer or enterprise, huge software company.
    0:23:57 They built commerce or physical goods.
    0:23:59 And I always thought, hey, why doesn’t Mr. Beast launch like a square competitor?
    0:24:01 Like, he’s got all these eyeballs.
    0:24:03 Like, I’m sure he’s got to get better margins than Feastables.
    0:24:07 And he has games, and he’s a friend and a friend of the firm, and he’s done phenomenally well.
    0:24:10 But what I like is that Roy is putting both of those together.
    0:24:14 And so maybe you can walk through a little bit.
    0:24:16 You talked about your distribution strategy.
    0:24:20 Why don’t you talk about how the product strategy has evolved from the beginning, and then we get to the sequencing.
    0:24:27 Yeah, I think something that a lot of people miss is that the first line of Cluely code was written, like, 10 weeks ago.
    0:24:29 So this is, like, really new.
    0:24:33 And this started with Interview Coder, which is just, like, a product that I coded up over a weekend in my dorm.
    0:24:36 And it was, like, a tool to let you cheat on interviews.
    0:24:43 And what we realized after we got about, like, 250 million impressions on the whole scenario is, like, wow, we just got so many eyeballs on this thing.
    0:24:51 Maybe if we can do this again, but we have a more general-use product with a similar UX because we think we’re really onto something with the UX here, then maybe we can make a lot more money.
    0:24:53 And that’s what we did with Cluely.
    0:24:55 And we just launched it as, like, Interview Coder for everything.
    0:24:55 You know, cheat on everything.
    0:24:59 Let’s just see what happens, and the usage data will tell us what people are using it for.
    0:25:00 That’s, like, exactly what’s happening.
    0:25:05 Now we have this general-purpose cheating tool, which is, in reality, is just, like, an invisible AI overlay.
    0:25:07 And here’s a new user experience for AI.
    0:25:09 Let’s push it out to a bunch of people and see what happens.
    0:25:13 And as a result, like, now we’re past, like, a billion views overall on Cluely.
    0:25:15 And we’re probably, like, the most viral startup in the world.
    0:25:20 And we have all this usage data that literally tells us, hey, here’s where this is most sticky, and here’s where the product direction needs to go.
    0:25:29 And I think that’s, like, the core advantage of distribution is you do not have to worry about market fit or anything because your users will tell you where the direction of market fit is headed.
    0:25:33 And their usage data will literally give you the information that you need to know.
    0:25:36 And if you don’t have usage data, then you’re literally shooting blind.
    0:25:41 Like, every person who has built a company before knows that you can’t really know what direction you’re going.
    0:25:42 You have to talk to your users.
    0:25:46 But I feel like if your distribution is strong enough, you don’t need to, like, talk to your users.
    0:25:47 You just need to look at their data.
    0:25:48 You just need to look at the aggregate number.
    0:25:52 When you’re also redefining kind of what a minimum viable product is to some degree.
    0:25:53 Yeah.
    0:25:57 Other people, other companies will spend many months building this thing and then seeing how people use it.
    0:26:03 But for you, if you can sort of draft the right content, you can test out the idea in a much quicker way to see, hey, is this really resonating?
    0:26:04 Yeah, exactly.
    0:26:07 I mean, when we launched the video, like, we barely had a functioning product.
    0:26:12 Like, the day before is when we finished our final test and we’re like, okay, we think this works.
    0:26:14 Now let’s just launch the video as soon as possible.
    0:26:20 And we launched the video and, like, all of a sudden, tens of thousands, we just said, hey, let’s just throw sales calls in the videos to see if people use it for sales calls.
    0:26:21 Because that seems like a pretty lucrative space.
    0:26:25 All of a sudden, we have, like, over a million dollars of enterprise revenue coming in for people using it for sales calls.
    0:26:32 And it’s just, like, you can shot in the dark distribution a lot quicker and a lot more accurately than you can shot in the dark product.
    0:26:35 And you don’t need, like, a million product integrations.
    0:26:36 Like, it’s just so much quicker.
    0:26:48 And what’s even better about it is that the iteration loop is much faster, too, because the algorithm will literally tell you via a number, which is number of views, like, shares, whatever, like, how well your strategy is going to work.
    0:26:51 So it’s much, much easier to test, is this viral?
    0:26:55 Does this have viral fit rather than does this have, like, you know, market fit?
    0:26:59 Roy, does that mean you sort of let the audience guide where the product goes?
    0:27:00 Is that how you sort of think about it?
    0:27:01 Yeah, yeah, exactly, yeah.
    0:27:12 Let’s talk a little bit about the form factor, because one of the things that internally we discussed is, look, Roy’s probably top 1%, now I revise it, the top 001% in the world in terms of knowing how to distribute.
    0:27:22 The Venn diagram of that and people who know and had the instinct to build a semi-translucent overlay, it sounds so simple.
    0:27:26 Does disappearing picture sound so simple?
    0:27:26 It’s easy.
    0:27:27 It’s not technically hard.
    0:27:31 Like, half-translucent overlay, that sounds simple.
    0:27:32 It’s not technically hard.
    0:27:32 Yeah, yeah, yeah.
    0:27:38 But that overlap, to me, was what gave me so much excitement around what you’re building.
    0:27:40 I actually have it right now on.
    0:27:42 Eric, did you go to University of Michigan?
    0:27:44 Yeah, I did.
    0:27:45 See, I didn’t know that.
    0:27:46 I did not know that.
    0:27:47 But I have Cluly open.
    0:27:49 I’m like, you went to U of M.
    0:27:49 Got that.
    0:27:52 Oh, you did philosophy, policy, and economics, I think?
    0:27:52 Yeah.
    0:27:53 Great.
    0:27:54 We can sort of bond over that.
    0:27:57 This, all of a sudden, is an incredible tool.
    0:27:57 That’s amazing.
    0:28:07 When you talk more about where you see the product going, or particularly how you think about it, anyone who’s building AI tools is asking themselves the question of, how is this defensible from one of the major players?
    0:28:09 Will OpenAI, et cetera, just build this feature?
    0:28:18 How do you think about making your product truly defensible, especially from the people that, because of their reach, their size, OpenAI is a distribution, too, right?
    0:28:19 So how do you think about this?
    0:28:24 Yeah, I mean, I guess we’re first to move in a pretty novel UX.
    0:28:26 And I think we did get to translucent.
    0:28:29 I think everyone’s going to inevitably get to translucent overlay.
    0:28:31 This is how integrated AI should feel.
    0:28:36 And Apple shows everyone that liquid glass is the translucent overlay that will be the form factor of AI in the future.
    0:28:38 Right now, I feel like it’s just a land grab.
    0:28:45 And if the question is about distribution, then I think there’s actually like a pretty strong case for us to make that we will actually end up distributing better than OpenAI.
    0:28:50 And it’s enough that you could probably bet on us at like, well, like a 30,000x discount.
    0:28:52 I’m actually not worried about distribution.
    0:28:56 And I think the quality of the product, I mean, it’s quite simple.
    0:29:04 I really feel like this is just a land grab right now to see who can convince as many consumers and enterprise first that they are the guy who deserves to win the translucent overlay.
    0:29:06 And right now, we’re making so much noise.
    0:29:09 I mean, like with the translucent overlay, like why would it not be us?
    0:29:11 And when did you figure out the translucent overlay?
    0:29:13 Yeah, I mean, I was just in my dorm with Neil.
    0:29:18 And we were just like, we literally spent every day thinking about how can we make InterviewCoder more invisible to interviews.
    0:29:19 And we played around.
    0:29:24 And there’s probably like 20 to 30 versions of InterviewCoder in the past that just we thought didn’t work.
    0:29:28 Essentially, it feeds you a code answer, like an answer to a coding problem.
    0:29:30 And you need to overlay that on top of your code.
    0:29:33 And we’re just like, man, I really need this integrated into my code.
    0:29:36 I need to see what I’m doing as well as see the answer that AI is giving me.
    0:29:38 And eventually, we just landed on translucency.
    0:29:40 And this was like, wow, this is like a magical moment.
    0:29:42 This is what the product needed.
    0:29:47 And like very soon, we realized, like, why are we only thinking about coding interviews and like software engineering coding interviews?
    0:29:49 This is such a small market.
    0:29:50 This is true for everything.
    0:29:52 AI should not feel like a separate window.
    0:29:54 Like it should be integrated seamlessly.
    0:29:55 And that looks like translucency.
    0:29:59 Brian, you said you were inspired by the Dragon Ball Z Scouter as an example.
    0:30:04 I would love, Rory, to chat about the staged approach of how you’re thinking about it, I think.
    0:30:05 Like right now, we’re distribution first.
    0:30:07 And then we’ll sort of build the product as we go.
    0:30:11 Stage two, here’s how we do it with a bunch of engineering prowess and product development, etc.
    0:30:14 We sort of chatted a little bit about that on text.
    0:30:18 And to me, that was like, oh, okay, like you’ll figure out product as you go.
    0:30:19 Yeah, yeah, yeah.
    0:30:20 I’d love to talk a little bit about that.
    0:30:24 I mean, right now, the internet is up in storms saying, hey, where’s the product?
    0:30:25 Where’s the product?
    0:30:30 And the two things that like we are literally working day and night to build out the product
    0:30:32 that we have in our head that the users are telling us that they want.
    0:30:36 But also every video I make that’s not directly about the product
    0:30:39 drums up so much more hype for the eventual product launch video.
    0:30:41 Like I will guarantee that this will be viral.
    0:30:45 And I guarantee that it will be more viral than if we just launched like earlier.
    0:30:50 And I think there’s truth in the statement of launch early, ship fast, launch before you’re ready.
    0:30:53 But for some reason, when we’re doing that at scale,
    0:30:55 like it feels like everyone is, oh, no, you launched too early.
    0:30:56 Now, what are you talking about?
    0:30:57 This is the playbook.
    0:30:59 Like we wrote our first lines of code 10 weeks ago.
    0:31:02 We’re like earlier than the latest YC batch of companies.
    0:31:05 Yet we’re like generating probably more revenue than every single one of them.
    0:31:09 And like the product is literally two and a half months since being built.
    0:31:10 In my perspective, we are pre-launch.
    0:31:14 And the huge benefit of being like massively distributing pre-launch is that
    0:31:19 you will know what product to build with as much certainty as you could possibly get.
    0:31:23 And if you can distribute and hype it up to an audience of literally millions of people like,
    0:31:25 hey, this product’s coming out, this product’s coming out.
    0:31:29 And we’re screaming to the world, AI overlay, like AI that sees your screen, here’s your audio.
    0:31:33 Like the second we make that, like, why would you pick anyone else’s product to use?
    0:31:36 We’ve been screaming it from the top of our lungs since day one, like before day one even.
    0:31:41 And like right now, the stage is distribution, get it into everyone’s mind.
    0:31:41 What is Clue.
    0:31:44 Clue is the invisible AI that sees your screen, here’s your audio.
    0:31:45 Everyone knows this.
    0:31:48 And as soon as we launch it, like who else will they download?
    0:31:49 Yeah.
    0:31:52 One thing I think that’s fascinating about what you’re doing, and I’ll compare it to our friends
    0:31:58 at TBPN who are kind of rebranding or reclaiming, they call themselves corporate driven media.
    0:32:00 Because everyone was like, we’re independent media.
    0:32:02 And they’re like, no, no, we’re corporate backed.
    0:32:03 We’re more honest that way.
    0:32:08 And they’re just leaning into like all the bits about ramp and, you know, say 5%.
    0:32:10 And there’s this kind of humor to it.
    0:32:15 And I think similarly, you’re kind of leaning into the controversy or being controversial as
    0:32:19 a strategy where some people think, oh, that’s fake or forced or whatever.
    0:32:23 But you’re at the same time, you’re super authentic in the way that you’re doing it.
    0:32:24 And people feel like they know you.
    0:32:29 And sometimes even if there’s a character, an exaggeration, that also feels authentic in
    0:32:29 some way.
    0:32:31 It’s just kind of a unique style.
    0:32:34 Yeah, I mean, I think this like just reflects a growing shift in society.
    0:32:38 I mean, literally for the past few decades now, there’s just been such this sharp drop
    0:32:39 off in professionalism.
    0:32:43 And I really think it’s because like content creation has been democratized.
    0:32:46 Like the first ever YouTube creators were just people making funny videos in their dorms.
    0:32:48 And this was the most authentic, like people crave this.
    0:32:52 Nobody wants to see another ad or another corporate newspaper like bullshit like that.
    0:32:55 Like they want to see some real person doing real things.
    0:33:00 And the democratization of content creation has allowed authentic people who make content to be
    0:33:01 seen by millions.
    0:33:03 And now authentic people who create content will be seen by millions.
    0:33:07 And for some reason, it’s just like no company gets this, that you’ll have 100 X followers
    0:33:09 and your post will be about introducing blah, blah, blah, blah.
    0:33:11 And it’s like the most corporate bullshit ever.
    0:33:13 You’re not going to get any views and nobody wants to see this.
    0:33:17 Would you rather have a world where everyone was extremely transparent and even the founders
    0:33:20 were honest and like everything you see is just someone’s authentic life?
    0:33:22 Would you rather see like a bunch of corporate bullshit everywhere?
    0:33:26 Like what we envision for the end state distribution of Cluey is that it does not feel like an ad
    0:33:27 at all.
    0:33:30 This is just the story of someone’s life that you want to see.
    0:33:31 And it is like true and authentic.
    0:33:36 And I try to be like, I’m probably more transparent about everything than I’d say probably 99% of
    0:33:37 companies in the value are.
    0:33:40 It’s funny because what you’re saying is some people get so mad at, I heard someone call it
    0:33:43 Riz marketing, which is a compliment, but they’re like, I hate this Riz marketing.
    0:33:45 Like it’s got to be product first, get back to fundamentals.
    0:33:48 But first, this is its own fundamentals too of how the world works today.
    0:33:52 But two is, so of course they can’t destroy you, but even the people who want to destroy
    0:33:55 you, the joke in my head is they could try to kill you, but they can’t kill the idea.
    0:33:55 Yeah.
    0:33:59 Like the way that you’re building companies, like there’s something about it that is going
    0:34:02 to have an impact on the next generation and you’re just starting the company journey.
    0:34:06 What I would say the viral marketing, when you say viral marketing is interesting,
    0:34:06 right?
    0:34:08 Like one leads to another.
    0:34:11 I actually think what you’re doing is like anti-fragile marketing.
    0:34:13 Like you’re so controversial, I’m going to cut off your head.
    0:34:16 And three sprungs up because one is mad at you.
    0:34:17 One is really happy about you.
    0:34:18 One is neutral.
    0:34:22 Like you get all these, like every time someone comes at you and comes at the idea of clearly,
    0:34:25 the more aura points it gets.
    0:34:26 Like aura farming.
    0:34:29 Any lessons from the most controversial stuff that you’ve done?
    0:34:30 Oh, is it always triple down?
    0:34:33 How do you think about, is there a line where to cross it?
    0:34:39 So I think the few lessons have been never punched down, like never, ever even remotely
    0:34:40 close to punched down.
    0:34:46 And I think people reward and the algorithm does reward authenticity more than anything.
    0:34:47 Like it rewards many things.
    0:34:50 But one of the things that it rewards most is like authenticity.
    0:34:53 And I think you’ll see me in Twitter like every once in a while, I’ll make like a genuine
    0:34:54 comment.
    0:34:55 Like, hey, hey, thank you.
    0:34:56 Like, I really respect you or something.
    0:34:58 I think like people love to see that.
    0:35:00 I saw your response to Gary Tan.
    0:35:00 Yeah, yeah.
    0:35:01 I mean, I mean, it’s true.
    0:35:01 Yeah.
    0:35:02 Like I respect that guy.
    0:35:05 And I hope that one day when the company gets to where I imagine it will be, then he
    0:35:06 will come turn around.
    0:35:10 And I don’t think the lesson is to triple down on everything.
    0:35:14 But I think the lesson is that if you’re honest, then the algorithm will reward you because
    0:35:18 there’s literally zero other company out there that is being fully honest about everything.
    0:35:20 And there’s zero founder out there that behaves authentically.
    0:35:23 The only other person I might think is like Elon Musk.
    0:35:27 And I think that’s a really good role model to have in business making.
    0:35:34 As you build clearly into the first destination for consumers and enterprise alike, how does
    0:35:40 like the type of stunts, if you will, and skits actually fit into the type of customer that
    0:35:41 you want to serve?
    0:35:43 I think we’re headed towards a future where this is the new normal.
    0:35:47 I mean, 40 years ago, we were way more professional than we are now.
    0:35:51 If you were even an engineer, you come into work, you come in with a suit and tie.
    0:35:52 And if you don’t, it’s distasteful.
    0:35:52 It’s disgusting.
    0:35:53 I cannot believe it.
    0:35:55 Like you should never, ever show up in a hoodie and sweatpants.
    0:35:59 Now you’re weird if you come up with a suit and you’re not in a hoodie and sweatpants.
    0:36:01 Everyone wants more authentic things.
    0:36:05 Right now, for some reason in society, we’re still lingering onto this image that companies
    0:36:09 need to be like brand friendly and boring and never say anything controversial or whatever.
    0:36:13 Like I don’t understand how this became the societal norm.
    0:36:15 But in reality, people want to see interesting things.
    0:36:17 I mean, like that’s the point of life is you see interesting things.
    0:36:19 There’s just this lack of professionalism.
    0:36:24 And again, with the distribution of short form content, like everyone sees the craziest
    0:36:26 things and you just get desensitized to these things.
    0:36:29 And that’s why sports like sperm racing are able to be so hyperviral.
    0:36:33 Nobody would have aired this on CNN 10 years ago, but you don’t need CNN anymore.
    0:36:34 You have Instagram, TikTok.
    0:36:36 And Instagram, TikTok, they love sperm racing.
    0:36:39 As a result, they’re able to raise at a fucking mass evaluation, like genuinely have a shot
    0:36:40 at being like the next legit sport.
    0:36:42 And it’s not just sperm racing.
    0:36:44 I mean, it’s every company in the world.
    0:36:47 There’s Sam Altman talking about how hot the guys are that GPT generates in the timeline.
    0:36:50 There’s Elon Musk doing ravings about his political takes.
    0:36:53 Like every company is getting more and more controversial, less professional.
    0:36:55 This trend is not dying anytime soon.
    0:36:58 I’m just the person that is pushing the envelope perhaps a little further than the world is ready
    0:36:59 for at the moment.
    0:37:02 But I think that the question that everyone should linger on a little bit is, can you imagine
    0:37:04 a world where we do win?
    0:37:05 Can you imagine the world?
    0:37:08 What will the state of the world look like if Cluey does win?
    0:37:13 And we prove to everyone like the bar for professionalism is here where we’ve determined it.
    0:37:17 The corporate culture of America as a whole is going to shift in perhaps the entire world.
    0:37:22 Everyone will realize, oh, we’ve had our like panties up in a bunch worrying about brand
    0:37:25 image and professionalism when in reality, the world craves something different.
    0:37:30 And I have very strong conviction that I’m right in this because I was right about X.
    0:37:35 And I did not understand for life and me why nobody is producing the viral videos that
    0:37:37 were so obviously designed to go viral for the algorithm.
    0:37:41 It’s just because nobody has caught on and nobody’s willing to like press that button.
    0:37:44 Now that I’ve pressed the button once, just imagine what the world looks like if Cluey makes
    0:37:45 it.
    0:37:46 You probably like are more interested in that.
    0:37:50 That would be a much more interesting world if every company was being 100% radically transparent
    0:37:53 and doing exactly what the most interesting thing was.
    0:37:57 As Elon says, the most entertaining outcome is the most likely.
    0:37:58 It’s true.
    0:38:01 On that note, this has been a fantastic episode with Rory, Brian.
    0:38:02 Thanks so much for coming on the podcast.
    0:38:02 Yeah.
    0:38:03 Thank you so much for having me.
    0:38:04 Thank you.
    0:38:09 Thanks for listening to the A16Z podcast.
    0:38:14 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash A16Z.
    0:38:17 We’ve got more great conversations coming your way.
    0:38:18 See you next time.

    What if virality wasn’t a tactic — but the entire product?

    In this episode, a16z General Partners Erik Torenberg and Bryan Kim sit down with Roy Lee, cofounder and CEO of Cluely, one of the most talked-about consumer AI startups of 2025. Cluely didn’t raise a mega round or drop a feature suite to get traction – it broke through by turning distribution into design: launching viral short-form videos, pushing polarizing product drops, and building in public with speed and spectacle.

    We cover:

    – Why virality is Cluely’s moat

    – Building a brand-native AI interface

    – The Gen Z founder mindset

    – What most startups get wrong about attention

    – Why creators are the new product managers

    – Cluely’s long-term vision for ambient AI

    Cluely is a glimpse at the next generation of startups,  where the line between product and performance is disappearing.

     

    Timecodes: 

    00:00 Introduction 

    01:07 Early Success

    02:02 Roy’s Journey: From College Kid to Tech Universe

    04:37 The Turning Point: Harvard and Beyond

    06:57 Building Cluey: The Early Days

    08:27 The Viral Strategy: Mastering Algorithms

    13:56 The 50 Interns Experiment

    15:30 The Investment Journey: Roy and Bryan’s Partnership

    19:20 Momentum as a Moat: The Future of AI Companies

    20:32 The Evolution of Product Strategy in the AI Era

    21:19 The Importance of Speed and Adaptability

    22:48 The Role of Distribution in Modern Startups

    24:26 Roy’s Journey and Product Development

    25:25 The Power of User Data and Feedback

    26:58 Innovative Marketing and Distribution Tactics

    28:25 The Future of AI Integration and Translucent Overlays

    32:15 Controversial Marketing and Authenticity

    34:01 The Impact of Radical Transparency

    36:42 The Changing Landscape of Professionalism

    38:26 Concluding Thoughts and Future Vision

    Resources: 

    Find Roy on X: https://x.com/im_roy_lee

    Find Bryan on X: https://x.com/kirbyman01

    Learn more about Cluely: http://cluely.com/

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

  • #817: 4-Hour Workweek Success Stories — Charlie Houpert on Building “Charisma on Command” to 10M+ Subscribers, From Charging $10 for Seminars to Making Millions, Living in Brazil, Critical Early Decisions, and The Secret to Freedom

    AI transcript

    Charlie Houpert is the co-founder of Charisma on Command, a company that helps people develop confidence, charisma, and strong social skills. Originally launched as a 4-Hour Workweek-inspired “muse,” it has since grown into one of the largest platforms for social skills and confidence training, with more than 10 million YouTube subscribers worldwide and more than a billion views across its channels in six languages. His flagship course, Charisma University, has guided more than 30,000 members through practical steps to become more magnetic.

    This episode is brought to you by:

    Patagonia‘s call-to-action to protect America’s public lands. Go to Patagonia.com/Tim to learn more and act now.

    Monarch Money track, budget, plan, and do more with your money: MonarchMoney.com/Tim (50% off your first year at monarchmoney.com with code TIM)

    LinkedIn Jobs recruitment platform with 1B+ users: https://linkedin.com/tim (post your job for free)

    *

    Timestamps:

    [00:00:00] Start.

    [00:06:44] Charlie meets the boogeyman (me).

    [00:10:11] Why defaulting to management consulting after college felt like daily self-betrayal.

    [00:13:21] Leaping into parkour training via DVD as a first business attempt.

    [00:15:45] Moonlighting vs. burning-ships entrepreneurship.

    [00:16:54] Negotiating remote work with a 90% raise.

    [00:21:22] Charlie moves to New York and kicks off KickAss Academy.

    [00:22:16] Airbnb survival tactics while living in a 396 sq. ft. apartment.

    [00:23:26] Using the fear-setting exercise and other disaster-mitigation strategies.

    [00:26:11] Charlie’s first blog post and crossing the publishing Rubicon.

    [00:28:26] How Charlie’s first in-person class prompted an accidental business model.

    [00:34:21] 10 go-getters make an ambitious move to Brazil.

    [00:32:14] The daily growth whiteboard system.

    [00:37:58] How a harsh Tucker Max consultation galvanized the rebranding to Charisma on Command.

    [00:44:39] From financial downturn to pre-selling a course for $12,500.

    [00:50:44] Finally making enough money to chase summer in six-to-eight-month increments.

    [00:52:00] Enjoying the sustainable benefits of creating timeless content.

    [00:54:05] How Bill Clinton seduced 7,000 people into following Charlie on YouTube.

    [00:55:46] How Greg McKeown’s Essentialism helped solve Charlie’s “Herbie” problem.

    [00:58:26] Evolving funnel flow and fame-jacking.

    [01:03:46] YouTube algorithm changes, short-form content, and maintaining audience trust for the long term.

    [01:10:58] Why I still create this podcast.

    [01:19:30] The dangers of succumbing entirely to audience expectation over authenticity.

    [01:21:42] The catalysts that led to time off, an ayahuasca retreat, and a seven-year transformation process.

    [01:30:26] Making the transition from 50/50 partner to sole owner.

    [01:35:16] Recommended reading: Six Pillars of Self-Esteem by Nathaniel Branden

    [01:37:32] The influence of The Last Psychiatrist blog.

    [01:41:46] Jay Abraham coaching: “Make it good enough for Tim Ferriss.”

    [01:43:52] How testimonials added a 4x conversion lift.

    [01:44:31] Coming to an agreement with the co-founder.

    [01:47:20] Joe Hudson and the Art of Accomplishment.

    [01:51:57] Why I stand by The 4-Hour Workweek without further revision, warts and all.

    [01:55:06] Exercising gratitude even when receiving praise is difficult.

    [01:59:15] Relationship with earlier work: video vs. writing.

    [02:02:05] Don’t miss “Filling the Void.”

    [02:03:56] More recommended reading.

    [02:06:43] Improv & Dragons.

    [02:08:06] Charlie’s billboard: “Don’t think, feel.”

    [02:08:57] Parting thoughts.

    *

    For show notes and past guests on The Tim Ferriss Show, please visit tim.blog/podcast.

    For deals from sponsors of The Tim Ferriss Showplease visit tim.blog/podcast-sponsors

    Sign up for Tim’s email newsletter (5-Bullet Friday) at tim.blog/friday.

    For transcripts of episodes, go to tim.blog/transcripts.

    Discover Tim’s books: tim.blog/books.

    Follow Tim:

    Twittertwitter.com/tferriss 

    Instagraminstagram.com/timferriss

    YouTubeyoutube.com/timferriss

    Facebookfacebook.com/timferriss 

    LinkedIn: linkedin.com/in/timferriss

    Past guests on The Tim Ferriss Show include Jerry SeinfeldHugh JackmanDr. Jane GoodallLeBron JamesKevin HartDoris Kearns GoodwinJamie FoxxMatthew McConaugheyEsther PerelElizabeth GilbertTerry CrewsSiaYuval Noah HarariMalcolm GladwellMadeleine AlbrightCheryl StrayedJim CollinsMary Karr, Maria PopovaSam HarrisMichael PhelpsBob IgerEdward NortonArnold SchwarzeneggerNeil StraussKen BurnsMaria SharapovaMarc AndreessenNeil GaimanNeil de Grasse TysonJocko WillinkDaniel EkKelly SlaterDr. Peter AttiaSeth GodinHoward MarksDr. Brené BrownEric SchmidtMichael LewisJoe GebbiaMichael PollanDr. Jordan PetersonVince VaughnBrian KoppelmanRamit SethiDax ShepardTony RobbinsJim DethmerDan HarrisRay DalioNaval RavikantVitalik ButerinElizabeth LesserAmanda PalmerKatie HaunSir Richard BransonChuck PalahniukArianna HuffingtonReid HoffmanBill BurrWhitney CummingsRick RubinDr. Vivek MurthyDarren AronofskyMargaret AtwoodMark ZuckerbergPeter ThielDr. Gabor MatéAnne LamottSarah SilvermanDr. Andrew Huberman, and many more.

    See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

  • Why Your Emotions Don’t Have to Control You with Ethan Kross

    AI transcript
    0:00:05 Cutting costs isn’t the strategy, it’s a survival tactic.
    0:00:08 But you’re not here to survive, you’re here to thrive.
    0:00:10 That’s why there’s Brex.
    0:00:15 Brex is a finance platform for companies looking to drive growth.
    0:00:20 It’s a corporate card, banking, expense, and travel platform,
    0:00:25 all powered by AI, all built to help your business excel.
    0:00:29 More one-way for startups, more control for scale-ups,
    0:00:31 less busy work for everyone.
    0:00:33 Brex helps you build boldly.
    0:00:41 Go to brex.com slash grow and get a platform that helps you turn finance into a strategic edge.
    0:00:48 Here on Remarkable People, we know that complexity can be the enemy of efficiency.
    0:00:51 That’s the philosophy behind Freshworks.
    0:00:55 While legacy software stacks can slow teams down,
    0:01:01 Freshworks builds intuitive tools that can help your team do their best work without the clutter.
    0:01:06 And when it comes to AI, it’s not about replacing humans.
    0:01:08 It’s about amplifying what makes us remarkable.
    0:01:31 I find it remarkable that on a daily basis, all of us are frequently challenged with having to manage our emotions in some fashion.
    0:01:38 Whether it be to turn up the volume on our happiness a little bit or turn the amplitude down on our anxiety or spend a little bit less time ruminating about something.
    0:01:43 These are frequent experiences that characterize a human species.
    0:01:49 We know about tools, science-based tools that exist that can help people.
    0:01:57 And we do not share these tools in a systematic way with folks in the same way that we share physical exercise tools with folks.
    0:02:00 I had my first gym class in first grade.
    0:02:04 I knew how to do a push-up and a jumping jack in first grade.
    0:02:06 Why don’t people know about how to shift?
    0:02:11 Why don’t our kids know about the Batman effect or how to strategically deploy their attention?
    0:02:13 I think it’s a huge problem.
    0:02:18 Hello, everybody.
    0:02:19 It’s Guy Kawasaki.
    0:02:22 This is the Remarkable People Podcast.
    0:02:24 We’re on a mission to make you remarkable.
    0:02:27 Today’s remarkable guest is Ethan Cross.
    0:02:34 He’s a psychologist, a neuroscientist, and he’s the author of two very great books.
    0:02:38 And we’re going to talk about one of them, his latest, called Shift.
    0:02:43 He’s an expert on emotion regulation and inner voice.
    0:02:48 He directs the Emotion and Self-Control Laboratory at the University of Michigan.
    0:02:54 I must say, that is a very unusual name for a lab, emotion and self-control.
    0:02:58 But anyway, that’s not something that would work out well in Silicon Valley.
    0:03:02 So the name of his two books are Chatter and Shift.
    0:03:12 And basically, he works on building bridges across science and practical tools to help people harness their thoughts and emotions.
    0:03:14 And so welcome to the show, Ethan.
    0:03:16 Hey, thanks for having me, Guy.
    0:03:17 It’s an honor to be here.
    0:03:25 I hate when podcast hosts go off the rails from the very start, but I’m going to do that right now.
    0:03:32 I hate to admit it, but one of the stories that you start Shift with, I’ll summarize the story.
    0:03:33 Correct me if I’m wrong.
    0:03:40 Because somebody found this Incan skull that was hundreds of years old and they sent it to the experts.
    0:03:55 And there was a square hole in the skull and you attribute this operation that some Incas had done in an attempt to modulate the emotion of whoever’s head they drilled.
    0:04:03 As I read that, I said, how can you possibly know that the Incan doctor did that to regulate emotions?
    0:04:04 And I was wondering that.
    0:04:07 So can you just clear that up for my foggy mind?
    0:04:09 Well, it’s not a foggy mind.
    0:04:10 It’s a precise mind.
    0:04:25 And so the statement in the book wasn’t about that particular skull on its own, but rather I was referring to the intervention and why we think historically that intervention was used in some cases.
    0:04:30 So the intervention was trepanation, carving holes in people’s skulls, often while they were still alive.
    0:04:41 And medical historians believe that one of the reasons why that technique was used was to help people manage big dysregulated emotions.
    0:04:42 Why might that be the case?
    0:04:48 If you go back in time, our theories of emotion dysregulation were quite different from what they are today.
    0:05:04 If you’ve got a person who’s acting out in seemingly irrational ways or maybe is totally withdrawn, perhaps the source of that malady is an evil spirit inside you that you need to purge yourself of.
    0:05:08 And so ergo, cutting a hole in people’s skull to let that spirit release.
    0:05:13 That idea was actually quite common throughout much of human history.
    0:05:32 Eight to 10,000 years ago, we were carving holes in people’s skulls, but in the Middle Ages, we were slashing people’s forearms and letting blood drip out of their system, cleansing the humors, cleansing the blood of toxins that might be creating emotions run amok.
    0:05:35 So just to be clear, your question is a great one.
    0:05:39 I absolutely do not know that it was that one specific skull, nor does anyone.
    0:05:46 It’s rather the intervention more broadly that we think was partially used for that purpose.
    0:05:46 Okay.
    0:05:59 Now, Ethan, what happens if 10,000 years from now, people are going to say, do you realize that people 10,000 years ago were sticking students inside these big machines they called MRIs,
    0:06:05 and they were bombarding them with magnetic waves and figuring out what part of their brains are active?
    0:06:09 Like, isn’t that the equivalent of drilling square holes right now?
    0:06:13 It’s a question I’ve asked myself many, many times.
    0:06:31 Like, I love history and dug deep into the history when I was researching Shift, and it does make me think about whether the things that we ask people to do or the prescriptions we are providing for things that can help people may be causing harm or not.
    0:06:39 And in all cases, we think really carefully in modern days about ensuring that we are not doing any harm.
    0:06:45 And the good news is that we actually have guideposts to steer us.
    0:06:58 And we’re using practices to determine that before we have a student do something, that it’s not going to create harm, at least to our awareness, which did not really exist back then.
    0:07:07 And so the other thing I’d point out is there are, of course, lots of other things we’ve done historically to help folks that have not resulted in carnage.
    0:07:13 And much of Shift, much of what I talk about, like, there are dozens of tools.
    0:07:14 Maybe we’ll talk about some of them.
    0:07:14 Maybe we won’t.
    0:07:24 But these tools that we have used scientific techniques, neuroimaging being one of them, but lots of other interventions as well to help us identify.
    0:07:30 What I love about these tools is they’re relatively non-invasive.
    0:07:31 They’re actually not relative.
    0:07:33 They’re just non-invasive, right?
    0:07:41 These are changes in the way we think or behave that science shows can put people on different emotional trajectories.
    0:07:45 And so I think the risk in these cases is pretty, pretty slim, but always good to ask.
    0:07:51 My interpretation of Shift is that emotions can be tools.
    0:07:53 Is that correct, first of all?
    0:07:56 Ding, ding, ding, ding, ding, you’ve got it.
    0:07:57 I would just clarify one thing.
    0:08:14 All emotions are tools, even the quote-unquote bad ones, which is a point that we often lose in contemporary society when we talk about emotions because we often hear, oh, you should try to live a life free of negative emotions.
    0:08:17 A, not possible, B, not desirable, because they’re tools.
    0:08:21 Now, first of all, I had to wrap my head around that.
    0:08:28 So to me, a tool is something that gets you from point A to point B.
    0:08:35 But it seems to me that for many people, emotions is the point B, right?
    0:08:36 So I want to be happy.
    0:08:39 I want to experience the emotion of happiness.
    0:08:44 But now you’re telling me that happiness is a tool to get somewhere else.
    0:08:47 So where am I trying to get if not happy?
    0:08:49 It could be trying to get to.
    0:08:57 It can also be a motivational force that puts you in specific situations or contexts that actually serve you well.
    0:09:05 They’re motivational forces that are activated in particular situations that orient us to excel in those circumstances.
    0:09:08 So you chose happiness as one candidate of emotion.
    0:09:10 Let me give you a dramatically alternative.
    0:09:12 Let’s say anxiety.
    0:09:17 Anxiety is an end state that I think most people don’t aspire to be anxious.
    0:09:19 There are some people who do like living in that zone.
    0:09:21 But I think most people do not.
    0:09:27 But it can be a tool that helps prepare us for potential threats.
    0:09:40 So when I think about instances in my career where a presentation didn’t go as well as I liked, where a meeting didn’t go as well as I liked, there were instances in which I felt zero anxiety beforehand.
    0:09:50 Because I felt no anxiety, there was no force that was motivating me to zoom in on the situation at hand and to start preparing for it.
    0:09:55 When experiencing the right proportions, those negative emotions can, in fact, be quite useful.
    0:09:56 Anger is another example.
    0:09:57 Guy, when was the last time you were angry?
    0:10:02 How long can the podcast go?
    0:10:03 I mean, yesterday.
    0:10:08 If we were to break down, like, when do people become angry?
    0:10:13 They become angry when their sense of right and wrong is challenged.
    0:10:15 So your view of the world is challenged.
    0:10:20 And there’s something you can do about it to rectify the situation.
    0:10:25 So when those conditions are met psychologically, we experience this response of anger.
    0:10:26 What does anger motivate us to do?
    0:10:27 You zoom in.
    0:10:31 You approach the situation and now you try to fix it.
    0:10:34 That can be helpful in those circumstances.
    0:10:39 So it’s a way of getting us typically to a desired point.
    0:10:49 You don’t need to be a rocket scientist to figure out that the fact that you named your book Shift means that these emotions and tools help you shift.
    0:11:00 So there are three facets of shifting refers to shifting refers to shifting refers to when your emotions are no longer serving you well.
    0:11:09 And we can all easily think about shifting, shifting involves being able to turn the volume on that emotion up or down.
    0:11:19 It involves, in other instances, shortening or lengthening how long you sit in a particular emotional response.
    0:11:28 And in some cases, it can involve moving from one state, happiness, to another one, contentment altogether.
    0:11:34 So Shift captures those three psychological jiu-jitsu moves, if you will.
    0:11:41 And the entire premise behind it is we evolve this capacity to experience emotions for a reason.
    0:11:43 They give us an advantage.
    0:11:45 They help us quickly prepare for different situations.
    0:11:48 But they’re a very blunt tool.
    0:11:54 They can easily be triggered out of proportion to the circumstances we’re dealing with.
    0:12:06 And the amazing thing in my eyes about human beings and the human brain is we also co-evolved the capacity to rein in these emotional responses.
    0:12:10 But we don’t get a user’s guide on how to master that capacity.
    0:12:12 And that’s what the book is all about.
    0:12:15 I actually want to throw a question back to you, if I can, Guy, though.
    0:12:28 But you said when we started that emotions and self-control, those two terms in Silicon Valley might be a little bit, I forget the word you used, unfamiliar or people might have trouble.
    0:12:31 I’d love to just get a sense of what you meant by that.
    0:12:32 I’m fascinated by it.
    0:12:36 I’m being sarcastic about the current state of Silicon Valley.
    0:12:54 It seems that the only emotions that’s coming out of many of the most successful people is self-survival and optimization of my own particular case, right?
    0:12:59 So I want low rates of long-term capital gains.
    0:13:02 I want crypto to be successful.
    0:13:05 It’s all about me, myself, and I.
    0:13:09 And I am turning into someone who is amoral.
    0:13:16 And I will suck up to whoever I have to suck up to get long-term low-capital gains and make crypto successful.
    0:13:20 And I don’t view that as the high road, Ethan.
    0:13:24 Yeah, you’re not putting me in a positive mood state thinking about that.
    0:13:27 Got it, got it, got it.
    0:13:34 Okay, so it’s a kind of instrumentality towards just optimizing financial benefit.
    0:13:49 Every business is under pressure to save money.
    0:13:53 But if you want to be a business leader, you need to do more to win.
    0:13:59 You need to create momentum and unlock potential, which is where Brex comes in.
    0:14:01 Brex isn’t just another corporate credit card.
    0:14:04 It’s a modern finance platform.
    0:14:08 That’s like having a financial superhero in your back pocket.
    0:14:15 Think credit cards, banking, expense management, and travel, all integrated into one smart solution.
    0:14:22 More than 30,000 companies use Brex to make every dollar count towards their mission, and you can join them.
    0:14:29 Get the modern finance platform that works as hard as you do at brex.com slash grow.
    0:14:38 If it were as easy as playing them a good song, you psyched up your daughter, I would play them that good song.
    0:14:42 So let’s talk about some of the mechanisms of shifting.
    0:14:48 So you brought up that great story about getting your daughter ready to play soccer by playing a song.
    0:14:52 What are the methods to catalyze shifts?
    0:14:54 So I like to break them down.
    0:14:58 You know, I think it’s easier for us to wrap our head around categories of tools.
    0:15:05 So you can bin the shifters, which are tools to push our emotions around, into two broad categories.
    0:15:10 Internal shifters, which are tools we, they’re in-built tools.
    0:15:12 We take them with us wherever we go.
    0:15:20 And then there are external shifters, which are a more, in some ways, complex set of tools that exist in the world around us.
    0:15:23 If we start with the internal ones, there are three categories there.
    0:15:34 The first one are sensory shifters, which are probably the lowest effort shifters in our toolbox that we often overlook.
    0:15:41 And so sensory shifters refer to all of the different senses that we possess, sight, sound, touch, smell.
    0:15:43 What do our senses do?
    0:15:47 They allow us to take in information about the world around us.
    0:15:54 I like to describe the sensory shifters as like, imagine having satellite dishes mounted all over your body.
    0:15:56 They’re constantly taking in information.
    0:16:02 And one of the reasons we take in that information is we need to know, like, how to navigate the world in a safe way.
    0:16:11 And so emotions, the experience of emotions are intertwined with our process of making sense of the world.
    0:16:14 One of the most powerful examples out there for me is music.
    0:16:18 I’ve been listening to music from a time I was five years old.
    0:16:21 I remember I got my first cassette tape.
    0:16:22 I’m dating myself now.
    0:16:26 I never stopped to think about, like, why I listened to music.
    0:16:27 I just always did it.
    0:16:36 And then I had this experience with my daughter that I describe in the book where I realized the value of music as an emotion regulation tool that can be strategically hard.
    0:16:38 Because my daughter is young.
    0:16:39 She’s playing soccer.
    0:16:40 I’m coaching the team.
    0:16:43 I look forward to this soccer match every week.
    0:16:45 And she wakes up in a funk.
    0:16:47 She doesn’t want to go.
    0:16:50 Nothing that I do to cheer her up is getting her excited.
    0:16:52 I’m beginning to get depressed.
    0:16:54 We get in the car.
    0:16:58 And just randomly, a pump-up song comes on the radio.
    0:16:59 Journeys Don’t Stop Believin’.
    0:17:01 I start jamming out.
    0:17:02 I start singing.
    0:17:04 I look into the rearview mirror.
    0:17:06 I see my daughter’s bopping her head along.
    0:17:08 And we both are invigorated.
    0:17:19 If you ask people why they listen to music, almost 100% of participants will say, I listen to music because I like the way it makes me feel.
    0:17:25 But if you then look at, hey, the last time you’re angry or anxious or sad, what did you do to make yourself feel better?
    0:17:30 Only between 10% and 30% of participants across studies report using that modality.
    0:17:36 Music is an incredibly fast way of modulating your emotions.
    0:17:40 It’s not going to solve your greatest problems.
    0:17:54 But what it can do is put you on a different emotional trajectory for a little while, opening you up to the possibility of then using other tools to help you go deeper into solving that malady that you are experiencing.
    0:17:55 And that’s just one example.
    0:17:57 What are your favorite foods, Guy?
    0:18:03 Oh, my favorite foods are, I’m going to have to get pretty local.
    0:18:09 There’s a Hawaiian dish called laulau, which is, it’s this pork wrapped in leaves.
    0:18:10 It’s really salty.
    0:18:11 I love laulau.
    0:18:12 I love poi.
    0:18:15 I love spam musubis.
    0:18:16 We could have a whole show on food.
    0:18:20 And I’ve actually had Andrew Zimmerman and Roy Yamaguchi.
    0:18:21 I love musubis.
    0:18:23 There you go.
    0:18:26 You’re using love to describe these experiences.
    0:18:38 Few things are as decadent and transportive for me as eating a dark chocolate covered peanut butter cup after dinner each night.
    0:18:40 Little one, not too excessive.
    0:18:42 But taste.
    0:18:43 Think about smell.
    0:18:52 You go into nice hotels, they pump fumes to change the way you feel about yourself and the places around you.
    0:18:54 So that’s one type of shifter, our senses.
    0:18:56 Well, how about when you get a new car?
    0:18:59 Yeah, the new car smell.
    0:19:01 It’s so simple and obvious on the one hand.
    0:19:06 On the other, we know that people just overlook this stuff.
    0:19:11 Like, you go to the airport, like, you go into the duty-free shop.
    0:19:14 This is an emotion regulation emporium.
    0:19:17 People are just selling perfumes and colognes.
    0:19:20 Like, what is the purpose of these substances?
    0:19:28 They are managing the way you feel about yourself and they are managing the way other people feel about you.
    0:19:30 That’s the senses, right?
    0:19:31 And a touch.
    0:19:33 Like, that’s another powerful one.
    0:19:35 Touch is the first sense to develop.
    0:19:37 It develops while we’re in the womb.
    0:19:44 The first thing we do with babies to comfort them, to regulate them, is we hold them.
    0:19:48 I don’t know about you, but I love to be held, even to this day.
    0:19:50 My kids are getting older.
    0:19:51 I’m still trying to grab their hand.
    0:19:53 They’re like, get away from me.
    0:19:54 What’s wrong with you?
    0:20:00 But even the fist bump with a colleague at work, a hug of my partner.
    0:20:06 These are regulatory experiences that are available to us.
    0:20:11 And so it’s just low-hanging fruit to build into your shifting repertoire.
    0:20:19 Now, of course, that’s not, again, going to help us deal with the really big bouts of anxiety and depression.
    0:20:20 It’s going to contribute to it.
    0:20:25 We know, by the way, that when people try to shift in their daily lives, they typically don’t do one thing.
    0:20:32 On average, they use between three and four different tools at any given moment in time to push their emotions around.
    0:20:42 We did these large studies during the COVID pandemic where we wanted to see what are the tools that are really helping people manage their anxiety from one day to the next.
    0:20:45 It was not one thing.
    0:20:46 It was combinations of tools.
    0:20:50 So your sense is that’s one set of tools you can use.
    0:20:56 Another big internal shifter are what I call attention.
    0:20:59 You could think of attention as our mental spotlight.
    0:21:09 And one of the things that we often get wrong about attention is we often tell people, you should never avoid the things that are bugging you.
    0:21:11 Like, you got to work through them.
    0:21:14 And I mean, this is one of the first things that we’re taught, right?
    0:21:16 Like, when there’s a problem, what do you do?
    0:21:19 When you’re a little kid, your parents say, run away.
    0:21:20 Don’t address it, right?
    0:21:22 They roll up your sleeves and you get to the bottom.
    0:21:25 You confront that issue, right?
    0:21:26 So we’re taught that from a very young age.
    0:21:32 We’re often here as we get older that avoiding things chronically is harmful to us.
    0:21:39 And so as a result of those different experiences, a lot of us develop this heuristic, this decision-making rule.
    0:21:41 Hey, when we’re in trouble, don’t avoid.
    0:21:42 Approach.
    0:21:44 Here’s what we’ve learned about this.
    0:21:53 If your approach to managing your emotions involves always, the moment something happens, I’m going to just repress it, suppress it, avoid it, never come back to it.
    0:21:55 If that’s all you do, this is not good.
    0:22:01 Tons of data showing that chronically avoiding our things leads to negative outcomes.
    0:22:10 But that doesn’t mean that you can’t be flexible with your attention when you’re dealing with a problem, right?
    0:22:13 Chronically avoiding things can be bad.
    0:22:25 But moving back and forth between focusing on a problem for a little bit, taking some time away, coming back to it, going back and forth in that manner, that can be really, really useful.
    0:22:40 If you’ve ever gotten an email and it provoked you and you decided, you know what, I’m not going to respond right now in the moment I just received that, I’m going to come back to it tomorrow or next week.
    0:22:44 You have experienced the benefits of strategic attention deployment.
    0:22:49 And that is an example of how powerful attention can be for modulating our emotions.
    0:22:53 You know, I never thought about that with email.
    0:23:02 And I’ll tell you, I have an even better system than waiting a day, which is when an email really pisses me off.
    0:23:10 I send it to Madison and Madison answers as me much better than I would ever answer.
    0:23:17 So she is my strategic attention, what was the phrase, strategic attention deployment?
    0:23:18 Yeah.
    0:23:21 But that also touches on another powerful tool.
    0:23:27 It’s actually a perfect segue guide to the final category of shifters, like boom, which is perspective, right?
    0:23:32 So sometimes you can’t divert your attention away.
    0:23:34 We don’t have the luxury to do it or we don’t want to.
    0:23:37 And you got to stare at something in the face and deal with it.
    0:23:44 And what we’ve learned is that shifting your perspective, looking at the bigger picture can be very helpful.
    0:23:46 But that’s often really hard to do.
    0:23:51 It’s easier said than done to just change the way you’re thinking about something.
    0:23:58 And so you just said that you, when you’re provoked, you’ll send it to Madison and have her respond to it, right?
    0:24:06 Why is Madison so adept at responding to your provocations and you’re not?
    0:24:16 The reason is we know it’s a lot easier to be wise and rational about someone else’s problems than ourselves.
    0:24:22 We don’t have the same level of immersion in the problem, so we can think about it with more clarity.
    0:24:38 What we’ve also learned is that there are tools you can use to shift your perspective, to adopt the perspective of another person when thinking about your problems that can make it much easier to not fire off the email that you will later regret.
    0:24:42 As an example, there’s a tool called Distant Self-Talk.
    0:24:43 It’s one of the first tools I use.
    0:24:48 It involves trying to work through my problems using my own name and you.
    0:24:51 All right, Ethan, what do you think you should do here?
    0:24:53 How are you going to manage this situation?
    0:24:56 Now, that may sound strange.
    0:24:58 Number one, I’ll point out a disclaimer.
    0:25:02 I typically do that silently, not out loud in front of other people.
    0:25:05 But think about it for a second.
    0:25:09 Kai, when do the word you, when do you typically use that word?
    0:25:11 This is not a trick question.
    0:25:18 I would say that it’s usually used in an accusatory sense.
    0:25:23 But even more basic, when you’re using that word, it’s usually about someone else.
    0:25:24 It’s someone else.
    0:25:33 So, the vast majority of times that you use the word you, it’s when you’re thinking about or referring to another person.
    0:25:34 Right?
    0:25:39 We just said, like, other people, they’re much better at dealing with our problems often than we are.
    0:25:43 So, when you use the word you to refer to yourself, it’s not like.
    0:25:45 It’s switching your perspective.
    0:25:48 It’s putting you in this frame of mind as, now I’m coaching someone else.
    0:25:49 I’m giving someone else advice.
    0:25:50 It’s no longer me.
    0:25:53 You get some space from the ego in that sense.
    0:25:56 Like, all right, what do you think you should do here?
    0:25:59 Guy, I am really good at giving advice to my buddies.
    0:26:03 Much better than I’m often about giving advice to myself.
    0:26:05 So, Ethan, what do you think you should do here?
    0:26:07 Here’s how I think you should manage it.
    0:26:16 It’s putting you in this coaching frame of mind that research shows leads us to make wiser, emotionally intelligent decisions and helps us regulate.
    0:26:25 And so, that’s another kind of very subtle tool that you can use to shift your perspective when you’re trying to grapple with a big emotion.
    0:26:33 Okay, then you are opening up another whole can of worms is a negative, but I mean.
    0:26:34 Yeah, bring it.
    0:26:35 It’s good.
    0:26:36 This makes it more fun to talk.
    0:26:46 Okay, so, if Madison is a great modulator for me, imagine what AI is, right?
    0:26:48 So, I get this email that pisses me off.
    0:26:53 I upload it to AI and I say, draft the response.
    0:26:56 Isn’t that the ultimate use?
    0:26:59 Not exactly.
    0:27:00 Here’s why.
    0:27:06 I thought one of these days it may be and with a proper training, I think it can absolutely be.
    0:27:15 Madison is someone who, to be clear to all of you who are listening, I don’t know much about Madison, right?
    0:27:17 I don’t know the background.
    0:27:21 I don’t know how she came into your life.
    0:27:27 But I’m going to guess that there was some screening involved, that you were careful in your selection of Madison.
    0:27:35 She wasn’t someone that just showed up all of a sudden and is responding to the most important people in your life who are pissing you off.
    0:27:42 Other people, as I point out in both of my books, because this is such an important issue,
    0:27:52 other people can be a remarkable asset or a tremendous liability when it comes to our emotional lives.
    0:28:01 And so, I encourage people to think really carefully about who they are sending their emails to respond on their behalf.
    0:28:07 Like, that’s a very intimate and privileged request that you are making of her.
    0:28:09 You would not let anyone do that.
    0:28:17 And so, if AI is amalgamating lots of information from the internet and assuming that, I don’t know what the weighting factors are,
    0:28:27 but these are the best ways to respond, my sense is that we can actually get a lot more nuanced and tailored to identify the best types of responses.
    0:28:34 Now, if you trained all of Madison’s responses into an AI chatbot or whatever, then we’re getting a lot closer.
    0:28:37 And that might well be a bionic response for you.
    0:28:42 Madison, do you want to set the record straight or speak up for yourself or do anything?
    0:28:46 I think we benefit one another and we balance each other out really well.
    0:28:50 Madison, though, are your services available to respond to my annoying emails?
    0:28:53 We can try it out.
    0:28:56 Madison, GPT.
    0:28:58 There we go.
    0:29:12 In your book, I read about this case of the woman whose baby had this peanut allergy and almost died and stuck an EpiPen in her thigh and saved her.
    0:29:16 And then for years, she was traumatized by that.
    0:29:24 So now you could make the case that the emotion of fear and reaction and all that saved her daughter’s life.
    0:29:29 But you could also make the case that those things lived on and became a very negative thing.
    0:29:32 So how does one get past that?
    0:29:38 You read a story about how Jean Hackman’s wife had Hantavirus and died and then he died later.
    0:29:41 And so now you’re afraid of mice all the time.
    0:29:43 Like, how do you deal with those kinds of emotions?
    0:29:52 I was fortunate to have a really wonderful mentor in graduate school who, you know, you come into graduate school and you look at the scientific literature.
    0:29:54 And it’s like, your mind is going to explode.
    0:29:55 There’s all this complexity.
    0:29:58 And my God, how do I wrap my head around it?
    0:30:00 And he sat me down in the first couple of weeks.
    0:30:03 And he says to me, Ethan, at its core, it’s really simple.
    0:30:11 When you get to the topic of self-control or emotion regulation, I use those phrases somewhat synonymously.
    0:30:14 It’s about feeling the way we want to feel, thinking the way we want to think.
    0:30:17 There are really two core components.
    0:30:19 Number one is motivation.
    0:30:21 And number two is ability.
    0:30:27 So number one, you have to believe that you can control yourself.
    0:30:28 Right?
    0:30:32 And the reason for that is, let’s use exercise in this example.
    0:30:33 Let’s use surfing as an example.
    0:30:36 We were talking about surfing before we started recording.
    0:30:50 If I don’t think there’s any way that this uncoordinated human being can get up there and surf a wave, it doesn’t make, like, why am I going to wake up early to even try?
    0:30:53 Why am I going to spend the money on the lessons, get the surfing board, whatever.
    0:30:55 So you’ve got to be motivated.
    0:31:08 Number two, just having the motivation is not enough because I can tell you there was an instance in which I was motivated to surf.
    0:31:13 I went to Hawaii on a family vacation several years ago and I showed up and I got the surfboard.
    0:31:17 And before I got the lesson, I tried to surf.
    0:31:19 And let me tell you, it did not go very well.
    0:31:22 You also need tools.
    0:31:27 You need to know what are the tactics that allow you to achieve those different goals.
    0:31:44 So if we go back to the mom whose fear response saved her child from dying but then had that fear response overgeneralized, now she’s concerned about anything that could potentially happen to the child negatively and it’s consuming your life.
    0:31:49 Number one, you’ve got to believe that this is something you can get a handle on.
    0:31:54 That may seem like a simple idea, like why is this guy even emphasizing this?
    0:32:03 But if you look at the research literature, some studies report that approximately 40% of people do not think they can control their emotions.
    0:32:05 40% guy.
    0:32:10 If you don’t think you can control your emotions, why are you even going?
    0:32:11 You’re not going to try.
    0:32:15 So step one is you’ve got to believe that you can do this.
    0:32:27 And then number two is if you’re finding that you are constantly experiencing emotions being triggered out of proportion, you need to learn the tools that are out there to help you rein these responses in.
    0:32:29 That is what the book is all about.
    0:32:47 I find it remarkable that on a daily basis, all of us are frequently challenged with having to manage our emotions in some fashion, whether it be to turn up the volume on our happiness a little bit or turn the amplitude down on our anxiety or spend a little bit less time ruminating about something.
    0:32:52 These are frequent experiences that characterize a human species.
    0:32:59 We know about tools, science-based tools that exist that can help people.
    0:33:08 And we do not share these tools in a systematic way with folks in the same way that we share physical exercise tools with folks, right?
    0:33:11 I had my first gym class in first grade.
    0:33:15 I knew how to do a push-up and a jumping jack in first grade.
    0:33:17 Why don’t people know about how to shift?
    0:33:23 Why don’t our kids know about the Batman effect or how to strategically deploy their attention?
    0:33:24 I think it’s a huge problem.
    0:33:36 So, I mean, you touched on it before, but continuing with this woman with the EpiPen and her fears, what are the tools she could throw at this problem?
    0:33:40 To help her, well, she could use distant self-talk, right?
    0:33:41 That’s number one.
    0:33:50 If she finds herself over-dramatizing a situation, say, what would you tell your sister if she was going through this with her kid?
    0:33:51 That’s a way of shifting a perspective.
    0:33:54 You could zoom out and look at the bigger picture.
    0:34:03 So, oftentimes, to correct people’s, in particular, anxious response or their worry response, you can get them to think like a scientist about the situation.
    0:34:09 Hey, what’s the probability that this negative experience might actually befall someone?
    0:34:17 Thinking through the probability of a negative outcome occurring can be really powerful for folks.
    0:34:27 Like, when you think about the probability of, for example, your plane crashing as compared to getting hit by a car or getting into a car accident, it is a striking comparison.
    0:34:30 You are much, much more likely to get hit by a car.
    0:34:33 Those kinds of experiences often have weight.
    0:34:37 You could encourage them to activate their senses.
    0:34:42 You could encourage them to activate their emotional advisory board.
    0:34:49 We haven’t talked about that yet, but who are the people in your life who are skilled at doing two things for you when you are struggling?
    0:34:54 Number one, they provide you with a sense of comfort and support.
    0:34:55 They validate what you’re going through.
    0:34:56 They empathize with you.
    0:35:01 Yeah, you may recognize at some level that what you’re going through is irrational.
    0:35:02 It’s embarrassing.
    0:35:04 I may not want to share it with you.
    0:35:10 But these are people who you can confide in, who will make it clear to you that, you know what?
    0:35:14 We all experience those embarrassing kinds of reactions at times.
    0:35:15 It’s normal.
    0:35:16 But they don’t stop there.
    0:35:23 They also then work with you to broaden your perspective, to help you get a handle on the situation.
    0:35:25 That’s a really powerful tool.
    0:35:31 So lots of different tools that mom can leverage to help herself.
    0:35:39 Now, what if the mom counters and says, listen, humans have been evolving for millions of years.
    0:35:41 It’s our maternal instinct.
    0:35:43 It’s our desire to be safe.
    0:35:45 I cannot suppress this.
    0:35:50 In fact, I should not suppress this because that’s what helps us survive.
    0:36:01 I think what you want to do is, in that case, point out, look, no one is asking you to suppress these maternal instincts you have.
    0:36:13 You are talking to me because you are telling me that these maternal instincts are actually getting in the way of you living the life that you want to live.
    0:36:13 Right.
    0:36:17 And it’s not because you’re having the instincts in the first place.
    0:36:19 It’s because they are metastasizing.
    0:36:26 They are becoming so big and by your own recognition out of proportion with the situation at hand.
    0:36:39 So why don’t we just try a little experiment to see what might happen if we try to rein those emotional responses in just a little bit and see if that’s something that you like or not.
    0:36:47 And so it’s simply offering an invitation to folks to experiment with these tools to gauge the impact they have on their lives.
    0:36:53 Ultimately, what I care about doing is helping people self-regulate.
    0:36:54 What is self-regulation?
    0:37:00 It’s about aligning your thoughts, feelings and behaviors with your goals.
    0:37:05 It’s about getting you to live the life that you want to live.
    0:37:12 Now, that might not be the life that I would want for myself or even for you if there’s someone that I care about, but that’s what it is.
    0:37:23 I don’t think someone’s coming to me or someone else if they think that their responses are exactly the way they should be and life should be because they should be happy in that situation.
    0:37:25 Up next on Remarkable People.
    0:37:30 How do you know that avoidance is working or not working?
    0:37:31 Here’s how you know it’s working.
    0:37:42 You put it in a box, you go, you distract, you look at other things, and you don’t find yourself thinking about this thing once you put it in the box and move away.
    0:37:44 You’re living the life you want to live.
    0:37:54 Do you want to be more remarkable?
    0:37:59 One way to do it is to spend three days with the boldest builders in business.
    0:38:07 I’m Jeff Berman, host of Masters of Scale, inviting you to join us at this year’s Masters of Scale Summit, October 7th to 9th in San Francisco.
    0:38:20 You’ll hear from visionaries like Waymo’s Takidora Mawakana, Chobani’s Hamdi Ulukaya, celebrity chef David Chang, Patagonia’s Ryan Gellert, Promises’ Phaedra Ellis Lampkins, and many, many more.
    0:38:25 Apply to attend at mastersofscale.com slash remarkable.
    0:38:29 That’s mastersofscale.com slash remarkable.
    0:38:31 And Guy Kawasaki will be there too.
    0:38:37 Become a little more remarkable with each episode of Remarkable People.
    0:38:41 It’s found on Apple Podcasts or wherever you listen to your favorite shows.
    0:38:46 Welcome back to Remarkable People with Guy Kawasaki.
    0:39:01 So who is in the Ethan Cross Hall of Fame for people who are really good shifters that, you know, you can hold us up as a hero that emulate this guy or this gal.
    0:39:03 They’re great at shifting.
    0:39:04 You can learn a lot from them.
    0:39:08 My wife is unbelievable at shifting.
    0:39:11 Actually, we were on a walk last night.
    0:39:14 I was saying like, are you ever worried about anything?
    0:39:17 Because I don’t hear very much about it.
    0:39:23 And mind you, this is someone who I’ve been in a relationship with for 25 years, quarter of a century, right?
    0:39:24 It’s a long time.
    0:39:36 I, of course, know that she can worry about things at times, but she’s amazingly adept at being proportional about when the worry begins to percolate.
    0:39:42 She quickly looks at the bigger picture like, well, is this something that I need to worry about right now?
    0:39:49 And maybe if it is something that’s significant, she comes to talk to me or someone else about it.
    0:39:51 She’s really, really good at shifting.
    0:40:05 But here is something that is very important, and that stems from your question that we haven’t talked about yet, that I think, don’t think I want everyone who listens to this conversation to know.
    0:40:15 You can look at my wife as a shifting role model, but the tools that work for her may not work for you.
    0:40:24 So one of the truisms that we have discovered as a field is that different combinations of tools work for different people.
    0:40:33 I wish there were three or four things that I could tell everyone to do, and it would lead them to experience a life of nirvana.
    0:40:36 The science simply does not support that.
    0:40:38 In some ways, that might be deflating to folks.
    0:40:47 On the other hand, I think it might also be a welcome message that, hey, if meditation doesn’t work for you, no worries.
    0:40:51 Because there are lots of other options you have available to yourself.
    0:40:59 So I think that’s just something important that people keep in mind when they search for the right tools to try in their lives.
    0:41:02 Things that work for your friends may or may not benefit from you.
    0:41:05 The challenge is to figure out what tools work best.
    0:41:12 And do you have any explanation for how your wife got to this point, besides being married to you?
    0:41:16 No, no, that’s definitely not the reason why it’s a marriage to me.
    0:41:22 I think some of it is genetic, that she’s predisposed to not be overly reactive.
    0:41:24 She had a wonderful upbringing.
    0:41:29 We know that adverse childhood experiences can make it more challenging for folks later on in life.
    0:41:34 So she grew up in a family with wonderful, positive attachments and love.
    0:41:40 She also has some intuitive understanding of the different tools that exist.
    0:41:49 She has both stumbled on tools that work for her, in her ability to perspective shift, in her ability to have a great emotional advisory board.
    0:42:03 She also is a psychology major as an undergrad, has learned about other tools that are out there, and has been discriminative in how she has folded different tools into her repertoire.
    0:42:06 The stuff that works for her, she leans on.
    0:42:08 The other tools that don’t, she doesn’t.
    0:42:19 She, to give you it by way of analogy, to stay in physical shape, she likes to walk and do Pilates, and occasionally she does spin.
    0:42:20 I like to walk.
    0:42:33 She has learned which exercise regimens contribute to her physical fitness, and the same is true when it comes to her mental fitness.
    0:42:38 But I think that’s the challenge we all face, to learn, hey, what do we got to do to be mentally fit?
    0:42:47 Since we’re discussing all members of your family, now, let’s glean some wisdom from your grandmother.
    0:42:57 Now, it seems to me that kind of her path was she put these bad memories into a box and kept that box closed.
    0:43:04 Is that a good practice, or is that dysfunctional in that box?
    0:43:08 It’s somehow going to open up someday, and, you know, what should we do?
    0:43:15 And the dysfunction we’re talking about is, of course, the Holocaust, which is a big thing to put in a box.
    0:43:20 So when do you put something in a box, or when is that advisable?
    0:43:23 I’d give one caveat to the description of my grandmother.
    0:43:30 She put things in a box, but she didn’t seal it airtight and leave it buried for the rest of her life.
    0:43:34 She would actually open that box a few times a year.
    0:43:42 She’d have a Remembrance Day that was organized with co-survivors, where they would let that box open and just dig into it deep.
    0:43:52 And on the rare occasion that she’d bump into another survivor in between the yearly Remembrance Day events, they would sometimes talk about these experiences.
    0:43:55 At all other times, though, it was locked shut.
    0:44:09 That was an approach that worked for her, and there’s research which shows that ability to, like, compartmentalize an experience and then come back to it on rare occasions, that that can work for other people, too.
    0:44:15 We cannot predict yet for whom and when that is going to work.
    0:44:18 We’re doing the science on precisely that issue right now.
    0:44:26 What scientists have done a pretty good job at doing is identifying specific techniques that can help people or harm them.
    0:44:34 What we have not yet done is identify the specific contingencies that explain when you should use different techniques.
    0:44:40 Hey, if someone with guys’ background is in this kind of situation, they should do these three things.
    0:44:43 But if they’re in this other situation, they should do these four.
    0:44:45 I cannot give that prescription.
    0:44:47 I don’t know of any scientists who can.
    0:44:48 We’re doing that work.
    0:45:00 Having said that, if you want to start experimenting with this approach to boxing something up for a while and then coming back to it later on, there are a few tips that I provide people with.
    0:45:04 Like, how do you know that avoidance is working or not working?
    0:45:06 Here’s how you know it’s working.
    0:45:17 You put it in a box, you go, you distract, you look at other things, and you don’t find yourself thinking about this thing once you put it in the box and move away.
    0:45:18 You’re living the life you want to live.
    0:45:23 If so, no need to go back and re-engage with that experience.
    0:45:27 As many would argue, this is not the case.
    0:45:33 I mentioned an anecdote in my book, like actually a point of conflict in my relationship with my dad.
    0:45:38 You know, you brought in my grandmother, my kid, my wife, but I might as well bring in my dad here.
    0:45:40 My parents got divorced when they were 12.
    0:45:43 It was painful when it happened.
    0:45:46 I dealt with the situation, moved on.
    0:45:49 I think it was good for the family that it happened.
    0:45:52 I never think about my parents’ divorce.
    0:45:53 I’m 45 now.
    0:45:54 Never think about it.
    0:46:02 The only time I go back there is when my dad tells me, hey, we need to talk about the divorce.
    0:46:05 And my response is, I’ve got nothing to really say.
    0:46:07 This doesn’t bother me at all.
    0:46:08 I think it was a good thing.
    0:46:10 He hasn’t let go of it.
    0:46:12 And that can create some conflict.
    0:46:16 And so this can show you how there are individual differences here, right?
    0:46:21 If you put it in a box, but you find yourself keep thinking about it over and over,
    0:46:26 that’s a cue that this strategy of compartmentalizing is not working.
    0:46:32 And then what you want to do, use some other strategies or maybe even open up the box and
    0:46:36 figure out why it’s intruding into your current awareness.
    0:46:42 Since you open the can of worms, I don’t mean that in a negative way, but since you open up
    0:46:49 the can of worms about your dad, isn’t this a case of the shoemaker’s dad has no shoes?
    0:46:53 Can’t you give him advice about what to do with that situation?
    0:46:56 Yeah, that’s another really interesting phenomenon.
    0:46:58 Of course, I can give advice.
    0:47:04 The question is whether the advice is uptaken, right?
    0:47:11 And we know when you go from kid to parent back or parent to kid, that can often get a little
    0:47:11 bit dicey.
    0:47:13 Yes, absolutely.
    0:47:17 These are tricky situations and we otherwise don’t have any conflict.
    0:47:20 But on this particular issue, there’s still some lingering emotion.
    0:47:22 Ethan, I have to apologize.
    0:47:26 I had no intention of making this the Ethan family show.
    0:47:27 I mean…
    0:47:33 Hey, once this airs, I might be sleeping in the hotel for a little bit.
    0:47:36 But guy, no apology needed because you know what?
    0:47:40 Everyone in my family is a human being.
    0:47:49 And part of the message of this book and something I believe devoutly in is that emotions and the
    0:47:55 trickiness that surrounds managing them, this is something that we all experience in our own
    0:47:57 unique ways, right?
    0:47:58 All of us.
    0:47:59 These are universals.
    0:48:04 I’m not talking about anything that isn’t relevant to any other person who is listening
    0:48:04 right now.
    0:48:09 The bottom line is it’s a messy situation for everybody.
    0:48:12 I was on book tour for about two weeks, went all over the place.
    0:48:18 And I came away from that book tour both with the recognition that it doesn’t matter who you’re
    0:48:23 looking at and how put together their life seems on the outside.
    0:48:26 We are all dealing with curveballs at times.
    0:48:31 And that was actually not something that I found to be disheartening.
    0:48:34 To the contrary, there was something very normalizing about that.
    0:48:36 And I shared that message with lots of folks.
    0:48:37 We’re all trying to stumble through this existence.
    0:48:43 And what’s great is that more so than any other point in my adult life, there’s a recognition
    0:48:50 that paying attention to these issues is important and actually matters, and that you can use scientific
    0:48:53 methodologies to weigh in on them.
    0:48:59 And so that’s an exciting message for a guy who runs Emotion and Self-Control Lab to bring
    0:49:00 us full circle here.
    0:49:02 Yes, yes.
    0:49:06 So I think that is a perfect place to end this episode.
    0:49:13 I think we’ve given just enough insights and perspective into your book and your work that
    0:49:15 more people will want to read that book.
    0:49:16 The book is called Shift.
    0:49:23 And I mean, Ethan, give your pitch for people to read Shift and let your emotions ring true
    0:49:24 here.
    0:49:26 Shift, Managing Your Emotions So They Don’t Manage You.
    0:49:27 That’s the name of the book.
    0:49:35 If you’re curious about what emotions are, why we have them, why it can at times feel painfully
    0:49:40 difficult to manage them and want to learn more about how to do so according to science,
    0:49:41 pick it up.
    0:49:42 That’s why I wrote it.
    0:49:43 All righty.
    0:49:44 That’s terrific.
    0:49:49 And clearly managing your emotions is part of becoming remarkable.
    0:49:52 So it fits right in line here.
    0:49:58 Ethan, thank you very much for this most shifting episode and shift happens.
    0:50:03 And, you know, it’s a very fascinating read and very fascinating episode.
    0:50:05 Thank you so much.
    0:50:10 And my best to your grandmother, your wife, your kids and your father.
    0:50:10 Oh, my God.
    0:50:13 I’m glad they all made it on our show.
    0:50:15 It’ll be a family event.
    0:50:18 We’ll have a dinner as we listen to this one at launch as well.
    0:50:20 True honor to be invited on.
    0:50:22 And it was an immensely enjoyable conversation.
    0:50:23 So thank you.
    0:50:25 Well, thank you very much.
    0:50:34 And just let me thank Madison, the good side of Guy, who has clearly figured out how to be better at being Guy than Guy.
    0:50:40 And Tessa Neisman, our researcher, our researcher, Jeff C and Shannon Hernandez, our sound design engineers.
    0:50:42 And that’s the Remarkable People team.
    0:50:49 And with guests like Ethan and his book, Shift, that’s how we’re trying to help you be remarkable.
    0:50:50 Thank you.
    0:50:51 Until next time.
    0:50:52 Bye-bye.
    0:50:59 This is Remarkable People.

    What if the ancient practice of drilling holes in skulls was actually an early attempt at emotion regulation? In this fascinating episode, Guy Kawasaki sits down with psychologist and neuroscientist Ethan Kross to explore how we can master our inner voice and harness our emotions as powerful tools.

    Ethan directs the Emotion and Self-Control Laboratory at the University of Michigan and is the author of two groundbreaking books: Chatter and his latest work Shift. He reveals why all emotions—even the uncomfortable ones—serve as essential tools for navigating life’s challenges.

    Discover the three categories of “shifters” that can help you regulate emotions: sensory tools (like music and touch), attention deployment strategies, and perspective-shifting techniques. Learn about distance self-talk, strategic attention deployment, and why your emotional advisory board might be your secret weapon.

    From ancient trepanation to modern neuroscience, from family dynamics to Silicon Valley culture, this conversation unpacks the science behind emotional regulation and provides practical tools you can use immediately.

    Guy Kawasaki is on a mission to make you remarkable. His Remarkable People podcast features interviews with remarkable people such as Jane Goodall, Marc Benioff, Woz, Kristi Yamaguchi, and Bob Cialdini. Every episode will make you more remarkable.

    With his decades of experience in Silicon Valley as a Venture Capitalist and advisor to the top entrepreneurs in the world, Guy’s questions come from a place of curiosity and passion for technology, start-ups, entrepreneurship, and marketing. If you love society and culture, documentaries, and business podcasts, take a second to follow Remarkable People.

    Listeners of the Remarkable People podcast will learn from some of the most successful people in the world with practical tips and inspiring stories that will help you be more remarkable.

    Episodes of Remarkable People organized by topic: https://bit.ly/rptopology

    Listen to Remarkable People here: **https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827**

    Like this show? Please leave us a review — even one sentence helps! Consider including your Twitter handle so we can thank you personally!

    Thank you for your support; it helps the show!

    See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

  • Raging Moderates: A Shaky Ceasefire (ft. Rep. Jim Himes)

    AI transcript
    0:00:02 Support for this show comes from Shopify.
    0:00:07 With Shopify, it’s easy to create your brand, open up for business, and get your first sale.
    0:00:13 Use their customizable templates, powerful social media tools, and a single dashboard for managing it all.
    0:00:20 The best time to start your new business is right now, because established in 2025, it has a nice ring to it, doesn’t it?
    0:00:26 Sign up for a $1 per month trial period at shopify.com slash voxbusiness, all lowercase.
    0:00:31 Go to shopify.com slash voxbusiness to start selling with Shopify today.
    0:00:33 shopify.com slash voxbusiness.
    0:00:39 Get unlimited grocery delivery with PCXpress Pass.
    0:00:41 Meal prep, delivered.
    0:00:43 Snacks, delivered.
    0:00:45 Fresh fruit, delivered.
    0:00:49 Grocery delivery on repeat for just $2.50 a month.
    0:00:51 Learn more at pcexpress.ca.
    0:00:54 We have a favor to ask you.
    0:00:59 The ProfG Pod team is planning for the future of the show, and we want our listeners to be a part of the conversation.
    0:01:03 That’s why we’re hoping you’ll help us by filling out a brief survey.
    0:01:06 Your feedback will help us figure out what’s working, what’s not.
    0:01:09 Please visit us at voxmedia.com slash survey.
    0:01:13 Again, that’s voxmedia.com slash survey to provide us with feedback.
    0:01:19 We do take it seriously if we’re thinking about new product extensions and want to know what we can do better.
    0:01:20 More dick jokes.
    0:01:21 More dick jokes.
    0:01:22 Red your mind.
    0:01:27 Welcome to Raging Moderates.
    0:01:28 I’m Scott Galloway.
    0:01:30 And I’m Jessica Tarlov.
    0:01:39 Okay, Jess, in today’s episode of Raging Moderates, we’re discussing the aftermath of Trump’s strikes in Iran and how we got to a ceasefire and then how we didn’t.
    0:01:47 First, we’re fortunate to have Congressman Jim Himes, the ranking member on the House Intelligence Committee, joining us to break down the latest developments.
    0:01:50 Representative Himes, I very much appreciate you being here.
    0:01:51 Welcome to the show.
    0:01:52 Thanks for having me.
    0:01:57 So, why don’t we just start off with your view of the state of play here.
    0:01:59 Can you break down the latest developments in the Middle East for our listeners?
    0:02:04 Yeah, well, we’re in a real roller coaster ride, right?
    0:02:14 We heard of the ceasefire last night, and then apparently the ceasefire was violated, and the president got very, very angry on social media, and now we may or may not be on a ceasefire.
    0:02:17 Look, a couple of big-picture things that we shouldn’t lose sight of.
    0:02:28 Number one, we went into a war in the Middle East without any congressional deliberation, and that is not according to the law, either the Constitution or the War Powers Act.
    0:02:30 And it’s also not very smart, right?
    0:02:33 And, you know, an awful lot of people are saying, well, presidents have done this forever.
    0:02:34 And that, you know, fair enough.
    0:02:35 That doesn’t make it okay.
    0:02:40 And I’m a big believer that Congress ought to actually abide by the Constitution.
    0:03:04 But the other thing I would point out is that, you know, Bill Clinton sending limited, you know, cruise missile strikes into Somalia or a president putting a few ground forces on the ground in Syria is not playing anywhere near the order of magnitude of what it means to take an offensive strike in an area where you have 40,000 troops, where if things go wrong, gasoline prices could, you know, go to $6 or $7 a gallon.
    0:03:07 This was an instance in which there should have been some consideration.
    0:03:08 Now, where are we?
    0:03:19 Thank God that it would appear that from a tactical standpoint, the military strike was successful in as much as it created a lot of big explosions and everybody got home safe.
    0:03:27 What we don’t know, and this is the question of the day, really, is whether this meaningfully set back Iran’s nuclear program.
    0:03:38 I can’t get into details for obvious reasons, but I see absolutely no evidence that this did anything other than slow the Iranians role a little bit, a little bit.
    0:03:47 And so in the coming days and weeks, we’re going to grapple with the possibility that the Iranians are still in a position to do a pretty quick breakout for a nuclear weapon.
    0:03:49 And what is going to be the Israeli response to that?
    0:03:53 And what is going to be the American response to that if, in fact, that turns out to be true?
    0:04:07 Vice President J.D. Vance sat down with Brett Baer on Special Report on Monday night, and Brett asked him about this and said, well, aren’t you concerned about the fact that they were able to relocate the 60 percent enriched uranium that it could fit in?
    0:04:18 I think it was 10 trunks of cars and because President Trump seemed to be telegraphing a lot of what was going on, that they were actually given enough time to be able to do that.
    0:04:22 And J.D. Vance basically pooh-poohed it and said that doesn’t really matter.
    0:04:29 I assume your assessment is that it does matter that they were able to get the uranium out and that they could start their project over, essentially.
    0:04:41 It’s inconceivable to me that somebody with the brains of J.D. Vance would say that if the Iranians were able to get all of their 60 percent enriched uranium out, that that wouldn’t matter.
    0:04:42 That’s just insane.
    0:04:48 Obviously, if they retain that 60 percent uranium, they have and some centrifuges.
    0:04:55 And it’s very, very unlikely that these raids obliterated, to use J.D. Vance’s word, all of the centrifuges.
    0:05:00 It’s not hard for the Iranians to refine this to weapons grade.
    0:05:05 And then it’s not hard, ultimately, to cobble together a nuclear device.
    0:05:18 So, look, I am sad to see, but not surprised, that J.D. Vance and senior members of this administration are using words like obliterate, which, again, I have seen nothing to suggest that that verb, you know, is in any way applicable here.
    0:05:33 And, again, that raises very serious questions because what do the Israelis do if it turns out that we simply move to the right a little bit, a month or a week, the ability of the Iranians to break out a weapon if they choose to do that?
    0:05:44 And, by the way, what about the fact that now if you’re an Iranian regime member, as awful as you are, you’re also smart enough to know, gosh, the whole negotiations thing was never real.
    0:05:52 And the president tore up the one thing that slowed the Iranians, the JCPOA, and he allowed the Israelis to start bombing in the middle of a negotiation.
    0:05:55 So if you’re an Iranian regime member, you say, OK, we tried that route.
    0:05:56 Now, you know what we’re going to do?
    0:05:57 We’re going to do what North Korea did.
    0:06:04 We’re going to do what Pakistan did, is we’re going to go underground and the world is going to learn about our progress when we actually test a device.
    0:06:06 And at that point, guess what?
    0:06:08 There are going to be no more military attacks on Iran.
    0:06:11 That, to me, is the really kind of horrifying scenario here.
    0:06:30 So, Representative, so if the president had come to Congress and sought congressional approval and laid out exactly and very detailed plans what he was planning to do, the ordinance, the armaments, the risks, the upside, the downside, would you have voted yes or no and why?
    0:06:41 Well, it’s sort of hard to answer that hypothetical question because there would be all sorts of other questions you would need to answer, like what we’ve been sort of alluding to.
    0:06:48 OK, we can make very big explosions in ventilation shafts in Fordo and Natanz, but what else?
    0:06:49 What else?
    0:06:54 What do we do if the 60 percent uranium is in a warehouse somewhere, as it may very well be?
    0:07:00 But let me let me not try to entirely dance around that question, and I’ll tell you what my bias is.
    0:07:02 All I’ve got is history to go on, right?
    0:07:08 And the history of our military interventions in the region in my lifetime is pretty darn bad, right?
    0:07:10 We took out Muammar Gaddafi.
    0:07:14 Libya is now a chaotic dystopia.
    0:07:20 We know the story of Iraq, where we empowered Iran and lost 4,400 troops in our efforts there.
    0:07:24 And of course, we don’t need to talk about Afghanistan to know that that’s not something.
    0:07:32 So anyway, my point, obviously, is what do I have to go on other than the history and the question of whether we have been successful in achieving our strategic aims in the region?
    0:07:36 And the answer to that question is pretty much generally no.
    0:07:48 So let me just say facts matter, but I would have had a very, very strong bias based on our history of ending up with outcomes that none of us would have either predicted or wanted when we get involved militarily in the Middle East.
    0:07:54 I understand, you know, we can’t get in a time machine and we can’t go back and do this differently.
    0:08:03 So we are where we are today and I saw former Secretary of State, Anthony Blinken, was out in the New York Times with an op-ed saying that he thought the strike was a mistake and he hopes it’s a success.
    0:08:09 Can you talk us through what you think a success looks like at this point?
    0:08:14 Do you think there is any chance at an Iranian and Israeli lasting ceasefire?
    0:08:20 And Donald Trump did float the idea of regime change just over Truth Social a couple of days ago.
    0:08:24 Do you think that that is still any part of the conversation?
    0:08:27 Well, yeah, I mean, your question is not too hard to answer.
    0:08:38 And just because I’m concerned, as you might imagine, I can envision and even accept the possibility that, yeah, you know, the Iranian people might finally do what the Argentine people did in 1982.
    0:08:43 And it turned out that the dictatorial generals that were governed them couldn’t even defend the Falkland Islands.
    0:08:45 And the Argentine people said, guess what?
    0:08:51 If you if you bunch of generals can’t even keep us safe from a country that’s 12000 miles away, out you go.
    0:08:59 So wouldn’t that be amazing if the Iranian people had the capacity and the will to finally overthrow this truly evil regime?
    0:09:06 Again, I’m not sure the United States should be in the business of of promoting that kind of regime change because we don’t have a very good track record.
    0:09:09 But, oh, my God, what an amazing outcome that would be.
    0:09:11 And look, it’s possible. It’s possible.
    0:09:18 It would also be amazing if the administration and the Israelis would say, OK, Iran, you’re probably in your weakest point in a generation.
    0:09:20 Let’s now sit down at the negotiating table.
    0:09:26 That’s a little bit of a hard sell, right, because if you’re an Iranian regime member, you say, oh, really, now we’re going to sit down at the negotiating table.
    0:09:30 And if you don’t like what we do, you know, we get another B2 flight over our nation.
    0:09:31 So that’s a hard sell.
    0:09:33 But I wouldn’t completely rule it out.
    0:09:43 The problem is if we had two hours to do it, we could talk about gasoline prices at six dollars, about dead American soldiers and sailors, about missiles, about terrorist cells activated in London and Rome.
    0:10:00 We could talk about the possibility of destabilization in the region and the fact that the Jordanian king, who’s really, really important to us, sits atop a powder keg and that, you know, real volatility could result in regime changes in other places like Jordan, where it would be a catastrophe for us.
    0:10:04 So anyway, let’s acknowledge that there could be a good outcome here.
    0:10:07 It’s just, you know, you’d have to go and get the odds from a bookie.
    0:10:11 You know, how much do you bet on the best case scenario coming out of the Middle East?
    0:10:21 Representative, I worry that as someone who’s a Democrat and is committed to retaking the House and the White House, I worry that, as always, we figure out a way to come across as incredibly weak.
    0:10:28 And that is we’re angry that they didn’t come to us, as you should have, for constitutional bypass the Constitution.
    0:10:36 That now seems to be the norm, almost a given, and not enough conversation around whether or not this was the right move.
    0:10:41 And I want to applaud you for actually addressing the question, but let’s steel man this a little bit because you brought up some issues.
    0:10:42 The price of oil.
    0:10:49 It looks as if right now the oil markets have yawned and don’t believe that this threatens oil prices.
    0:10:56 If the Strait of Hormuz, if in fact it is compromised, it’ll hurt India and China more than it would hurt us.
    0:10:57 We’re fairly energy self-sufficient.
    0:11:07 That Khomeini, at 85 years of age, leading a theocracy that has had its hands cut off, is on the brink of collapse.
    0:11:12 And this might tip it over into collapse, and that we are not planning, as far as I can tell, to put boots on the ground.
    0:11:16 We’re just always remiss to take a victory lap.
    0:11:17 We’re kicking Russia’s ass.
    0:11:20 It feels like Iran’s air defenses are down because of the brave work of the IDF.
    0:11:27 And we have demonstrated that we spend $800 billion for a reason and that we have armaments that no one else has.
    0:11:32 And that the capacity to get closer to a bomb, we know they didn’t get any closer.
    0:11:34 We know that they’re further away.
    0:11:36 We just don’t know how much they’re further away.
    0:11:45 Isn’t this potentially, or most likely even, something that will be looked back as America exerting its power in a thoughtful way,
    0:11:49 and that the Democrats were more focused on procedure than actual outcomes?
    0:11:55 Yeah, well, you know, okay, fair point, Scott.
    0:12:05 And, you know, I really like when we’re talking about military activity and war and our troops to not collapse into a consideration of the politics of this.
    0:12:11 But you ask an interesting question, to which I would say these things can break either way.
    0:12:22 You know, if we were having this conversation in the early first decade of the 2000s and talking to Hillary Clinton and Hillary Clinton says, you know, we Democrats make the argument that you just made.
    0:12:25 We Democrats always end up looking weak.
    0:12:31 So I’m voting yes to give George W. Bush the authority to go into Iraq.
    0:12:32 Great analogy.
    0:12:50 At that moment in time, I’m not sure Hillary Clinton thought to herself that it is that vote, this hawkish vote, because I’m afraid of looking weak, that is probably going to be the single largest factor that an unknown state senator from Illinois named Barack Obama is going to take me out as the presidential candidate of 2008, right?
    0:12:52 So I think these things can can turn on a dime.
    0:12:54 And look, let’s not be silly about this.
    0:13:03 If the best case scenario happens and the regime falls and the new regime or the new government says we’re never going to mess around with uranium or nuclear weapons again.
    0:13:03 Yeah.
    0:13:05 You know, we will have gotten very lucky.
    0:13:08 And I’ll be sad because you say process.
    0:13:13 You know, to me, abiding by the Constitution is not just a reversion to process.
    0:13:17 It’s actually something that every two years I raise my hand and swear to do.
    0:13:23 So I’m sort of a little sad that I would say, well, we’re having a process argument because I think the Constitution is worth defending.
    0:13:27 But anyway, my larger point is that in these sorts of situations, you’re right.
    0:13:30 You know, there is a political implication.
    0:13:37 But again, if you were thinking purely politically, would you have said, yeah, let’s take that Gaddafi guy out?
    0:13:42 Yeah, let’s let’s, you know, try to nation build in Afghanistan because we’ve got the best capabilities everywhere.
    0:13:46 In retrospect, you would say, boy, pretty ugly political position.
    0:14:04 I want to stay on the politics issue, but frame it in a little bit of a different way, because it’s been reported that Democrats were not briefed about the strike ahead of time, including yourself and Senator Mark Warner, both the ranking members on the Intel Committee as members of the Gang of Eight.
    0:14:11 That is something deeply concerning to me, that the Republicans feel like they’re just going to go it alone.
    0:14:14 Can you talk about whether that’s true, the implications of that?
    0:14:26 And if there’s any chance that we can make foreign policy, which has historically been a space that could be fairly bipartisan, return to the norm or at least get a bit better than it is right now?
    0:14:33 Yeah, look, I’ll absolutely acknowledge that there are, you know, issues with congressional consultation, right?
    0:14:40 I mean, Scott didn’t ask this specifically, but implied it, which is, hey, what if we have a four week debate over this attack?
    0:14:43 At that point, haven’t the Iranians completely hidden all their uranium?
    0:14:45 That’s a that’s a fair point. Right.
    0:14:53 And we could have that argument. And maybe you would think about things like informing small numbers of members of Congress, gang of eight leadership, whatever you want to do.
    0:14:57 So there’s a there’s a reasonable argument to have there. But it does stop at the law. Right.
    0:15:03 You know, just because something is hard or inconvenient doesn’t mean that you could violate the law or the Constitution.
    0:15:06 I keep saying that it’s not just process. It’s the law.
    0:15:20 But, yeah, I mean, one thing is unambiguous, Jess, which is that letting republic letting members of your own party know, but not letting the opposition know is a sort of ugly innovation of the Trump administration.
    0:15:31 And look, it it’s sort of dumb, too. Right. Because now if this thing goes horribly wrong, which which you could, politically speaking, yeah, you, Mr. President, own this.
    0:15:35 And by the way, the four or five Republicans you chose to reach out to own it as well.
    0:15:40 And, you know, we’ve got the political defense of you didn’t even I read about this on Twitter, you know.
    0:15:44 So anyway, that’s a pretty ugly new innovation in this from this administration.
    0:15:48 Representative Himes is the ranking member on the House Intelligence Committee.
    0:15:53 You’re just privy to color in detail that the general public and the media isn’t.
    0:16:05 And one of the things that struck me about this attack or specifically the aftermath of the attack is whether it was Iraq or Afghanistan or expelling Hussein from Kuwait, regardless of the success or lack thereof of those interventions.
    0:16:18 The next day we had big nations with substantial armies weighing in and support that there was clearly a lot of groundwork laid to say that, all right, we support this.
    0:16:24 It was clear that we’re not acting alone, that we might be the leadership and have the biggest military in the West.
    0:16:27 But we are, in fact, hand in hand with the West.
    0:16:42 And one of the things that was so striking here and so disappointing was that the only nations that commented on this the next day were the Chinese saying, there they go again, making the world more unstable and Russia mocking us for not diminishing their nuclear capabilities to the extent we were bragging.
    0:16:54 The lack of alliances, the lack of support, this go-alone arrogance to me was so distressing and something that the public didn’t discuss.
    0:17:06 As somebody who is obviously in conversation with our allies, both in, you know, open formats and behind closed doors, can you speak a little bit to, one, do you buy the thesis that we don’t have the support we typically have?
    0:17:12 And two, what you’re seeing across our alliances around this type of activity?
    0:17:15 Yeah, I mean, not surprised, right?
    0:17:25 We know that the Trump administration, you know, doesn’t put much, to put it mildly, value on our allies or about acting together.
    0:17:37 But these interventions that we’ve talked about, some of which didn’t go very well, almost always involved us working with our allies just because practically that’s a good thing.
    0:17:43 And also because we care that we speak as the West and not just as the U.S.
    0:17:54 So George H.W. Bush, when Saddam Hussein invaded Kuwait, famously spent weeks working the phone to put together the coalition that ultimately was successful in removing Saddam Hussein from Kuwait.
    0:18:03 And, you know, the famous coalition of the willing going into Iraq with us again, I think we can look back on that and say, gosh, that didn’t work out quite the way we had hoped.
    0:18:08 But, you know, George W. Bush did do the work to get our NATO allies and others.
    0:18:12 Even in Libya, we were operating under the auspices of NATO.
    0:18:14 So that’s generally a good idea.
    0:18:23 It gives credibility and it gives us, to be fair, on the margin, some operational capacity that we might not otherwise have on the margin.
    0:18:32 So, you know, when you take action like this, it’s always a good idea for no other reason than to hear what the Brits and the others have to say about how we can do this well.
    0:18:37 But this is not, of course, the way this administration thinks about taking action abroad.
    0:18:40 And it’s interesting timing.
    0:18:43 The president is on his way to the NATO summit in The Hague.
    0:18:51 He definitely wanted a big win coming in since everybody is pretty mad at him about tariffs and the general state of the world.
    0:18:54 How do you think this is going to play out over the next couple of days?
    0:19:06 You know, I can’t emphasize enough how much the facts on the ground matter to the answer to that question.
    0:19:17 Again, on one extreme, maybe the Iranian people finally say we’ve had enough and they, you know, have both the willingness and the capability to overthrow this hideous regime, in which case we’re all going to feel good.
    0:19:25 On the other extreme, of course, is, you know, continued Israeli attacks on Iran, Iran claiming that they’re violating the ceasefire.
    0:19:31 And Israel would do that because they realize that we probably haven’t significantly damaged the nuclear capability.
    0:19:33 And now we’re back to a shooting war in the Middle East.
    0:19:37 Or, again, my my worst case scenario is the quiet scenario.
    0:19:41 It’s not bombs going off or missiles landing in Bahrain.
    0:19:47 It’s the Iranians go dead quiet for six months and seven months from now, there’s a test of a nuclear device.
    0:19:57 So, you know, where we land on that spectrum of, you know, magnificent to horrible is going to have a lot to do with how, you know, to Scott’s point, the domestic politics play here.
    0:20:08 And to the way the rest of the world thinks about it now, let me make one last point here, because I think those of us who are interested in international affairs should be self-reflective.
    0:20:29 If you had told me two years ago that Israel was going to be able to largely take out Hezbollah to assassinate Hamas leadership in downtown Tehran and basically crush their leadership and disable the Iranian air defenses, whatever, 40, 50 percent.
    0:20:31 I would have said that’s overambitious.
    0:20:37 And so let’s not be overly biased towards the pessimistic here.
    0:20:46 What the Israelis, whatever you think about, you know, its wisdom or its justice, what the Israelis have accomplished, you know, since October 7th.
    0:20:58 And I set aside their activities in Gaza when I say this militarily against Hezbollah and militarily against Hamas in Iran has been, let’s just say, nobody, I think, would have put a big bet on that outcome.
    0:21:07 Just along those lines, Representative, if you think of us as having four enemies, mostly China, North Korea, Iran, and Russia, I would argue China’s not an enemy.
    0:21:08 It’s the Americans.
    0:21:11 When we have a competitor that gets too successful, we think of them as an enemy.
    0:21:12 I think of them as a competitor.
    0:21:16 So that was Russia, North Korea, and Iran.
    0:21:22 I mean, we are, quite frankly, kind of kicking ass and taking names.
    0:21:26 I mean, I think Russia and Iran are just not in the same place they were 24 months ago.
    0:21:37 And just a pointed question, hasn’t the Ukrainian army and the IDF, quite frankly, been doing the West’s dirty work and kind of kicking ass and asking questions later?
    0:21:53 Don’t we owe, as someone who’s on the Intelligence Committee, with exponentially more budget, exponentially better equipment, haven’t they demonstrated the kind of confidence and courage that has advanced our objectives and made us safer?
    0:21:57 Don’t we owe the Ukrainian army and the IDF a huge debt of gratitude?
    0:22:02 I would separate those two questions.
    0:22:05 You know, everyone thought Ukraine was going down.
    0:22:05 Yeah.
    0:22:15 And what Ukraine has managed to pull off has been nothing short of epic, especially in the context of our wavering support, where we get sort of partial credit for helping the Ukrainians.
    0:22:28 And the lesson that has come out of that war is hopefully being learned by dictators everywhere, which is that when you’re on someone else’s land, even if you have overwhelming firepower, you’re going to have a hard time.
    0:22:41 A million casualties in Russia right now, Putin doesn’t care about that, but, you know, hopefully the other dictators around the world who are thinking about a Ukraine-like incursion are taking that a little bit more seriously.
    0:22:53 And again, I just I won’t repeat myself, but what the IDF achieved against Hezbollah, what the IDF achieved against Hamas in Tehran and what they achieved against the Iranians is pretty spectacular.
    0:23:00 I’m putting an asterisk on that because too much of what we see happening in Gaza right now should not be happening.
    0:23:05 There is too much humanitarian suffering and civilian loss.
    0:23:08 And I do think that over time, the IDF will need to grapple with that.
    0:23:17 But the last answer on your question about the IDF, Scott, is, again, it really matters how this ends.
    0:23:25 And Middle East experts will tell you, you sometimes don’t know the answer to the famous question, tell me how this ends in the Middle East for a couple of years.
    0:23:31 So, again, I’m not going to beat this dead horse too much, but a regime change and a giving up of the nuclear weapons.
    0:23:32 Wow. Incredible.
    0:23:35 But there are a lot of other scenarios.
    0:23:44 And until we know which door gets opened, I think it’s a little early to celebrate or to say that the IDF has been doing our dirty work.
    0:23:46 Look, again, let me just say it again.
    0:23:52 If the Iranians give up their nuclear weapons or, you know, let us all hope for regime change, remarkable.
    0:23:53 But we just we’re not there yet.
    0:23:57 Congressman Himes, thank you so much for your time.
    0:23:58 It’s invaluable that you could join us.
    0:23:59 Thanks a lot.
    0:24:00 Thanks for having me.
    0:24:02 Yeah, Congressman, you’re thoughtful and direct.
    0:24:03 You’re in the right seat.
    0:24:07 It makes us feel good that you’ve decided to do what you do.
    0:24:10 And Scott rarely says that to anyone that we talk to.
    0:24:16 So I’m just chuckling because I’m not sure that thoughtful and direct is actually in the job description of a member of Congress.
    0:24:17 But OK, I’ll take it.
    0:24:18 It should be.
    0:24:20 Keep on keeping on.
    0:24:20 Right on.
    0:24:21 Thanks, Representative.
    0:24:22 Thank you for your time.
    0:24:22 All right.
    0:24:23 Take care.
    0:24:24 Thank you very much.
    0:24:26 OK, let’s take a quick break.
    0:24:27 Stay with us.
    0:24:34 It won’t take long to tell you Neutral’s ingredients.
    0:24:40 Vodka, soda, natural flavors.
    0:24:47 So, what should we talk about?
    0:24:52 No sugar added?
    0:24:58 Neutral.
    0:25:00 Refreshingly simple.
    0:25:07 This week on Net Worth and Chill, we’re diving into uncharted territory with our first ever Am I the Financial Asshole? episode.
    0:25:12 You sent in your messiest money dilemmas, and I’m here to deliver the verdicts.
    0:25:17 From the couple stuck in a mortgage with their brother-in-law to the wedding and bachelorette parties costing an arm and a leg,
    0:25:22 we’re unpacking the most cringeworthy cash conflicts that are testing relationships and moral boundaries.
    0:25:26 Whether your team justified financial boundaries or team, that’s just cold,
    0:25:31 this episode will have you questioning everything you thought you knew about money etiquette.
    0:25:41 I am so bummed that your partner can’t see that because regardless of whether or not you draft and sign your own prenup, you get one.
    0:25:45 The other alternative is that the government gets to write it for you.
    0:25:50 Listen wherever you get your podcasts or watch on youtube.com slash yourrichbff.
    0:25:53 Hi, this is Scott Galloway.
    0:25:56 If you’re listening to this, you likely already know who I am.
    0:25:57 Kind of a big deal.
    0:26:00 Everyone’s laughing.
    0:26:03 This message is for you, our loyal listeners.
    0:26:07 Prof G Markets is now, drumroll, daily.
    0:26:08 That’s right.
    0:26:12 Monday through Friday, Prof G Markets breaks down market-moving news,
    0:26:14 helping you build financial literacy and security.
    0:26:15 Don’t miss it.
    0:26:19 Subscribe to Prof G Markets wherever you get your podcasts.
    0:26:27 Welcome back.
    0:26:29 Jess, what did you think of Representative Hines?
    0:26:30 I loved him.
    0:26:31 You loved him.
    0:26:38 I’m a big fan of his, and I appreciate also that he comes on Fox, which not every Democrat does,
    0:26:46 but having the chance to hear from the ranking member on the intel committee is really special.
    0:26:50 And I thought he did a lot of things that are different from how many members did,
    0:26:54 but he was open to criticizing himself and the party.
    0:27:02 He talked about moments of humility, and he was also able to, I think, thoughtfully reflect on a best-case scenario coming out of this,
    0:27:06 and then also to prepare us for what he’s afraid of.
    0:27:14 I thought it was a very well-rounded approach to a very fast-moving situation that carries a lot of danger to it, frankly.
    0:27:14 What did you think?
    0:27:22 The more I’m exposed, in the last 10 years, I had never, I don’t think, other than occasionally,
    0:27:25 you know, when I took my sister to Washington when she was in college,
    0:27:28 and I would just walk into congressional offices and meet with some aide,
    0:27:31 I had no exposure to elected representatives.
    0:27:35 And in the last 10 years, I’ve had a lot, mostly because I want my money, to be honest.
    0:27:37 Money’s nice.
    0:27:38 Yeah, money’s access.
    0:27:41 And so I have access to a lot of elected representatives.
    0:27:48 And I am consistently impressed by what thoughtful, intelligent, patriotic, committed people they are.
    0:27:53 And it bothers me how lazy people are to constantly shitpost our government,
    0:27:56 believing that everyone’s corrupt and nobody’s smart.
    0:28:00 There’s a lot of really, really impressive people who give up.
    0:28:04 You know, a guy like that could easily be running a private equity firm,
    0:28:07 clocking a shit ton of money, and at Bezos’ wedding this weekend.
    0:28:12 And instead, he chooses to, you know, be in D.C. trying to sort through this shit.
    0:28:20 So I’m, you know, I’m always impressed or consistently impressed to the upside by these individuals.
    0:28:23 So back to the issue at hand.
    0:28:28 Trump announced what he called a complete and total soothpire between Israel and Iran.
    0:28:32 The truce was supposed to be phased in over 24 hours, but already it’s showing signs of strain.
    0:28:38 Israel reportedly struck a radar site near Tehran after claiming Iran violated the ceasefire first.
    0:28:42 And behind the scenes, Trump is said to be furious with Israeli Prime Minister Netanyahu,
    0:28:46 pressing him on a tense call on Tuesday morning.
    0:28:52 So we went from bunker busters to a ceasefire in less than 48 hours,
    0:28:53 and now the ceasefire is already cracking.
    0:28:59 Any sense for what changed behind the scenes to make this deal happen in the first place and it’s already falling apart?
    0:29:06 I’m not sure how much of the deal was really together or how much it’s fallen apart, actually.
    0:29:13 A ceasefire is, in a lot of ways, I know it sounds like a final thing, but it’s a moving target constantly.
    0:29:19 And it ebbs and flows, and I’m still hopeful that we will be able to get to one.
    0:29:24 I don’t know what that looks like in the long term, because some people just can’t be friends.
    0:29:28 And I think Israel and Iran are two of those kinds of some people.
    0:29:30 But I remain optimistic.
    0:29:36 I think part of what got our hopes up is that we have a truth, social, happy president
    0:29:41 that feels that he can post through a foreign policy crisis.
    0:29:43 And that has some benefits.
    0:29:46 I think the transparency, to some degree, is good.
    0:29:53 It has some negative effects, like the fact that we had to send, you know, a decoy fleet
    0:30:00 and the real fleet to try to throw Iran off the scent because Donald Trump was posting
    0:30:01 through the entire thing.
    0:30:04 And that’s something that you don’t want to see from the commander-in-chief.
    0:30:08 But I went to bed, very hopeful.
    0:30:09 It was ceasefire news.
    0:30:10 I woke up this morning.
    0:30:12 The ceasefire is off.
    0:30:13 And maybe it’s back on.
    0:30:18 This was as President Trump was boarding to head to The Hague for the NATO summit.
    0:30:21 And I hope something good can come out of this.
    0:30:26 But I was struck by, and it was interesting, that Congressman Himes has introduced this
    0:30:27 resolution.
    0:30:29 He wants us to follow the Constitution.
    0:30:34 And he did have a defense for why this was different than actions past presidents have
    0:30:40 taken and also said past presidents shouldn’t have done these kinds of things without authorization.
    0:30:42 So at least it was a bit of a nuanced take.
    0:30:46 But I was struck by what German Chancellor Merz said about it.
    0:30:50 And he said, there is no reason to criticize what America did at the weekend.
    0:30:51 Yes, it is not without risk.
    0:30:54 But leaving things as they were was not an option either.
    0:31:03 I think that that speaks most accurately to how I’m feeling in my heart about what happened.
    0:31:10 I understand the American intel community did not have the same assessment as the Israelis.
    0:31:12 The Israelis are obviously closer to it.
    0:31:16 But I’m fundamentally concerned that it seems like Bibi Netanyahu is now our DNI.
    0:31:18 That’s a dangerous place to be in.
    0:31:21 But so is having Tulsi Gabbard as your DNI, also dangerous.
    0:31:29 But we know that past presidents have tried and failed to stop Iran’s nuclear ambitions.
    0:31:35 I know that they should have stayed in the JCPOA, that we were slowing their enrichment development
    0:31:36 by a lot.
    0:31:39 I also know we had to give them money that was used to fund terrorism.
    0:31:41 And that’s not a good outcome either.
    0:31:44 But Merz’s comments really struck me.
    0:31:49 And I do feel it was unsustainable to let things just keep going on, as it were.
    0:31:53 And at this particular moment, and I like that you brought it up to Congressman Himes,
    0:32:00 because of the work of the Israelis and the Ukrainians, the allies of the Iranians, like
    0:32:02 the Russians, are unable to help them.
    0:32:09 They have been so utterly decimated between going after Hamas and Hezbollah and the Russians
    0:32:16 that we have an opportunity with a weak axis of evil to do something really important for
    0:32:18 the safety of the region and the world.
    0:32:20 And that was the opportunity that I saw.
    0:32:22 Yeah, I thought that was really well put.
    0:32:29 I mean, again, self-hating Americans, we can never actually take credit or give credit
    0:32:29 where it’s due.
    0:32:37 And that is, if Russia, specifically the perception of Russia’s fierce fighting force was intact,
    0:32:39 I don’t think we could have done this.
    0:32:42 Or I don’t think we would have had the balls to do it, because we would have been worried
    0:32:46 they’d be arming their proxies in Syria with surfaced air missiles that could take out
    0:32:47 B-2 bombers.
    0:32:52 One of those B-2s going down and then a bunch of Iranian kids jumping on the wings of
    0:32:55 B-2s would be a really bad image for us.
    0:33:02 And we would have been scared that Russia’s long arms would be, you know, within reach
    0:33:08 or this would have been within the grasp of Russia arming Syrians or potentially arming or
    0:33:10 helping or supporting Iran.
    0:33:13 And the way I see this is the following.
    0:33:14 I’m very much in favor of this.
    0:33:22 I’ve never understood how far-right Republicans can be isolationist and then vote for a $200
    0:33:28 billion increase in the military budget from $800 billion to a trillion such that we don’t have
    0:33:32 a bigger budget than the next 10 biggest nations, but the entire world.
    0:33:33 It’s like, well, what’s the point?
    0:33:37 Canada’s not going to invade Buffalo anytime soon.
    0:33:43 When you spend $800 billion on our military, you are making a decision to get off of your
    0:33:51 heels and onto our toes and project power and deliver violence to other places in a very imperialist,
    0:33:55 aggressive way to represent our interests offensively and proactively.
    0:33:57 And that’s what this is.
    0:34:00 And I don’t, I think we’re looking at the wrong metric.
    0:34:02 I understand that we want to diminish their nuclear capability.
    0:34:05 But for me, the outcome here is the following.
    0:34:12 I think the IRGC or the Islamic Republic has been a cancer, an occupying force, has very little
    0:34:14 support amongst the Iranian people.
    0:34:19 I think two of the biggest unlocks, you know, as a dork, I think one, overthrowing or nudging
    0:34:23 the Venezuelan government over the edge such that we’re even more energy independent.
    0:34:25 Venezuela has more oil than Saudi Arabia.
    0:34:30 And to seeing the Islamic Republic come to an end, I think that would be one of the most
    0:34:37 accretive actions for the 45 million women in Iran that, in terms of actual, if we really
    0:34:41 did give a flying fuck about human rights and stability in the region.
    0:34:46 And I’ve always thought Iran and America could be incredible allies that, you know, I’ve said
    0:34:49 this before, the Iranians I know are more American than Americans.
    0:34:55 So I see this more as while they’re kind of, quite frankly, down and out to hopefully tip
    0:35:02 over the Iranian people to give them the confidence to perhaps not overthrow this regime, but create
    0:35:03 their own regime change.
    0:35:08 You can’t, you can’t, you can’t create regime change from the outside.
    0:35:11 You can potentially inspire it.
    0:35:14 And that’s what I’m hoping, that’s what I’m hoping this was.
    0:35:19 The other thing that comes out here for me or the observation is there’s a reason that
    0:35:21 business people make such shitty presidents.
    0:35:26 It’s easy to believe that you call the two CEOs of companies and you can do this and say,
    0:35:29 okay, hey, Steve Jobs, it’s Bill Gates.
    0:35:31 We’re not going to hire each other’s employees.
    0:35:32 Stop it.
    0:35:35 I forget the, one of them called the other and said, stop hiring my employees.
    0:35:37 Yeah, I remember the story, but I don’t.
    0:35:40 It was Steve Jobs and, I mean, it was the guy from Google.
    0:35:44 Anyways, you’re not supposed to do that, but they can call each other and handshake and then
    0:35:48 send out an email to all the key people and boom, it’s in place.
    0:35:49 Cease fires don’t work that way.
    0:35:51 You’ve got to give it time.
    0:35:52 You’ve got to phase it in.
    0:35:57 You’ve got to relay information to your, your service to air missile battery commanders.
    0:36:04 You’ve got to have checks and balances, means of observation, ensure that all the entire command
    0:36:09 chain is on board with it and you need to phase it in over weeks, if not months sometimes, but to
    0:36:12 believe that, oh, it’s like a business deal.
    0:36:16 And if I get the two top guys to agree to it on the phone with me, it’s going to happen.
    0:36:21 It’s just so incredibly naive that this thing was going to hold.
    0:36:25 I don’t think there’s ever been a truce where someone has called and said, oh, agree to it.
    0:36:25 Okay.
    0:36:26 I got your agreement, agreement.
    0:36:28 And then you go out and announce it.
    0:36:33 Folks, geopolitical truces don’t work that way.
    0:36:35 There’s too many moving parts.
    0:36:42 There’s too many, the IRGC right now isn’t even able to communicate with its different portions
    0:36:46 of its armed services right now, because they’re afraid to use the internet for fear that the
    0:36:49 IDF uses it as a signal code to drum strike them.
    0:36:55 So for, again, for Donald Trump to think he can come in and say, oh, you own the Plaza.
    0:36:57 I own the Hilton.
    0:37:01 We’re going to stop trying to poach each other’s employees and get the COs to agree.
    0:37:05 That’s not how this works in the Middle East.
    0:37:08 And then the final observation is our director of national intelligence.
    0:37:11 I mean, I see three legs of the stool here.
    0:37:14 Kinetic power, which we demonstrated in spades, which I’m a fan of.
    0:37:17 Two, alliances, we fell down.
    0:37:18 It’s embarrassing.
    0:37:22 And one thing I don’t think the media is observing is that Britain, France,
    0:37:26 even the kingdom didn’t come out with direct statements of support.
    0:37:29 Both Bushes would have made sure that would have happened.
    0:37:30 Obama would have made sure that would happen.
    0:37:34 Biden would have made sure that happened such that this was a move from the West
    0:37:37 and from democracy, not just from Trump.
    0:37:38 And then the third thing is competence.
    0:37:42 And who the fuck are we supposed to believe here?
    0:37:47 We have a director of national intelligence stating that they aren’t any closer to a bomb.
    0:37:50 And then Trump directly contradicting his director of national intelligence.
    0:37:57 We have secretaries Hegseth and Rubio stating that we are not pursuing regime change.
    0:38:05 And then we have Trump saying in all caps, make Iran great again and saying he’s in favor of regime change.
    0:38:08 No one knows what is going on here.
    0:38:13 Who on earth is actually going to report on what has happened?
    0:38:14 Who has the credibility?
    0:38:15 What institution?
    0:38:24 What experts are going to be able to put out any credible evidence one way or the other of the level of damage or lack thereof of these facilities?
    0:38:29 Because we now have the fucking bad news bears running the government.
    0:38:31 You don’t even know who to believe.
    0:38:33 They can’t stay on message.
    0:38:34 They’re not consistent.
    0:38:38 The military, thank God, still demonstrates more competence than any organization in history.
    0:38:44 But we have a president who does not understand this is not a business deal.
    0:38:49 The truces between warring nations take weeks, if not months, to implement.
    0:38:53 And there has to be a series of checks and they have to be wound down incrementally.
    0:38:55 They can’t happen overnight.
    0:39:00 And when you announce them, like you want to take a victory lap because it’s some big deal or something,
    0:39:03 you are setting yourself and the nation up for embarrassment and failure.
    0:39:09 And the level of incompetence here is starting to seep into everything this guy does.
    0:39:09 Your thoughts.
    0:39:18 Well, it also speaks to why he tore up the nuclear deal in 2018 without a solution of what we were going to do instead.
    0:39:26 I mean, the numbers are staggering in terms of the increase in enriched uranium going from under 4% to 60%
    0:39:31 and adding an extra 100 kilograms at least to the stockpile.
    0:39:37 And we don’t know what will happen with their nuclear stockpile and how they’ll rebuild.
    0:39:43 And the timeline that Congressman Himes was giving was startling to me, where he said six or seven months.
    0:39:51 And so if the intel community’s assessment was that they hadn’t made a final decision as to whether they were trying to build a nuclear bomb.
    0:39:57 And I know that Jon Stewart is a very funny guy, but he’s also a very serious guy.
    0:40:05 And everyone should check out the montage that he had on the show last week of Netanyahu saying the bomb is coming, the bomb is coming.
    0:40:10 And it’s over the course of the last 20 years saying that we’re at that 90% level.
    0:40:13 Remember that graphic that he showed on the floor of the U.N.
    0:40:16 And our intel community says that that isn’t the case.
    0:40:18 That doesn’t mean that Iran isn’t a danger.
    0:40:21 That doesn’t mean that Iran isn’t the largest state sponsor of terrorism.
    0:40:25 That doesn’t mean that Iran isn’t responsible for killing innocents all over the Middle East.
    0:40:38 And also Americans, when the IRGC threatened to activate sleeper cells in the United States, I completely freaked out because I’m sure that they have them here.
    0:40:47 And we could be in, I’m in Washington, D.C. right now as we’re speaking, and I’m walking around thinking what could happen to any of us.
    0:40:48 I live in New York City.
    0:40:50 Great place to do a terrorist attack.
    0:40:51 They’ve done it before.
    0:40:54 So all of that is deeply concerning to me.
    0:41:02 To the point about the yahoos that are in charge, it does feel like Donald Trump isn’t really listening to anybody else than Bibi Netanyahu.
    0:41:07 And I sound like a bit of a broken record about it, but he has essentially supplanted everybody else.
    0:41:11 His intelligence is the intelligence that the United States trusts.
    0:41:17 Donald Trump, I think, doesn’t understand how good Bibi is at doing his job.
    0:41:19 This is how he’s managed to stay in power for this long.
    0:41:23 This is a man that is staying in power so that he can stay out of jail.
    0:41:26 And he has Trump wrapped around his finger.
    0:41:33 He can get him to trust the Israelis over the United States with the drop of a hat.
    0:41:35 And that’s what we’re seeing here.
    0:41:45 You noticed DNI Tulsi Gabbard out of the frame when Trump came out to make his address after the strike was carried out, said, you know, total and complete success.
    0:41:46 Tulsi was not standing behind him.
    0:41:48 It was just Hegseth Rubio and J.D. Vance.
    0:41:53 So clearly, that’s the imagery that he wants to project forward, that Tulsi has nothing to do with this.
    0:42:03 But the New York Times, who has done some incredible reporting on what’s been going on behind the scenes, shows a very insular group that’s informing him.
    0:42:08 And the fact that we are hand in glove with the Israelis every step of the way.
    0:42:10 They’re our strongest ally in the region.
    0:42:12 Both of you and I are strong supporters.
    0:42:20 And I was very appreciative that Congressman Himes also stipulated that the situation in Gaza is very different than what we are talking about here.
    0:42:26 But you essentially have a president that is all but going it alone.
    0:42:33 And he has a bit of a toddler sensibility about how things should happen.
    0:42:35 Like, I want it and I want it now.
    0:42:42 And that makes sense, looking back at the way that he’s conducted his business deals over, you know, the course of the last 50, 60 years.
    0:42:47 But it’s very different when you’re playing in the big leagues like this.
    0:42:53 And he seems to be completely myopically focused on how do I get that Nobel Peace Prize?
    0:42:55 How do I get that Nobel Peace Prize?
    0:43:00 And ending the Iranian nuclear program is certainly a good way to head in that direction.
    0:43:21 I wanted to bring this up because you talked about politics a bit during the interview, and I saw so many Democrats just reflexively opposing this, not even willing to consider that there might be merit to it or even going so far as to praise what the Air Force was able to pull off, which was absolutely incredible.
    0:43:36 And I feel like there’s this strong argument that Democrats can be making or, frankly, people who are just observing what’s going on about how Joe Biden governed and the foreign policy moves that he made that set Trump up for success in this moment.
    0:43:42 And I really wish that we could have a broader, contextualized conversation about foreign policy.
    0:43:46 We didn’t just, like, wake up on January 21st of 2025.
    0:43:48 And that was the beginning of all of this.
    0:44:01 And there’s so much that went on over the course of the last four years from weakening Russia, what Israel did using our weapons, the Ukrainians did using our weapons, President Biden allowing this to happen.
    0:44:05 That has provided for hopefully what is a good result.
    0:44:07 And I’m very focused on that.
    0:44:12 And I think that there is, to some degree, a victory lap that the Democrats should be able to take on this.
    0:44:14 OK, let’s take a quick break.
    0:44:15 Stay with us.
    0:44:24 There are nine people credibly running in New York City’s Democratic mayoral primary.
    0:44:29 The city’s deranged ranked-choice voting ensures every New Yorker gets to vote for five of them.
    0:44:34 Yesterday, candidate Brad Lander briefly made headlines when he got arrested by ICE.
    0:44:37 You don’t have the authority to arrest U.S. citizens.
    0:44:42 But the smart money says this race comes down to two, Andrew Cuomo and Zohran Mamdani.
    0:44:43 Let’s see what the tabs are saying.
    0:44:44 The Post?
    0:44:52 Zohran Mamdani has barely ever had a job with just three years in the workforce, including his rap career and a gig for his mom.
    0:44:53 The Times?
    0:44:56 Mamdani narrows Cuomo’s lead in New York City mayor’s race.
    0:44:58 New poll finds.
    0:44:59 The Daily News?
    0:45:03 Accused New Jersey killer used fake police lights to pull over wife’s lover.
    0:45:04 Cops say.
    0:45:08 It’s New York’s most exciting vote since Jimmy Fallon asked Robert De Niro to pick a movie.
    0:45:11 Coming up on Today Explained, the old guard versus young cardamom.
    0:45:14 Listen, weekday afternoons.
    0:45:23 In 2001, Lindsay met a man named Carlo.
    0:45:26 About a week later, they went on a date.
    0:45:32 And almost 15 years after that, she found out Carlo had been keeping a secret.
    0:45:41 Did you just go through every single moment of your relationship trying to see if you picked up on anything?
    0:45:43 Yeah, I didn’t sleep for days.
    0:45:45 I ran over things again and again in my head.
    0:45:50 And part of me didn’t really still believe it.
    0:45:52 It took quite a while to sink in.
    0:45:54 I’m Phoebe Judge.
    0:45:58 Listen right now on Criminal, wherever you get your podcasts.
    0:46:13 The Trump Organization launched a wireless carrier this week, which is weird.
    0:46:19 But even weirder is the phone, which is supposedly $500 made in the United States and coming in September.
    0:46:22 And I am here to tell you, I don’t believe any of it.
    0:46:27 This week on The Vergecast, we talk about what is going on with Trump Mobile and the T1 phone,
    0:46:31 plus what this all says about how we might buy phones in the future.
    0:46:37 All that, plus our review of the Nintendo Switch 2 and lots more on The Vergecast, wherever you get podcasts.
    0:46:46 Welcome back.
    0:46:52 It’s going to be, I mean, first off, and maybe we can play the clip.
    0:47:00 We basically have two countries that have been fighting so long and so hard that they don’t know what the fuck they’re doing.
    0:47:01 Do you understand that?
    0:47:08 For the president to come out and say, these guys have been at war so long that they don’t know what the fuck they’re doing.
    0:47:13 As someone who’s fond of expletives, the president should not be making them.
    0:47:18 The president of the United States, that just diminishes his authority and respect.
    0:47:24 And also, what I believe happened here, and I think this was a good idea,
    0:47:29 but that doesn’t mean the strategy and the incentives here don’t reflect poorly on the current leadership.
    0:47:35 I believe the only reason Trump did this was because he looked at Netanyahu’s dick and thought,
    0:47:38 wait, I want some big dick energy of my own.
    0:47:44 I think this was seen globally as such an extraordinarily competent, aggressive, and courageous move,
    0:47:50 what the IDF was able to pull off in Iran, that he wanted to jump on the metal podium and say,
    0:47:57 look at me, which is the wrong reason to do this, even if it was the right tactical maneuver.
    0:48:02 And what seems clear to me is Bibi Netanyahu thinks he’s on top.
    0:48:07 He can do whatever he wants and that the president will go along with it.
    0:48:07 He’s right.
    0:48:12 So essentially, you know, Middle East policy right now is being run by the superpower there,
    0:48:13 and the superpower there is Israel.
    0:48:22 And then the really dangerous thing about all of this is that Israeli leadership wants to be on a war footing.
    0:48:26 Whether it’s the right thing or not, he’s on a war footing trying to stay out of jail.
    0:48:35 And that is his only chance of staying out of jail is to get people to rally around the flag because they are at war.
    0:48:53 And I believe that he is very excited to go into Iran because he realizes the only thing standing between him and jail is the rallying around the flag that happens when you’re at war.
    0:49:02 And that is a frightening place to be when you have a place as unstable as the Middle East and you have a nation with nukes.
    0:49:10 So this is a very – as is everything in the Middle East, this is a very complex, upsetting situation.
    0:49:15 And where we will see, I think, unintended consequences.
    0:49:17 And right – so far we haven’t.
    0:49:20 It looks as if Khamenei’s response has been performative.
    0:49:30 You know, the missile barrages into American bases in Qatar and, I believe, in Iraq have so far been totally ineffective.
    0:49:32 I think he even gave the heads up to –
    0:49:32 He did.
    0:49:38 And then you had Qatar helping with brokering the ceasefire of last night.
    0:49:38 So –
    0:49:39 Right.
    0:49:44 Qatar working with us, essentially, to make sure that things can simmer down.
    0:49:44 Right.
    0:49:50 So it looks as if that it was basically performative, such that Khamenei can say to his people, I’m tough, I respond.
    0:49:52 But not risk escalation.
    0:49:54 And if it stops there, then great.
    0:49:57 Then everyone can take – Israel and American can take a victory lap.
    0:50:19 But I think the president’s inability to appreciate that strength and greatness is in the agency of others, not having our traditional allies around us supporting us with intelligence and with, you know, perceptual support, if you will, ensuring that we – that the world knows this was an action of the West, just not from a guy who demonstrates incompetence.
    0:50:28 An intelligence apparatus that seems totally sclerotic and bipolar, don’t know who to listen to, don’t know what they’re meaning.
    0:50:44 And then the thing that, you know, as supporters of Israel, I think is really concerning right now, is when he comes out and says they don’t know what the fuck they’re doing and I’m angry at Israel, that’s language and a statement he says to Bibi privately on a secure phone.
    0:51:12 And when Trump says that Israel doesn’t know what the fuck it’s doing and leaks discreetly or overtly a real dissatisfaction and frustration with Israel, he is emboldening Israel’s enemies to take more aggressive and bold action than they might otherwise.
    0:51:16 So, you know, fighting with your allies is bad.
    0:51:17 Fighting without them is worse.
    0:51:23 When you have allies, you put on a unified front, even when it sucks.
    0:51:25 Yeah, but I agree with you.
    0:51:30 I want the president of the United States of America to behave like he’s the president of the United States of America.
    0:51:45 But this is Donald Trump and the American public picked this consciously and they probably like that they have somebody where they can see what’s actually going on behind the scenes or what he perceives to be going on behind the scenes.
    0:51:50 I think that this is the most transparent administration in American history, as I’m told regularly.
    0:51:59 I spoke to two Democrats last night who are running for president who haven’t officially announced, but take my calls and call me because it’s obviously running for president.
    0:52:05 And I said the opportunity here is to come out and say, I agree with the action.
    0:52:07 I don’t support the president’s policies.
    0:52:08 I don’t support how he’s gone about this.
    0:52:11 He’s injected more risk into this than he needed to.
    0:52:12 But I support the actions.
    0:52:17 And it’s important that we rally around our military and the flag and the president in a time like this.
    0:52:20 Because, again, I think the Democrats have fucked up here.
    0:52:21 Totally.
    0:52:22 Well, it’s the reflexive no.
    0:52:26 And I mean, maybe we’ll hear from those two later in the day.
    0:52:28 But so far, I haven’t really seen that.
    0:52:31 Well, the only one who’s done it is Fetterman.
    0:52:35 Well, that’s I assume he was not the one who called you last night.
    0:52:41 Fetterman Fetterman has basically come out and said, you know, look at the action, not the politics.
    0:52:43 And a lot of people on the front.
    0:52:47 I go to the same place whenever the far left and the far right agree on anything.
    0:52:49 That means we’re at negative 40.
    0:52:51 Negative 40 is where Celsius and Fahrenheit meet.
    0:52:52 It’s inhospitable.
    0:52:56 Whenever the far left and the far right agree on anything, it’s a really bad idea.
    0:53:01 Whether it’s anti-vaccination or isolationism, whatever it is, you know it’s a really bad idea.
    0:53:08 And when you have Marjorie Taylor Greene and AOC agreeing on something, it means you should probably agree with the other way.
    0:53:09 And they’re both agreeing.
    0:53:16 You know, they’re both spouting off, in my opinion, this isolationist, you know, in my opinion, very dangerous bullshit.
    0:53:19 And again, I come back to the same place.
    0:53:22 And I apologize, I’m being redundant here.
    0:53:30 Why on earth are we spending the GDP of Argentina on our military if we’re not going to exert this kind of power?
    0:53:32 Well, we’re always going to exert it.
    0:53:33 We’re just going to complain about it.
    0:53:39 Or some people are going to feign outrage and say, you know, we’re not these people.
    0:53:42 And the truth is, is that we are fundamentally these people.
    0:53:48 But I just want to say on the Fetterman front, and while I agree with some of his positions, he’s just completely lockstep with Israel.
    0:53:54 He doesn’t even acknowledge what’s going on in Gaza as a humanitarian disaster.
    0:54:01 So John Fetterman is out on an island on his own when it comes to these kinds of actions.
    0:54:05 So we’ll see what the mainstream of the party does.
    0:54:14 But I think it’s totally an opportunity, again, to sound like a normal human being, to meet people where they are, and to rally around the flag.
    0:54:24 So, by the way, I almost forgot, our little girl, I could not be more proud of you than if you were up reading the Torah.
    0:54:25 Oh, my God.
    0:54:35 Donald Trump, personal attack on our very own, literally a badge of honor.
    0:54:51 Donald Trump came out and mentioned you by name, saying that on True Social, why does Fox News allow failed TV personality Jessica Tarlop to soil the five?
    0:54:53 Oh, my God, you’re ruining the five.
    0:54:59 Even Fox viewers who are about 105 and fucking crazy love you.
    0:55:01 Love you, literally love you.
    0:55:06 Her voice, her manner, and above all else, what she says are a disgrace.
    0:55:08 You’re a disgrace, Jess.
    0:55:08 So I hear.
    0:55:15 To television broadcasting while claiming the network is alienating MAGA supporters by giving her airtime regular.
    0:55:19 I could not be more proud of you.
    0:55:26 This is a big, I know you thought when I called you and said I wanted to do a show with you, you thought this is my big moment.
    0:55:29 But this is your big moment.
    0:55:32 Ladies and gentlemen, Jessica Tarlop.
    0:55:34 Soiling the five.
    0:55:35 There you go.
    0:55:37 What did you think when you saw that?
    0:55:39 It took my breath away.
    0:55:40 Took your breath away?
    0:55:40 Yeah.
    0:55:46 Well, he’s posted about me before, but not quite as meanly.
    0:55:53 And I was just thinking, why aren’t you busier?
    0:56:02 And then, this was Friday early evening, I’m thinking, we started sending the B2 bombers left a few hours later.
    0:56:04 Like, you really should have been busier.
    0:56:05 He’s focused on you.
    0:56:06 Yeah.
    0:56:23 I mean, he was watching the show and the Times has reported that TV coverage, specifically on Fox, has informed his view on getting involved and that he wanted to be part of the action, had a bit of FOMO when it came to what the Israelis were able to pull off.
    0:56:27 But it’s something to have that happen.
    0:56:30 It’s a very uncomfortable feeling.
    0:56:31 Oh, OK.
    0:56:31 Hold on.
    0:56:31 Hold on.
    0:56:32 Hold on.
    0:56:33 Let me just break it down for you.
    0:56:34 Yeah.
    0:56:57 This is the biggest, I think this is arguably, other than, of course, meeting Scott Galloway, I think this is the biggest thing to happen to you, because in the midst of probably the biggest geopolitical event of his career, he takes time to shitpost you, which absolutely means every senator in Congress would kill to have the president call them out by name.
    0:57:06 Because when he disagrees with you, it basically means you’re doing something right, and you are now more in his head than anyone who’s running for president.
    0:57:09 He doesn’t give a shit what Senator Schumer thinks or says.
    0:57:13 He’s not worried about Governor Newsom running for president.
    0:57:15 He’s worried about you.
    0:57:18 I think this is—I’m very excited.
    0:57:19 I’m very excited.
    0:57:21 This made my day when I saw this.
    0:57:23 I was surprised not to hear from you, though.
    0:57:24 You know me.
    0:57:24 I don’t like—
    0:57:26 No text from Scott.
    0:57:27 I got some good texts.
    0:57:28 I don’t like to talk to people.
    0:57:29 No, you don’t.
    0:57:29 It’s awful.
    0:57:31 Yeah, yeah, I don’t like to talk to people.
    0:57:35 I feel desperate sometimes with the amount of times that I’ve texted to no response.
    0:57:37 Sometimes I get a thumbs up.
    0:57:39 But I’m like, I’m just going to keep doing it.
    0:57:43 I’m the Hermes of fake intellects in that it’s all about scarcity.
    0:57:45 It’s all about managing fake scarcity.
    0:57:52 I’m an elite university that rejects people more than I could to give the impression of some sort of value or scarcity.
    0:57:53 It’s all an act.
    0:57:55 I’m going to defund you over that.
    0:57:55 It’s not cool.
    0:57:57 It’s all an act.
    0:58:01 And again, I wish I’d figured this out when I was in my mating years.
    0:58:02 Okay.
    0:58:03 I think we should end it there.
    0:58:08 I think we’re going to watch Jessica Tarloff take a victory lap.
    0:58:21 I think we’re going to see her on the medal podium living rent-free in President Trump’s head because she is so articulate, so unafraid, so bold, so numero cinco in the five.
    0:58:28 The most watched program in the world has one person who the president is listening to.
    0:58:29 It’s not the Senate minority leader.
    0:58:30 It’s not the Senate minority leader.
    0:58:31 It’s not Leader Jeffries.
    0:58:34 It’s not Tom Friedman.
    0:58:40 Literally, the most important person in the world with President Trump right now is Bibi Netanyahu.
    0:58:53 Number two, ladies and gentlemen running through the tape, collecting the gold, bronze, and silver of people-shaping geopolitical conversations around the world.
    0:58:55 That’s right, the co-host of Raging Moderates.
    0:59:02 If we are not number one this week, literally, I am going to weep crocodile tears while listening to Megyn Kelly.
    0:59:03 I’ll be so upset.
    0:59:05 This is a big moment for you, Jess.
    0:59:07 We’re going to leave it there.
    0:59:08 All right.
    0:59:09 Let’s read us out.
    0:59:10 That’s all for this episode.
    0:59:12 Thank you for listening to Raging Moderates.
    0:59:15 Our producers are David Toledo and Eric Jenicus.
    0:59:17 Our technical director is Drew Burroughs.
    0:59:19 Going forward, you’ll find Raging Moderates every Wednesday and Friday.
    0:59:24 Subscribe to Raging Moderates on its own feed to hear exclusive interviews with sharp political minds.
    0:59:26 You won’t hear anywhere else.
    0:59:34 This week, Jess is talking to Congressman Greg Kassar, who I heard the president does not listen to nor does not care what he says because he is not Jess Tarlow.
    0:59:38 Make sure to follow us wherever you get your podcasts.
    0:59:40 You don’t miss an episode.
    0:59:43 Keep on soiling, my woman.
    0:59:44 Keep on soiling.
    0:59:46 Couldn’t stop even if I wanted to.
    0:59:47 All right.

    Scott and Jessica talk through the aftermath of the weekend’s airstrikes in Iran — the lack of coordination in the lead-up, differing accounts of the damage, and confusion about a ceasefire. They’re joined by Rep. Jim Himes, Ranking Member on the House Intelligence Committee, to discuss possible consequences for Iran’s regime, citizens, and nuclear capabilities. Plus: Trump publicly lashes out at Israel, Iran, and… one of the hosts of Raging Moderates.

    Follow Jessica Tarlov, @JessicaTarlov

    Follow Prof G, @profgalloway.

    Follow Raging Moderates, @RagingModeratesPod.

    Learn more about your ad choices. Visit podcastchoices.com/adchoices