Category: Uncategorized

  • Latin America’s IPO Market, How to Mentor Young Men, and Scott’s Stake in La Equidad Football Club

    AI transcript
    0:00:03 Support for Prop G comes from Crescent Family Office.
    0:00:05 As an entrepreneur, you spend a lot of time
    0:00:06 in energy building your business.
    0:00:08 And chances are, you’ve been so busy,
    0:00:09 there hasn’t been a ton of time to think about preparing
    0:00:13 for an exit, tax strategies, and wealth management.
    0:00:14 Crescent is here to help wealth creators
    0:00:17 and families like yours streamline complexity
    0:00:18 and invest for the future.
    0:00:19 Crescent was built by entrepreneurs,
    0:00:22 for entrepreneurs with financial advisory teams
    0:00:23 who embrace the fiduciary duty
    0:00:26 to place the client’s interests first.
    0:00:28 You can learn how to optimize your life
    0:00:29 by scheduling a call with a Crescent founder
    0:00:32 at CrescentCapital.com.
    0:00:33 We are not clients of Crescent.
    0:00:34 There are no material conflicts
    0:00:36 other than this paid endorsement.
    0:00:39 All investing involves risk, including loss of principle.
    0:00:41 (funky music)
    0:00:45 – Hey, what you doing?
    0:00:48 – Programming our thermostat to 17 degrees
    0:00:50 when we’re out at work or asleep.
    0:00:51 We’re taking control of our energy
    0:00:54 use this winter with some easy energy saving tips
    0:00:55 I got from Fortis, BC.
    0:00:58 – Ooh, conserve energy and save money?
    0:01:00 Maybe to buy those matching winter jackets?
    0:01:02 – Uh, no, we’re also getting
    0:01:04 that whole matching outfit thing under control.
    0:01:07 – Discover low and no cost energy saving tips
    0:01:10 at fortisbc.com/energysavingtips.
    0:01:12 Matching track suits?
    0:01:13 – Please no.
    0:01:17 – Mike has always wondered
    0:01:20 what makes some people want to play action hero.
    0:01:23 – When I see people speeding or skydiving
    0:01:25 or surfing on hundred foot waves,
    0:01:26 I always think, are you crazy?
    0:01:27 What are you doing?
    0:01:28 Because I’m like risk averse.
    0:01:32 I’m trying to like, you know, keep Mike in one piece.
    0:01:34 – But what about people who get their thrills
    0:01:36 from playing cop?
    0:01:40 This week on Explain It To Me, citizens arrest.
    0:01:43 Is it a real thing and should it be?
    0:01:45 New episodes every Wednesday.
    0:01:50 – Welcome to Office Hours with Pravji.
    0:01:52 This is the part of the show where we answer your questions
    0:01:54 about business, pick tech, entrepreneurship,
    0:01:55 and whatever else is on your mind.
    0:01:57 If you’d like to submit a question,
    0:01:59 please name all a voice recording
    0:02:00 to officehours@propgmedia.com.
    0:02:04 Again, that’s officehours@propgmedia.com.
    0:02:06 I have not seen or read these questions.
    0:02:08 Question number one.
    0:02:09 – Hi, Pravji.
    0:02:12 I’m Anna from Mexico, currently living in Chicago, Illinois.
    0:02:15 First of all, thank you for all your hard work.
    0:02:18 I listen to your podcast during my commute to school
    0:02:21 and it always makes the ride more enjoyable.
    0:02:25 My question is about APOs in Latin America.
    0:02:27 One of the predictions you did for 2025
    0:02:30 was that emerging markets will become
    0:02:32 more attractive to investors.
    0:02:35 I had the chance to work for a unicorn in the region
    0:02:38 and ever since, I have wondered,
    0:02:42 why we don’t see many IPOs in Latin America?
    0:02:45 I truly believe that there are amazing companies
    0:02:47 led by talented entrepreneurs,
    0:02:49 but not many of them go public.
    0:02:50 What do you think this is the case
    0:02:52 and what do you think could be improved
    0:02:54 to help more companies in the region
    0:02:55 to take that step?
    0:02:59 Thank you so much and thank you for choosing that question.
    0:03:00 – Anna from Mexico.
    0:03:03 I love your accent and the IPO market.
    0:03:06 Since the pandemic, the IPO market in Latin America
    0:03:08 has really fluctuated.
    0:03:11 In 2021, the region raised about $19 billion,
    0:03:13 mostly due to Brazil,
    0:03:15 which accounted for about 85% of that capital.
    0:03:19 Mexico, in comparison, had only one IPO in 2021.
    0:03:21 So basically the IPO market in Mexico
    0:03:24 could best be described as kind of like dead.
    0:03:25 Other than that, it’s fine.
    0:03:26 By 2022 and 2023,
    0:03:30 IPO listings came to a complete halt in Brazil.
    0:03:31 In the first half of 2024,
    0:03:33 the entire Latin American and Caribbean region
    0:03:35 registered only four IPOs.
    0:03:36 Why?
    0:03:39 Essentially, the boring stuff, interest rates.
    0:03:42 Rates in Brazil rose over seven points
    0:03:43 to nine and a quarter points.
    0:03:45 Similarly, interest rates in Mexico
    0:03:48 increased from 5.5% to 10.5% in the same period.
    0:03:50 At the end of 2022,
    0:03:52 the average policy rate in Latin America
    0:03:56 stood at 18.9%.
    0:03:59 Rising interest rates usually decrease investor confidence,
    0:04:01 which typically means an outflow of capital
    0:04:03 from equity markets into safer fixed income investments.
    0:04:06 In short, when you’re getting paid a lot of money
    0:04:08 just to put your money in bonds
    0:04:12 or would feel like less volatile or safer investments,
    0:04:13 the bar to go public gets much higher
    0:04:15 and money flows out of equities
    0:04:17 and into fixed income instruments,
    0:04:20 thereby creating a less hospitable market for IPOs.
    0:04:23 Additionally, the rise of populist left-leaning governments
    0:04:25 in Latin America has deterred private investment
    0:04:29 in the region, resulting in nearly flatline GDP per capita.
    0:04:31 I’m actually quite bullish on Mexico.
    0:04:32 I think for a few reasons.
    0:04:34 One, more political stability,
    0:04:37 I would argue, than the US to proximity.
    0:04:40 As we try and divest away from China,
    0:04:42 one because of geopolitical tensions,
    0:04:45 but also just because of supply chain diversification.
    0:04:46 I was on the board of chemical turbine outfitters.
    0:04:48 We woke up one day and Mila COVID
    0:04:51 realized that the disproportionate 60, 70% of our tops
    0:04:55 were being manufactured in a five mile radius of Shenzhen.
    0:04:58 And basically all 550 of our stores couldn’t get tops in time
    0:05:01 and said, okay, even if it costs us a little bit more,
    0:05:03 we need some supply chain diversification.
    0:05:04 Basically up until COVID,
    0:05:07 supply chain was run for the lowest cost, full stop.
    0:05:09 How do we eke out more and more costs?
    0:05:12 And it ended up we had absolutely no slack,
    0:05:13 meaning any eruption at all.
    0:05:16 And everything from your refrigerator to your garage door,
    0:05:17 they couldn’t find parts
    0:05:18 and the whole thing just kind of collapsed on itself.
    0:05:21 So essentially the biggest threat to these companies
    0:05:23 is no longer consumer demand,
    0:05:26 which has been really strong for the last 16 odd years,
    0:05:28 but supply chain interruptions
    0:05:31 where basically consumers can’t get what they want.
    0:05:33 So there’s this amazing
    0:05:35 or this incredible inspired effort
    0:05:37 towards supply chain heterogeneity
    0:05:39 and then there’s supply chain diversification,
    0:05:43 moving stuff out of China into Vietnam,
    0:05:44 other Southeast Asian countries.
    0:05:46 And also the big beneficiary has been Mexico,
    0:05:48 which is now our largest trading partner.
    0:05:50 So I’m actually quite bullish on Mexico.
    0:05:53 IPOs in general seem to be in,
    0:05:55 are they encyclical decline or structural decline?
    0:05:57 And that is, it used to be,
    0:05:59 if you wanted a company to be worth more
    0:06:00 than a few hundred million dollars,
    0:06:03 you had to access this deep pool of capital
    0:06:04 called the public markets.
    0:06:05 There just wasn’t enough capital available
    0:06:06 in the private markets.
    0:06:10 The biggest VC funds only had two or $300 million funds.
    0:06:12 So you went public to access
    0:06:14 these big pools of public capital.
    0:06:16 That’s no longer the case.
    0:06:17 People have gone further downstream
    0:06:19 and when they see opportunities,
    0:06:21 when they see the company accelerating in value,
    0:06:22 they wanna squeeze more and more juice
    0:06:24 in the private markets as institutional investors.
    0:06:26 So they’re like, we don’t need to go public.
    0:06:27 As a matter of fact,
    0:06:30 if the existing shareholders or employees
    0:06:32 want some liquidity, we can give it to them.
    0:06:34 We can do secondary offerings.
    0:06:35 We don’t want the scrutiny,
    0:06:37 the regulation of the public markets.
    0:06:38 So we’ll just stay private.
    0:06:40 And oftentimes the private markets
    0:06:43 are actually trading any premium to the public markets.
    0:06:44 So what do you have?
    0:06:47 The number of public companies has been reduced,
    0:06:49 I think by two thirds over the last 30 or 40 years.
    0:06:50 One, it’s harder to go public.
    0:06:51 Two, there’s more regulation.
    0:06:53 Three, it’s expensive.
    0:06:56 And four, mergers and acquisitions
    0:06:58 have taken a lot of public companies off of the rolls.
    0:07:03 And finally, finally, being private is kind of equivalent
    0:07:05 or maybe even more ideal than going public.
    0:07:08 And typically, I think also what’s hurt the public markets
    0:07:10 is by the time a company decides to go public,
    0:07:12 it’s sort of the last stop on the full train.
    0:07:16 And that is Google when they went public, enormous upside.
    0:07:17 Same with Metta.
    0:07:19 These companies still had enormous upside left.
    0:07:20 Now most of the, again,
    0:07:22 the juices being squeezed in the private markets.
    0:07:24 So by the time it gets to the public markets,
    0:07:26 it’s sort of like, well, we’ve run out of people
    0:07:28 in the private markets who will bid us up.
    0:07:30 Let’s see if we can find stupid retail investors
    0:07:33 and some of the public market IPOs have underperformed
    0:07:37 and it’s made a less hospitable environment.
    0:07:40 But yes, the IPO market in Mexico is dead.
    0:07:42 I hope it comes back for all of us.
    0:07:44 Thanks for the question.
    0:07:45 Question number two.
    0:07:49 Hey, Scott, this is Kevin from Colorado.
    0:07:50 Long time listener of the pod
    0:07:53 and want to tell you how much I appreciate your insights
    0:07:54 on helping young men.
    0:07:57 Although I’d still like to consider myself young.
    0:08:00 In 2025, I’m turning 40
    0:08:02 and I’m increasingly feeling the responsibility
    0:08:04 to make a positive impact
    0:08:07 with the abundance of young men I interact with in my life.
    0:08:10 I have three young boys, a younger brother,
    0:08:14 four younger brothers-in-law and eight young nephews.
    0:08:17 Outside of my family, I own two businesses,
    0:08:20 each having many young male employees.
    0:08:22 And I’m also active in my church
    0:08:26 where I help mentor dozens of young men under 18.
    0:08:28 I’m finding it hard to get involved
    0:08:31 and help these boys that aren’t my sons
    0:08:33 because I don’t want to overstep my bounds.
    0:08:35 My question, Scott, is this.
    0:08:38 With no shortage of opportunities,
    0:08:41 how do you recommend I be more proactive
    0:08:43 in helping mentor these young men,
    0:08:46 especially those I can see are struggling,
    0:08:50 without being overbearing or coming off as weird?
    0:08:51 – Boss, you’re doing it.
    0:08:54 You know, raising good men,
    0:08:57 raising confident, loving, patriotic boys,
    0:09:01 that’s kind of 90% of what you’re supposed to be doing.
    0:09:02 In terms of helping other boys,
    0:09:04 when I can’t stand the gestalt in our society
    0:09:07 where we suspect men who want to get involved
    0:09:08 in other boys’ lives,
    0:09:09 I think that is one of the terrible things
    0:09:12 about our society that is largely the fault
    0:09:13 of the Catholic Church and Michael Jackson,
    0:09:15 which I will not go into,
    0:09:17 but your inclinations are the right ones.
    0:09:18 One, getting involved in groups,
    0:09:20 being a scout leader, you know,
    0:09:22 teaching Sunday school, whatever it might be,
    0:09:25 but sharing some of that confidence,
    0:09:27 being a male role model in groups,
    0:09:29 also recognizing, or what I think is a great trick
    0:09:33 or not a trick, but I do this sometimes with my boys.
    0:09:35 One, my boys are 14 and 17,
    0:09:38 so no matter how wonderful or fat the vacation is,
    0:09:40 unless they’re with other boys or other people their age,
    0:09:41 they’re just not gonna have a good time
    0:09:42 and they’re gonna get bored and angry
    0:09:44 and make our lives miserable.
    0:09:45 So I bring friends.
    0:09:49 And what I find is I try and find or encourage them
    0:09:52 to invite boys that may not have the opportunity
    0:09:56 for whatever to hang out or do the kind of things we do.
    0:09:58 So I like to bring my boys friends with them.
    0:10:01 And I find you don’t need to do that much.
    0:10:05 I think just seeing a healthy father-son relationship,
    0:10:06 seeing, you know,
    0:10:08 you trying to occasionally ask a few questions,
    0:10:10 them asking you a few questions,
    0:10:12 it’s really powerful ’cause the reality is,
    0:10:14 and this is sort of, I don’t know,
    0:10:17 disappointing, hurtful as the dad,
    0:10:19 what you’re gonna find is with your sons,
    0:10:22 they would draw a little bit or they’re just less inclined.
    0:10:25 I have young men asking me for advice every goddamn day.
    0:10:26 My boys don’t ask me for advice
    0:10:29 ’cause they have this very healthy instinct
    0:10:33 that says I need to break away from the pride.
    0:10:35 And in order to make that a little less painful,
    0:10:38 I start thinking my parents are idiots.
    0:10:39 And what’s interesting is there’s research
    0:10:41 that shows with teen boys,
    0:10:44 they’re more apt to listen to their friends’ fathers
    0:10:45 than their own fathers.
    0:10:46 So what I would say is,
    0:10:48 in addition to just doing the good work you’re doing
    0:10:52 to get involved in groups where you might have an audience
    0:10:54 of more young men or young people that you can influence,
    0:10:56 whether it’s a church group,
    0:10:58 or as I said, the Boy Scouts or volunteer groups
    0:11:00 or coaching or sports leagues,
    0:11:03 if you really feel like you have that ability
    0:11:05 to connect with young men.
    0:11:06 But also on a very basic level,
    0:11:08 when you do stuff with your boys,
    0:11:12 encourage them, ask them or just invite other boys
    0:11:13 their age from your friends.
    0:11:15 Especially, I think it’s doing God’s work
    0:11:18 to kind of do a little bit of poking around
    0:11:19 and find out where there’s single mothers
    0:11:21 in your universe and say,
    0:11:24 hey, I’m taking my kids to the football game
    0:11:27 or I’m taking my kids to whatever it might be
    0:11:30 to the beach, would John like to join us?
    0:11:32 And I think you’re gonna find a lot of single mothers
    0:11:36 are very open to their son spending more time
    0:11:38 in the company of men trying to lead a good life
    0:11:41 and around other boys, but thanks for the question.
    0:11:44 We have one quick break before our final question.
    0:11:45 Stay with us.
    0:11:53 – Support for PropG comes from Quint.
    0:11:54 Your wardrobe is the first thing you upgrade
    0:11:56 when you wanna look and feel your best.
    0:11:58 But what if you’re someone who doesn’t like spending
    0:12:00 a ton of mental energy on your clothes?
    0:12:02 The solution is to find classic timeless pieces
    0:12:05 that actually last and that’s when you turn to Quint’s.
    0:12:08 They offer high quality clothing at an affordable price.
    0:12:10 They’re really well known for the Mongolian cashmere sweaters
    0:12:12 which started at $60.
    0:12:14 They also have great active wear
    0:12:16 including performance tees and tech shorts.
    0:12:19 Our producers tried some sheets from Quint’s
    0:12:20 and said they love them.
    0:12:22 Made with organic cotton, comfort
    0:12:25 and sustainable ability all in one package.
    0:12:27 Whatever you’re looking for,
    0:12:28 Quint says that all of their items are priced
    0:12:31 50 to 80% less in similar brands.
    0:12:32 They say they’re able to do that
    0:12:34 by partnering directly with top factories
    0:12:35 and they use premium fabrics and finishes
    0:12:38 so every item feels like a nice luxury item.
    0:12:40 Upgrade your closet this year
    0:12:41 without getting the upgraded price tag.
    0:12:45 Go to quints.com/propg for 365 day returns
    0:12:47 plus free shipping on your order.
    0:12:51 That’s Q-U-I-N-C-E.com/propg
    0:12:54 to get free shipping and 365 day returns.
    0:12:56 Quints.com/propg.
    0:13:04 This episode is brought to you by Crescent Family Office.
    0:13:05 Entrepreneurs understand the challenges
    0:13:07 of building a successful business.
    0:13:09 You’ve probably spent years pouring your heart
    0:13:10 into different ventures
    0:13:12 and maybe even had some serious wins.
    0:13:14 Still, I bet a lot of these successes
    0:13:16 came with headaches, complex financial planning,
    0:13:19 optimizing tax strategies and timely exit planning.
    0:13:22 It can be overwhelming to figure those things out
    0:13:24 and charting a path forward can take a lot of time away
    0:13:25 from what you love most.
    0:13:27 If that sounds familiar,
    0:13:29 you might want to check out Crescent.
    0:13:30 They’re a prestigious family office
    0:13:32 for CEO, founders and entrepreneurs.
    0:13:35 Crescent’s advisory teams can simplify your financial life.
    0:13:37 They handle the tedious stuff behind the scenes,
    0:13:39 freeing you up to focus on growing your business
    0:13:41 and enjoying your life.
    0:13:43 It’s the sort of help that can be transformative.
    0:13:47 Optimize your life and optimize your wealth with Crescent.
    0:13:49 If you want the freedom to follow what really matters,
    0:13:51 you should schedule a call with a Crescent founder
    0:13:53 at CrescentCapital.com.
    0:13:55 We are not clients of Crescent.
    0:13:56 There are no material conflicts
    0:13:57 other than this paid endorsement.
    0:13:59 All investing involves risk,
    0:14:01 including loss of principal.
    0:14:09 Support for the show comes from Nerd Wallet.
    0:14:11 Listener, a new year is finally here.
    0:14:12 And if you’re anything like me,
    0:14:15 you’ve got a lot on your plate, new habits to build,
    0:14:17 travel plans to make, podcasts to host.
    0:14:19 Good thing our sponsor, Nerd Wallet,
    0:14:21 is here to take one of those things off your plate,
    0:14:24 finding the best financial products.
    0:14:26 Introducing Nerd Wallet’s best of awards list,
    0:14:29 your shortcut to the best credit card savings accounts
    0:14:30 and more.
    0:14:32 The Nerds at Nerd Wallet have done the work for you,
    0:14:35 researching and reviewing over 1,100 financial products
    0:14:37 to bring you only the best of the best.
    0:14:39 Looking for a balanced transfer credit card
    0:14:42 with a 0% APR, they’ve got a winner for that.
    0:14:44 How about a bank account with a top rate
    0:14:46 to hit your savings goals?
    0:14:48 They’ve got a winner for that too.
    0:14:50 That way you can know you’re getting the best products
    0:14:52 for you without doing all the research yourself.
    0:14:54 So let Nerd Wallet do the heavy lifting
    0:14:56 for your financial future this year
    0:14:58 and head over to their 2025 Best of Awards
    0:15:01 at nerdwallet.com/awards
    0:15:03 to find the best financial products today.
    0:15:11 – Welcome back, question number three.
    0:15:15 – Hi, Scott, this is Marcela Terra from Miami.
    0:15:18 I am a Colombian immigrant and just when I thought
    0:15:21 I couldn’t love any of your content anymore,
    0:15:25 I find out you have bought a football team,
    0:15:29 not soccer football from my beloved country, Colombia.
    0:15:33 I was just curious what motivated you
    0:15:35 to join this group that is making the purchase.
    0:15:37 And if you have any plans to ace it,
    0:15:40 my country, I’m just so excited.
    0:15:43 Thank you so much for all your wonderful content
    0:15:46 and keep up the good work.
    0:15:48 In another non-football note,
    0:15:52 just wanted to thank you for showing up vulnerable
    0:15:56 and emotional and loving and caring
    0:15:58 and also being a baller because I think
    0:16:01 that’s your very own brand of healing the world
    0:16:06 and helping young men know that those two things
    0:16:09 can coexist by sky.
    0:16:10 Thank you.
    0:16:13 – I’m gonna print this out and read it to myself very night.
    0:16:17 This is one, I oftentimes think my producers hate me
    0:16:19 and are just so sick of me because I know me
    0:16:20 and I know how difficult I can be.
    0:16:21 Actually, I’m not difficult.
    0:16:23 I don’t think I’m difficult, am I difficult?
    0:16:26 But I’m shocked that they let this get through.
    0:16:28 So thank you for the really kind words.
    0:16:33 So la equidad, the number,
    0:16:35 the second best team in Bogota, Colombia.
    0:16:38 This is, I don’t wanna say it’s a dream of mine,
    0:16:41 but I had always toyed with the idea of I really wanted
    0:16:45 to buy or to make an investment in Rangers,
    0:16:48 the best team in Scotland.
    0:16:49 Because my dad used to go to Rangers games,
    0:16:50 I absolutely love football.
    0:16:54 I love the idea of investing in a community
    0:16:55 and I think it would just be a lot of fun.
    0:16:59 Also, quite frankly, it’s sort of a,
    0:17:01 I don’t know, a symbol of your success,
    0:17:03 midlife crisis, I’m worried I’m gonna die soon,
    0:17:05 arrested out of lessons.
    0:17:07 I mean, all of that less admirable shit too.
    0:17:11 And I met, so how did this come about?
    0:17:14 I met some, I met this guy at an investor conference
    0:17:16 who’s sort of the Yoda or the Svangale
    0:17:19 of investing in teams and he scans the world for teams
    0:17:23 and he tries to find teams that are economically viable
    0:17:26 because they have a great player academy
    0:17:30 and cultivate and chest eight players and then sell them
    0:17:34 or they have kind of unrealized or unlocked TV rights,
    0:17:36 whatever it might be, or they’re just not a well run team.
    0:17:39 This kid literally needs a kid who’s in his 30s,
    0:17:41 scours the globe for opportunities.
    0:17:42 And he came to me and said,
    0:17:43 I’m putting together an investor group,
    0:17:45 which includes people much more famous than me,
    0:17:48 including Ryan Reynolds and Rob McAlhaney.
    0:17:52 And I got to know Rob because of my work on “Young Man”
    0:17:54 and I was in “Welcome to Rexmer,” I was in an episode
    0:17:56 and I just was so impressed with this guy.
    0:17:59 He’s just such a lovely, talented guy.
    0:18:01 And when I saw they were part of the investor group
    0:18:04 and it was also an opportunity to spend more time in Colombia,
    0:18:07 which I just think is such an incredible country.
    0:18:11 And also quite frankly, this is a bite size.
    0:18:12 I’m not buying Chelsea.
    0:18:14 Chelsea cost four or $5 billion.
    0:18:15 I’m not in that weight class.
    0:18:18 This is a company that went for dramatically,
    0:18:21 dramatically smaller purchase amount.
    0:18:23 And I’m also a small owner,
    0:18:28 or I think I’m gonna own something like 5% of the team.
    0:18:31 But anyways, this is one part investment.
    0:18:34 I do believe, I think I’m gonna make money here.
    0:18:34 Why?
    0:18:35 The fastest growing demographic group
    0:18:38 isn’t seniors or Latinos, it’s billionaires
    0:18:40 and the super wealthy.
    0:18:40 What does that mean?
    0:18:42 It means the assets is super wealthy by,
    0:18:46 whether it’s hotel rooms in Santropé or Gulf Streams
    0:18:49 or Loro Piano or Brunella Cuccinelli
    0:18:52 or sports teams are gonna go up in values.
    0:18:55 Sports teams typically have sort of a natural monopoly.
    0:19:00 And so kind of Columbia League One does not have only,
    0:19:04 they sequester, we can’t just start a football team.
    0:19:05 MLS rights in the United States
    0:19:07 are going for several hundred million dollars now.
    0:19:09 These are essentially regulated monopolies.
    0:19:10 Is that a good thing?
    0:19:13 Probably not, but I’m taking advantage of it.
    0:19:14 So what do you have?
    0:19:18 You have an explosion in the customer base,
    0:19:20 specifically wealthy people who buy teams.
    0:19:23 And two, you have limited supply.
    0:19:25 I like my math here, I like my prospects.
    0:19:29 Also, you have this dynamic in terms of media
    0:19:32 where people can now avoid ads.
    0:19:33 Advertising has become a tax
    0:19:35 that the technologically illiterate
    0:19:37 or the poor have to pay except for sports.
    0:19:40 And that is the only time I ever see adverts,
    0:19:41 is they call them here in Britain,
    0:19:43 is when I’m watching Arsenal play.
    0:19:45 And that is the only time I’ll endure ads
    0:19:49 is when I’m watching live TV, which is almost never.
    0:19:50 I never watch live news anymore.
    0:19:53 I don’t watch, I watch original scripted dramas.
    0:19:55 So I’m able to avoid almost every ad
    0:19:56 except when I watch live sports,
    0:19:59 meaning the TV contracts will go up in value
    0:20:01 because advertisers still need to reach people
    0:20:02 in their fewer and fewer places.
    0:20:04 They can reach them, see above live sports,
    0:20:06 which means the TV rights deals will go up,
    0:20:09 which means the value, the teams will go up in concert
    0:20:11 with the number of buyers,
    0:20:14 specifically very wealthy people going up.
    0:20:17 But I don’t wanna pretend this is also not consumption.
    0:20:19 I just think this is gonna be just so much fun.
    0:20:22 I can’t wait to take my boys.
    0:20:23 I can’t wait to go with my friends.
    0:20:25 I can’t wait to have an excuse
    0:20:28 to spend more time in Medellin and Cartagena.
    0:20:31 I just think it’s, I’m just so excited about this.
    0:20:32 I love Latin American culture.
    0:20:34 I love the idea of spending more time in Colombia.
    0:20:37 I love the midlife crisis, midlife meat crisis.
    0:20:39 I like the investor group.
    0:20:40 So this was an easy one.
    0:20:43 And it’s not, you know, this is a lot of money,
    0:20:44 but it’s not hundreds of millions,
    0:20:45 not even tens of millions.
    0:20:49 It’s for someone like me who is incredibly blessed
    0:20:51 and economically secure, but not a billionaire.
    0:20:53 So what is this?
    0:20:55 This is capitalism meets consumption.
    0:20:58 I just think it’s gonna be so much fun.
    0:21:01 So famous last words, but I’m really excited about this.
    0:21:05 And maybe I will see you at a lot, equidad game.
    0:21:08 Oh my gosh, that’s right.
    0:21:10 He’s coming in, El Pato’s coming in.
    0:21:13 Moyase color, Moyat.
    0:21:15 – That’s all for this episode.
    0:21:16 If you’d like to submit a question,
    0:21:18 please email a voice recording
    0:21:20 to officehours@profitmedia.com.
    0:21:22 Again, that’s officehours@profitmedia.com.
    0:21:25 (upbeat music)
    0:21:28 (crickets chirping)
    0:21:38 [BLANK_AUDIO]

    Scott discusses the lack of IPOs in Latin America, specifically the difference between cyclical and structural decline. He then gives advice on mentoring young men and wraps up by discussing his stake in a Colombian soccer team, La Equidad Football Club.

    Music: https://www.davidcuttermusic.com / @dcuttermusic

    Subscribe to No Mercy / No Malice

    Buy “The Algebra of Wealth,” out now.

    Follow the podcast across socials @profgpod:

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • Mike Maples Jr.: Inflection Points and Startup Success

    AI transcript
    0:00:02 (upbeat music)
    0:00:10 – Hello, I’m Guy Kawasaki.
    0:00:12 This is the Remarkable People podcast,
    0:00:16 and we’re in the business of helping you become remarkable.
    0:00:19 So we scour the globe for the most remarkable people,
    0:00:23 and we found a remarkable person named Mike Maples Jr.
    0:00:26 And in the prelude, we were just discussing,
    0:00:28 he and I go way back.
    0:00:30 In college, he read my books,
    0:00:32 which is kind of a disturbing thing
    0:00:36 when people tell you that they read your book in college,
    0:00:39 and now they’re way out of college.
    0:00:43 But anyway, so he is a venture capitalist today.
    0:00:47 He’s the co-founder of a leading Silicon Valley seed fund
    0:00:51 called Floodgate, and he has invested in companies
    0:00:54 like Twitter, and Twitch, and Okta.
    0:00:57 He’s really one of the pioneers of seed capital
    0:00:59 in the mid-2000s.
    0:01:02 So with no further ado, Mike Maples Jr.,
    0:01:04 welcome to Remarkable People.
    0:01:07 – Hey, thanks Guy, and I’ve been looking forward to this.
    0:01:09 I think I’ve probably known you much longer
    0:01:11 than you’ve known me, because I got to know you
    0:01:14 somewhat through your books when I was in college.
    0:01:17 – Wow, like I said, that’s kind of a double edged sword.
    0:01:20 Well, as long as you don’t think that I wrote Rich Dad,
    0:01:22 Poor Dad, I call that a win.
    0:01:24 So I’m easy to please that way, Mike.
    0:01:26 – Yeah, yeah.
    0:01:29 The ones I really remember in my formative years
    0:01:31 were The Macintosh Way,
    0:01:34 and then the other one was Selling the Dream.
    0:01:36 And I think in Selling the Dream,
    0:01:40 I was actually at Kepler’s bookstore when you launched it.
    0:01:42 I probably have a signed copy.
    0:01:43 And one of the aspects of it I remember
    0:01:46 is I think it had the Macintosh product introduction plan.
    0:01:48 – Yes, in the back.
    0:01:50 – So at the time, I was a young product manager
    0:01:53 at Silicon Graphics, and you didn’t really get trained
    0:01:55 in marketing back in those days.
    0:01:57 And so everything I knew about marketing
    0:01:59 is just what I read in marketing books.
    0:02:01 That was an actual introduction plan.
    0:02:03 I was like, “Hey, man, I could really learn something
    0:02:04 “from this.”
    0:02:08 And then reading your books led me to Regis McKenna
    0:02:09 and some of his work.
    0:02:11 I always thought that he had some really good ideas
    0:02:12 around relationship marketing
    0:02:15 and how do you evangelize ideas.
    0:02:17 Yeah, so I feel like I’ve probably known you,
    0:02:19 hopefully I don’t seem like a stalker,
    0:02:23 but I’ve known you since you were CEO of ACIAS.
    0:02:24 – Wow.
    0:02:26 – Man, you’re going back.
    0:02:28 You’re going back so far.
    0:02:29 I was young.
    0:02:31 (laughs)
    0:02:35 So I loved your book and I see so much stuff
    0:02:37 that I agree with in your book.
    0:02:39 And let’s start with something very basic.
    0:02:43 Your book is all about getting these inflections
    0:02:46 and getting insights from the inflections
    0:02:48 and turning it into ideas.
    0:02:51 So let’s start with a definition to help everybody.
    0:02:55 What exactly do you consider an inflection?
    0:02:59 – Yeah, so an inflection is a change event
    0:03:02 that’s actually external to a startup
    0:03:04 or any business for that matter.
    0:03:08 And it allows a startup capitalist
    0:03:10 to compete by changing the subject.
    0:03:13 So what I like to say is that in startups,
    0:03:15 better doesn’t matter
    0:03:18 because business is never a fair fight.
    0:03:19 And if you’re a startup,
    0:03:23 you have to have some form of weapons
    0:03:25 to wage asymmetric warfare on the present.
    0:03:29 You have to turn the incumbents’ greatest perceived strengths
    0:03:31 into their biggest weaknesses.
    0:03:33 And the way entrepreneurs do that
    0:03:34 is they harness inflections.
    0:03:37 They use inflections to bend the arc of the present
    0:03:39 to a radically different future.
    0:03:41 And by doing that,
    0:03:45 they deny the premise of the rules of competition.
    0:03:47 They show up seemingly out of left field
    0:03:50 and now all of a sudden they disorient the incumbents.
    0:03:53 Airbnb did this in hospitality
    0:03:55 and Twitter did this in blogging and communication.
    0:03:58 But nobody ever when they saw Airbnb said,
    0:04:00 “Well, how does that compare to Four Seasons?”
    0:04:02 And nobody when they saw Twitter said,
    0:04:04 “How does that compare to WordPress?”
    0:04:07 It stood alone as an entirely new thing.
    0:04:09 I like to say better doesn’t matter when you’re startup
    0:04:11 because if you’re better,
    0:04:14 then the customer’s gonna have an alternative
    0:04:15 to your startup.
    0:04:17 And why would they pick your product
    0:04:19 when you’re 80% likely to go out of business?
    0:04:20 They’re only gonna pick your product
    0:04:23 if it can’t be reconciled with anything
    0:04:24 they’ve ever seen before.
    0:04:25 And if they say, “Oh my gosh, where have you been
    0:04:26 all my life?”
    0:04:29 And so the way that the founder achieves that goal
    0:04:31 is they harness inflections.
    0:04:33 They use the power of inflections to offer something
    0:04:36 that would seem unthinkable before the inflection happened.
    0:04:38 – But Mike. – Yes.
    0:04:43 – Do inflections cause companies to succeed
    0:04:47 or do companies cause inflections to succeed?
    0:04:49 Which way does it flow?
    0:04:51 – Yeah, so I like that question
    0:04:54 and it is one of these kind of recursively existential
    0:04:56 questions too, right?
    0:04:58 And the way I internalized it was,
    0:05:01 I was like, okay, what should founders care about?
    0:05:04 And what I thought was, okay,
    0:05:08 founders need some type of a power to change the subject.
    0:05:11 And so when I think about an inflection,
    0:05:13 whether the founder caused it or leveraged it,
    0:05:16 what matters I think from the founder point of view
    0:05:17 is three things.
    0:05:21 One is just what is the specific empowerment?
    0:05:24 Does it offer something that can provide a 10x benefit
    0:05:29 or something radically unique that’s never been seen before?
    0:05:31 Or is it just an API call that got added
    0:05:33 to Stripes API list, right?
    0:05:36 So like a good example of an inflection would have been
    0:05:41 it was empowering was the iPhone 4S had a GPS chip in it.
    0:05:43 And you could have had the idea for ride sharing
    0:05:45 before the iPhone 4S, but it wouldn’t have mattered
    0:05:47 because you couldn’t have implemented a system
    0:05:49 that embodied that idea.
    0:05:52 But now all of a sudden you have a GPS chip in the phone,
    0:05:55 you can locate riders and drivers with an algorithm.
    0:05:57 So that’s an example of something very empowering.
    0:06:01 The second thing we wanna see in an inflection is,
    0:06:02 who does it empower?
    0:06:03 Who cares?
    0:06:05 And in the case of the iPhone 4S,
    0:06:08 a lot of people care ’cause a lot of people have smartphones.
    0:06:11 And so the potential surface area of the empowerment
    0:06:13 is very high because if you believe
    0:06:15 that smartphones will keep happening
    0:06:17 and they’ll keep having GPS chips in them,
    0:06:19 then potentially that’s hundreds of millions
    0:06:21 if not billions of people someday.
    0:06:25 And then the third aspect of the inflection is,
    0:06:27 I call it the empowerment conditions.
    0:06:31 And so we’ve had nuclear power since the ’40s,
    0:06:33 but we haven’t built a nuclear power plant
    0:06:35 in the United States since the 1970s.
    0:06:37 And so just because you have a power
    0:06:39 doesn’t mean you’re gonna use it or be allowed to use it.
    0:06:41 And there might be political factors,
    0:06:43 there might be trust factors,
    0:06:44 there could be a lot of reasons
    0:06:47 that people don’t decide to adopt it.
    0:06:49 So those are the three things I look for.
    0:06:50 What’s the magnitude of the empowerment?
    0:06:52 Who does it empower?
    0:06:54 And under what conditions will people decide
    0:06:55 to take advantage of it?
    0:06:58 And under what conditions will they decide not to?
    0:06:59 And if you can get all three of those things
    0:07:02 going your way as a founder,
    0:07:05 now it’s kind of like a rock and David slingshot
    0:07:06 against Goliath, right?
    0:07:09 Now you have something that you can bring to the party
    0:07:11 that changes the subject.
    0:07:14 – When I read your book and I read that concept,
    0:07:18 I thought to myself, oh my God, this is such a high fence.
    0:07:23 In a sense, you’re saying that every successful startup
    0:07:27 either caused or significantly jumped on an inflection.
    0:07:31 And I gotta tell you, I see a lot of startups
    0:07:36 that I would hardly affiliate them with truly an inflection.
    0:07:39 So are you that tough?
    0:07:41 I mean, if somebody shows up at flood gate
    0:07:45 and they just have something better, better,
    0:07:47 do you just throw them out the door?
    0:07:48 – Well, I don’t throw them out the door.
    0:07:50 I wish them well, right?
    0:07:52 But what I say to them is, look,
    0:07:55 do you want to pursue an idea that has outlier,
    0:07:58 unbounded upside potential or not?
    0:08:01 And if you’re not harnessing an inflection
    0:08:02 in your startup idea,
    0:08:06 you’re competing in somebody else’s sandbox.
    0:08:09 So the mistake most startups make is they say,
    0:08:11 I want to go after big market.
    0:08:14 And that makes sense on the surface
    0:08:17 because big markets have lots of customers and revenue.
    0:08:20 But the problem is that the founder often unwittingly
    0:08:23 buys into a context, which is the market
    0:08:26 as it’s defined as the market.
    0:08:28 And if the market’s already defined,
    0:08:30 then somebody’s defined it.
    0:08:32 And that person has the advantage over you
    0:08:35 because they get to define the discussion that occurs
    0:08:37 and they already have the advantages of the incumbency.
    0:08:40 And so therefore, it’s kind of like
    0:08:41 if you’re competing over territory
    0:08:43 that’s a tiny little municipality
    0:08:45 and an already discovered thing,
    0:08:47 you’re not gonna have as big of an upside
    0:08:49 as if you’re Lewis and Clark mapping
    0:08:51 the Louisiana Purchase territory
    0:08:55 and discovering the undiscovered land.
    0:08:58 And so I’m not interested in the total existing
    0:09:00 or total available market.
    0:09:01 I’m interested in the total future market.
    0:09:04 I’m interested in a product that defines a future market
    0:09:06 because it harnesses these inflections.
    0:09:08 And so I just say to a founder, hey, look,
    0:09:11 I’m not for everybody, but that’s what I’m in it for.
    0:09:14 I’m trying to find startups that harness these inflections
    0:09:17 in unconventional ways to create a product
    0:09:20 that radically changes how people feel and act.
    0:09:22 And to me, that’s where the startup wins
    0:09:24 is when they do that.
    0:09:25 There’s two ways to look at the future.
    0:09:26 Do you believe it’s gonna be a new
    0:09:28 and improved version of the present?
    0:09:31 In which case the incumbents are gonna usually win that.
    0:09:33 Or do you believe that the future is not gonna be able
    0:09:35 to be reconciled with the present?
    0:09:37 In which case the startup can win.
    0:09:40 And by the way, you work for a guy who’s the master at this
    0:09:42 and he did it even in a big company.
    0:09:45 So like Steve Jobs, when he comes back to Apple,
    0:09:48 everybody says, hey, you should license the Mac OS
    0:09:51 and run it on Intel and you should have a clone market
    0:09:54 just like Microsoft and IBM do and all this stuff.
    0:09:56 Jobs didn’t do that.
    0:09:59 Jobs always found a way to change the subject.
    0:10:01 With the iPod, he found this inflection
    0:10:03 in the sense of a tiny little hard disk
    0:10:06 that he could put into a music player.
    0:10:09 And with the iPhone, he waited until the technology was ready
    0:10:11 that the touch screen was good enough
    0:10:13 and that the power in the phone was good enough
    0:10:16 that he could put Mac OS in a phone device.
    0:10:20 But Jobs never competed by being better at something.
    0:10:23 He always competed by showing up
    0:10:24 with something radically different
    0:10:26 and helping the rest of us understand
    0:10:28 what was important about it.
    0:10:30 And he did that by harnessing inflections.
    0:10:32 He just naturally knew how to do that
    0:10:34 because he knew that was his weapon
    0:10:36 to sort of change the discussion.
    0:10:39 – So let’s talk about what I consider
    0:10:42 maybe the biggest inflection certainly in my career.
    0:10:45 And I might argue in the history of mankind,
    0:10:48 which is artificial intelligence.
    0:10:53 And it seems to me that, you know, I use chat, GBT,
    0:10:58 I use Claude, I use Perplexity, I use four or five LLMs.
    0:11:01 And for the life of me, I cannot tell you why
    0:11:04 I use one or over the other.
    0:11:08 If you were to talk to the CEO of Claude or Perplexity
    0:11:09 and you would tell them like,
    0:11:12 listen, you’re just doing something better.
    0:11:14 Maybe your model is better or something,
    0:11:16 but you are not fundamentally changing.
    0:11:19 How does Perplexity or Claude compete
    0:11:22 against Gemini in Google?
    0:11:26 I just don’t see how they differentiate.
    0:11:27 – Yeah, it’s interesting.
    0:11:30 And I love this question because it highlights
    0:11:32 a real challenge I think right now,
    0:11:35 which is yes, you want an inflection,
    0:11:37 but you also want to have an insight.
    0:11:39 It’s not enough to just have a powerful idea.
    0:11:41 If a whole bunch of other people have
    0:11:43 that same powerful idea,
    0:11:47 then the opportunity gets somewhat competed away.
    0:11:49 You want to be non-consensus and right.
    0:11:51 So if we go back to the example of Lyft
    0:11:54 and then we can relate it to AI, I suppose,
    0:11:57 the iPhone 4S had an inflection
    0:12:02 in terms of empowering people to locate riders and drivers.
    0:12:06 But the founders of Lyft and Uber had to have the insight
    0:12:07 that, oh, that means you could create
    0:12:11 a transportation network that has network effects.
    0:12:14 And that’s where the creativity of the founder comes in.
    0:12:16 And now it’s obvious.
    0:12:18 Now people are like ride-sharing, of course.
    0:12:20 But at the time, it seemed crazy.
    0:12:22 Like, who’s going to get in a stranger’s car?
    0:12:23 They’re like, nobody’s going to do that.
    0:12:24 That’s crazy.
    0:12:27 Just like, who’s going to stay in a stranger’s house?
    0:12:28 That’s crazy.
    0:12:30 And so most of the great startups,
    0:12:33 they have to have an insight because
    0:12:35 if human beings are conditioned to like things,
    0:12:37 and so if everybody likes your idea,
    0:12:40 it’s too similar to what they already know,
    0:12:42 which means it’s too much of an incremental improvement.
    0:12:45 It’s too much competing on better versus different.
    0:12:46 So the best startup ideas I’ve seen
    0:12:50 have this quality of most people don’t like it at first
    0:12:51 or don’t think it’s going to work
    0:12:54 or they think it’s irrelevant.
    0:12:56 But there’s a small subset of people who are like,
    0:12:58 oh, my gosh, where have you been all my life?
    0:12:59 I’ve seen the light.
    0:13:00 This is amazing.
    0:13:01 You want that.
    0:13:03 You want to be non-consensus and right.
    0:13:04 It’s not enough just to be right.
    0:13:06 You have to be non-consensus and right.
    0:13:10 So when I back to perplexity and Claude and these guys,
    0:13:12 I don’t really think of them as startups
    0:13:15 in the way that maybe you and I think of startups.
    0:13:18 The way I think of perplexity and those guys
    0:13:20 is I think of them more as like small versions
    0:13:22 of big companies.
    0:13:25 And if I think about those companies,
    0:13:30 they’re not really trying to, in a capital efficient way,
    0:13:34 create some asymmetrically radically difference.
    0:13:36 They’re partnering with the big tech companies
    0:13:38 to create scalable technology.
    0:13:40 And I think it just happens to be
    0:13:44 that right now, small versions of big tech companies.
    0:13:47 But I don’t think of chat GPT and Claude and those guys
    0:13:49 as like startup capitalists.
    0:13:52 I think of them as probably someday a large division
    0:13:53 of a big tech company.
    0:13:55 We just don’t know which one yet.
    0:13:58 In a sense, even though what they do is so exciting,
    0:14:02 it sounds kind of boring when you put it that way.
    0:14:03 I think what they’re doing is great.
    0:14:06 I just don’t think of them as startup people
    0:14:09 in the same way that I thought of the Lyft guys
    0:14:10 or the Twitter folks, right?
    0:14:12 I think of them as more like back in the day
    0:14:14 when you’d have these joint ventures
    0:14:17 between IBM and Apple or when you’d have big companies
    0:14:18 spinning off divisions and stuff like that.
    0:14:20 I think of it as more like that
    0:14:23 than I think of it as classic startup capitalism.
    0:14:28 – So with all this talk about the entrepreneurs,
    0:14:31 the true entrepreneurs who are creating
    0:14:34 or writing this inflection point,
    0:14:37 I guess my big question for you, Mike,
    0:14:41 is how do you separate the nutcases
    0:14:42 from the pattern breakers?
    0:14:45 Because at the time you hear some of these pitches,
    0:14:48 you must think these people are nuts.
    0:14:50 And then five years later, you say,
    0:14:51 “Oh, they were so right.”
    0:14:53 So how do you figure that out?
    0:14:55 Who are the nutcases and who are the breakers?
    0:14:57 – Yeah, and it’s funny, like some books
    0:15:02 are born of experience and competence and expertise.
    0:15:06 This book that I wrote started more out of embarrassment,
    0:15:07 right?
    0:15:08 I had passed on air bed and breakfast,
    0:15:10 which became Airbnb.
    0:15:13 And I would have made like thousands of times of money
    0:15:15 if I’d done it.
    0:15:17 And then I noticed that 80% of my exit profits
    0:15:18 had come from pivots.
    0:15:23 So Twitter had started as a podcasting company called Odio.
    0:15:25 Twitch had started out as a terrible idea
    0:15:27 called Justin.tv.
    0:15:28 And so I’m looking at this and I’m like,
    0:15:30 “What business am I even in here?
    0:15:31 What am I doing?
    0:15:32 Am I just throwing darts?
    0:15:35 Should I just retire before I get exposed?”
    0:15:36 And what I started to realize was
    0:15:38 that these startup ideas that were working,
    0:15:40 they were harnessing inflections
    0:15:42 and they were non-consensus and right.
    0:15:44 Now, here’s the tricky part,
    0:15:45 and it gets to your question, I think,
    0:15:48 which is when you’re non-consensus and right,
    0:15:51 you don’t know that you’re right at first.
    0:15:52 You only know that you’re non-consensus.
    0:15:55 If you knew you were right, it would be too obvious.
    0:15:57 And so the less obvious it is,
    0:15:59 the less you can know for sure that you’re right.
    0:16:03 So on some level, you have to try the idea.
    0:16:04 You have to decide,
    0:16:07 I don’t know if I’m 100% right yet,
    0:16:10 but this is a non-consensus area that’s worth pursuing.
    0:16:14 It’s worth my energy and time to figure out.
    0:16:16 And if I’m not 100% right, I might pivot.
    0:16:19 So how do you tell the difference?
    0:16:21 What’s the right kind of crazy?
    0:16:22 What’s the wrong kind of crazy?
    0:16:25 The question I like to ask is,
    0:16:27 is this from the future?
    0:16:28 And I think William Gibson,
    0:16:31 the cyberpunk author was right when he said
    0:16:32 the future’s already here,
    0:16:34 it’s just not evenly distributed.
    0:16:37 For example, when Mark Andreessen worked
    0:16:41 on the Mosaic browser at the University of Illinois,
    0:16:44 he was in a supercomputer lab with a really fast network
    0:16:45 and powerful computers.
    0:16:49 His idea of what networks were gonna be
    0:16:52 was a better version of the future
    0:16:54 than Microsoft’s top-down or AOL
    0:16:57 or the US government or AT&T or Time Warner’s view
    0:16:58 of a tops-down network.
    0:17:01 I think that the way you know
    0:17:04 that a non-consensus idea is worth pursuing
    0:17:07 is the authenticity of the founder
    0:17:09 to that future that they’re pursuing.
    0:17:11 I think that if you’re living in the future
    0:17:13 before other people,
    0:17:15 and you’re harnessing powerful inflections
    0:17:17 before other people,
    0:17:19 the odds that you’re gonna build the right thing
    0:17:22 are much higher to be correct.
    0:17:24 And so that your intuition about what to build
    0:17:26 is far more likely to be right.
    0:17:28 And so that’s what I really look for is,
    0:17:31 I wanna know is this founder living in the future
    0:17:33 for the rest of us?
    0:17:36 Are they intrinsically motivated by that future?
    0:17:39 Are they just pursuing what they think is hot?
    0:17:43 And do they have insights about what to build
    0:17:47 because of their authentic obsession with that future?
    0:17:51 And I believe that that causes you to be more likely
    0:17:53 to build the right thing,
    0:17:55 but it also makes you more credible to early believers.
    0:17:57 It causes other people to say,
    0:18:00 hey, I agree that this guy Kawasaki guy
    0:18:02 with this asius relational database here,
    0:18:04 I think he’s noticed something
    0:18:05 that others haven’t noticed.
    0:18:07 And I wanna join that movement.
    0:18:10 And I remember this from when you were a founder.
    0:18:14 I think in your database at fourth dimension,
    0:18:15 tell me if I’m wrong about this,
    0:18:18 but your contacts, you would have a tag next to them.
    0:18:21 And one of the tags was like true believer.
    0:18:24 And Joe Lemont, my roommate college,
    0:18:26 you had tagged him as a true believer.
    0:18:29 And so that’s like these early startup markets
    0:18:32 are animated by belief, not utility, right?
    0:18:35 People buy from startups for aesthetic reasons,
    0:18:36 not practical reasons.
    0:18:38 And they do it because they co-create the future
    0:18:39 with the founders.
    0:18:42 It’s the early believers that do that with the founders.
    0:18:45 And it’s the belief in a different
    0:18:48 but aesthetically superior future
    0:18:51 that drives people to move the present
    0:18:53 to a different future together.
    0:18:55 And so that’s what I’m looking for, right?
    0:18:57 I want the idea that’s not consensus,
    0:18:59 but I wanna believe that the person comes
    0:19:02 by the idea honestly, that they come by it
    0:19:06 by authentically pursuing a future that I can get mine.
    0:19:08 (upbeat music)
    0:19:27 – So today in artificial intelligence,
    0:19:32 do you have any examples of this non-consensus
    0:19:36 kind of inflection idea?
    0:19:39 – Yeah, so I’d say that AI has been really challenging
    0:19:42 for me because I see inflections every week,
    0:19:46 but somebody shows up and says I have this great idea.
    0:19:49 And I’m like, I would totally use that product,
    0:19:52 but I don’t know why there’s not gonna be 10 just like it,
    0:19:54 which means it doesn’t have enough of an insight.
    0:19:57 So I’ll give an example of one that I think does.
    0:20:00 So I’m involved with this company called Applied Intuition,
    0:20:04 and they make autonomous vehicle simulation software
    0:20:09 and software-defined platforms for the car companies.
    0:20:12 And so let’s say your General Motors or Porsche
    0:20:15 or you’re one of these big car companies,
    0:20:19 Tesla has a software-defined car and you don’t really,
    0:20:21 and you don’t really know how to build it,
    0:20:23 and you don’t really know how to,
    0:20:26 the electric vehicles are more like a software platform
    0:20:30 architecture versus a supply chain with a bajillion suppliers
    0:20:33 that you’ve all done business with for 50 years.
    0:20:36 And so Casar, you and I started this company
    0:20:39 with some friends of his from Google.
    0:20:41 And so they’d grown up in Detroit.
    0:20:43 Casar had worked at General Motors
    0:20:45 after graduating from undergrad.
    0:20:48 Then he went to Silicon Valley, was at Google for a while,
    0:20:50 worked on the Google Maps team,
    0:20:52 knew all the guys at Waymo.
    0:20:56 And so he can go to the CEO of a company like Porsche
    0:20:59 and say, hey, if you wanna have a software-defined car,
    0:21:01 I’m your only path to getting there.
    0:21:03 You don’t have the internal talent.
    0:21:06 I have the best developers in the world
    0:21:07 tackling this problem.
    0:21:11 And you can only get this talent if you do business with me.
    0:21:14 And now is Sam Altman gonna release a new version of chat
    0:21:15 GPT that does that?
    0:21:17 No, right, because he’s not in the business
    0:21:19 of creating software-defined cars.
    0:21:22 Those are examples of businesses I like
    0:21:24 because it involves deep technology,
    0:21:29 but it involves a multidisciplinary approach.
    0:21:30 You have to not just be good at the AI,
    0:21:32 but you have to be able to speak the language
    0:21:33 of the car companies.
    0:21:36 And you have to be able to implement systems
    0:21:38 at global enterprise scale
    0:21:41 and help the customer reach the promised land.
    0:21:44 And it’s very strategic to these car companies.
    0:21:45 They know they have to move in the direction
    0:21:47 of being a software-defined car.
    0:21:50 And so that’s the kind of stuff I’ve been seeing lately
    0:21:52 that I get excited about.
    0:21:55 But what if somebody says there’s only 25 major car companies
    0:21:56 in the world?
    0:21:59 How big is the market, Mike?
    0:22:00 Yeah, and in that case,
    0:22:03 you have to be able to get giant contracts, right?
    0:22:05 You have to have individual customers
    0:22:07 in the hundreds of millions of dollars
    0:22:09 to make that business work.
    0:22:10 But you can, right?
    0:22:12 Bosch doesn’t have that many customers in the car industry,
    0:22:15 but they probably do about $50 billion a year
    0:22:16 selling to the car companies.
    0:22:18 And so if you become the complete answer
    0:22:20 to something that company cares about,
    0:22:22 you can do pretty well.
    0:22:25 But companies aren’t just gonna give $100 million
    0:22:27 to just some bozo, right?
    0:22:29 They’re not gonna give you a giant contract
    0:22:31 unless they think you can really do the job
    0:22:32 and you’re the only guy that can do it.
    0:22:35 And so you have to have a particular set of skills
    0:22:38 that people buy into, or it’s gonna be hard.
    0:22:41 I don’t wanna really catalyze PTSD in you,
    0:22:43 but can you discuss the cases
    0:22:46 where you had a false positive?
    0:22:49 In other words, that you thought someone was a pattern breaker,
    0:22:52 but after all, didn’t?
    0:22:54 Oh, sure, in fact, it’s most of the time.
    0:22:57 My business is very strange.
    0:23:01 Warren Buffett, I’ve heard him say rule number one,
    0:23:02 don’t lose money.
    0:23:05 Rule number two, don’t forget rule number one.
    0:23:08 For me, rule number one is don’t pass on Airbnb.
    0:23:09 If I had been wrong about Airbnb,
    0:23:11 I could only lose 100% my money.
    0:23:12 But if you’re right,
    0:23:15 you can sometimes make a thousand times or more.
    0:23:18 Buffett talks about his margin of safety.
    0:23:22 What I care about is my margin of asymmetric upside, right?
    0:23:24 I care about how big could it be
    0:23:26 in the rare event that it works.
    0:23:28 So I have this really weird way
    0:23:30 of coming up with an investment thesis.
    0:23:33 I’m like, okay, given that it’s 80% likely I’m wrong,
    0:23:36 how big does it need to be if I’m right?
    0:23:39 And if it can’t be big enough,
    0:23:41 no matter what the odds are,
    0:23:42 I just can’t invest in it
    0:23:45 because I have to get paid for the risk I take, right?
    0:23:47 I’m taking crazy high risk.
    0:23:48 And so I’m like, you know,
    0:23:51 unlike say Buffett who never wants to lose money,
    0:23:53 I’m saying to myself, okay,
    0:23:57 given that it’s 80% likely I’m gonna lose money,
    0:24:00 in the 20% case, how big does it need to get?
    0:24:02 Which is just a different way of showing up in the world, right?
    0:24:05 It’s a different way of thinking about success.
    0:24:10 So I don’t look at risk as the chance of success or failure.
    0:24:12 I think of risk through an expected value lens.
    0:24:15 I’m like, okay, in the 20% case it’s right,
    0:24:16 how big does it need to be
    0:24:18 to be a good expected value bet,
    0:24:20 even though it’s risky.
    0:24:23 – What if at the moment of making an investment,
    0:24:25 it’s really not an assessment
    0:24:28 of how big can something be
    0:24:32 because you don’t know what this company’s gonna pivot to,
    0:24:33 right?
    0:24:35 So when you talk about Justin TV,
    0:24:40 you couldn’t do an analysis of how big Justin TV would be,
    0:24:44 but when Justin TV pivoted to Twitch, then you could,
    0:24:47 but that wasn’t at the point of investment.
    0:24:50 So how do you factor in the pivots?
    0:24:52 – Yeah, and so the way I do it is,
    0:24:55 I like to say, I don’t wanna study
    0:24:57 the total addressable market
    0:24:59 for the reason that you mentioned, right?
    0:25:03 What I believe is that companies that harness inflections
    0:25:04 define new markets.
    0:25:07 And I can’t size a future market, right?
    0:25:09 Future hasn’t happened yet.
    0:25:11 So what I need to do is it reminds me in physics,
    0:25:14 you have potential energy and mechanical energy.
    0:25:18 And what I’m looking for is a future potential market.
    0:25:21 And I believe that big future potential markets happen
    0:25:23 from powerful inflections.
    0:25:26 So the more powerful the inflection is,
    0:25:29 the more capacity it has to change the future.
    0:25:32 And the more capacity it has to impact the future
    0:25:33 in a broad way.
    0:25:36 And so what I do is I take a leap of faith.
    0:25:40 I say, based upon this inflection, based upon this insight,
    0:25:42 and based on the founders’ authentic match
    0:25:44 to the future they’re pursuing,
    0:25:46 I think the odds are in my favor.
    0:25:50 I don’t have to know exactly how the dots will connect.
    0:25:53 I only have to think that the odds are highly probable
    0:25:56 that the founder will find a way to connect those dots.
    0:25:57 I love this job saying,
    0:25:59 you can only connect the dots looking backwards.
    0:26:01 And I think that’s what he was getting at, right?
    0:26:06 You’re pursuing opportunities that are ambiguous,
    0:26:09 but that doesn’t mean they’re not a risk worth taking.
    0:26:11 And so you’re betting on the founder’s ability
    0:26:15 to navigate their insight to the right product proposition.
    0:26:17 To the example that you gave with Twitch,
    0:26:18 JustinTV is a terrible idea,
    0:26:20 but they navigated the idea to Twitch,
    0:26:22 which was a great idea.
    0:26:24 But the inflections, the underlying inflections
    0:26:26 were always there from the very beginning.
    0:26:28 – Now, what about false negatives
    0:26:31 where you turned down somebody and they turned out great?
    0:26:33 What have you learned about why you made
    0:26:35 false negative decisions?
    0:26:37 – To me, those are the biggest errors in my business.
    0:26:40 Passing on air bed and breakfast at the time
    0:26:41 was a terrible idea.
    0:26:43 I’ve passed on other good ones too.
    0:26:45 We passed on Pinterest early.
    0:26:49 We passed on a company called Anaplan, passed on Figma.
    0:26:52 And so whenever we pass on these,
    0:26:54 in fact, whether I saw them or not,
    0:26:58 I keep this database if I call them 100 bag or startups.
    0:27:01 So I have this list of a little over a hundred companies
    0:27:02 and I study them.
    0:27:05 I create a time capsule of what it looked like
    0:27:07 at the seed round.
    0:27:10 So I have the seed deck from Pinterest and Dropbox
    0:27:13 and Airbnb and all these companies.
    0:27:14 And I look at it and I say to myself,
    0:27:17 okay, where was the signal?
    0:27:20 What was the thing that would have told you to say yes?
    0:27:24 And why did I have a failure of imagination and say no?
    0:27:27 And with Airbnb, my failure of imagination was,
    0:27:30 I thought people aren’t gonna want to stay
    0:27:31 in a stranger’s house.
    0:27:32 That’s crazy, right?
    0:27:34 And it was around the time that Craigslist killer,
    0:27:35 I was like, somebody’s gonna get killed
    0:27:36 in one of these things.
    0:27:38 And it was a screwed up meeting.
    0:27:40 Brian couldn’t get the site to work in our meeting.
    0:27:42 He had a room full of cereal boxes.
    0:27:44 He was trying to sell me Obama O’s
    0:27:45 and Captain McCain Crunch.
    0:27:48 And so that’s the other thing I learned
    0:27:50 is sometimes the pitch doesn’t go well,
    0:27:52 but that doesn’t mean it’s not gonna succeed.
    0:27:55 So I try to go back in time.
    0:27:56 The other thing I’ve learned, Guy,
    0:27:59 is that even the founders misremember how to happen
    0:28:00 a lot of the time, right?
    0:28:01 And so when it works,
    0:28:04 everybody remembers the stuff they knew
    0:28:05 and the good decisions they made.
    0:28:08 But a lot of times they know things that weren’t so
    0:28:10 at the time.
    0:28:13 And so you have to be almost like Joe Colombo detective,
    0:28:14 right?
    0:28:16 You do a forensic analysis of what it looked like
    0:28:17 at the time.
    0:28:19 And then you got to ask yourself,
    0:28:21 do any of the frameworks that we use
    0:28:23 to evaluate these things,
    0:28:25 would they have been useful here?
    0:28:26 Would they have caused us to say yes,
    0:28:28 or are we just breathing our own fumes?
    0:28:31 Do we believe a bunch of stuff about what’s true
    0:28:32 in the world that just didn’t true
    0:28:35 or doesn’t capture the reality of this situation?
    0:28:36 That’s what I try to do.
    0:28:40 I try to go back in time and really understand it.
    0:28:42 There’s a lot of things you can do to understand it.
    0:28:44 You can look at the initial seed pitch deck
    0:28:45 and you can say,
    0:28:47 is that what the product ended up being?
    0:28:49 Or did it end up being something different?
    0:28:51 How long did it take for them to get to a million revenue,
    0:28:53 10 million, 100 million revenue?
    0:28:54 What caused that?
    0:28:56 What kind of business model was it?
    0:28:59 What was the defensible mode that they created?
    0:29:02 You could start to kind of look for the signals
    0:29:04 that would have clued you in that,
    0:29:06 “Hey, this is a bet we’re taking.”
    0:29:09 – I have not done what you just described
    0:29:14 in any form as organized or rational or careful as you,
    0:29:16 but my conclusion,
    0:29:20 the more I am in Silicon Valley and around tech startups,
    0:29:24 my conclusion is coming to that the older I get,
    0:29:25 the less I know,
    0:29:27 and I could almost make the case
    0:29:29 that you should just invest in the stupidest things
    0:29:31 that come across your deck
    0:29:34 because that’s the ones that’s gonna succeed.
    0:29:37 I cannot figure this out.
    0:29:39 I would have never invested in,
    0:29:40 as you say,
    0:29:41 you’re gonna let a stranger stay in a house,
    0:29:44 you’re gonna let a stranger rent your car,
    0:29:46 you’re gonna let a stranger ride with you,
    0:29:48 you would get in a stranger’s car,
    0:29:50 get in a stranger’s house,
    0:29:53 you would take a stranger’s tour or rental.
    0:29:56 I would do none of those and look at that.
    0:29:59 I’d be zero for three right there.
    0:30:00 – Yeah, it’s tough, right?
    0:30:02 And it’s a humbling game for sure.
    0:30:04 And the other thing about it,
    0:30:07 guy is like massive success is one Airbnb away, right?
    0:30:11 You get one of those every 10 years, you’re really good.
    0:30:12 It is a humbling game,
    0:30:16 but like for me, it’s just so darn interesting, right?
    0:30:19 We know a whole lot about business now,
    0:30:20 but a hundred years ago,
    0:30:23 there were no org charts and there wasn’t accounting
    0:30:26 and there wasn’t corporate strategy as we know it.
    0:30:28 There wasn’t Michael Porter Five Forces,
    0:30:30 there wasn’t any of this stuff.
    0:30:32 All that stuff had to be figured out.
    0:30:34 And I think we’re at the beginning of infinity
    0:30:37 of really understanding startup capitalism.
    0:30:39 And so I just think a startup capitalist
    0:30:40 is a different type of capitalist.
    0:30:44 They don’t create value by persistently compounding
    0:30:46 an advantage like a big company does.
    0:30:50 They create value by doing something radically different.
    0:30:51 And that’s another way to show up in the world
    0:30:53 and be valuable.
    0:30:56 And so I just think there’s so much to learn about that.
    0:30:58 When I think that just like there’s a process
    0:31:01 for building a great company that’s a going concern,
    0:31:05 I think that there are processes and patterns
    0:31:07 that can be understood about what it takes
    0:31:10 to create a great startup that changes the future.
    0:31:12 And there’s so little that’s known about it.
    0:31:14 So we’re having this conversation right now, right?
    0:31:16 It seems kind of random, right?
    0:31:19 It seems like trying to predict the weather or something.
    0:31:21 But to me, that’s what makes it so interesting
    0:31:23 is it’s so hard to figure out
    0:31:25 that it’s worth trying to figure out.
    0:31:28 – I tell you, Mike, as I get older and older,
    0:31:30 my strategy is get lucky,
    0:31:34 but get lucky is not a good strategy.
    0:31:35 – That’s pretty good sometimes.
    0:31:40 – So I’m gonna take you down a rat hole
    0:31:42 and you can tell me we don’t wanna go down this rat hole,
    0:31:45 but you have the data to know these people
    0:31:46 better than most people.
    0:31:51 So I look at the current Mark Andreessen,
    0:31:56 the current Elon Musk, the current Mark Zuckerberg.
    0:31:59 And I ask myself, what happens to these people?
    0:32:01 These people started off trying to make the world
    0:32:05 a better place, democratizing, computing,
    0:32:07 freedom of information, all that.
    0:32:10 And now they turn into these MAGA people
    0:32:13 and you got any thoughts about what happens?
    0:32:17 Is it age or is it when you start being worth
    0:32:19 a billion dollars or more or something just happens in you?
    0:32:22 What happens to these people?
    0:32:23 – Let’s see.
    0:32:26 So I’m gonna tread carefully on the political questions.
    0:32:27 – Why?
    0:32:29 – But I mean, I guess we could talk about it some.
    0:32:33 But one thing that I’ve observed in a lot of these folks
    0:32:36 is that they tend to be disagreeable.
    0:32:39 When you think about it, a startup
    0:32:41 is a fundamentally provocative act.
    0:32:43 It’s a disagreement with the present.
    0:32:46 You’re showing up as a founder and you’re saying,
    0:32:49 hey, the way you think the world is, I’m challenging that.
    0:32:52 The way you think the world is, that’s not how it’s gonna be.
    0:32:53 It’s not gonna be about taxis anymore.
    0:32:54 It’s gonna be about ride sharing.
    0:32:57 It’s not gonna be about cars with drivers.
    0:32:58 It’s gonna be autonomous vehicles.
    0:33:00 It’s not gonna be gas, ice cars.
    0:33:02 It’s gonna be electric vehicles.
    0:33:04 Yes, I am gonna start a company
    0:33:06 that blasts rockets into outer space.
    0:33:08 Even though my competition is the US government
    0:33:11 and they have an infinite supply of money that they can print.
    0:33:14 These people tend to be disagreeable.
    0:33:16 And by that, I don’t just mean being a jerk.
    0:33:19 They tend to just be willing to disagree.
    0:33:23 Justin Kahn, before he started Justin TV,
    0:33:25 he started a calendaring company
    0:33:28 and Google launches a Google calendar.
    0:33:31 And Justin decides to sell the company on eBay.
    0:33:33 And I was like, I didn’t even know it’s possible
    0:33:34 to sell a company on eBay.
    0:33:36 Full distinct for $250,000.
    0:33:38 Some of the disagreeableness is just the willingness
    0:33:42 to be unconventional in how you pursue your ideas
    0:33:43 and your mission.
    0:33:47 I think that Elon is naturally disagreeable.
    0:33:48 There are aspects of disagreeableness
    0:33:50 that people don’t like,
    0:33:52 but if you make the diamond,
    0:33:54 not have the properties of the diamond,
    0:33:56 it won’t cut glass anymore, right?
    0:33:57 I think that most of these people
    0:33:59 who create these outlier results,
    0:34:01 they’re not normal people.
    0:34:03 And they’re not normal in ways
    0:34:05 that change the world for the better.
    0:34:06 And they’re not normal in ways
    0:34:08 that some people may not like.
    0:34:11 But like Elon said on Saturday Night Live that time,
    0:34:13 he’s like, I blast rockets into outer space.
    0:34:16 I’m trying to change humanity.
    0:34:18 Did you expect me to be a normal chill dude?
    0:34:19 He’s just not gonna be.
    0:34:21 That’s not who he is.
    0:34:24 And so I’ve kind of learned that these people
    0:34:26 don’t end up fitting neatly into a box
    0:34:28 that you wish they would fit into.
    0:34:30 And I imagine that you’ve known some founders,
    0:34:33 you knew Steve Jobs way better than I ever would have.
    0:34:35 But I imagine there were things that Steve did
    0:34:38 that probably maybe you wish he didn’t do.
    0:34:39 But it is what it is.
    0:34:41 It comes with a territory.
    0:35:02 I wanna end up with kind of a,
    0:35:05 just like a quick series of questions.
    0:35:07 I’m assuming that many entrepreneurs
    0:35:08 are gonna listen to this podcast
    0:35:11 and your legitimate successful VC,
    0:35:14 they probably would love to be in front of you.
    0:35:16 So just some quick questions,
    0:35:19 very tactical and practical for an entrepreneur.
    0:35:21 So question number one is,
    0:35:23 how do people get to a VC?
    0:35:26 Do they like send out 2,000 emails?
    0:35:28 But how do people get to you?
    0:35:31 The best way is to get a referral
    0:35:33 from somebody that we respect.
    0:35:36 And people tend to trivialize that discussion.
    0:35:38 So people tend to say that just means
    0:35:40 that whoever has the best network wins
    0:35:42 and there’s gonna be certain people
    0:35:45 who get excluded from being entrepreneurs.
    0:35:46 But there’s a reality guy,
    0:35:49 which is if you’re starting a startup,
    0:35:52 you have to convince people to join your movement.
    0:35:55 And if you’re sitting out there in the future by yourself
    0:35:56 and nobody joins your movement,
    0:35:58 that future is not gonna happen.
    0:36:01 And so to me, it’s not a function
    0:36:03 of how well connected you are.
    0:36:06 Getting an intro from someone I respect
    0:36:10 is a function of your ability to persuade somebody credible
    0:36:12 that your future matters.
    0:36:14 And that other people are gonna wanna join you
    0:36:15 in that future.
    0:36:17 And so, yeah, something could come over the transom,
    0:36:19 but how am I supposed to know?
    0:36:21 How am I supposed to know how good they are?
    0:36:25 I’ve gotta find some way to validate
    0:36:29 that their ideas about the future make sense
    0:36:30 to credible people.
    0:36:32 And by the way, it doesn’t have to be like Reed Hoffman
    0:36:34 or Mark Andreessen that makes the intro.
    0:36:37 It could be you’re living in a certain future
    0:36:39 in synthetic bio,
    0:36:42 and it’s a scientist who’s very credible in that area
    0:36:44 is like this is one of the most amazing things
    0:36:45 I’ve ever seen.
    0:36:49 And so I just need some signal from the future
    0:36:53 that’s valid, that validates the idea and the insight.
    0:36:55 I’d say that’s the key thing.
    0:36:59 If your insight is starting to get traction,
    0:37:01 that should be a solvable problem, right?
    0:37:02 There should always be somebody credible
    0:37:05 who embraces the idea, who’ll make the intro.
    0:37:09 So I’d say that that’s the primary thing that we look for.
    0:37:11 – Okay, next question.
    0:37:12 What do you want in the pitch?
    0:37:15 What’s the content of the pitch?
    0:37:18 – Okay, yeah, so I’ll give some super tactical advice on that.
    0:37:22 All things being equal, I like to say slide number one,
    0:37:26 say what you do as if I know literally nothing.
    0:37:29 You don’t say we’re Airbnb,
    0:37:34 we’re a marketplace for unused residential housing space.
    0:37:35 I don’t know what that is.
    0:37:38 That’s jibber jabber jargon.
    0:37:40 What you want to say is something like,
    0:37:44 we’re Airbnb, we let you read an extra room in your house.
    0:37:46 And here’s why that’s important.
    0:37:49 A lot of times you’ll get a pitch
    0:37:53 and your 10 slides in, I don’t know what the startup does still.
    0:37:56 And that’s hard to process, right?
    0:37:58 And I’m sympathetic to founders on this front
    0:38:01 because they get advice, they get bad advice.
    0:38:03 They get advice that says millennials are a thing,
    0:38:05 put that slide up front.
    0:38:08 Marketplaces are a thing, talk about marketplaces.
    0:38:10 And I’m 10 minutes in, I don’t know what you do yet.
    0:38:14 So slide number one is what do we do is if I know nothing?
    0:38:17 Slide number two is what do I know
    0:38:19 about the future that’s not obvious?
    0:38:22 Is really your chance to convey the insight?
    0:38:25 Slide number three is anything impressive
    0:38:28 that’s happened so far objectively?
    0:38:32 Customers, patents, letters of intent,
    0:38:37 just some proof that people in the world care about this thing
    0:38:40 and that they’re joining the movement and signing up.
    0:38:41 What I find is that the founder
    0:38:46 can get those three things right quickly, 10 minutes in.
    0:38:49 Now the venture capitalist is leaning forward.
    0:38:52 Now, it’s funny, I think we rift on this earlier.
    0:38:56 What happens too often is founders get bad advice.
    0:38:59 And so they create what I like to call a Franken Deck.
    0:39:01 And so what happens is they’ll pitch their advisor
    0:39:03 and the advisor wants the best for them.
    0:39:05 They’ll say, hey, VCs love marketplaces
    0:39:07 ’cause they have network effects.
    0:39:09 Don’t say you rent an extra room in your house.
    0:39:12 Say we’re marketplace for residential real estate
    0:39:15 ’cause marketplace, hot network effect,
    0:39:17 those are the real estate big.
    0:39:18 Okay, put that in there.
    0:39:20 And then the founder say, oh yeah, you’re right,
    0:39:21 okay, I’ll do that.
    0:39:23 And then they’ll say the other thing is
    0:39:25 this is an appealing service for millennials.
    0:39:27 So you should have a few slides up front
    0:39:29 that talk about the importance of millennials
    0:39:31 and handcrafted experiences, all that stuff.
    0:39:34 And you haven’t talked about the total available market yet.
    0:39:35 So you ought to talk about that.
    0:39:37 You need a few slides on that.
    0:39:39 You need a few slides on the real estate market,
    0:39:41 residential– – You need AI in there too.
    0:39:43 – Today you’d say you need AI.
    0:39:46 And so then what happens is you go in
    0:39:50 with this deck of 20 slides and you pitch somebody
    0:39:54 and the VC passes partly ’cause they don’t know what you do.
    0:39:57 And then they give a reason in their pass note
    0:39:58 for why they passed.
    0:40:00 And it’s like, oh man, there’s another objection.
    0:40:02 I better have a slide that counters that objection.
    0:40:04 And so before you know what you have 30 slides,
    0:40:07 each of which is designed to anticipate
    0:40:09 and counter an objection.
    0:40:11 And what I like to say to founders is
    0:40:14 the only people that matter are the people
    0:40:15 who believe your insight.
    0:40:17 If a VC doesn’t believe your insight,
    0:40:18 they’re not gonna invest.
    0:40:21 There’s no way to overcome their objection.
    0:40:24 They’re not gonna invest no matter what until it’s proven.
    0:40:27 What you wanna find is the subset of people in this world
    0:40:28 who believe what you believe.
    0:40:30 That’s true of customers, it’s true of investors,
    0:40:32 true of early employees.
    0:40:35 Don’t waste any urges of energy on anybody else.
    0:40:37 And the problem with the Franken Deck is
    0:40:40 the person who was ready to believe doesn’t know what you do
    0:40:45 because they got confused by just how convoluted
    0:40:46 the pitch was.
    0:40:50 And that person was gonna be much more likely to say yes,
    0:40:51 if you just show up and say,
    0:40:54 we let you rent an extra room in your house,
    0:40:57 you’re gonna be able to do this because Facebook Connect
    0:41:00 lets hosts and guests know who each other are
    0:41:02 and people are used to online reviews
    0:41:03 and everybody’s connected.
    0:41:06 Millennials want these kinds of experiences and all that.
    0:41:08 That’s the conversation you need to have
    0:41:10 with the person who’s prepared to believe.
    0:41:12 And the person who’s not prepared to believe doesn’t matter
    0:41:14 because they’re not gonna do anything anyway.
    0:41:16 And so their opinion doesn’t matter either.
    0:41:18 This is the important point.
    0:41:20 The source of their objection doesn’t matter
    0:41:22 because they’re not gonna join your movement.
    0:41:24 Only those who are prepared to join your movement
    0:41:27 have valid input about your strategy.
    0:41:29 And that’s really important I find,
    0:41:31 is to say, hey, I’m only gonna spend time
    0:41:34 with the people I think are ready to move with me
    0:41:38 and I’m gonna bias my feedback collection
    0:41:39 to what they say.
    0:41:42 – I gotta tell you Mike, I gotta believe
    0:41:44 that many entrepreneurs listening to this,
    0:41:47 their heads are basically exploding
    0:41:51 because they’ve been hammered and I gotta take feedback.
    0:41:54 I gotta check off all the boxes and I gotta do all this.
    0:41:57 And basically you’re saying you got them in three slides
    0:41:59 and if you don’t get them in three slides,
    0:42:00 you’re never gonna get them.
    0:42:03 So just cut your losses, stop wasting time
    0:42:07 and go find somebody who does believe the three slides.
    0:42:10 – I think so and some people may disagree with me here,
    0:42:13 but when you think about it, it’s inspiring.
    0:42:16 When you realize that great ideas
    0:42:20 are usually disliked by most at first.
    0:42:21 By the way, that’s true of everything.
    0:42:24 That’s true of Euclidean geometry.
    0:42:27 It’s, you know, Copernicus, when he says the sun
    0:42:29 is at the center, not the earth,
    0:42:33 the pope puts him under house arrest and says,
    0:42:35 hey, maybe you ought to change your opinion about that.
    0:42:37 A lot of the great ideas in human history,
    0:42:39 you know, people when Einstein proposed
    0:42:40 the general theory of relativity,
    0:42:43 they’re like, this guy sounds like he’s smoking weed.
    0:42:47 That’s one of the most abstract, crazy things I’ve ever heard.
    0:42:52 And so most great ideas are heretical at first.
    0:42:57 And actually, if most people don’t like your startup idea,
    0:42:59 that’s a positive sign.
    0:43:03 If everybody liked your startup idea, it’s too incremental.
    0:43:06 When you realize that, it’s inspiring, right?
    0:43:09 When you realize, hey, given that most people
    0:43:12 are gonna dislike my idea, who cares about them?
    0:43:13 They don’t matter.
    0:43:16 They’re not creating the future, I am.
    0:43:19 I and my early believers are gonna create the future.
    0:43:21 They’re not gonna have a say in it.
    0:43:23 And so I need to go find who those people are
    0:43:25 and not waste a single urge of energy
    0:43:27 on anybody who’s not those people.
    0:43:29 And so what we wanna do is we wanna find the people
    0:43:32 who say, oh my gosh, where have you been all my life?
    0:43:33 This is amazing.
    0:43:36 I can’t wait to join your call to adventure
    0:43:38 and go do this with you.
    0:43:39 That’s what you’re looking for.
    0:43:42 And the people who aren’t ready to accept your call
    0:43:44 to adventure, their objections don’t matter
    0:43:47 because they don’t apply to your adventure, right?
    0:43:50 Only objections that matter are the objections
    0:43:53 from fellow believers because they help you see the future
    0:43:54 in a more clear way.
    0:43:59 – Mike, I have more questions,
    0:44:02 but there’s no question I’m gonna ask
    0:44:05 that’s gonna elicit an answer that is a better way
    0:44:08 to end this podcast than what you just said.
    0:44:10 So we’re gonna (laughs)
    0:44:11 – Okay, I like it.
    0:44:14 – I call this the casino theory
    0:44:16 and I often apply it to surfing.
    0:44:20 So I’ll tell you the casino theory of surfing and podcasting.
    0:44:23 So sometimes when you go to Las Vegas
    0:44:25 and you have 50 bucks in your pocket,
    0:44:29 you go to a casino and you bet it on blackjack or craps
    0:44:33 or whatever and just magically you have $500.
    0:44:37 So my casino theory is that most people have that $500
    0:44:40 and they keep playing until they lose it.
    0:44:42 But if you’re smart and you got lucky,
    0:44:45 you got 500 bucks, you walk out, right?
    0:44:49 You quit gambling, you walk out with the 500.
    0:44:51 So the casino theory of surfing is
    0:44:53 after you caught a great wave,
    0:44:54 don’t try to keep catching waves,
    0:44:56 you’re just gonna get disappointed.
    0:44:58 And the casino theory of podcasting is
    0:45:01 when you had a great answer like that,
    0:45:02 you don’t ask more questions,
    0:45:05 you just quit now and you end the podcast
    0:45:07 ’cause that was a great answer
    0:45:09 and all these entrepreneurs are on the world.
    0:45:11 They’re putting pieces of their brain back in their head
    0:45:14 because you just said something that’s contrary
    0:45:16 to what they’ve heard for the last two years.
    0:45:20 So that’s the way to end this podcast, Mike.
    0:45:22 – All right, I appreciate your taking the time guy
    0:45:24 and it’s great to see you
    0:45:26 and congrats on all the success you’ve had
    0:45:30 and many types of ways and scenarios, right?
    0:45:34 You’ve been a polymath when it comes to the tech industry.
    0:45:35 It was great to see you.
    0:45:38 – I wish I could say that I caused
    0:45:42 or really capitalized on inflections
    0:45:47 as much as some of the stories that you wrote about.
    0:45:49 And actually, if I think about it,
    0:45:50 at the start of my career,
    0:45:53 I got on the Macintosh inflection
    0:45:54 and at the end of my career,
    0:45:56 I got on the Canva inflection.
    0:45:58 And I gotta tell you, in both cases,
    0:46:01 I consider myself lucky, not smart.
    0:46:02 – Yeah, and it’s interesting.
    0:46:04 Probably you don’t wanna sell past the order
    0:46:06 like what you said, you wanna end on the right note.
    0:46:09 But the other thing about this surfing thing is,
    0:46:11 if you go after the right waves,
    0:46:13 you only have to be right once.
    0:46:15 And so that’s the way I look at it,
    0:46:17 is you wanna pursue opportunities
    0:46:19 where you only have to be right once
    0:46:22 because you’ll be spectacularly right.
    0:46:24 The only way to really lose in entrepreneurship
    0:46:26 is to lose your time.
    0:46:28 Pursuing something that you realize
    0:46:30 in hindsight wasn’t worthy of your talent and time.
    0:46:32 And so we wanna go after ideas
    0:46:35 that are waves that are worth surfing.
    0:46:36 Because like you said,
    0:46:38 if you catch the ideal wave, you did it.
    0:46:39 – Thank you, Mike.
    0:46:42 That was just a remarkable interview.
    0:46:43 I’m Guy Kawasaki.
    0:46:46 This is remarkable people and all you entrepreneurs
    0:46:50 who just had all your myths exploded.
    0:46:51 I empathize with you,
    0:46:54 but better you hear it now from Guy and Mike
    0:46:56 than you hear it two years from now
    0:46:59 after all these rejections and disappointments.
    0:47:02 So that’s how to be a remarkable entrepreneur.
    0:47:05 I wanna thank the rest of the remarkable people team.
    0:47:08 That’s of course, Matt as a Nismar producer,
    0:47:10 Tessa Nismar researcher,
    0:47:14 Luis Magana, Fallon Yates, and Alexis Nishimuro.
    0:47:16 We are the remarkable people team
    0:47:18 and we are hell bent for leather
    0:47:21 on a mission to make you remarkable.
    0:47:25 Until next time, Mahalo and Aloha.
    0:47:30 (orchestral music)
    0:47:32 This is remarkable people.

    Buckle up for a mind-bending journey into the heart of startup innovation! On this episode of Remarkable People, Guy Kawasaki goes deep with Silicon Valley’s master of disruption, Mike Maples Jr. As the wizard behind Floodgate who spotted Twitter and Twitch before they exploded, Mike shatters conventional wisdom about what makes startups soar. Forget everything you think you know about “better products” – Mike reveals why the craziest ideas often win big and why being dismissed might be your biggest advantage. Warning: this episode may permanently rewire your entrepreneurial brain!

    Guy Kawasaki is on a mission to make you remarkable. His Remarkable People podcast features interviews with remarkable people such as Jane Goodall, Marc Benioff, Woz, Kristi Yamaguchi, and Bob Cialdini. Every episode will make you more remarkable.

    With his decades of experience in Silicon Valley as a Venture Capitalist and advisor to the top entrepreneurs in the world, Guy’s questions come from a place of curiosity and passion for technology, start-ups, entrepreneurship, and marketing. If you love society and culture, documentaries, and business podcasts, take a second to follow Remarkable People.

    Listeners of the Remarkable People podcast will learn from some of the most successful people in the world with practical tips and inspiring stories that will help you be more remarkable.

    Episodes of Remarkable People organized by topic: https://bit.ly/rptopology

    Listen to Remarkable People here: **https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827**

    Like this show? Please leave us a review — even one sentence helps! Consider including your Twitter handle so we can thank you personally!

    Thank you for your support; it helps the show!

    See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

  • Re-imagining the energy grid … through batteries (Two Indicators)

    When it comes to solar and wind power, renewable energy has always had a caveat: it can only run when the wind blows or the sun shines.

    The idea of a battery was floated around to make renewables available 24/7. For years, it existed as an expensive, little-used technology. Then in 2021, it took off.

    In this episode, we explore how this new energy market works in two states: California and Texas.

    In California, there is now enough grid-scale battery storage to power millions of homes — at least for a few hours — and it’s growing fast. What does this success mean for the grid, and how did the state support it?

    Then, we visit Texas, whose approach is more free-market rodeo. The state has the second-most battery storage capacity in the U.S. And in Texas, their system was recently put to the test. So, can these large-scale batteries can help prevent blackouts?

    These two stories come from our sister show The Indicator, which recently reported a series about the electric battery market.

    Today’s show was hosted by Cooper Katz McKim, Darian Woods and Wailin Wong. The original Indicator episodes were produced by Cooper Katz McKim and Corey Bridges, and edited by Kate Concannon. It was fact-checked by Sierra Juarez and engineered by Jimmy Keeley and Neil Tevault. Alex Goldmark is Planet Money’s executive producer.

    Help support Planet Money and hear our bonus episodes by subscribing to Planet Money+ in Apple Podcasts or at plus.npr.org/planetmoney.

    Learn more about sponsor message choices: podcastchoices.com/adchoices

    NPR Privacy Policy

  • AI Agents Take Digital Experiences to the Next Level in Gaming and Beyond, Featuring Chris Covert from Inworld AI – Episode 243

    AI transcript
    0:00:10 [MUSIC]
    0:00:13 Hello, and welcome to the NVIDIA AI podcast.
    0:00:15 I’m your host, Noah Kravitz.
    0:00:19 Digital humans and AI agents are poised to be big news this year.
    0:00:21 They’re already making waves, in fact.
    0:00:26 At CES 2025 in Las Vegas, Logitech G’s Stream Labs unveiled its intelligence
    0:00:31 streaming assistant, powered by technologies from in-world AI and NVIDIA.
    0:00:35 The intelligence streaming assistant is an AI agent designed to provide real-time
    0:00:39 commentary during downtime and amplify excitement during high stakes moments
    0:00:41 like boss fights or chases.
    0:00:45 The collaboration brings together Stream Labs expertise in live streaming tools,
    0:00:49 NVIDIA ace technology for digital humans, including AI vision models that can
    0:00:53 understand what’s happening on screen, and in-world’s advanced generative AI
    0:00:57 capabilities for perception, cognition, and adaptive output.
    0:00:58 But what are digital humans?
    0:01:00 Where are they going to be used?
    0:01:03 And where are they going to make an impact in the enterprise and
    0:01:05 in gaming and entertainment in particular?
    0:01:08 And how does a designer design in the age of digital humans,
    0:01:10 agentic AI, and beyond?
    0:01:14 Chris Covert, director of product experiences in in-world AI,
    0:01:18 who’s press release, by the way, or blog post, I paraphrased heartily from in
    0:01:21 that introduction, so credit to.
    0:01:24 Chris is here to dive into these points and more as we talk about some of the
    0:01:28 technology that I think is really poised to really make a big impact this year
    0:01:33 going forward and kind of shape this new area of digital experience for all of us
    0:01:35 in what we’ll broadly call the AI age.
    0:01:36 But at any rate, Chris is here.
    0:01:41 So Chris, thank you so much for joining the AI podcast and welcome.
    0:01:42 Thank you for having me today.
    0:01:45 It’s a pleasure to be here, not only as a longtime partner of NVIDIA,
    0:01:49 but also as a huge fan of your recent announcements in AI at CES this year as
    0:01:53 well, genuinely amazing time to be talking about AI in this space.
    0:01:54 It really is.
    0:01:55 And so you were at the show.
    0:01:56 You were in Vegas.
    0:01:57 Couldn’t make it personally.
    0:01:58 I’ll be doing dice.
    0:02:00 I’ll be doing GDC, but our team was there.
    0:02:02 We worked on the project.
    0:02:03 All right, so let’s get into it.
    0:02:08 Tell us a bit about in-world, in-world AI for listeners who may not know.
    0:02:11 We can get deeper into the assistant in particular,
    0:02:14 a little later in the conversation, if that’s cool, just so you can kind of set
    0:02:18 the stage for all of us about what it is that we’re going to be talking about.
    0:02:20 Yeah, so I, you know, extremely biased.
    0:02:22 I have the best job in the world.
    0:02:25 And in-world, we make the leading AI engine for video games.
    0:02:28 And I get to work with the most creative minds in the industry,
    0:02:31 most creative minds in the world of gaming and entertainment to answer the
    0:02:35 question, how can we make fun more accessible to the people that make our
    0:02:36 favorite experiences?
    0:02:40 And I’m not going to tell you it sounds like a great job.
    0:02:42 It is, it is.
    0:02:43 And I’m not just blowing smoke here.
    0:02:47 And we’ll definitely, as we go through this, you know, emphasize fun,
    0:02:50 accessible in people, and why I think those are the most important here.
    0:02:54 But in-world’s mission is to be that AI engine for this industry, right?
    0:02:58 The more that the technology progresses, the lower the barrier
    0:03:00 to entry to make AI experiences.
    0:03:04 But we find there are still challenges, regardless of if you’re a studio
    0:03:08 with a massive staff or you’re an AI kind of native company building
    0:03:12 up these experiences from grassroots, that there’s a massive gap
    0:03:14 between prototyping and deployment of AI systems.
    0:03:19 And then to do that and find the fun for a user for an experience
    0:03:20 is still incredibly challenging.
    0:03:24 So we offer not only the platform, but the services to help you
    0:03:26 create that fun in a deployable manner.
    0:03:31 So kind of briefly run down maybe what you guys talked about at CES this year
    0:03:34 and then we’ll put a pin in the assistant, like I said, and come back.
    0:03:35 Yeah, that’s awesome.
    0:03:38 So blog posts is out there, fantastic videos out there.
    0:03:41 A lot of demonstrations of what happened, but succinctly, you know,
    0:03:45 to plug the collaborative effort here in World NVIDIA and Streamlabs
    0:03:50 got together for CES and we put to the test this, you know, streaming companion
    0:03:52 that could not only act as a cohost to a streamer,
    0:03:54 but also serve to support the creator in other ways,
    0:03:58 like handling production for them on their stream as well.
    0:04:01 We showed a demo of this happening at the conference, always a risk
    0:04:04 because you’re playing in a big space with low, you know, internet bandwidth.
    0:04:08 But we played this over Fortnite and the streamer, the demo where in this case
    0:04:11 and their agentic cohost, you know, chatting during the game,
    0:04:15 grilling each other for bad rounds, reacting to trends that are happening in chat,
    0:04:17 cheering together when you manage to get kills.
    0:04:21 But then interestingly enough, you know, when chat or when the streamer wants a replay,
    0:04:24 streamer has to just ask the agent, they’ll clip it,
    0:04:27 they’ll set up the screen for that replay and they’ll keep that stream feeling fluid.
    0:04:30 So it’s one of those use cases, again, trying to align on like,
    0:04:35 where do agentic systems actually play in enterprise and this entertainment industry
    0:04:37 is we’re finding they’re hyper specialized.
    0:04:40 And in this use case, we have this perfect, you know,
    0:04:43 one streamer has to wear all of these different hats during their stream.
    0:04:45 How can an agentic system come assist them
    0:04:48 so that they can focus on making the best content
    0:04:51 and all of the other things in managing that content can be done for them.
    0:04:54 So back in the day, and this is part of why we got to come back to it.
    0:04:58 I did a lot of YouTube stuff and we tried to do some live streaming.
    0:05:00 This was in like the early mid 2000s.
    0:05:05 And I remember setting up, you know, open source plugins to try to get graphics
    0:05:06 over the live stream.
    0:05:10 And then I had like a secondary camera, you know,
    0:05:13 like a phone with a camera with a USB cable and all this stuff.
    0:05:14 But that sounds amazing.
    0:05:17 So I’m excited to talk about that to kind of then back into it
    0:05:19 or start with the technology and go forward.
    0:05:23 A genetic AI cohost digital humans.
    0:05:24 Let’s talk about those things.
    0:05:27 You want to start with digital humans because we’ve talked about a genetic AI
    0:05:30 and we’ll get to it and it has the different, you know, what an agent is or isn’t.
    0:05:33 I think it’s still a little bit malleable and context specific.
    0:05:37 But when we talk about digital humans, when InWorld talks about digital humans,
    0:05:38 what are we talking about?
    0:05:39 You know, that’s such a great question.
    0:05:41 And it brings up an important distinction.
    0:05:44 When most people say digital humans, they’re thinking of chatbots.
    0:05:48 Text-based tools that respond with mostly pre-programmed scripts
    0:05:52 to help a specific task or guide users through a set of questions or instructions.
    0:05:56 But at InWorld, we focus on AI agents that go far beyond simple chat.
    0:05:59 These agents can autonomously plan, take actions,
    0:06:01 and proactively engage with their environment.
    0:06:07 Unlike a typical chatbot, which waits for a user to have some kind of question
    0:06:12 or response in and then that AI agent provides a canned response,
    0:06:15 our agents at InWorld are designed to interpret multiple types of inputs,
    0:06:19 whether that’s speech or text or even vision or sensor data.
    0:06:22 And then dynamically think and respond in real time.
    0:06:23 They don’t just answer questions.
    0:06:25 They can initiate tasks.
    0:06:26 They can adapt to different contexts.
    0:06:28 And they can carry out sophisticated interactions
    0:06:31 that make sense within the context of what you’re doing.
    0:06:34 So if you look at where digital humans are used mostly today,
    0:06:35 and again, I’m super biased,
    0:06:38 it’s often in the high volume chatbot or, you know,
    0:06:40 personal digital assistance space.
    0:06:43 But if you consider where they can have the biggest impact,
    0:06:45 that’s when they function as these truly autonomous agents.
    0:06:48 Planning and acting and proactively helping people
    0:06:50 in ways that traditional chatbots can’t.
    0:06:52 It’s not question and answer.
    0:06:54 It’s answer or question that I didn’t even know I had yet.
    0:06:56 And that’s the core of our focus at InWorld,
    0:06:58 building AI agents that can not only look human,
    0:07:02 but can think and react and solve problems like one,
    0:07:05 using multiple modalities to interact with their world.
    0:07:06 In our case, a digital one.
    0:07:09 When it comes to building with agentic AI,
    0:07:13 do you feel even like you have kind of a solid vision
    0:07:15 of what is to come even near term?
    0:07:20 Or is it still very early days of figuring out everything from,
    0:07:23 you know, trying different types of infrastructure
    0:07:26 to, you know, sort of the design and higher level thinking
    0:07:28 about how to architect these things?
    0:07:28 That’s a great question.
    0:07:32 And the reality is we’re shifting toward an agentic AI framework,
    0:07:33 as I mentioned before.
    0:07:37 And shifting toward that framework is no longer really an option.
    0:07:40 It’s becoming essential for us and our partners.
    0:07:44 When you need any sense of autonomy or kind of real time
    0:07:46 or runtime adaptability, latency optimization,
    0:07:48 any sense of custom decision making,
    0:07:52 you need a robust framework that lets you take control of that stack.
    0:07:55 No one platform is going to serve the needs of our industry
    0:07:56 as standalone.
    0:07:58 So building a framework that is flexible,
    0:08:00 that is robust to the needs and growing needs,
    0:08:03 changing needs, super diverse needs of our industry
    0:08:05 is incredibly important.
    0:08:06 What we’re seeing more and more of is enterprises
    0:08:09 that want to own that architecture end to end.
    0:08:11 What I mean by that is they need the flexibility
    0:08:14 to decide what models to use, where to use them,
    0:08:16 how data is fed into their systems,
    0:08:18 how to manage compliance and risk,
    0:08:21 all of these factors that require really custom logic
    0:08:22 and custom implementations.
    0:08:24 So it’s adopting this, you know,
    0:08:28 agentic AI framework that really puts the power in their hands.
    0:08:30 It’s not just about future proofing.
    0:08:33 It’s about giving them full control over how AI evolves
    0:08:35 within their organization at any given time, right?
    0:08:37 The industry is going to move quickly.
    0:08:38 We want to make sure our partners can move
    0:08:41 just as quickly as they get inspired.
    0:08:45 But to me, you know, my favorite key differentiator here,
    0:08:47 Future of Inworld, why the framework model,
    0:08:50 why all of these changes is that such a framework
    0:08:52 doesn’t just guarantee technical flexibility.
    0:08:54 That’s great, and that’s going to make a lot of people happy,
    0:08:57 but it also opens up the creative design space as well, right?
    0:09:00 By collaborating with people that are thinking
    0:09:02 of the most outrageously beautiful,
    0:09:05 outrageously wonderfully weird experiences
    0:09:07 you could possibly imagine and giving them tools
    0:09:09 that are opened up to really let them craft
    0:09:11 their own AI architectures.
    0:09:14 We’re really, you know, helping push the boundaries
    0:09:18 of what and how we can deliver innovative experiences, right?
    0:09:21 We made the jokes earlier, but if we say that conversational AI,
    0:09:22 you know, this like chat bot nature,
    0:09:26 is now feeling very entry level, it’s feeling baseline,
    0:09:28 then what we’re doing with this agentic framework
    0:09:30 is building out that blueprint for the future
    0:09:32 so that our partners and our customers
    0:09:37 can help imagine, build, and deploy AI experiences
    0:09:40 that genuinely didn’t feel possible even six months ago, right?
    0:09:41 Taking that to a whole new level.
    0:09:44 If we go more than like two years out,
    0:09:47 I’m at a 10% confidence level of where this is actually going to be.
    0:09:50 Yeah, I appreciate the candor.
    0:09:52 Yeah, it’s just, you know,
    0:09:54 and I’ve been doing this a couple of times in my career throughout
    0:09:57 where I’ll work with somebody in an advisory capacity
    0:10:00 where we’re looking at the future three to five years out
    0:10:02 and by the time we finish our analysis,
    0:10:04 the thing that we had two years out is now open sourced
    0:10:06 and you’re like, cool, so back to the drawing board
    0:10:09 because it’s just the industry moves so quickly
    0:10:13 that what is possible, it’s actually really hard to nail down
    0:10:16 and we will get into kind of, hopefully in this conversation,
    0:10:20 how designing around AI and these systems is that challenge
    0:10:23 because I think we’ve learned some fantastic lessons
    0:10:25 that hopefully people listening can take away as well.
    0:10:29 I really see there, you know, when it comes to trying not to sound too obvious
    0:10:31 when we’re talking about agents,
    0:10:34 the evolutionary chain of how an agent will grow,
    0:10:36 how an agent, you know, the technology behind it
    0:10:38 will proceed in even the near future.
    0:10:41 It comes all down to agency and again, it’s agents.
    0:10:44 It sounds obvious, but what I really mean by that
    0:10:46 is the complexity of that cognition engine
    0:10:49 and I’ll try and crack this net with an analogy
    0:10:51 and I’ll try and make it fast because I could talk about this.
    0:10:52 Now, take your time.
    0:10:53 We’ll take your time with the analogy anyway.
    0:10:58 We have this, you know, first again is that conversational AI phase
    0:11:00 and I’ll use a gaming analogy, right?
    0:11:04 The conversational AI phase gives avatars, gives agents,
    0:11:05 I’ll use them interchangeably today,
    0:11:08 extremely little agency in doing anything other than speaking, right?
    0:11:12 It may be able to respond to my input if I ask it to do something,
    0:11:14 but it’s not going to physically change the state of something
    0:11:17 other than the dialogue it’s going to tell me back.
    0:11:19 So, I think in terms of an analogy,
    0:11:21 like a city-building simulation game,
    0:11:24 in this phase, conversational AI,
    0:11:25 I can ask it about architecture
    0:11:27 and it could give me some facts about architecture.
    0:11:30 I can ask it about plans I have to build this little area
    0:11:31 and it could give me some advice.
    0:11:34 I could ask it, “Hey, how do I limit the congestion
    0:11:37 or how do I maximize traffic over this little commercial area?”
    0:11:41 And it could tell me in words how to potentially do those things,
    0:11:43 but it won’t be able to do things like place buildings
    0:11:46 or reconstruct roads or even select the optimal facade
    0:11:48 for the style of the region
    0:11:50 because that’s completely out of its scope.
    0:11:50 And not to…
    0:11:51 It can’t act.
    0:11:52 Yeah, it can’t act.
    0:11:53 It can’t act at all.
    0:11:53 It is…
    0:11:55 Okay, just making sure that I’m fine.
    0:11:55 Yeah.
    0:11:58 It is a very rudimentary perception,
    0:11:59 kind of what is it seeing,
    0:12:02 what context does it have and cognition,
    0:12:04 how is it planning, how is it reasoning,
    0:12:06 very simple engines that drive that.
    0:12:08 Really, I’m going to talk to it.
    0:12:10 It’s going to say, “Oh, you know,
    0:12:12 this person cares about architecture.
    0:12:14 I’m going to pull everything I know about architecture,
    0:12:16 turn that into a nice response,
    0:12:19 weave together kind of their question into my response,
    0:12:22 and then boom, now we have simple conversation AI.”
    0:12:24 But not too far ahead of that.
    0:12:26 And I say that lovingly and endearingly
    0:12:29 is a simple task completion AI.
    0:12:31 It’s very much still call and response.
    0:12:33 You have an action engine.
    0:12:34 You say, “I’m having an agent
    0:12:37 that can only construct new buildings
    0:12:38 exactly where and when I tell them.
    0:12:40 I can say place a building at this intersection
    0:12:41 and it does it, boom.
    0:12:43 I can say change the building to this type
    0:12:44 and boom, it does it.”
    0:12:44 Right.
    0:12:46 And while you argue that
    0:12:47 that requires a little bit more perception,
    0:12:49 maybe a little bit more reasoning,
    0:12:50 definitely some action,
    0:12:53 it’s limited to an extremely small set of actions
    0:12:56 and it’s pretty much just scripting with extra steps.
    0:12:58 It knows what it’s going to expect.
    0:12:59 It’s waiting for me to say it in a way
    0:13:02 that it can map to that is the place of building action.
    0:13:04 I’m going to place a building done.
    0:13:08 If a language model provides the description,
    0:13:11 including like I wish I could change the background
    0:13:13 to mountains or whatever, right?
    0:13:15 Or envision a background with mountains
    0:13:16 and it can’t act, right?
    0:13:17 Can the system be automated?
    0:13:18 What you’re talking about
    0:13:23 so that the language model can tell the action cognition model?
    0:13:24 What to do?
    0:13:25 I forget what term you used.
    0:13:28 I have a large action model in my head,
    0:13:30 which yeah, I have thoughts about that terminology,
    0:13:32 but I don’t know if they’re my own thoughts or not.
    0:13:34 So we’ll get into it.
    0:13:36 But you’re definitely hinting at, and again,
    0:13:38 I should have upfront said this.
    0:13:39 There are four phases.
    0:13:41 We are at phase two.
    0:13:43 I think a lot of virtual assistants are at that phase two.
    0:13:46 What you’re describing in kind of intent recognition
    0:13:49 or kind of passive intent recognition even
    0:13:51 is there are no formal names for this,
    0:13:54 but I’ll call this like an adaptive partner phase
    0:13:56 where the AI is observing
    0:13:58 and responding to changes on its own, right?
    0:13:59 And this is the natural evolution
    0:14:02 of where a lot of AI systems are heading today.
    0:14:03 CES, a good example of that.
    0:14:05 I have no doubt throughout the year,
    0:14:07 you’ll get really close to getting here
    0:14:09 as another core standard.
    0:14:10 I think right now,
    0:14:12 a lot of the kind of simple task completion AI
    0:14:15 is a core standard across enterprise and gaming.
    0:14:16 This adaptive partner phase
    0:14:20 where this agent in this analogy, extended analogy,
    0:14:22 would notice changes like newly built roads
    0:14:25 or new residents influx into this tiny area
    0:14:28 and automatically adapt construction plans.
    0:14:30 It’s not micromanaging every decision,
    0:14:33 but it feels like you’re collaborating with an agent
    0:14:36 or a unit that has just enough context
    0:14:37 to make smart decisions on its own,
    0:14:40 like an evolution of a recommendation engine
    0:14:43 being driven by a cognition engine here.
    0:14:44 So it’s not just learning,
    0:14:47 but it feels like it’s learning what we need
    0:14:48 even before we ask it.
    0:14:51 And I expect to see, again, a lot of this in the next year.
    0:14:53 Again, I think that’s phase three.
    0:14:55 I think there’s still a phase four.
    0:14:57 And I think that’s a fully autonomous agent.
    0:15:00 And that stage, again, continuing our analogy,
    0:15:02 is a player two, right?
    0:15:04 Where player three is it’s adapting to us.
    0:15:07 Stage four is, hey, this thing is an agent all on its own.
    0:15:10 It feels like I’m playing against another human.
    0:15:11 It is making decisions that feel optimal
    0:15:14 to its own objectives, alike to mine or not.
    0:15:18 And I think this is where a lot of people think
    0:15:19 about agentic AI.
    0:15:21 They think we are there today.
    0:15:24 There’s quite a big gap in the deployment of these systems
    0:15:26 to get to that truly autonomous agent.
    0:15:29 That is kind of the gold rush right now
    0:15:32 is creating a cognition and a perception
    0:15:34 and an action engine ecosystem
    0:15:38 that can feel natural to a user.
    0:15:39 I’m speaking with Chris Covert.
    0:15:43 Chris is Director of Product Experiences at N-World AI.
    0:15:45 I made him kind of, I made us kind of put a pin
    0:15:47 in their product announcement to kind of bring this
    0:15:50 into the, from the abstract a little bit into the concrete.
    0:15:52 I’m glad you humor me, Chris, because I think kind of digging
    0:15:54 into, well, what does this stuff actually mean?
    0:15:55 And it’s sort of moving.
    0:15:57 And one of the things I should say it’s kind of moving
    0:15:58 and evolving.
    0:16:00 Maybe the lingo is moving, targeting,
    0:16:03 but the tech and what’s actually happening is evolving
    0:16:04 and still so fast.
    0:16:04 All right.
    0:16:07 So let’s talk about the big announcement from CES.
    0:16:10 You said you wanted to talk through kind of the design
    0:16:12 process a little bit and how you make these choices.
    0:16:14 So if that’s something that makes sense to you to do
    0:16:17 in the context of talking about the streaming assistant,
    0:16:18 great.
    0:16:20 And if not, take it whichever way you want to go first.
    0:16:21 That sounds perfect.
    0:16:21 Yeah.
    0:16:23 So I’ll be honest around here, right?
    0:16:26 In any demo that you see publicly,
    0:16:28 especially when you have more than one company demoing it,
    0:16:31 there’s a lot of design, but also a lot of urgency
    0:16:33 to show something quickly and cool.
    0:16:37 So of the things that I’ll say are design axioms
    0:16:40 that I love to live by, we had to sacrifice some
    0:16:42 just to make sure that something awesome could
    0:16:44 be demonstrated at CES.
    0:16:46 Putting a timeline on creative vision is always challenging.
    0:16:47 But–
    0:16:47 Right.
    0:16:49 And that’s like you can’t– we live in–
    0:16:51 I mean, I don’t build software for a living.
    0:16:52 You do.
    0:16:55 But we live in such an age of like, we’re past this state,
    0:16:58 where beta became, all right, everybody ships beta.
    0:16:59 And then we got used to it.
    0:17:01 And it’s like, right, there’ll be updates
    0:17:03 and downloadable content and this and that.
    0:17:05 But not CES.
    0:17:05 That’s the date.
    0:17:06 Show up or don’t.
    0:17:07 So–
    0:17:08 Exactly.
    0:17:08 Exactly.
    0:17:11 But in terms of design, and this might sound odd,
    0:17:13 coming from a tech startup on the call here today.
    0:17:16 But my first piece of advice, regardless of who we’re
    0:17:18 working with, is always to ignore the technology
    0:17:20 and focus solely on the experience
    0:17:22 that you really want to create.
    0:17:24 Where is the value in the experience?
    0:17:25 The why?
    0:17:28 And as soon as you can lead, or not as soon as you can lead
    0:17:31 with the tech, but as soon as you do lead with the tech,
    0:17:32 hey, I have an idea.
    0:17:34 What if we had an LLM that could?
    0:17:36 You already are risking thinking too narrowly
    0:17:38 and missing bigger opportunities.
    0:17:41 And I genuinely would love to say the reason behind this
    0:17:42 is because it’s human-centered.
    0:17:45 I come from the intersection of autonomous systems, AI,
    0:17:46 and human-centered design.
    0:17:49 So I would love to say that it’s a human-centered approach
    0:17:50 to design, and that’s why I do it.
    0:17:53 But in reality, is that the technology just advances
    0:17:55 so quickly by the time your idea goes into production,
    0:17:58 that tech-grounded idea that you had now
    0:18:00 looks like a stone wheel compared to the race car
    0:18:01 you thought it was six months ago.
    0:18:05 So again, like the urgency behind some of these things
    0:18:08 is challenging, especially when you come from a tech-grounded
    0:18:10 design principle standard.
    0:18:14 So I’m a firm believer in the moonshot first approach,
    0:18:17 where you begin by clarifying why you want to build something
    0:18:20 before how you decide to build it, especially for anything
    0:18:22 with a shelf life greater than just a few months at this stage.
    0:18:24 Again, if you start with how you end up
    0:18:26 with a bunch of low-hanging fruit,
    0:18:28 and a bunch of low-hanging fruit makes for a really bad smoothie.
    0:18:32 So what is a moonshot experience?
    0:18:35 Like, I often refer in design thinking workshops
    0:18:38 to the classic impact versus feasibility matrix.
    0:18:41 And this isn’t like a very exciting conversation.
    0:18:42 So I’ll try and keep it high energy.
    0:18:44 But typically, when you’re looking at impacts–
    0:18:46 I was just saying the smoothie.
    0:18:48 Sorry, I couldn’t let the smoothie go unacknowledged,
    0:18:50 but I didn’t want to interrupt it.
    0:18:51 Well, we’ll come back to smoothie right for now.
    0:18:52 OK, good.
    0:18:55 Because again, this is that it’s in that quadrant,
    0:18:58 or it’s in that matrix, impact feasibility.
    0:19:00 Your low-hanging fruit ideas are the things
    0:19:03 that are going to be low feasibility or low impact.
    0:19:06 But typically, you’re aiming for the upper right quadrant,
    0:19:09 which is the highest feasibility and the highest value,
    0:19:11 where impact and feasibility are so high,
    0:19:13 the ideas feel easy to build
    0:19:15 and have a lot of inherent value in them.
    0:19:17 But what I’ve experienced when working with partners
    0:19:20 and their AI ambitions is that that’s actually a trap,
    0:19:23 that that quadrant right there is doomed for failure.
    0:19:28 Because often the ideas that are the ones worthy of pursuit
    0:19:31 the most are the ideas that almost feel impossible,
    0:19:34 but if real would deliver extraordinary impact.
    0:19:35 Right, that’s right.
    0:19:37 As you get closer to building that value,
    0:19:40 and you’re going from idea to prototype to implementation,
    0:19:45 the technology is likely grown in not only capability,
    0:19:47 efficiency, but also accessibility in ways
    0:19:51 that probably outpace our own kind of traditional lens
    0:19:54 of what is feasible and our imagination of where the tech can go.
    0:19:56 So that sounds like a lot of nothing.
    0:19:58 So my actual advice is,
    0:20:01 when we’re designing these types of experiences,
    0:20:03 start from that moonshot that feels impossible
    0:20:06 and then break it down into a functional roadmap.
    0:20:09 What you saw at CES in this demo with Streamlabs
    0:20:11 and NVIDIA and in-world technology
    0:20:13 creating the streamer assistant
    0:20:15 is actually just Horizon 1, that insight.
    0:20:17 What can we build that gives us insight
    0:20:20 into whether this is something that provides value?
    0:20:22 Where does it provide value and to who?
    0:20:25 The thing we demonstrated at CES
    0:20:30 was just the first pass proof of concept of this capability.
    0:20:34 The kind of upper potential of where this can go,
    0:20:35 where we want this to go,
    0:20:37 and where we’re going to explore its value
    0:20:42 is still very much architected out toward a much longer roadmap.
    0:20:45 And again, showing what we showed at CES
    0:20:47 and what I love about this industry right now
    0:20:50 is we’re able to show really compelling integrations
    0:20:53 with very meaningful use cases in this industry.
    0:20:55 I’m not about to wax poetic,
    0:20:58 like our industry is the same as healthcare
    0:20:59 or things like that and enterprise,
    0:21:02 but showing digital companions, virtual assistants,
    0:21:05 digital humans at the state that we’re showing them now,
    0:21:08 knowing what’s to come is an incredible place
    0:21:10 to look ahead and say, “Okay, cool.
    0:21:13 This scratches an itch that either the market needed
    0:21:16 or that technologically is doing something
    0:21:17 that could never be done traditionally before.
    0:21:23 Where this moves in the next six months to six years
    0:21:24 is anyone’s game.”
    0:21:27 Okay. So two questions, but they’re related.
    0:21:29 One’s easy. The other’s a follow-up.
    0:21:32 What is the availability of the assistant?
    0:21:34 It was demo state very early.
    0:21:36 Do you have kind of a roadmap for that?
    0:21:38 I shouldn’t say availability of a roadmap.
    0:21:40 And that might be the answer to the second part,
    0:21:44 but what else is on the horizon for you for in-world AI?
    0:21:46 I mean, you talked a little bit before
    0:21:49 about the broader horizon for agentic AI
    0:21:51 and avatars and assistants and such,
    0:21:52 but you can go back there if you like.
    0:21:53 What’s coming up this year,
    0:21:56 kind of maybe near-term that you’re excited about?
    0:21:58 Well, man, this year being near-term
    0:22:01 is funny at InWorld because we move very quickly.
    0:22:03 This year is many near-terms stacked against each other.
    0:22:06 Well, right. And I’m figuring we’re taping now,
    0:22:08 but it’s going to go live.
    0:22:10 You know, there’s a little bit of a buffer
    0:22:12 and we reference CES.
    0:22:14 Yeah. So we certainly have productization ambitions
    0:22:15 for this demo.
    0:22:18 What we’re doing post-CES is we’re taking the feedback
    0:22:21 of what we’ve built and we’re augmenting it in many new ways.
    0:22:24 Again, what was built for the demo was a proof of concept.
    0:22:27 As many proof of concepts on show floors are,
    0:22:29 it wasn’t robust to all the inputs we would want.
    0:22:30 It wasn’t robust to all the games
    0:22:32 that we would want it to be playable with.
    0:22:34 So we’re trying to build out that ecosystem
    0:22:35 in an intelligent strategic way
    0:22:37 so that if it were to go to market,
    0:22:40 it would be usable to as many streamers as possible
    0:22:42 who wanted to leverage this type of technology.
    0:22:46 So keep your ears on the beat for what’s about to come out.
    0:22:50 I have no doubt that between Nvidia, InWorld, and Streamlabs,
    0:22:52 all announcements and all possible show floors
    0:22:54 that we can show our advancements on
    0:22:56 will be shown at the right time.
    0:22:57 So super exciting for that.
    0:22:59 As it relates to InWorld, oh boy,
    0:23:01 it’s such a fascinating question
    0:23:03 because a lot of what we’re doing,
    0:23:04 we’re doing with partners
    0:23:07 with such fascinating game development lead times
    0:23:09 that I hope that we’ll be playing
    0:23:11 more in-world driven experiences
    0:23:13 in the next, you know, when this releases,
    0:23:15 but also over the next four or five years,
    0:23:17 depending on the scale of the company we’re working with.
    0:23:21 So I’m genuinely excited for what’s to hold
    0:23:22 as our platform develops.
    0:23:25 Again, as we’ve seen in just the last year or so
    0:23:27 with more and more competitive players
    0:23:30 in the space of providing AI tools,
    0:23:32 and then just fantastic partners
    0:23:35 that are helping this industry become more accessible
    0:23:37 to people of all different backgrounds,
    0:23:41 really hope to see the, I wouldn’t say consolidation of tools,
    0:23:43 but the accessibility, I’ll keep using that word,
    0:23:47 the accessibility of different AI platforms.
    0:23:50 Hey, I want to use this model, but in this engine,
    0:23:51 to become a lot easier.
    0:23:53 And InWorld’s goal is certainly to make that happen
    0:23:55 for as many industries as we can,
    0:23:58 in particular the gaming and entertainment industry,
    0:24:00 but it doesn’t happen without partners like Nvidia
    0:24:02 and the work that you guys are doing with ACE.
    0:24:05 So where I, you know, where I think InWorld is going
    0:24:08 is to make that easier, to continue to work with studios
    0:24:11 to find the fun and to convince every player.
    0:24:14 And in particular, you know, let me be honest,
    0:24:16 the YouTube commenters that, hey,
    0:24:19 there is actually a world here where this technology
    0:24:21 is not only fun and immersive,
    0:24:23 but it’s something that the entire industry views
    0:24:25 as it’s gold standard.
    0:24:27 So I think we’re there a lot sooner than we think.
    0:24:29 I think it’s right around the corner,
    0:24:33 but could not be more excited to continue
    0:24:35 to work with creatives to help them tell stories,
    0:24:39 to help them, you know, flex and use their imagination
    0:24:42 as much as possible to make the best possible experiences.
    0:24:43 We do it every day.
    0:24:45 We may not see it in the near term
    0:24:47 because games take a while to make,
    0:24:49 but genuinely excited.
    0:24:51 Excellent. Well, I’ll offer more fun.
    0:24:53 The world always needs more fun to counterbalance everything else.
    0:24:56 Chris, for listeners who would like to learn more
    0:24:59 about anything specific we might have talked about
    0:25:01 or just broadly about InWorld,
    0:25:02 where can they go online?
    0:25:05 Website, obviously, but are there social handles?
    0:25:08 Is there a separate blog or even research blog?
    0:25:09 Where would you direct folks?
    0:25:10 Yes, to all of the above.
    0:25:13 You can find all of that at inworld.ai.
    0:25:17 And we have our blog for partners and experiences there.
    0:25:18 We have technical releases.
    0:25:19 We have research done.
    0:25:21 All of our socials are linked there.
    0:25:22 If you want to stay up to date
    0:25:24 with all the announcements that we make,
    0:25:26 we have a lot of fun and we like to talk about it.
    0:25:28 So definitely stay up to date.
    0:25:30 Fantastic. Well, thanks for taking a minute
    0:25:31 to come talk about it with us.
    0:25:32 We appreciate it.
    0:25:35 And best of luck to you and InWorld AI
    0:25:36 with everything you’re doing this year.
    0:25:39 And maybe we can do it again down the road.
    0:25:41 I love it. Thank you so much, Noah. Appreciate it.
    0:25:44 [MUSIC PLAYING]
    0:25:48 [MUSIC PLAYING]
    0:25:51 [MUSIC PLAYING]
    0:25:55 [MUSIC PLAYING]
    0:25:58 [MUSIC PLAYING]
    0:26:02 [MUSIC PLAYING]
    0:26:06 [MUSIC PLAYING]
    0:26:09 [MUSIC PLAYING]
    0:26:13 [MUSIC PLAYING]
    0:26:16 [MUSIC PLAYING]
    0:26:20 [MUSIC PLAYING]
    0:26:23 [MUSIC PLAYING]
    0:26:26 [MUSIC PLAYING]
    0:26:30 [MUSIC PLAYING]
    0:26:40 [BLANK_AUDIO]

    AI agents with advanced perception and cognition capabilities are making digital experiences more dynamic and personalized across industries. In this episode of the NVIDIA AI Podcast, Inworld AI’s Chris Covert discusses how intelligent digital humans are reshaping interactive experiences, from gaming to healthcare, and emphasizes that the key to meaningful AI experiences lies in focusing on user value rather than just technology.

  • How AI Is Changing Warfare with Brian Schimpf, CEO of Anduril

    AI transcript
    0:00:06 The American defense industry is the largest in the world at nearly $1 trillion,
    0:00:11 accounting for about 40% of military spending around the world,
    0:00:15 and also arguably impacting every person on Earth.
    0:00:18 Now this sector has also phase shifted throughout the decades,
    0:00:20 including the consolidation of crimes,
    0:00:24 shrinking from over 50 to less than 10 large crimes,
    0:00:27 receiving a majority of defense dollars.
    0:00:30 Those are companies like Lockheed, Raytheon, or Boeing.
    0:00:33 But there are some new companies in town,
    0:00:37 trying to disrupt how defense is done through new hardware and software.
    0:00:42 One of those is Andrewle, a company that just announced Arsenal One,
    0:00:45 a billion dollar factory in Columbus, Ohio,
    0:00:48 expected to create 4,000 jobs in the region.
    0:00:50 Now in today’s episode,
    0:00:52 Andrewle co-founder and CEO, Brian Schimpf,
    0:00:57 sits down with A16C Growth general partner, David George.
    0:01:00 Together they discuss how Andrewle got its first product off the ground,
    0:01:03 competing with some of the largest companies in the world
    0:01:08 in navigating the US government’s complex procurement processes.
    0:01:12 They also discuss how AI changes the modern battlefield.
    0:01:13 Being able to pull out that signal
    0:01:17 from this overwhelming amount of information that exists.
    0:01:20 Plus, what most people get wrong about these technologies.
    0:01:25 It is unethical to not apply these technologies to these problems.
    0:01:27 And how we shape up to the competition.
    0:01:31 They are running hundreds of tests a year of hypersonic weapons.
    0:01:33 Right, the US is running like four.
    0:01:35 Now if you do like this episode,
    0:01:38 it comes straight from our AI Revolution series.
    0:01:41 So if you missed previous episodes of that series,
    0:01:43 with guests like AMD CEO Lisa Sue,
    0:01:46 anthropic co-founder Dario Amade,
    0:01:50 or the founders of companies like Databricks, Waymo, Vigma, and more,
    0:01:54 head on over to A16C.com/AiRevolution.
    0:01:58 All right, let’s get started.
    0:02:01 As a reminder, the content here is for informational purposes only.
    0:02:04 Should not be taken as legal, business, tax, or investment advice.
    0:02:07 Or be used to evaluate any investment or security.
    0:02:11 And is not directed at any investors or potential investors in any A16C fund.
    0:02:14 Please note that A16C and its affiliates
    0:02:17 may also maintain investments in the companies discussed in this podcast.
    0:02:20 For more details, including a link to our investments,
    0:02:23 please see A16C.com/disclosures.
    0:02:28 (Music)
    0:02:36 (Music)
    0:02:39 Let’s jump right in. What is the end roll? Tell us what you do.
    0:02:42 All right, so we were founded in 2017. We’re about seven years in.
    0:02:46 The basic idea was we thought there was a better way to make defense technology.
    0:02:51 So number one, the tech for the next 20 or 30 years was going to be primarily,
    0:02:56 how do you just have more cheap autonomous systems on the battlefield,
    0:02:59 just more sensors, just more information flowing in.
    0:03:01 That seemed like it had to be true.
    0:03:04 So we invested in the core software platform we call Lattice
    0:03:06 that enables us to make sense of all these things.
    0:03:10 We have built a variety of autonomous products that we fielded over the last seven years,
    0:03:12 just an outrageous pace.
    0:03:16 And we’re really working on all aspects of national security and defense.
    0:03:20 And how did you guys get on to national defense as the place to go spend your time?
    0:03:22 I mean, I know your background, but maybe you can share that.
    0:03:25 Yeah. So I was at Palantir for about 10 years.
    0:03:29 I had been working on a variety of government programs and then several of the co-founders.
    0:03:33 So Trace Stevens was also a Palantir, Matt Grimm, our COO as a Palantir.
    0:03:34 We’re all really good friends.
    0:03:39 And we’ve been talking about doing this idea of there needs to be a next generation defense company.
    0:03:44 And then Tray and Palmer met through VC world and Palmer was just getting out of Oculus
    0:03:46 and he was like, “It’s the same thing I want to do.”
    0:03:49 And so we decided to kick this off together.
    0:03:53 But for me, working in defense, it was just obvious the degree to which there was a problem.
    0:03:58 You work in this space, the tech is old, it is not moving fast, it is very lethargic.
    0:04:00 There are relatively few competitors at this point.
    0:04:02 It just felt very ripe to do something different.
    0:04:07 And it’s the sort of thing that once you get into it, the people who are actually serving
    0:04:10 just have this patriotic motivation to solve the problem.
    0:04:14 It’s just very, very motivating problem to work on.
    0:04:16 How did you land on the first product?
    0:04:19 So first product we worked on was what we call Century.
    0:04:20 It’s for border security.
    0:04:22 And this was a Palmer idea.
    0:04:24 He believed that tech could actually solve this.
    0:04:27 So we have these automated cameras with radars.
    0:04:31 We can monitor the border for miles away from these cameras.
    0:04:36 And he was like, “This is something we can solve super fast with technology.”
    0:04:42 And it really kind of hit what has ended up being a very good pattern for us is find an urgent problem
    0:04:45 that actually has a real tech solution.
    0:04:49 That we can apply the cutting edge technology to.
    0:04:53 So early on in 2017, computer vision was just starting to work.
    0:04:55 It wasn’t even really embedded GPUs yet.
    0:05:01 We were literally taking desktop GPUs and liquid cooling them to get these things to work in a hot desert under solar power.
    0:05:04 But we were able to go and get a prototype up in about three months
    0:05:09 and then move into like a pilot in about six months and then full scale in about two and a half years.
    0:05:10 So really, really quick timeline.
    0:05:15 But I kind of fit this problem set of we had a technical insight of how you can do this better.
    0:05:18 And there was urgency to solve the problem.
    0:05:20 They actually wanted to make a dent in this.
    0:05:22 Alright, I’m going to ask you a lot more about that stuff.
    0:05:26 But one of the things that people say to me all the time, and you hear it in speeches and all this stuff,
    0:05:28 like AI is going to change the nature of warfare.
    0:05:29 Yeah.
    0:05:33 On the one hand, the major breakthrough that we just had, the way everyone interacts with it,
    0:05:35 is like a chat bot and an LLM.
    0:05:36 It’s pretty cool.
    0:05:37 It’s amazing.
    0:05:38 It’s awesome.
    0:05:39 I use it for everything.
    0:05:47 But what are the implications of this new wave of AI generative AI on modern warfare, physical sense?
    0:05:48 You have known the software side.
    0:05:49 Let’s talk about that.
    0:05:54 So when I think about where AI is going to drive the most value for warfare,
    0:06:00 it is dealing with the scale problem, which is really the amount of information that is the number of sensors,
    0:06:03 the sheer volume of systems that are going to be fielded and it’s going to go through the roof.
    0:06:05 So this is like lattice.
    0:06:06 Maybe start even there.
    0:06:08 Everything has a sensor.
    0:06:09 That’s right.
    0:06:11 So what do people do in the DoD?
    0:06:15 There’s a lot of things they do, but what’s the primary war fighting function?
    0:06:18 They are trying to find where the adversaries are.
    0:06:19 Yes.
    0:06:21 They need to then deploy effects against them.
    0:06:23 That can be a strike.
    0:06:25 That can be deterring them by a show of force.
    0:06:28 That can be jamming and non-kinetic things.
    0:06:31 And they’ve got to then assess, did that actually work?
    0:06:32 Find them.
    0:06:35 You’ve got to engage and you’ve got to assess.
    0:06:36 It’s pretty straightforward.
    0:06:39 That is the primary thing that the military does.
    0:06:40 And so, okay, what do you need to do that?
    0:06:42 You need a ton of sensors.
    0:06:47 You need a ton of information on what is going on with an adversary who is constantly trying to hide from you and deceive you.
    0:06:52 So just huge amounts of information to make that problem as intractable as possible for them to be able to hide
    0:06:55 or when they are deceiving, you can figure it out.
    0:06:58 Hard to deceive in every single phenomenology of sensing.
    0:07:00 This technology exists, right?
    0:07:01 The sensors exist.
    0:07:02 The sensors all exist.
    0:07:03 The sensors are deployed, right?
    0:07:06 They’re going to get better and you’re going to be cheaper and you’re going to be able to do more of them.
    0:07:10 But a lot of the limit of why can’t we do more is what the hell are you going to do with the day?
    0:07:11 Processing capabilities, yeah.
    0:07:13 Processing, but also just operationally.
    0:07:20 So, okay, now I say I had a perfect AI system that could tell me where every ship, aircraft, and soldier was in the world.
    0:07:21 What are you going to do with that?
    0:07:22 Now I know everything.
    0:07:24 That is overwhelming, right?
    0:07:28 And so then being able to sift through that information to, well, okay, they’re maneuvering here.
    0:07:29 What does that imply?
    0:07:31 Is this an aggressive action?
    0:07:32 Is it outside their norms?
    0:07:35 Is this different than we’ve seen in the past?
    0:07:41 Being able to pull out that signal from just this overwhelming amount of information that exists.
    0:07:43 And then the other side, you got to act, right?
    0:07:46 So now I’ve got to actually be able to carry out these missions.
    0:07:47 Yeah.
    0:07:51 So this is where on the autonomy side, it really comes in, which is, okay, I want to send fighter pilots out.
    0:07:54 So the way they do like a predator drone today is like a guy with a joystick.
    0:07:55 Yeah.
    0:07:56 We’ve all seen that Ukraine, Russia.
    0:07:57 Yeah, exactly.
    0:07:59 It’s all like manually piloted.
    0:08:01 But that doesn’t really scale.
    0:08:05 And there presents a lot of limitations on communication shaming, all these things.
    0:08:11 So I want to be able to task a team of drones to go out and say, hey, go in this area and find any ships and tell me where they are.
    0:08:12 I just wanted to be that simple.
    0:08:14 They just need to figure out their own route.
    0:08:17 And if I lose some of them, they rebalance, they just go out and handle it.
    0:08:18 They’re running target recognition.
    0:08:20 They can pop back whatever is relevant.
    0:08:26 That is where I think the autonomy side really comes in, which is I can just drive scale into the number of systems I can operate in the environment.
    0:08:33 The promise of AI in a lot of ways in the long run with this is just the ability to scale the types of operations I’m doing, the amount of information I have.
    0:08:40 And if done very well, it will put humans into a place of sort of better decision making, right?
    0:08:45 Instead of being like inundated by a volume of data and then all of our capacity goes to these mechanical tasks.
    0:08:53 We can have humans with much better context, much better understanding, historical understanding of what this means, what the implications of different choices are.
    0:08:54 Yeah.
    0:08:56 Those are all things that AI can enable over time.
    0:08:58 Ideally better decision making.
    0:09:00 Ideally because we are wildly better decision making.
    0:09:04 We’re working with both limited information and imperfect judgment.
    0:09:05 That’s right.
    0:09:06 I guess, right?
    0:09:07 Yeah.
    0:09:13 And so the more you can have AI augmentation for these things and synthesis and like clarity, that is where the promise of this is.
    0:09:19 And so the U.S. posture on this is very much, we want to have humans accountable for what happens in war.
    0:09:20 Yes.
    0:09:21 That is how it should be, right?
    0:09:25 The military commander that employs a weapon is accountable for the impact of those weapons.
    0:09:26 Yeah.
    0:09:27 That is correct.
    0:09:29 I think that is the system we should have.
    0:09:34 And so then nobody is talking about having full-blown AI is going to decide who lives and dies.
    0:09:38 That is a crazy version that nobody wants to have.
    0:09:44 Well, I think it’s also far-fetched in the sense that it presumes some sort of objective function that isn’t driven by us.
    0:09:45 Exactly.
    0:09:46 This is my conversation with everybody.
    0:09:47 Yeah.
    0:09:49 Oh, my God, what about when the AI goes Terminator on us?
    0:09:51 I’m like, it’s a tool for humans.
    0:09:53 It doesn’t have an objective function.
    0:09:55 That’s a leap that is not on the scientific roadmap today.
    0:09:56 That’s right.
    0:09:58 So, like, why would that be the case in warfare?
    0:09:59 That’s right.
    0:10:02 And so I think the reality for these things is it’s going to be human augmentation.
    0:10:07 It is going to be enabling human software in a much larger scale with much higher precision on these things.
    0:10:09 And that is the opportunity with it.
    0:10:10 Yeah.
    0:10:15 And so to me, it is unethical to not apply these technologies to these problems.
    0:10:21 Our view has always been, if we’re the best technologists on these problems or we can get the best technologists to it,
    0:10:26 giving the best tools on these absolutely critical decisions that are extremely material,
    0:10:28 that seems like probably a good thing.
    0:10:33 And engaging in the question of how can you use this technology responsibly and ethically is incredibly important.
    0:10:34 Yeah.
    0:10:39 Is it more humane to have a fighter pilot in the way of danger or having an autonomous system?
    0:10:40 That’s right.
    0:10:41 Piloting in a conflict?
    0:10:42 That’s right.
    0:10:43 And by the way, I have friends who are fighter pilots.
    0:10:44 I love fighter pilots.
    0:10:45 Yeah.
    0:10:47 But the technology has advanced significantly.
    0:10:52 And you can make the argument that it is more humane not to put them in the line of fire in the way of danger, right?
    0:10:53 Yeah.
    0:10:54 We’re not going to want to put US troops at risk.
    0:10:55 Yes.
    0:11:04 And I think those are the turns factor of the US saying, I have this capability and I’ve reduced my political cost of engaging on these things.
    0:11:06 It’s actually a pretty good deterrent as well.
    0:11:07 Yeah.
    0:11:09 I’m not putting US troops or I can give this to allies.
    0:11:10 Yes.
    0:11:11 And they can defend themselves.
    0:11:12 Yep.
    0:11:13 Keep us out of the fight.
    0:11:14 Yeah.
    0:11:15 Keep our troops out of the fight.
    0:11:16 Keep the troops out of the fight.
    0:11:17 And it changes the calculus quite a bit.
    0:11:25 And so I think that actually in a lot of ways if done well has a significant kind of stabilizing impact and a turn impact.
    0:11:29 It just is harder to use force to get your political ends.
    0:11:30 Yes.
    0:11:31 Exactly.
    0:11:32 And I think that can be a very positive thing.
    0:11:33 Yeah, exactly.
    0:11:34 Yeah.
    0:11:39 I keep coming back to deterrence and we need to find a way to create a sense of urgency for the sake of deterrence.
    0:11:40 Yeah.
    0:11:41 Not for the sake of going to war.
    0:11:48 And so it feels like that’s universally like people, people we talk to, I feel like that’s universally known and hopefully we can make some progress.
    0:11:49 Yeah.
    0:11:50 I think people largely agree.
    0:11:52 Well, Vladimir Putin was very convincing of this.
    0:11:53 Yes.
    0:12:00 Like it turns out invading Ukraine was probably the single biggest shift I’ve seen in terms of people recognizing that, look, there are still bad actors in the world.
    0:12:04 They will use force to get to their political well if they think it will work.
    0:12:05 Yeah.
    0:12:07 If the cost is worth it, they’re going to do it.
    0:12:09 And I don’t think there’s any reason to believe that’s going to stop.
    0:12:11 It’s been true for tens of thousands of years.
    0:12:16 Do you think the future of war first, you said AI is an augmentation for humans?
    0:12:17 Yep.
    0:12:21 How fully automated do you think a conflict can become, say in the next 10 years?
    0:12:31 Look, I think the mechanics of, okay, there’s this airfield and you want to go surveil it and take it.
    0:12:32 You’re going to do some strike.
    0:12:33 You’re going to do some surveillance.
    0:12:34 You’re going to do all these things.
    0:12:37 There will be a large degree of automation in that, right?
    0:12:43 Like I can just say, hey, send this team of drones out in these waves to go conduct this operation,
    0:12:45 find things that pop up that are a threat.
    0:12:48 Tell me if you pop up to the human to say engage or not.
    0:12:49 It goes, right?
    0:12:51 Like you can move at a much faster pace.
    0:12:53 I think a lot of the things that were starting to happen in Ukraine,
    0:12:56 a lot of the great work Palatier did was on things like this,
    0:12:59 where it was like the targeting process of going from satellite imagery through to,
    0:13:05 hey, this looks like a tank through to an approval of, is this a legitimate military target or not?
    0:13:06 I was streamlined and compressed.
    0:13:07 Much faster work for you.
    0:13:08 Much, much faster.
    0:13:11 So I think those things will happen very, very quickly.
    0:13:12 Like very, very quickly.
    0:13:17 Then, okay, now it turns into a matter of policy and degree and scope.
    0:13:22 That is a thing that I think we’re just going to have to figure out as we work through it with the military.
    0:13:24 So what we think about from the technology side is,
    0:13:28 okay, I don’t want to design anything that precludes more advanced forms of this over time,
    0:13:30 architect it correctly.
    0:13:35 But the crawl phases just get a lot of the basics just automated, very mechanical things.
    0:13:36 Make it very predictable.
    0:13:37 Don’t have any surprises.
    0:13:39 And then you can add more sophistication.
    0:13:43 As you build trust, the AI advances, these things get more sophisticated over time.
    0:13:48 And one of the best examples is on the defensive side where it’s right for AI.
    0:13:50 So we do a lot of work on counter drone systems.
    0:13:52 This is one of the areas we’re partnering with OpenAI on.
    0:13:57 And it’s looking at this question of if you have multiple drones flying at you
    0:14:02 and you have minutes to respond before the strike happens on you.
    0:14:03 How do you make an optimal decision?
    0:14:04 How do you make an optimal decision?
    0:14:07 When you are panicked, you are nervous and your life is at risk.
    0:14:12 Is that a person manually sitting there making those decisions today?
    0:14:16 Yeah, it’s often three because they have a separate radar from a camera
    0:14:19 or separate from the guy pulling the trigger on the weapon systems.
    0:14:22 And so then the coordination costs can be significant.
    0:14:24 So you can automate a lot of this.
    0:14:26 And then the other problem with this is then, as we’ve seen in Ukraine,
    0:14:30 every single unit, every single soldier is now at risk of drones.
    0:14:35 So this has to proliferate out from being a specialty that you do in an operation center
    0:14:37 now to every vehicle in the field.
    0:14:40 In the field, everyone has to have this capability.
    0:14:45 You need the ability to have these systems just process all that sensor data
    0:14:49 automatically, fuse it together, tell you viable options for countering this
    0:14:51 and tell you what’s a threat and what’s not a threat.
    0:14:53 Like these are the types of things you need to be able to do,
    0:14:56 respond with intelligence suggestions,
    0:14:59 and then have the system just automatically carry it out from there.
    0:15:01 Yeah, these are the types of problems we’re working on.
    0:15:05 And the defensive side is just you need it, right?
    0:15:09 There’s no choice because the timelines are too short and the urgency is too high.
    0:15:17 And it’s a very straightforward area to understand where technology can really improve the problem.
    0:15:23 Yeah, it’s like the highest stakes version of decisioning that autonomous driving cars are doing today,
    0:15:25 but with way more sensor information.
    0:15:27 Yeah, it’s not a road.
    0:15:31 Yes, with an adversary who’s constantly trying to fool you to see view.
    0:15:33 And yes, it’s very, very hard.
    0:15:36 So that’s one of the big parts of the partnership with OpenAI.
    0:15:37 Yeah, yeah.
    0:15:38 So they’ve been great.
    0:15:43 And I think Sam especially has been very clear that he supports our warfighters
    0:15:48 and he cares about having the best minds in AI working on national security
    0:15:51 and who better exists to work through these hard problems.
    0:15:52 Yeah.
    0:15:55 And so I was just incredibly proud of them for coming out in favor of this
    0:15:56 and saying they’re going to work on this.
    0:15:57 They’re going to do it responsibly.
    0:15:58 They’re going to do it ethically.
    0:16:02 But this is an important problem that the best people should be working on.
    0:16:07 The defense industry is notoriously difficult for startups to navigate.
    0:16:11 So how did you guys actually get traction in the first place?
    0:16:16 And do you think that’s going to change in the future?
    0:16:18 Do you think it will continue to be hard?
    0:16:20 Do you think the primes will continue to have a stranglehold?
    0:16:21 I’d love your take on that.
    0:16:22 It is very hard.
    0:16:27 And I think we built a lot of the right technology and the right business model
    0:16:30 of investing in things that we believe need to exist.
    0:16:32 I think we’re picking a lot of the right problems to go after,
    0:16:34 but probably more than anything,
    0:16:37 I think we understood the nature of what it took to sell, right?
    0:16:40 And the congressional relationships, the Pentagon relationships,
    0:16:44 the military relationships, like all of this that you need to be able to say,
    0:16:45 “Hey, we have the right tech.
    0:16:46 You can trust us.
    0:16:47 We can scale.
    0:16:49 We can actually solve these problems for you.”
    0:16:55 Proving that it works and then like catalyzing all of these really complex processes around it.
    0:17:00 I think the other part that we’ve done quite well is we’re just finding ways
    0:17:03 to find those early adopters and we understand those playbooks.
    0:17:04 Who’s going to move quick?
    0:17:08 How do you just build that momentum and advocacy in the government to make this go?
    0:17:10 Look, it’s like more bureaucratic in certain ways.
    0:17:13 Is it much worse than selling to a bank or an oil and gas company?
    0:17:17 It’s, I don’t know, maybe 30% worse, but like probably not 5x worse.
    0:17:18 Yeah.
    0:17:22 And I think the reality is it’s like enterprise sales are actually very hard.
    0:17:25 Especially the ones with long sales cycles and massive commitments.
    0:17:26 That’s right.
    0:17:28 These are large capital investments customers making.
    0:17:29 That is a slow sales cycle.
    0:17:30 That is how it works.
    0:17:31 Yeah.
    0:17:35 And so I think there’s like a lot of complaining and frustration.
    0:17:36 It’s okay.
    0:17:38 Well, also being bad at business means you’re bad at business.
    0:17:40 If you don’t understand your customer, you’re going to lose.
    0:17:41 Yeah.
    0:17:42 That’s how it works.
    0:17:44 So do I think the government needs to be a better buyer of these things?
    0:17:49 Do I think they need to like take better strategies that’ll get them more what they want?
    0:17:50 Absolutely.
    0:17:55 They’re taking observably bad strategies to get to the outcome they actually want.
    0:17:58 Do I think it’s necessary to change for us to be successful?
    0:17:59 Not really.
    0:18:01 We’re just going to play the game that they present.
    0:18:02 Okay.
    0:18:04 I want to talk about the observably bad strategies.
    0:18:05 Yeah.
    0:18:06 What are the observably bad strategies?
    0:18:07 Right.
    0:18:08 And then what are the good ones?
    0:18:13 And maybe also wrap it into this idea that how do you actually convince the government
    0:18:15 that your ideas are the right ideas?
    0:18:16 Yeah.
    0:18:23 So take should you go spend money building a whole new generation of F-35s with man pilots
    0:18:27 or a whole new generation of aircraft carriers, or should you do something different?
    0:18:29 And how do you actually get your points across to them?
    0:18:30 Okay.
    0:18:34 So they’re sort of like, how do they contract and buy and what’s been going wrong there?
    0:18:38 And then it’s what’s the right composition of even if you could buy perfectly well,
    0:18:39 what should you be buying?
    0:18:40 What should you buy?
    0:18:41 Yeah.
    0:18:42 Two different questions.
    0:18:43 Yeah.
    0:18:46 So the typical government contracts are done in what’s called cost plus fixed fee.
    0:18:51 And this actually came out of World War II when we were retooling industry to work on
    0:18:52 national security problems.
    0:18:55 We’re just like, we’re going to cover all your costs and we’ll give you a fixed profit percentage
    0:18:56 on top.
    0:18:58 And so the incentives here are sort of obvious.
    0:19:00 If it’s more expensive, you get more profit.
    0:19:02 If it is less reliable, you get more profit, right?
    0:19:03 The longer it takes.
    0:19:04 Yeah.
    0:19:07 The longer it takes, the less viable it is, the more complicated it is.
    0:19:11 There’s no point of incentive in there to actually drive down costs.
    0:19:12 And you see this play out, right?
    0:19:16 It’s like the companies have gotten so used to this where you look at even something like
    0:19:20 Starliner where, you know, I think SpaceX had a third the amount of money that Boeing
    0:19:23 was given to make Starliner work.
    0:19:27 And SpaceX did it on time probably faster than they even predicted.
    0:19:30 Did it probably incredibly profitably and it worked.
    0:19:33 And so I think these incentives that don’t hold you accountable are actually bad for
    0:19:34 your company.
    0:19:36 It just makes you a worse company.
    0:19:40 But do people in the government realize that it’s bad for the country?
    0:19:41 I think they are frustrated.
    0:19:44 I think they understand that this is not really working.
    0:19:50 So you look at like F-35 as an example of one of these programs, it took 25 years to
    0:19:53 get it from initial concept to fielding.
    0:19:57 There’s this awesome chart which is like shows how long it takes to get commercial aircraft
    0:20:00 or autos from kickoff to fielding.
    0:20:03 And it’s been like flat to slightly better for all of those things, like on the order
    0:20:05 of two to three years.
    0:20:08 The military aircraft side just went linearly straight up.
    0:20:12 If these things are taking longer and longer and longer, there’s an amazing quote that
    0:20:15 if you extrapolate this out, this is in the 90s, this guy made this quote.
    0:20:20 If you extrapolate this out by like 2046, the US government will be able to afford one
    0:20:25 airplane that the Air Force and Navy share and the Marine Corps every other day of the
    0:20:26 week gets.
    0:20:27 It better be a good airplane.
    0:20:28 Yeah.
    0:20:29 These things are just crazy.
    0:20:31 And so I think they recognize that this is not working, right?
    0:20:32 This is broken.
    0:20:34 Now the other part of this, they haven’t had a lot of alternatives.
    0:20:40 So you have a relatively small cartel of these companies who sort of all say we won’t do
    0:20:41 fixed price programs anymore.
    0:20:44 They won’t do things on a fixed cost basis.
    0:20:46 So okay, if you’re the government and you’re a buyer, what are you going to do?
    0:20:47 Yeah, of course.
    0:20:48 You don’t have a lot of choice here.
    0:20:52 And there’s been a lot of problems with trying to get this model right.
    0:20:55 Now, there’s some things that can work a lot better.
    0:21:01 I think SpaceX really proved this where they literally built a reusable rocket that you
    0:21:03 catch with chopsticks commercially.
    0:21:07 I think we could solve these things, guys.
    0:21:08 What thing can’t we build?
    0:21:09 I think we could build an airplane.
    0:21:11 I think we could build it.
    0:21:16 So there’s not really a question that this is the only part of some magical thing that
    0:21:18 can only, like there’s the only people that can do it anymore.
    0:21:19 Well, I knew it anymore.
    0:21:20 And you guys with autonomous fighter jets.
    0:21:21 Exactly.
    0:21:22 Yeah.
    0:21:23 Like it’s proven now that the future can be built.
    0:21:24 Yeah.
    0:21:25 Yeah.
    0:21:26 Exactly.
    0:21:27 And so I think the alternatives are there now.
    0:21:28 And then models that can work a lot better.
    0:21:33 It’s like one of my crazier ideas is New Missile takes about 12 years to go from
    0:21:35 concept through to fielding, like 12 years.
    0:21:36 That’s insane.
    0:21:37 It’s insane.
    0:21:38 And so, okay, if you’re in that world…
    0:21:40 But how fast is the technology evolving?
    0:21:41 Oh, like…
    0:21:44 This is like 12 years from now, like what we’ll be able to do.
    0:21:45 Right.
    0:21:46 Exactly.
    0:21:47 And then we’ll still be on the previous system.
    0:21:48 No, there’s even crazier examples.
    0:21:52 Like the Columbia-class nuclear submarine is going into service in 2035, and it’s expected
    0:21:54 lifetime this year, 2085.
    0:21:57 So how good were we in 1960, I guess, where we were today?
    0:21:58 It’s like…
    0:21:59 So unclear.
    0:22:00 So unclear.
    0:22:01 Yeah, we had technology to go to the moon.
    0:22:02 Yeah.
    0:22:03 Exactly.
    0:22:04 Like the quality of the phone.
    0:22:05 Yeah.
    0:22:06 Those are like the computing power of the phone.
    0:22:07 Yeah.
    0:22:11 And so, these timelines just get longer and longer, and this is death spiral, these things.
    0:22:14 Contrast that cycle of development with China.
    0:22:15 Do they take 12 years?
    0:22:17 How is their tech stack up to ours?
    0:22:22 The single best stat for this is they are running hundreds of tests a year of hypersonic
    0:22:23 weapons.
    0:22:24 Right.
    0:22:25 The U.S. is running like four.
    0:22:26 Right.
    0:22:29 Anyone who’s worked in technology understands the compounding value of iterating on these
    0:22:33 things, and it is just so undervalued.
    0:22:34 Why is that the case?
    0:22:39 Look, the U.S., all these tests are very expensive, very complicated.
    0:22:42 There’s so much build up because every test has to go well, because we do relatively
    0:22:43 few tests.
    0:22:46 So then it increases the risk and the duration that you prep for these tests and increase
    0:22:47 the cost.
    0:22:48 It’s cycle times are long.
    0:22:49 Yeah.
    0:22:50 And you’re just in this vicious negative cycle.
    0:22:53 Like anyone who’s worked in software understands this, like the old school way of releasing
    0:22:54 software.
    0:22:56 If you did a yearly release, you try to shove everything you can into that.
    0:22:59 The risk goes through the roof, quality is a disaster.
    0:23:06 Too fast has an insane quality of its own and just how quickly you can learn and how
    0:23:09 much you can actually reduce costs on these things.
    0:23:14 And so they’re just much more willing to test and iterate in a way that the U.S. is not
    0:23:15 right now.
    0:23:18 And so I think that is like long-term, the biggest thing I worry about for the U.S. is
    0:23:22 the pace, the pace of iteration, pace of iteration on these things.
    0:23:27 It probably is the single biggest determining factor of how successful you’re going to be
    0:23:29 over a 20 to 30 year period.
    0:23:30 How do we create a sense of urgency?
    0:23:31 Yeah.
    0:23:35 Like you look at that retooling, we had a two year period of lend lease and the amount
    0:23:38 of GDP that was spent on lend lease, the time was through the roof.
    0:23:41 And we weren’t at war then for others people.
    0:23:46 So we had a two year head start to recondition U.S. industry around this before we even entered
    0:23:47 into a conflict.
    0:23:48 And that’s about how long it took.
    0:23:53 And particularly Russia, about same duration, about two years to retool their industry against
    0:23:58 defense production, they are now out producing all of NATO on munitions.
    0:23:59 Russia.
    0:24:00 Yeah.
    0:24:01 Russia.
    0:24:02 Well, I believe it.
    0:24:04 And we’ve sanctioned them to hell and they still are doing, right?
    0:24:05 Well, they still have gas.
    0:24:07 They still have plenty of gas.
    0:24:11 And so it’s quite tricky to think you’re going to reconstitute this in a single day.
    0:24:13 I think the department has a lot of urgency on it.
    0:24:15 One of the areas where we see it is showing up with weapons.
    0:24:20 So when you look at these wargaming sort of scenarios, all these wargames are sort of
    0:24:22 questionable in their own ways.
    0:24:27 But pretty consistently, the stockpile of key U.S. munitions is exhausted in about eight
    0:24:28 days.
    0:24:30 And that’s usually problematic.
    0:24:37 And that is because we have gone down this path of thinking that we’ll be able to have
    0:24:40 this Gulf War strategy of concluding a conflict in two or three days and that’s how we’re
    0:24:41 going to fight our wars.
    0:24:42 And it’s just not true.
    0:24:43 Right?
    0:24:45 It’s like any of these high intensity conflicts.
    0:24:47 Well, for any adversary that matters.
    0:24:48 That’s right.
    0:24:49 It’s not even close.
    0:24:51 We’ve got to be prepared to sustain these protracted conflicts.
    0:24:54 And that in and of itself is probably one of the best deterrent factors we can have.
    0:24:55 Exactly.
    0:24:56 It’s like, we will not stop.
    0:24:57 We will not back down.
    0:24:59 We will have the capacity to withstand anything.
    0:25:00 Right?
    0:25:03 That is a message we need to send to our adversaries worldwide.
    0:25:06 We have critical gaps on a lot of the kind of constituent parts of supply chain.
    0:25:08 This is a national security issue.
    0:25:11 So I think there is a feeling that this is a problem.
    0:25:13 I don’t think anyone thinks everything’s going great.
    0:25:15 Now the question is, what are the strategies to get a way out?
    0:25:16 Right?
    0:25:19 I don’t think there’s any debate that we’re on our back foot in terms of the capacity
    0:25:22 we need, the mass we need, the type systems we need.
    0:25:24 Now like, how do you get out of it?
    0:25:25 That’s a much harder question.
    0:25:31 And do it in a way that is going to work with Congress, is affordable, is actually something
    0:25:32 we can sustain.
    0:25:36 The path we’re on is probably more incremental than revolutionary, I would say, with like
    0:25:40 US government, where companies like us are going to come in and win incremental new programs
    0:25:41 and show the different problems.
    0:25:42 Yes.
    0:25:43 We’ll be more innovative.
    0:25:44 We’ll be more innovative.
    0:25:45 I think that flywheel is really starting to go.
    0:25:47 We saw a volume issue.
    0:25:48 But it is a major volume issue.
    0:25:53 And I think on the weapons production side, look, the only solve out of this is to actually
    0:25:56 tap into the commercial and industrial supply chains that exist.
    0:25:57 We’re pretty good at building cars.
    0:25:58 We’re pretty good at building electronics.
    0:25:59 Certainly the components that go into it.
    0:26:00 The components for sure.
    0:26:01 We’ve been building a lot of components.
    0:26:02 Like we can do this stuff.
    0:26:03 Yeah.
    0:26:06 And you could design your systems in a way that take advantage of those commercial supply
    0:26:07 chains.
    0:26:09 Like one example we have is maybe like a low cost cruise missile.
    0:26:10 Yeah.
    0:26:11 It’s very cool.
    0:26:12 Several hundred mile range.
    0:26:16 And we made the exterior fuselage in this process that’s used for making acrylic bathtubs.
    0:26:17 It’s this hot press process.
    0:26:19 It’s like we’re making the gas tank.
    0:26:20 This is a mad scientist thing.
    0:26:21 It’s incredible.
    0:26:22 It’s awesome.
    0:26:25 And the fuel tank is maybe the same thing as the rotomolding you use it for like making
    0:26:26 plastic toys.
    0:26:27 And it works great.
    0:26:28 Yeah.
    0:26:30 There’s a huge supply base that’s available to do these things.
    0:26:35 And in contrast that with most of these traditional weapons where it’s like overly bespoke components,
    0:26:40 we got to get the dude that knows how to solder this one thing out of retirement in the supply
    0:26:44 chains are super deep, like four year lead times on these weapons.
    0:26:46 Like it’s really, really bad once you get into it.
    0:26:50 Like I saw this thing, the defense primes were like, we need to change the federal acquisition
    0:26:55 rules so that we can stockpile four year lead time parts for like a four year lead time
    0:26:56 part.
    0:26:57 What do we hear?
    0:26:58 What do we do?
    0:26:59 What do we do?
    0:27:00 The world has changed in four years.
    0:27:01 Yeah.
    0:27:02 Like what could is happening?
    0:27:04 And so I think there’s a problem, but then the government doesn’t help or they don’t
    0:27:06 allow them to change the components.
    0:27:08 There’s no incentive to change the components.
    0:27:09 Well, so this is the problem.
    0:27:10 There’s no origin.
    0:27:11 It goes back to there’s no origin.
    0:27:12 Exactly.
    0:27:14 And so look, I think a lot of the traditional players are like patriots and they really
    0:27:18 care, but it’s like they’re in a system that doesn’t encourage them, support them or
    0:27:21 even I kind of boil it down to like two key things.
    0:27:23 One is meaningful redirection of resources.
    0:27:29 So like right now the amount of money that’s actually spent on capabilities, like the types
    0:27:33 of things working on it somewhere between 0.1 and 0.2% of the defense budget.
    0:27:34 That seems pretty loud.
    0:27:40 Even if we got to 2%, 2% we are like in a wildly different world in terms of what you
    0:27:42 can do with that type of money.
    0:27:44 You’re making like a VC sounding pitch.
    0:27:50 Yeah, if I could even, if I could just get 1%, but that’s actually very helpful context
    0:27:51 all kidding aside.
    0:27:53 This is a crazy small number.
    0:27:57 It’s a crazy small number and even the small numbers are pretty big, but you really need
    0:27:58 to up this.
    0:28:03 So like number one is make the hard choices to drive redirection of resources into the
    0:28:06 technologies that are actually going to be what you need, right?
    0:28:10 Where they’re so stuck with these legacy costs.
    0:28:15 Number two is every company in the world gets this, which is you need to empower good people
    0:28:20 to run hard at a problem and put all the things that they need to do it and all the approvals
    0:28:23 and all of that under their command to just get to yes.
    0:28:24 Yes.
    0:28:25 Yes.
    0:28:26 It’s very simple, right?
    0:28:29 That’s how every company operates and that is how you’re successful.
    0:28:31 Just empower good leaders to get results.
    0:28:32 Yeah.
    0:28:33 Hold them accountable.
    0:28:36 It is the opposite of how it works in the Pentagon where every time something has gone
    0:28:41 wrong, a new process and a new office has been added to check the homework and say no
    0:28:43 and they saw all progress out.
    0:28:46 And so I think there’s relatively simple things that can be done with some combination of
    0:28:52 congressional action and executive action to flip that on its head, say, nope, these
    0:28:57 program offices are fully empowered to field their capabilities and they’re just accountable
    0:29:00 to senior leaders on the risk of trade offs.
    0:29:01 Yeah.
    0:29:02 And that’s it.
    0:29:05 And you give them a budget, give them a budget, give them a target and they have to understand
    0:29:06 the risk.
    0:29:10 They have to do all this, but they’re going to make informed choices on risk and cost
    0:29:11 and schedule and performance trade offs.
    0:29:12 Yeah.
    0:29:13 That’s their job.
    0:29:14 That’s what we’re hiring them to do.
    0:29:18 If we create really empowered people to actually field stuff, you will get amazing results
    0:29:20 because there are really good people in the government.
    0:29:24 It’s just there are 10 times as many people who say no as there are to people who are
    0:29:25 accountable for doing it.
    0:29:26 Oh, that’s fascinating.
    0:29:28 10 times more people who hang around say no than say yes.
    0:29:29 That’s right.
    0:29:32 Could you do just like a project warp speed for defense?
    0:29:37 I know that’s like, that implies something short term, like it’s like a one-time catch-up
    0:29:38 or something.
    0:29:40 Yeah, this is probably needs to be just like a permanent shift.
    0:29:41 I think you have to do both.
    0:29:42 Right.
    0:29:45 So you’ve got to say, look, yeah, we need a warp speed for autonomous systems or weapons.
    0:29:46 We need that.
    0:29:47 Right.
    0:29:48 That’s a no brainer that we need to have.
    0:29:53 And in doing that, you can tease out what are those things that you cut and everything
    0:29:55 worked out fine.
    0:29:58 And you just didn’t need to do it again.
    0:30:02 And then in parallel, you do the painful and slow process of just whacking back all
    0:30:04 these like bureaucratic things that exist.
    0:30:07 I think you got to do something right and use that as a template.
    0:30:12 And so these sort of like things that prove you can be successful, do more of them, go
    0:30:16 at bigger scale, while also cutting back all the nonsense on things that just don’t need
    0:30:17 to exist anymore.
    0:30:19 They made sense at the time.
    0:30:23 Now let’s revert, walk back and reset where we actually need to be for where we are.
    0:30:24 Like tech has changed.
    0:30:25 The pace has changed.
    0:30:26 Reflect that in your process.
    0:30:31 It seems even before the stuff we were just talking about in 2019, when you guys started
    0:30:37 the company in 2017, starting a company in defense was extremely unpopular.
    0:30:41 And when you talk about what do you need to succeed as a startup, there’s so many things,
    0:30:46 but capital talent, relationships with customers, like all of those things are way, way harder
    0:30:52 or were way, way harder in defense in 2017, and in fact, like radioactive for some in
    0:30:53 2017.
    0:30:57 It’s a lot of the engineers and things are just like a religiously opposed.
    0:31:02 Now it seems that there’s this whole new burgeoning interest in defense startups and we have an
    0:31:06 American Dynamism Fund and lots of people are interested.
    0:31:07 How did that happen?
    0:31:10 Because it seemed to happen a little bit before Ukraine too.
    0:31:12 Started to shift just before Ukraine.
    0:31:13 Yeah.
    0:31:14 What was the cause of that?
    0:31:19 So yeah, when we started, I mean, the number of VCs who gave us like ethics interviews or
    0:31:27 just said no or look, like my crass take is that Silicon Valley is like quite mimetic.
    0:31:29 The VC world as well.
    0:31:36 And once the mainline funds like you guys, Founders Fund, General Catalyst all came out
    0:31:41 and said, we’re doing this and like our valuation was high enough, then everyone was like, then
    0:31:42 they got it.
    0:31:43 Chase, chase.
    0:31:44 Yeah.
    0:31:46 I think that was like step one was it was sort of normalized.
    0:31:47 Yeah.
    0:31:50 And stream VC funds were saying, no, we’re doing this is important.
    0:31:53 I know Mark put out post on it at the time.
    0:31:58 And so I think that was like the snowball then of, okay, this is succeeding.
    0:31:59 It’s actually okay.
    0:32:01 Everyone’s been told it’s okay.
    0:32:05 And then there was this catalyzing event around Ukraine.
    0:32:09 And then I think on the why so many defense tech startups, it’s like, look, this stuff
    0:32:11 is, I think it’s very important to work.
    0:32:14 It’s also as an engineer, just some of the hardest and most interesting process you’re
    0:32:15 going to work in.
    0:32:16 Yeah.
    0:32:23 When engineers grew up looking at skunkworks and seeing the SR 71 Blackbird, all these
    0:32:28 wild things that the US was able to pull off, that was your inspiration growing up as an
    0:32:29 engineer.
    0:32:30 Yeah.
    0:32:31 Like this stuff is iconic.
    0:32:32 People want to work on these things.
    0:32:36 And so I think it just really mobilized people who really cared about this.
    0:32:40 And then you have a ton of vets or leaving the military and just want to solve problems
    0:32:41 that they encountered.
    0:32:42 Yeah.
    0:32:44 And so you just have all these kind of a ton of interest in working on it now, a ton of
    0:32:49 capital because they’ve seen our success, they know it can be done, and then just the
    0:32:52 social normalization of the whole thing really flipped the narrative.
    0:32:53 Yeah.
    0:32:59 And I would say the evolution of the sort of primitives for technology has actually advanced
    0:33:01 the opportunity big time, right?
    0:33:05 So like a lot of the dollars that would go to something like an aircraft carrier, which
    0:33:11 is untouchable for a startup, should go to smaller form factor, a treatable, fully autonomous
    0:33:12 equipment.
    0:33:13 You’re 100% right.
    0:33:17 And a big part of our strategy on this has been like, we are leaning into everywhere
    0:33:19 where there’s commercial investment.
    0:33:24 And so many of the things that historically have been like defense exclusive are no longer
    0:33:25 the case.
    0:33:26 Totally.
    0:33:28 One of the examples of this is we built this electronic warfare system.
    0:33:29 It’s really cool.
    0:33:31 It’s a jammer, sensors, jams, radio signals.
    0:33:35 If we did that five years ago, 10 years ago, you would have custom tape out chips.
    0:33:36 It’s hundreds of millions.
    0:33:37 Yeah.
    0:33:38 And it’s a huge thing.
    0:33:40 So only government funded things did it.
    0:33:42 It was on a really slow cycle.
    0:33:47 Well, now with all the 5G tech, this is like the performance of these things is through
    0:33:48 the roof.
    0:33:49 You just take commercial parts.
    0:33:50 Yeah.
    0:33:53 And then just being the fastest to integrate and understand how to utilize these technologies
    0:33:54 becomes the advantage.
    0:33:55 Same with AI.
    0:33:57 It was like, we don’t do AI model research.
    0:33:58 Yep.
    0:33:59 We don’t need to.
    0:34:00 Yeah.
    0:34:01 We just take the best things that are there.
    0:34:02 The best models.
    0:34:03 Yeah, exactly.
    0:34:04 So riding these tech waves has been a huge part of it.
    0:34:08 And that is the macro shift that occurred where the department hasn’t reconciled yet,
    0:34:13 which is like the innovation is much more coming from the commercial world.
    0:34:15 So it becomes being the best adopter.
    0:34:18 It is no longer these 10-year tech road maps of the department controls.
    0:34:19 Yes, exactly.
    0:34:21 It is a totally different world we’re living in.
    0:34:25 And so I think, yeah, the macro piece of why a company like us can see major technology
    0:34:30 shifts around where the innovation is coming from, huge geopolitical shifts.
    0:34:31 Yes.
    0:34:35 And then the consolidation of the existing industrial base with the bad incentives has
    0:34:37 led to an erosion of capacity.
    0:34:41 And so you combine all these things together and you’re like, the conditions were sort
    0:34:44 of set for us to be successful on.
    0:34:45 Yes.
    0:34:46 Yeah.
    0:34:47 They don’t think we could have done it five years or later.
    0:34:48 It would be too late.
    0:34:49 Five years earlier.
    0:34:50 Probably would have been too early.
    0:34:51 It wouldn’t have worked.
    0:34:52 Yeah.
    0:34:53 I think we were in this like two to three year window where we could ride all those waves
    0:34:54 correctly.
    0:34:55 Yeah.
    0:34:56 Brian, it’s so fun to be with you.
    0:34:57 Thanks a ton for spending the time.
    0:34:58 Yeah.
    0:35:01 Thank you for what you’re building as your investor, but more importantly, for all of
    0:35:02 America.
    0:35:02 Thank you.
    0:35:22 [MUSIC]

    How is AI reshaping modern warfare? 

    Speaking with a16z Growth General Partner David George, Anduril cofounder and CEO Brian Schimpf discusses how AI helps humans make better strategic decisions by sorting through the enormous amount of data collected from modern battlefields. Schimpf also discusses navigating the US government’s complex procurement processes, using commercial technologies to kickstart their own product development, and the growing opportunities for startups in defense. Throughout, Brian offers a deep dive into the intersection of technology, geopolitics, and the future of defense.

    This episode is part of our AI Revolution series, where we explore how industry leaders are leveraging generative AI to steer innovation and navigate the next major platform shift. Discover more insights and content from the AI Revolution series at a16z.com/AIRevolution.

     

    Resources: 

    Find Brian on X: https://x.com/schimpfbrian

    Find David on X: https://x.com/davidgeorge83

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

  • Raging Moderates: Trump’s Immigration Crackdown and the Democrats’ Muted Response

    AI transcript
    0:00:01 – Nice.
    0:00:03 – Support for the show comes from Nerd Wallet.
    0:00:05 When it comes to finding the best financial products,
    0:00:06 have you ever wished someone
    0:00:08 would do the heavy lifting for you?
    0:00:10 Take all that research off your plate?
    0:00:13 Well, with Nerd Wallet’s 2025 Best of Award,
    0:00:14 that wishes come true.
    0:00:16 The Nerds at Nerd Wallet are on it.
    0:00:19 They have already reviewed more than 1,100 financial products
    0:00:22 like credit cards, savings accounts, and more
    0:00:25 in order to highlight and bring you the best of the best.
    0:00:28 Check out the 2025 Best of Awards today
    0:00:30 at nerdwallet.com/awards.
    0:00:35 – Hey, whatcha doing?
    0:00:37 – Programming our thermostat to 17 degrees
    0:00:39 when we’re out at work or asleep.
    0:00:42 We’re taking control of our energy use this winter
    0:00:45 with some easy energy saving tips I got from FortisBC.
    0:00:47 – Ooh, conserve energy and save money?
    0:00:49 Maybe to buy those matching winter jackets?
    0:00:51 – Uh, no.
    0:00:53 We’re also getting that whole matching outfit thing
    0:00:54 under control.
    0:00:57 – Discover low and no-cost energy saving tips
    0:01:00 at FortisBC.com/EnergySavingTips.
    0:01:01 Matching track suits?
    0:01:02 – Please, no.
    0:01:06 – Support for Prop G comes from Crescent Family Office.
    0:01:08 As an entrepreneur, you spend a lot of time
    0:01:09 in energy building your business.
    0:01:11 And chances are, you’ve been so busy,
    0:01:12 there hasn’t been a ton of time to think about preparing
    0:01:16 for an exit, tax strategies, and wealth management.
    0:01:18 Crescent is here to help wealth creators and families
    0:01:21 like yours streamline complexity and invest for the future.
    0:01:23 Crescent was built by entrepreneurs for entrepreneurs
    0:01:25 with financial advisory teams who embraced
    0:01:29 a fiduciary duty to place the client’s interests first.
    0:01:31 You can learn how to optimize your life
    0:01:33 by scheduling a call with a Crescent founder
    0:01:35 at CresitCapital.com.
    0:01:36 We are not clients of Crescent.
    0:01:37 There are no material conflicts
    0:01:39 other than this paid endorsement.
    0:01:42 All investing involves risk, including loss of principal.
    0:01:49 – If you’re enjoying our content, just a quick ask.
    0:01:53 Please, right now, hit that subscribe button on YouTube
    0:01:55 and also hit subscribe
    0:01:58 in our dedicated Raging Moderates feed.
    0:02:00 Jess has two young kids at home.
    0:02:01 Be supportive.
    0:02:02 – They have to go to college,
    0:02:05 even though higher ed is such a disaster.
    0:02:06 – Hit it now.
    0:02:09 Please help us out and in exchange, we will do our best.
    0:02:11 Maybe we won’t be as moderate as you like,
    0:02:14 but we promise we’ll be very, very raging.
    0:02:15 – We’re not supposed to tell them that.
    0:02:16 – That’s right.
    0:02:17 Now we’ll be both raging and moderate.
    0:02:19 Please hit the subscribe button.
    0:02:24 – Welcome to Raging Moderates, I’m Scott Galloway.
    0:02:25 – And I’m Jessica Tarland.
    0:02:28 – Jess, this is the part of the show where we banter.
    0:02:29 – I’m reading the same note.
    0:02:31 What would you like to banter about?
    0:02:33 – Well, let’s bring this back to me.
    0:02:35 Show, ask me what I do this weekend.
    0:02:36 – What did you do this weekend?
    0:02:37 How fantastic was it?
    0:02:38 – Well, Jess, I don’t like to talk about me
    0:02:41 or my personal life, but on Saturday morning,
    0:02:45 I had my 14 year old, just me and him.
    0:02:46 And so I said, what do you wanna do?
    0:02:48 And I knew it involves something in football.
    0:02:51 So we jumped on the Euro star on Saturday morning.
    0:02:52 – Which is lovely.
    0:02:53 – Amazing.
    0:02:54 – Yeah.
    0:02:55 – It takes two hours and 20 minutes.
    0:02:59 Pancras, or St. Pancras Station is about 10 minutes
    0:03:02 from where we’re living.
    0:03:05 Two hours and 20 minutes later, we’re at Garde de Nord,
    0:03:07 which is I think train station of North or something
    0:03:10 in French, just for those of you out there.
    0:03:12 And then we’re at the hotel.
    0:03:15 We went and had an amazing, my favorite,
    0:03:19 or our favorite lunch, we had steak frites.
    0:03:22 And then we went to the most amazing little stores,
    0:03:25 mostly chocolatiers and crepe stores.
    0:03:27 And then we went back and napped and went to the pool.
    0:03:29 When you have a 14 year old, you gotta go to the pool.
    0:03:30 I’ve learned that.
    0:03:32 – That lasts until high school?
    0:03:33 – Yeah, no.
    0:03:34 – That they’re into going to the pool?
    0:03:35 – You know, his only criteria for a hotel
    0:03:36 is do they have a pool?
    0:03:39 And we always go to the pool.
    0:03:42 And then we went and then we had a dinner
    0:03:45 at the most fabulous restaurant at the top
    0:03:48 of this new fancy hotel called Cheval Blanc.
    0:03:52 And we got dressed up and then we got dressed down
    0:03:56 and we went to the PSG Paris Saint-Germain football
    0:03:58 that’s the major team in Paris.
    0:04:00 Oh, someone’s gonna say, “No, they’re not.”
    0:04:03 Ty Renz, which was a big disappointment for PSG,
    0:04:05 but we went and saw the football game,
    0:04:08 came back next morning, woke up, had a lovely breakfast
    0:04:12 and then went to Notre Dame, which is spectacular.
    0:04:13 – Oh, it looks amazing.
    0:04:16 – Spectacular and then caught the US star back,
    0:04:18 I mean, 24 hours and just it was one of those
    0:04:22 incredible weekends with my son.
    0:04:23 What did you do?
    0:04:25 – Well, I went to a three-year-old’s birthday party
    0:04:26 on Saturday morning.
    0:04:30 There was cake, funfetti flavor, which is my favorite.
    0:04:31 So–
    0:04:32 – Do they rent a place or is it one of those
    0:04:33 where it’s rich people in Tribeca?
    0:04:34 Do they have to rent a place?
    0:04:37 – Well, I think both of those types classify as–
    0:04:38 – Wealthy.
    0:04:39 – Rich people, ’cause if you can rent a place
    0:04:41 for your toddler’s birthday,
    0:04:44 they did it at their building, actually.
    0:04:47 So, and I don’t know what they earn,
    0:04:50 so I can’t comment on that, but it was a lovely party.
    0:04:52 And then Saturday night,
    0:04:55 we went out to dinner with three other couples,
    0:04:58 which was very lively, a lot of fun.
    0:05:01 We’re trying out a new babysitter.
    0:05:02 It seemed to go okay.
    0:05:06 The toddler by the end said, “I liked new babysitter.”
    0:05:09 So, great, but it was nice to go out.
    0:05:12 You know how you usually are out with one other couple,
    0:05:15 but being out with three other couples,
    0:05:16 tons of conversations going on.
    0:05:19 We had a nice time, but home by 9.30,
    0:05:20 asleep by 10, the usual.
    0:05:23 – So, four couples that usually gives everyone permission
    0:05:24 to drink more is what I find.
    0:05:25 And two–
    0:05:28 – There was a lot of drinking and a discussion
    0:05:30 about how one person thinks
    0:05:32 that their partner drinks too much.
    0:05:34 – It makes company, that’s nice.
    0:05:37 – Yeah, I asked to be excluded from the conversation,
    0:05:40 ’cause also I barely drink, I’m not.
    0:05:43 I’m a lightweight, and I guess not that fun.
    0:05:45 And so, I’m always very uncomfortable
    0:05:46 with those conversations.
    0:05:48 – Yeah, so I’ll give you a little bit of a heads up
    0:05:50 on what’s coming your way.
    0:05:53 In about seven to 10 years,
    0:05:55 you’re gonna freak out about the fact
    0:05:58 that you’ve spent so much time just raising kids,
    0:05:59 and that you’re losing your youth,
    0:06:01 and you and your female friends will start parting
    0:06:04 like fucking rock stars diagnosed with ass cancer.
    0:06:06 You’re gonna start drinking like crazy.
    0:06:09 You’re gonna start doing girls trips all the time.
    0:06:10 You’re gonna abuse alcohol,
    0:06:11 you’re gonna experiment with drugs,
    0:06:14 and you’re gonna make all the guys trips
    0:06:18 that movies have been depicted on seem like a tea party.
    0:06:21 They talk about the midlife crisis that men have.
    0:06:24 I think men’s are longer, but less severe.
    0:06:27 The midlife crisis for women happens earlier.
    0:06:28 Specifically, I think when they leave
    0:06:30 kind of their birthing years and they’re worried
    0:06:34 they’re losing kind of their hot girl 20s and 30s,
    0:06:36 and they go ape shit.
    0:06:37 So anyways, you got that coming.
    0:06:38 You got that coming.
    0:06:40 Well, well, that sounds like fun,
    0:06:42 ’cause I, yeah, I’m definitely not doing any of that.
    0:06:42 No, none of that.
    0:06:43 At this point.
    0:06:45 Well, it’s hard, you know, what are you gonna do?
    0:06:47 Every once in a while, I have a lot of good friends
    0:06:48 because I–
    0:06:50 You have a second glass of Chardonnay.
    0:06:51 Is that when you–
    0:06:54 No, I’m not, wait, I have one drink also.
    0:06:54 What is your drink?
    0:06:58 I only drink Tito’s grapefruit and soda.
    0:06:59 You’re so fancy.
    0:06:59 It’s like a modified Paloma.
    0:07:01 So here’s another prediction.
    0:07:02 I mean–
    0:07:04 You’re gonna start getting unsolicited bottles of Tito’s.
    0:07:07 When I started talking about how much I love Zacapa,
    0:07:09 I’d show up to events and they’d have a bottle of Zacapa.
    0:07:13 So everyone, Tito’s for the lady.
    0:07:15 Good, good, all right, enough of that.
    0:07:17 Grape banter, I loved it.
    0:07:18 Yeah, that worked, check that box.
    0:07:20 In today’s episode of Raging Moderates,
    0:07:23 we’re discussing Trump’s whirlwind first week in office,
    0:07:25 how Democrats are responding to Trump,
    0:07:27 and what it really means to be a moderate
    0:07:29 in today’s political climate.
    0:07:31 Okay, so let’s bust right into it.
    0:07:33 Donald Trump has hit the ground running
    0:07:35 in his first week back in office,
    0:07:38 signing nearly 50 executive orders and actions
    0:07:41 that are already sort of reshaping politics in our country.
    0:07:43 These include escalating immigration crackdowns
    0:07:48 or deportation flights in a southern border troop surge,
    0:07:49 as well as targeting birthright citizenship
    0:07:51 and freezing asylum programs.
    0:07:54 On Sunday, he also got into a feud
    0:07:57 with the president of Colombia over deportation flights.
    0:07:59 Meanwhile, he has halted foreign aid worldwide
    0:08:01 and revoked security clearances
    0:08:03 from former officials critical of him.
    0:08:05 Add to that his controversial pardons of January 6th
    0:08:08 defendants, the confirmation of cabinet appointments,
    0:08:10 including Pete Hegset, despite numerous allegations
    0:08:13 against him and threats to eliminate FEMA
    0:08:15 while visiting disaster zones,
    0:08:18 including Hurricane Helene’s wreckage in North Carolina
    0:08:20 and wildfire ravaged California.
    0:08:22 It’s a whirlwind start that is already redefining
    0:08:24 how America Governs communicates
    0:08:26 and is viewed globally.
    0:08:31 Jess, there’s a lot to unpack here from Trump’s first week.
    0:08:32 What stood out to you the most?
    0:08:34 – That there is a lot.
    0:08:36 That’s the point, right?
    0:08:41 We’re so quickly back to where we left Trump 1.0,
    0:08:45 which is I’m gonna throw everything at the wall
    0:08:47 and just see what sticks.
    0:08:51 And it feels as though we have a bunch of angry teens
    0:08:52 that are in charge of the government, right?
    0:08:55 Trying everything, pushing boundaries,
    0:08:58 waiting for someone to slap them on the wrist to push back.
    0:09:02 Maybe that comes from, you know, part of the internal caucus,
    0:09:05 but we’ll talk about that later on
    0:09:07 with some of the DEI initiatives that they scaled back,
    0:09:10 but like waiting for the courts to come and get them,
    0:09:13 you know, hoping that they just can skate through
    0:09:15 with some of this stuff.
    0:09:19 I feel there’s an overwhelming sense of manufactured chaos
    0:09:23 to everything and that we are living in the midst of,
    0:09:24 you would know better than me.
    0:09:28 If it is, in fact, the biggest branding exercise in history,
    0:09:30 but this golden age of America,
    0:09:33 and he has lots of social media spots about it
    0:09:37 out there on all of his channels,
    0:09:40 it feels like he’s just going full steam ahead
    0:09:45 with this social media driven approach to governance,
    0:09:49 loyalty tests, everything that he left on the table
    0:09:50 from the first time around,
    0:09:53 he’s picking it up and doing it to the nth degree
    0:09:55 if he can get away with it.
    0:09:58 And he’s just calling our bluff, right?
    0:10:01 Just saying, well, come and get me, right?
    0:10:04 So that’s my feeling writ large
    0:10:07 about what’s gone on so far, what about you?
    0:10:09 – Yeah, I think that’s right.
    0:10:12 It feels as if one, you could argue this is leadership
    0:10:14 that he’s had kind of four years to prepare
    0:10:17 for what he would do and just hitting the ground running
    0:10:19 and has decided, okay, I’m gonna, you know,
    0:10:21 promises made, promises kept,
    0:10:24 and it’s going aggressively at everything he talked about
    0:10:26 and moving, you know, their fleet of foot,
    0:10:28 signing executive orders on the dais.
    0:10:30 So you could argue it’s leadership.
    0:10:32 I like what you said, the world’s largest branding event.
    0:10:33 I hadn’t thought of it that way.
    0:10:35 I think that’s really interesting.
    0:10:38 At the same time, you sort of flooding the zone,
    0:10:41 which is so much shit or actions
    0:10:45 that wouldn’t have flown before.
    0:10:47 And one of the things about our government,
    0:10:48 the reason why we have three branches
    0:10:51 is it’s meant to have this wonderful intransigence
    0:10:53 where we sometimes to a fault,
    0:10:56 wrestle stuff to the ground and really examine it.
    0:10:58 And there’s just none of that now.
    0:11:01 The, I would give more points to the Trump administration
    0:11:03 than I would give to the Democratic party right now
    0:11:05 who appears to be just caught flat footed.
    0:11:07 In this mixed of, can you believe this shit?
    0:11:08 We don’t know what to do.
    0:11:11 And trying to sign up for this PBS Hallmark Channel
    0:11:14 bullshit of we need to come together
    0:11:16 and we need to cooperate with the president.
    0:11:17 And they’re all trying to,
    0:11:21 they’re all in such shock about this win.
    0:11:24 And, you know, while it was a small number of votes,
    0:11:26 so you went kind of seven for seven in swing states,
    0:11:28 they’re all trying to pretend to be more moderate
    0:11:30 and say, I’ll work with the president
    0:11:32 and they don’t want to come right out of the box
    0:11:33 shitposting them.
    0:11:34 I think that’s a failed strategy.
    0:11:35 I think they wanted war.
    0:11:37 I think we should have it.
    0:11:41 I’m not up for normalizing an insurrectionist and a rapist.
    0:11:43 So I guess that doesn’t make me a moderate
    0:11:44 because I won’t sign up to this.
    0:11:45 – Great.
    0:11:47 We’ll just end the podcast right now.
    0:11:50 – But I would argue that there’s this,
    0:11:54 we have this sort of notion it’s time to come together.
    0:11:55 – I don’t agree.
    0:11:58 I think Democrats and moderate Republicans
    0:12:00 need to come to the rescue.
    0:12:02 I think some of this stuff is just so over the line
    0:12:06 and so un-American, you know,
    0:12:10 rescinding the security detail of people who he doesn’t like.
    0:12:14 – With live Iranian threats to their lives.
    0:12:16 I mean, if you have Tom Cotton,
    0:12:19 maybe the hero in all of this,
    0:12:21 which is frightening to me, but.
    0:12:24 – Well, he wants a security detail when he’s out of office.
    0:12:26 – Well, he definitely wants one on January 6th.
    0:12:28 Oh, no, it was Josh Hawley.
    0:12:31 He was riding away like a little girl.
    0:12:32 No offense to little girls.
    0:12:34 Some of it is just petty.
    0:12:37 Some of it is the revenge tour.
    0:12:40 Some of it I do think is rooted in a genuine ideology,
    0:12:42 but then it’s always taken one step further.
    0:12:46 And I’ve been thinking about this notion of
    0:12:50 what is the kernel of truth that gets us
    0:12:52 to the crazy place that we’ve ended up, right?
    0:12:55 Because a lot of what’s going on right now,
    0:12:56 I believe is Democrats fault, right?
    0:12:59 Like we walked into very specific traps.
    0:13:02 And so we ended up with people telling us,
    0:13:04 essentially I want a bloodless revolution, right?
    0:13:07 I want the establishment so far away
    0:13:09 that I can’t even see Nancy Pelosi.
    0:13:11 All of this makes me sick.
    0:13:14 And I would rather take a flyer on the chaos agent
    0:13:19 than have someone who is routine and boring
    0:13:23 and also is afraid to say things that are common sense.
    0:13:25 Like the thing that’s right in front of you.
    0:13:29 So if we look at with the DEI revolution
    0:13:32 that’s going on now, that comes from, you know,
    0:13:34 after the murder of George Floyd,
    0:13:36 that companies just added, you know,
    0:13:40 50 to 100 DEI employees or it was crazy.
    0:13:44 I was looking at data from Loudoun County, Virginia.
    0:13:47 They have a DEI office that they’re spending enough on it
    0:13:50 that they could hire 125 new teachers
    0:13:51 if they took that money and did that.
    0:13:54 Obviously a parent in that district is going to say,
    0:13:56 I would rather have new teachers
    0:14:00 than have dozens of employees in the DEI office.
    0:14:02 For instance, on the trans issue,
    0:14:04 we talk about this regularly.
    0:14:06 If you’re going to say that it’s fine for Leah Thomas
    0:14:09 to compete at the collegiate level,
    0:14:12 people are going to think that you are insane on trans issues
    0:14:13 and then they’re going to pick someone
    0:14:14 who’s going to over correct
    0:14:17 and take away protections for trans people.
    0:14:20 On immigration, if you’re going to say the border is secure,
    0:14:22 people are going to look at you like you’re a lunatic,
    0:14:25 especially once migrants are getting bust
    0:14:27 to northeastern liberal cities
    0:14:29 and they’re understanding the implications
    0:14:31 of how El Paso taxes has been living.
    0:14:32 And we’re going to have the overcorrection
    0:14:35 and we’re going to see stuff like this.
    0:14:39 Like, I don’t even want to call it a rise in deportations.
    0:14:40 I find it fascinating,
    0:14:42 and this goes to the social media presidency
    0:14:45 or just who has the best branding exercise of this
    0:14:47 because they’re out there saying,
    0:14:48 promises made, promises kept.
    0:14:53 We deported 310 undocumented people or illegals,
    0:14:57 they would say, I would say undocumented because I’m polite.
    0:15:00 And Biden was deporting,
    0:15:02 sometimes it got over 400 people per day.
    0:15:05 I think on average, he was at 310 per day,
    0:15:07 but the Democrats never talked about anything
    0:15:09 that they were doing.
    0:15:11 And that’s the difference in this.
    0:15:13 If you leave people in the dark
    0:15:16 about what your administration is actually doing,
    0:15:18 and I’m talking about the stuff that matters to them,
    0:15:21 not the stuff that feels ancillary or for show,
    0:15:24 they’re going to pick the other person.
    0:15:26 And now people are running around saying,
    0:15:28 oh, well, Donald Trump is the toughest on immigration.
    0:15:30 No, actually Barack Obama was the toughest on immigration.
    0:15:32 And we have been deporting a lot of people.
    0:15:35 People don’t know that border crossings are down 55%
    0:15:37 at the end of the Biden administration,
    0:15:39 after obviously a huge surge in the beginning.
    0:15:44 So we need a new digital strategist, that’s for sure,
    0:15:46 the next time around that we do this.
    0:15:51 And I don’t know, have you listened to any of Chris Hayes’
    0:15:54 interviews about his new book about attention
    0:15:56 and like the attention economy?
    0:15:58 – Oh, that’s a new term, attention economy.
    0:15:59 Let me think, I started using that
    0:16:00 about 15 fucking years ago.
    0:16:02 Anyways, what did Chris Hayes said?
    0:16:03 He’s discovered the attention economy?
    0:16:04 What does Chris have to say?
    0:16:07 – No, he didn’t, no, he didn’t discover it,
    0:16:11 but he’s talking about it in context of recent outcomes
    0:16:15 and this race just to be the first person to say something
    0:16:17 that someone sees on their phone
    0:16:18 and how meaningful that is for that.
    0:16:21 And I cannot think of an instance where Democrats
    0:16:24 were the first people to be able to say something.
    0:16:26 It’s always the Republicans and usually it’s Trump.
    0:16:28 – Well, just starting there.
    0:16:31 So first off, I would like to see
    0:16:34 the, what committee would it be?
    0:16:36 – We need immediately, in my opinion, to get,
    0:16:41 probably not Musk,
    0:16:43 because I think it would be too much of a spectacle
    0:16:45 and just bring him more power.
    0:16:48 It’s like one of those villains in a comic book
    0:16:50 that the more you throw shit at him,
    0:16:53 the more he absorbs it and becomes more powerful.
    0:16:56 But the Commerce, Science and Transportation Committee
    0:17:01 in my opinion have hearings in the tech industry’s
    0:17:03 influence on democracy and its elections
    0:17:06 because there, I think, is now emerging evidence
    0:17:09 that basically Musk and Yacarino weaponized Twitter,
    0:17:12 including creating thousands and thousands of accounts
    0:17:17 to elevate misinformation and essentially spread
    0:17:19 just a ton of propaganda misinformation
    0:17:21 that had a real impact on the election.
    0:17:22 I’m not sure it’s illegal.
    0:17:23 It’s a private company.
    0:17:25 He can do what he wants with it.
    0:17:29 But I want her up there to under oath to say,
    0:17:31 yeah, I knew that he was creating thousands of bots
    0:17:33 pretending to be humans.
    0:17:37 And we were elevating information or lies,
    0:17:39 even though we knew they were lies,
    0:17:42 such that it would influence the outcome of the election.
    0:17:44 I just want her to go on record saying that
    0:17:47 so Americans know what they’re dealing with.
    0:17:50 And they have very effectively,
    0:17:53 even if it’s Umberta Eco, the Italian philosopher said,
    0:17:55 along the lines of the attention economy,
    0:17:57 that it’s not what you’re famous for,
    0:17:59 it’s just about being famous.
    0:18:00 So say something incendiary,
    0:18:02 and as long as you’re dominating the news cycle,
    0:18:04 I mean, I feel like Republicans are dominating
    0:18:06 90% of the news cycle right now.
    0:18:09 And unfortunately, we have Senator Schumer,
    0:18:11 who brightens up a room by leaving it,
    0:18:14 just kind of doing nothing or saying nothing.
    0:18:16 We have Speaker Emerita Pelosi,
    0:18:21 who just purchased $50,000 to $100,000 worth of call options
    0:18:23 on Tempest AI.
    0:18:24 And when that was disclosed,
    0:18:27 the company had its best one day performance in history
    0:18:28 at 35%.
    0:18:30 So she’s spending more time on Robinhood,
    0:18:32 engaging what is effectively insider trading,
    0:18:36 than actually paying attention to real issues.
    0:18:39 You know, my question is, where the fuck are Democrats?
    0:18:41 I don’t agree with a lot of AOC’s policies,
    0:18:42 but at least she’s out there.
    0:18:43 At least she’s trying to push back.
    0:18:48 Where is Senator Klobuchar talking about,
    0:18:51 you know, antitrust and competition and inflation,
    0:18:53 and talking about how the tenure is surging,
    0:18:56 and that these policies are incredibly inflationary.
    0:19:00 We have, we’re literally fighting fire
    0:19:01 with fucking squirt guns.
    0:19:02 And when I say squirt guns,
    0:19:06 I mean senior leadership in the Democratic Party
    0:19:08 that is too old, too tame,
    0:19:10 thinks they’re in a PBS drama,
    0:19:11 where they, good sir,
    0:19:14 and they like hit them with their glove.
    0:19:15 I mean, enough already.
    0:19:18 That this is insane that we don’t have,
    0:19:20 I wanna see hearings on,
    0:19:22 let’s immediately have a hearing
    0:19:24 on this new crypto AI community.
    0:19:27 And first thing is we need them to come,
    0:19:29 this committee who organized by the president,
    0:19:33 we should have the Senate Judiciary Committee hearing
    0:19:37 investigate legal implications of Trump’s meme coins.
    0:19:41 I want and bring the new head of this AI and crypto community
    0:19:44 to explain the Trump and Melania coin.
    0:19:46 Let’s have him go on the record and say,
    0:19:49 what is this and is it good for the economy?
    0:19:52 And we’ll be able to invite dozens of the 60,000 people
    0:19:55 who bought this coin and are now off 70 or 80%
    0:19:57 in about 72 hours.
    0:19:58 Instead, we just sort of sit there
    0:20:01 and give this bullshit, let’s time to come together.
    0:20:04 There’s things we can work on together, work on together.
    0:20:07 We had an immigration bill that the president basically
    0:20:10 from Mar-a-Lago killed, so he could take credit for it.
    0:20:12 And we’re all sitting around thinking,
    0:20:14 it reminds me of the movie, “The Mission.”
    0:20:16 I don’t know if you saw that movie
    0:20:19 with Robert De Niro and Jeremy Irons.
    0:20:21 But Jeremy Irons plays this,
    0:20:23 I’m gonna call priest, a religious figure,
    0:20:26 and the British are coming basically to slaughter
    0:20:28 this indigenous community.
    0:20:30 I don’t know if it’s Argentina or Brazilian.
    0:20:32 They’re missionaries, anyways.
    0:20:34 And the priest, Robert De Niro is trying to get everyone
    0:20:37 they know they’re coming to prepare for war.
    0:20:39 And the priest says, we’re about non-violence.
    0:20:41 And they’re slaughtered, of course.
    0:20:42 I just watched a movie called “24,”
    0:20:44 which was about the most famous Norwegian spy
    0:20:47 of all things, and a woman stands up,
    0:20:48 as he’s speaking to this university, saying,
    0:20:51 why didn’t you try non-violence?
    0:20:54 I feel like the Democrats have decided to try non-violence.
    0:20:58 And I’m like, sir, say, I choose violence.
    0:20:58 This is not working.
    0:21:00 Sitting around trying to pretend
    0:21:01 we’re taking the higher ground
    0:21:04 and we’re gonna work with the president.
    0:21:06 It hasn’t worked, folks.
    0:21:09 We need to be calling balls and strikes here
    0:21:11 and saying that this is, when you have the president
    0:21:13 saying things like polluting blood,
    0:21:15 and then you have the person who, in my opinion,
    0:21:18 weaponized a platform to get him elected,
    0:21:20 telling the far-right party in Germany,
    0:21:22 you shouldn’t dilute your culture.
    0:21:24 I mean, this is pre-Hitler shit.
    0:21:27 And yet, I don’t see a single Democrat
    0:21:30 with anything resembling a following of social media
    0:21:33 out there saying, fuck all.
    0:21:36 So yeah, right now, as far as I can tell,
    0:21:38 we have one party and another party
    0:21:40 that thinks they’re at Catillion
    0:21:42 training their kids to be polite
    0:21:46 and that peace and love will win out.
    0:21:47 Thank you for my TED talk.
    0:21:49 Okay, let’s take a quick break.
    0:21:50 Stay with us.
    0:21:54 It’s Today Explained.
    0:21:57 I’m Noelle King with Miles Bryan.
    0:21:59 Senior reporter and producer for the program, hello.
    0:22:01 Hi, you went to public school, right, Miles?
    0:22:03 Yes, go South High Tigers.
    0:22:05 What do you remember about school lunch?
    0:22:09 Oh, I remember sad lasagna, shrink-wrapped
    0:22:10 in little containers.
    0:22:11 I remember avoiding it.
    0:22:13 Do you remember the nugs, the chicken nuggets?
    0:22:17 Yeah, if I had to eat school lunch,
    0:22:18 that was a pretty good option.
    0:22:19 I actually liked them.
    0:22:22 But in addition to being very tasty,
    0:22:23 those nugs were very processed.
    0:22:26 And at the moment, America has got processed foods
    0:22:28 in its crosshairs.
    0:22:30 It’s true, we are collectively very down
    0:22:32 on processed food right now, none more so
    0:22:35 than Health and Human Services’ secretary nominee,
    0:22:37 Robert Fluoride Kennedy, Jr.
    0:22:41 I’ll get processed food out of school lunch immediately.
    0:22:43 About half the school lunch program
    0:22:45 goes to processed food.
    0:22:49 Hen the man who once saved a dead bear cub for a snack
    0:22:50 fixed school lunches.
    0:22:54 Today Explained, every weekday,
    0:22:55 wherever you get your podcasts.
    0:22:59 Support for the show comes from Vanta.
    0:23:01 Trust isn’t just earned, it’s demanded
    0:23:02 whether you’re a startup founder navigating
    0:23:04 your first audit or a seasoned security professional
    0:23:06 scaling or GRC program.
    0:23:08 Proving your commitment to security
    0:23:11 has never been more critical or more complex.
    0:23:13 That’s where Vanta comes in.
    0:23:15 Businesses use Vanta to establish trust
    0:23:18 by automating compliance needs across over 35 frameworks,
    0:23:22 including SOC2 and ISO 27001.
    0:23:23 They also centralized security workflows,
    0:23:26 complete questionnaires up to five times faster
    0:23:28 and proactively manage vendor risk.
    0:23:31 Vanta not only saves you time, it can also save you money.
    0:23:34 A new IDC white paper found that Vanta customers
    0:23:37 achieve $535,000 per year in benefits
    0:23:40 and the platform pays for itself in just three months.
    0:23:42 Join over 9,000 global companies
    0:23:44 including Atlassian, Quora and Factory
    0:23:46 who use Vanta to manage risk
    0:23:48 and prove security in real time.
    0:23:52 For a limited time, our audiences get $1,000 off Vanta
    0:23:54 at Vanta.com/PropG.
    0:23:59 That’s V-A-N-T-A.com/PropG for $1,000 off.
    0:24:07 – So a couple of things.
    0:24:10 Democrats said all the stuff that you just said
    0:24:12 before the election for months
    0:24:16 and voters turned up and said, “I don’t care.”
    0:24:18 Right, so I care.
    0:24:21 Millions of people do care,
    0:24:23 but the pivotal number that seven million people
    0:24:26 who voted for Biden in 2020 sat home in 2024.
    0:24:29 That’s how little they cared about
    0:24:30 what you’re just talking about.
    0:24:32 AOC is out there.
    0:24:34 She posted before the inauguration,
    0:24:37 people are asking me why I’m not going to inauguration
    0:24:41 and I’m not going to the inauguration of a rapist
    0:24:42 to use the term that you use,
    0:24:44 though I know there’s a legal conversation about that
    0:24:46 and I’m not looking to have a defamation suit.
    0:24:48 So AOC is saying stuff like that.
    0:24:49 She was on John Stuart’s podcast,
    0:24:52 talking a lot like you are just now.
    0:24:55 But I feel like for someone like Hakeem Jeffries,
    0:24:58 who is a very unifying leader,
    0:25:03 he is trying to figure out as Nancy Pelosi had to for years,
    0:25:06 how to manage a caucus that is being pulled
    0:25:07 in many different directions
    0:25:09 because the difference between what goes on
    0:25:13 for a safe seat Democrat and a swing seat Democrat
    0:25:14 is like night and day.
    0:25:17 And we’re going to have Congressman Tom Swazi
    0:25:19 on for an interview later in the week,
    0:25:22 talk to him about this as he’s from a swing district
    0:25:25 and he was one of the first people out there saying
    0:25:27 these are the issues I’ll be able to compromise on.
    0:25:29 We got to work together like the Lake and Riley Act
    0:25:31 for immigration and I’m definitely looking forward
    0:25:33 to pushing him about the parts of that bill
    0:25:37 that are definitely not good, right?
    0:25:40 In terms of not projecting dreamers and minors,
    0:25:44 but obviously you can’t affect any change
    0:25:45 if you’re not in office.
    0:25:48 And if these people want to continue to be reelected,
    0:25:51 so they can even make incremental progress,
    0:25:54 they’re going to have to work with the other side
    0:25:54 to some degree.
    0:25:58 It’s something like 83% of the American public
    0:26:01 wants the two sides to work together.
    0:26:06 And that’s why I think going back to the executive orders
    0:26:07 and kind of the beginnings of this,
    0:26:11 like it is important to look at the list of things
    0:26:16 and to say this is stuff that I kind of understand, right?
    0:26:19 Like if you want to call it a national emergency
    0:26:21 on the Southern border,
    0:26:23 if you want to put more resources down there,
    0:26:24 I can get on board with that.
    0:26:25 I completely understand it.
    0:26:27 All the people who live along the border would tell you
    0:26:29 that’s exactly the kind of conditions
    0:26:30 that they’re living in.
    0:26:34 But the stuff that you have to figure out a way
    0:26:36 to effectively hold the line,
    0:26:40 not just rail about it or post about it.
    0:26:43 He’s basically undoing the entire asylum system.
    0:26:46 I was watching Tom Homan, he was being interviewed
    0:26:49 about what happens now to all the people
    0:26:52 who had their customs and border patrol appointments canceled
    0:26:54 ’cause they got rid of the CBP One app.
    0:26:56 And he said, well, go to a port of entry.
    0:26:59 And the whole point was that you don’t want people
    0:27:00 showing up at port of entries.
    0:27:02 I mean, there are tens of thousands of people
    0:27:04 who have been waiting in Mexico,
    0:27:07 some upwards of a year to do this the legal way.
    0:27:09 Also completely forgetting the fact
    0:27:12 that people who are here illegally do have rights.
    0:27:14 That is enshrined in our constitution
    0:27:15 that they have a right to legal counsel,
    0:27:18 that they have a right to do process.
    0:27:20 And the DOJ has new directives.
    0:27:24 This is one that I thought this can’t be real,
    0:27:27 where they’re now telling legal service providers
    0:27:30 who get federal funding not to do their jobs,
    0:27:32 not to help these immigrants that are here
    0:27:36 who might have a completely legitimate asylum claim.
    0:27:40 I already mentioned the DEI offices
    0:27:41 in the Fairfax County, sorry,
    0:27:44 I said Loudoun County before it’s Fairfax County.
    0:27:47 So that’s obviously bad, but then you sprint ahead.
    0:27:50 Did you see this, that the Department of Defense,
    0:27:52 ’cause they took down all of their DEI stuff,
    0:27:55 removed promotional video material
    0:27:56 about the Tuskegee Airmen.
    0:27:58 – Yeah, and women in World War II.
    0:28:01 – Yeah, the Wasps, which is such a great name for it.
    0:28:04 And it was Katie Britt from the center from Alabama
    0:28:07 who tweeted, “Oh, this must be a mistake.”
    0:28:08 And within a couple of hours,
    0:28:10 the new defense secretary, Pete Higgs,
    0:28:12 was like, “I’ve fixed it.”
    0:28:14 But that feels like one of those circumstances
    0:28:16 where they were trying it on, right?
    0:28:18 They thought, well, we could just go ahead
    0:28:19 and get rid of these things.
    0:28:23 And if someone catches us, so what, we’ll put it back up.
    0:28:25 – And we’re in the news, as long as we can.
    0:28:27 – Right, and we’re dominating the cycle no matter what,
    0:28:31 because all news is good news
    0:28:33 or all press is good press, I guess.
    0:28:36 And that’s a credo that Trump has lived by forever.
    0:28:39 There’s also just such a lack of expertise
    0:28:41 and willingness to want to do the work, right?
    0:28:44 They want to eliminate things en masse
    0:28:46 and not spend the time going through
    0:28:48 and actually looking at what the relevance is,
    0:28:52 like purging the government of any of those checks.
    0:28:54 They got rid of, I think, 17 inspector generals
    0:28:57 over 12 huge bureaucracies, right?
    0:29:00 These are things that used to piss off
    0:29:02 storied members of the Senate, like Chuck Grassley,
    0:29:04 lost his mind in 2021.
    0:29:06 Trump got rid of two IGs.
    0:29:08 Now 17 have been removed.
    0:29:12 Did you see this communication freeze for the NIH and the CDC?
    0:29:15 Like in the midst of bird flu,
    0:29:18 they can’t tell people what’s going on with something.
    0:29:20 Yeah, exactly.
    0:29:24 And the foreign aid freeze is just totally frightening.
    0:29:25 – Yeah, I’m a chairman of the World Health Organization.
    0:29:28 To your point about sticking our chin out,
    0:29:31 I believe Biden’s first executive order had to do
    0:29:34 with transgender athletes’ rights,
    0:29:35 and it took him three years
    0:29:36 for an executive order on immigration.
    0:29:39 – And to your point, illegal border crossings
    0:29:43 had dropped to about 45,000 in December of 2024,
    0:29:45 but in December of 2023,
    0:29:46 a quarter of a million people came across
    0:29:48 the border illegally.
    0:29:53 What I find sort of ironic and telling about these,
    0:29:54 I don’t know what the term is, roundups,
    0:29:57 or when the ice shows up,
    0:29:58 they’ve decided the most efficient place
    0:30:02 to quote unquote find these unproductive people
    0:30:03 who are freeloading.
    0:30:05 Is it workplaces?
    0:30:06 – Right.
    0:30:08 – I thought, so if you wanted to deport Americans,
    0:30:09 you’d probably go to McDonald’s
    0:30:12 or to their basements where video games are,
    0:30:15 but with undocumented workers,
    0:30:18 you go to places of work ’cause that’s where they are.
    0:30:20 I thought that was sort of ironic,
    0:30:22 but we had this coming.
    0:30:25 We ignored the problem, it got out of control,
    0:30:28 and just as you can never actually visually spot
    0:30:30 a pendulum at the middle,
    0:30:32 they have swung, they have taken advantage of this
    0:30:33 and they swung back.
    0:30:35 And quite frankly, I don’t have a problem
    0:30:38 with deportations of undocumented workers,
    0:30:39 but let’s start with those who are in prison,
    0:30:41 let’s start with those who’ve now committed two crimes,
    0:30:44 one crime coming over here, legal, the second one.
    0:30:47 I think that’s absolutely fair game.
    0:30:50 The, I mean, some of the other issues
    0:30:52 that we really screwed up on,
    0:30:55 we talked about transgender, we took just a,
    0:30:57 I don’t wanna say irrelevant,
    0:30:59 but an issue and gave them just a free gift
    0:31:01 would purchase for parents all over the nation
    0:31:05 who don’t wanna have their daughters kind of run over.
    0:31:08 The macro, the biggest issue, hands down in my opinion,
    0:31:10 is that a mix of identity politics,
    0:31:13 weaponization by special interest groups,
    0:31:16 essentially had the Democrats implicitly
    0:31:19 and explicitly turn their backs on the group
    0:31:22 that has struggled the most the last 40 years.
    0:31:25 Everybody feels when young people aren’t doing well.
    0:31:28 Their parents, society,
    0:31:29 and these are the people on social media saying,
    0:31:32 okay, great invidious, we’re $3 trillion,
    0:31:33 I can’t afford rent.
    0:31:35 So instead of focusing on policies like inflation,
    0:31:38 how to build more houses, bring down costs,
    0:31:41 lower enrollment instead of being weaponized
    0:31:43 by these universities that are,
    0:31:46 I mean, essentially some of the most corrupt organizations
    0:31:48 in the world are seen as the center
    0:31:50 of democratic politics, specifically my industry.
    0:31:54 What is more of an epicenter of kind of democratic ideals
    0:31:56 than elite institutions
    0:31:58 who I just interviewed the president Dartmouth.
    0:32:02 They have an $8 billion endowment and they let in 500 kids.
    0:32:03 – Okay, excuse me,
    0:32:07 but you’re not this elite cast enforcer
    0:32:10 talking about big, big fancy ideals,
    0:32:12 but you don’t want to give people this drug
    0:32:14 that decreases obesity, anxiety,
    0:32:17 gives them a shot at getting married, making money.
    0:32:20 No, you’d rather hoard it just for you and your elite friends.
    0:32:23 So let’s create a misdirected DEI.
    0:32:26 Michigan, University of Michigan has 200 DEI officers.
    0:32:32 60% of Harvard’s freshman class identifies as non-white,
    0:32:34 but we need to have DEI on campuses
    0:32:35 such that we can discriminate against,
    0:32:38 what, white kids and rural states?
    0:32:40 I mean, it’s just, we got so out of control
    0:32:43 with the identity politics, the DEI apparatus,
    0:32:46 not focusing on inflation, not focusing on young people,
    0:32:50 that we just gave them a layup to become sort of,
    0:32:52 you know, just go overboard, flood the zone
    0:32:54 with a ton of shit.
    0:32:56 I get it, we deserve it, we had it coming.
    0:32:58 What I’m really disappointed about
    0:32:59 is why we’re all taking it
    0:33:02 and calling on people to work together.
    0:33:03 As far as I’m concerned,
    0:33:07 it used to be about a certain level of mutual respect.
    0:33:10 You know, Democrats and Republicans at the end of the day
    0:33:12 thought, well, if they get in power,
    0:33:14 I want them to show me some mutual respect.
    0:33:16 It seems to me like we need to move to mutual destruction
    0:33:20 and say, look, Marjorie Taylor Greene and Stephen Miller,
    0:33:23 if you want to start revoking security details,
    0:33:24 just be careful what you ask for,
    0:33:26 ’cause once you’re out of office,
    0:33:29 my guess is you’re gonna be real fond of a security detail.
    0:33:31 You know, if you want,
    0:33:33 I mean, if you want us to stick the DOJ on you
    0:33:35 after our guy gets in office,
    0:33:38 but right now we’re giving them the impression that
    0:33:39 if you hit us, we’re Gandhi,
    0:33:42 and we believe that peace is gonna work here,
    0:33:44 and I think it’s to our disadvantage.
    0:33:47 I think we come across as total wimps
    0:33:50 and there’s no incentive for them to say,
    0:33:52 well, maybe we shouldn’t be cutting
    0:33:54 the security detail of General Milley,
    0:33:57 in case our generals that we like are under threat
    0:33:58 after they leave office.
    0:33:59 I don’t think there’s any sense
    0:34:03 that we’ll ever hit back, so to speak, your thoughts.
    0:34:04 – Yeah, to back up your point,
    0:34:08 that I think it’s crazy that we had his whole transition
    0:34:12 knowing exactly who he is, what he’s going to do,
    0:34:14 he’s telegraphing it every day,
    0:34:19 and that we showed up on January 20th at inauguration
    0:34:24 and did not have a solid message or a plan for countering this.
    0:34:28 – The Democrats response post inauguration
    0:34:33 or the complexion, for me, defines the term flat-footed.
    0:34:36 Just on our heels, not even responding,
    0:34:41 just kind of paralyzed, just incredibly encephaletic
    0:34:44 and not counter punching at all.
    0:34:48 It feels to me like in terms of the viscosity or strength
    0:34:49 of the Democratic Party right now,
    0:34:52 we have, I’ve never seen us this week.
    0:34:53 – That’s a major declaration.
    0:34:55 I don’t know if it’s necessarily wrong.
    0:34:57 I think the right attitude,
    0:34:59 like Congressman Golden’s team has said,
    0:35:01 we’re not gonna respond to everything that Trump does
    0:35:02 ’cause you can’t live in the midst
    0:35:05 of an outrage cycle constantly.
    0:35:08 But if we just take it on the chin constantly,
    0:35:11 I could see voters showing up again in 2026
    0:35:13 and just saying, well, what are you about?
    0:35:15 I still have absolutely no idea.
    0:35:17 So Huckin’ Jeffries really wants to focus in
    0:35:18 on cost of living issues,
    0:35:23 and the Republicans have put forward all these EOs,
    0:35:26 they have a bill about banning transgender people
    0:35:29 from athletics, but they don’t have a cost of living bill
    0:35:31 and JD Vance was on with March Brennan over the weekend
    0:35:33 and said, well, it takes time to bring prices down.
    0:35:35 When they had told us it would happen on day one.
    0:35:37 So I think you have to keep hammering that,
    0:35:40 but you also need to have a personality
    0:35:42 and be able to go on a long-form podcast
    0:35:45 and chill with people and talk about other things
    0:35:49 besides politics and I’m not seeing that
    0:35:52 from that many key players in the Democratic party.
    0:35:54 Most people, myself included,
    0:35:57 thought Biden went too far with his preemptive pardons,
    0:36:02 but he may be vindicated in that in the long term,
    0:36:04 that these are ruthless people
    0:36:07 who have said in public forums
    0:36:10 that they’re going to come after these folks
    0:36:12 and that that was actually the right thing to do.
    0:36:14 I mean, that’s a Mitch McConnell move, right?
    0:36:16 It’s not a traditional Joe Biden move
    0:36:18 to go for the absolute worst case scenario.
    0:36:23 So, you know, I hope that if Kashpatal gets confirmed
    0:36:25 and it looks like he will,
    0:36:27 because it looks like everybody will,
    0:36:32 maybe not Tulsi, that he isn’t as punitive
    0:36:38 and as motivated by retribution as he details in his book,
    0:36:40 but who knows?
    0:36:43 And I wanted to add to what you were saying
    0:36:44 about young people.
    0:36:48 I was talking to a friend of mine whose brother is 30 years
    0:36:51 old, went on a bachelor party, 13 guys,
    0:36:53 10 of 13 voted for Trump.
    0:36:55 And these were all liberal-minded guys
    0:36:58 that went to university together.
    0:37:01 And what we were discussing that I found so interesting
    0:37:03 and it links also to the discussion
    0:37:05 about what’s going on with higher education
    0:37:08 is that we just are seeing now amongst young people
    0:37:11 a new definition of what sounds smart.
    0:37:15 So, it used to be, you know, all of your degrees,
    0:37:19 your level of credibility was directly connected
    0:37:22 to how fancy the school you went to was, right?
    0:37:25 Like what kind of job you had,
    0:37:28 how that you knew which fork went with which, right?
    0:37:30 Like going back to the pretty woman scene, right?
    0:37:31 Slippery little suckers, right?
    0:37:33 Like that that was what low-class looked like
    0:37:35 and high-class looks like someone
    0:37:38 who dedicates their life to public service,
    0:37:40 but also has a trust fund that they’re relying on
    0:37:42 and went to Harvard for everything.
    0:37:45 And now the people that are revered
    0:37:46 or that folks think are smart
    0:37:50 are the ones who are asking questions incessantly.
    0:37:52 And it doesn’t matter what they’re questioning.
    0:37:55 Like RFK, well, I’m just asking questions, right?
    0:37:58 About the measles vaccine or fluoride in the water
    0:38:00 or whatever it is that day.
    0:38:04 And the right has weaponized that against us
    0:38:06 to an incredible advantage
    0:38:10 because all of these young people who are smart
    0:38:14 and very well-educated now think that it is cool
    0:38:17 and forward-thinking and what they want to see in leadership
    0:38:20 for people to not actually know the answers to questions.
    0:38:23 And I don’t know how you rectify that
    0:38:25 because, you know, I talked to my toddler
    0:38:27 and she’s asking questions all the time, right?
    0:38:29 Like, but why, mom?
    0:38:30 Mommy, why?
    0:38:30 Why do we do this?
    0:38:32 Why are you going here?
    0:38:33 Why do I have to brush my teeth?
    0:38:35 Why do I have to make my bed?
    0:38:38 And then you fast forward
    0:38:41 to where she’ll be in 25, 30 years, let’s say.
    0:38:43 Is she gonna think Joe Rogan is the smartest person
    0:38:45 in the world because he’s just asking questions?
    0:38:47 I mean, even Lex Friedman,
    0:38:50 someone is a very traditionally smart person, right?
    0:38:53 In terms of education and productivity
    0:38:54 is just asking questions.
    0:38:58 Like asking Zelinsky, why don’t you just give up your country?
    0:39:00 And that’s what passes now
    0:39:02 as the folks that we should be looking up to.
    0:39:05 And that’s leading to a set of government officials
    0:39:07 who are also doing that same thing
    0:39:09 who are just flooding the zone with wild questions
    0:39:12 that then leave them with this route
    0:39:15 that they can go through to throw all of this
    0:39:16 administrative shit at the wall.
    0:39:19 And we’re ending up with a government
    0:39:21 that I’m sure will be in complete crisis
    0:39:22 in a matter of months.
    0:39:26 – Yeah, it’s definitely, I mean, we say this a bunch
    0:39:29 that we’re sort of in uncharted territory.
    0:39:31 It seemed like I’m just, you said something
    0:39:33 that sort of, I don’t know, piqued my interest
    0:39:35 because you know more about this than I do,
    0:39:39 but that you believe Kashpatel will be confirmed
    0:39:42 and that there’s a chance that Tulsi Gabbard won’t be.
    0:39:45 And I would have reversed that, but I don’t know the latest.
    0:39:46 What’s going on there?
    0:39:49 – What, well, what’s going on is, I mean,
    0:39:50 you saw Pete Hegsas go through.
    0:39:53 So there were three no votes on that.
    0:39:56 It was quite clear that Senator Tillis did want
    0:39:59 to vote against him ’cause he required Hegsas
    0:40:02 to write a letter to him answering specific questions
    0:40:06 about accusations of abuse against his second wife.
    0:40:09 There was a big New York Times piece about this.
    0:40:12 And you can see if an unjoney earns his face
    0:40:13 that she didn’t want to do this.
    0:40:15 But when you’re threatened with a primary
    0:40:17 and probably with violence,
    0:40:20 knowing how the internet works, you do those things.
    0:40:22 The reason that I say that I think it’s possible,
    0:40:23 Tulsi doesn’t get confirmed,
    0:40:27 Lindsey Graham over the weekend wouldn’t answer, right?
    0:40:29 And how he was voting on it.
    0:40:32 And I feel like if there is anyone
    0:40:33 who’s not gonna get through,
    0:40:36 it’s the one that people are being the quietest about.
    0:40:39 And no one really talks about Tulsi Gabbard.
    0:40:42 They’re talking about the other ones.
    0:40:46 I think it’s feasible that Republicans decide
    0:40:48 that RFK Junior is not actually a threat
    0:40:51 to vaccines or whatever and let him through.
    0:40:54 But it seems like they’re advertising,
    0:40:56 having good meetings with Cache Patel.
    0:40:59 And I haven’t seen one thumbs up MAGA post
    0:41:01 with people standing with Tulsi Gabbard.
    0:41:05 So, but I’m prepared for all of it to get through.
    0:41:09 – I love someone said that Mitch McConnell voting
    0:41:11 against Hexath would be like Hannibal Lecter
    0:41:13 going vegetarian on his death bed.
    0:41:18 It’s just like, don’t hold your breath.
    0:41:19 But yeah, it’s-
    0:41:20 – You did it.
    0:41:25 – Yeah, again, he’s got very little to lose at this point.
    0:41:27 Right?
    0:41:28 – Totally.
    0:41:30 – He’s leaving his old, I mean-
    0:41:32 – Profiles encourage though, right?
    0:41:36 That you’re doing it and then nothing really to show for it.
    0:41:40 But, you know, I guess I appreciate it anyway.
    0:41:41 – But you think Cache Patel is gonna get through?
    0:41:43 That’s super interesting.
    0:41:46 – Right now, I don’t, you know, who knows,
    0:41:51 but it looks likely and people seem to be thinking
    0:41:56 that Donald Trump is entitled to whatever he wants
    0:41:58 in terms of a cabinet.
    0:42:01 And usually people do get what they want.
    0:42:04 There have been cases where it hasn’t happened,
    0:42:07 but they strong arm everyone.
    0:42:09 And they have this fleet of people online
    0:42:11 just threatening anyone who opposes them,
    0:42:13 including, you know, members of their family
    0:42:14 will do things like that,
    0:42:17 whether that is a sitting elected representative
    0:42:19 or the president of another country.
    0:42:22 Like this game that went on with the Colombians
    0:42:25 about using military planes
    0:42:28 versus the regular detention planes.
    0:42:30 I mean, first of all, it costs three times more.
    0:42:35 They can get up to $852,000 to send back migrants
    0:42:39 on these C-130s, I think, or C-17s.
    0:42:40 – It’s all about branding, right?
    0:42:43 – And so it’s like a huge, it’s all the show.
    0:42:45 And CNN was reporting this morning
    0:42:48 that they want all of the ICE agents
    0:42:50 to be wearing their vests.
    0:42:53 And for this to be made for TV,
    0:42:54 ’cause we’re in the Truman Show,
    0:42:55 but like a really bad version.
    0:42:57 – Yeah, the one jet blue.
    0:42:59 Okay, let’s take a quick break.
    0:42:59 Stay with us.
    0:43:05 – Support for property comes from neutral.
    0:43:07 Hair is incredibly personal.
    0:43:09 Every man out there has his own relationship
    0:43:10 to his hair journey.
    0:43:12 Maybe you embrace the bald
    0:43:13 when you notice things thinning up top.
    0:43:14 I support you.
    0:43:16 I think that’s a wonderful decision.
    0:43:18 But if you’re not quite ready to say goodbye
    0:43:20 to your shedding thinning locks,
    0:43:22 you might want to check out Neutrophil.
    0:43:25 Neutrophil is the number one dermatologist recommended
    0:43:26 hair growth supplement brand.
    0:43:29 And it’s been trusted by over one million people.
    0:43:31 You may see thicker, stronger, faster growing hair
    0:43:33 with less shedding in just three to six months
    0:43:34 with Neutrophil.
    0:43:36 It’s physician formulated to target the common causes
    0:43:38 of thinning hair, including stress,
    0:43:40 lifestyle, hormones, aging, and more.
    0:43:41 If you’ve been waiting for a convenient
    0:43:43 and effective way to care for your hair,
    0:43:45 Neutrophil is a great place to start.
    0:43:48 You can start your hair growth journey now with Neutrophil.
    0:43:49 For a limited time,
    0:43:51 Neutrophil is offering our listeners $10 off
    0:43:54 your first month subscription and free shipping
    0:43:55 when you go to Neutrophil.com
    0:43:58 and enter the promo code PROPG.
    0:44:00 Find out why over 4,500 healthcare professionals
    0:44:03 and stylists recommend Neutrophil for healthier hair.
    0:44:08 Neutrophil.com spelled N-U-T-R-A-F-O-L.com promo code PROPG.
    0:44:13 That’s Neutrophil.com promo code PROPG.
    0:44:16 (upbeat music)
    0:44:20 – Welcome back.
    0:44:22 Before we wrap, we gotta adjust something
    0:44:23 we’ve been hearing from some of you.
    0:44:26 Apparently there’s a feeling that we’re not quite living up
    0:44:28 to the moderate in our show name.
    0:44:30 Maybe we’re leaning a bit more into the raging side.
    0:44:32 People say we’re more raging than moderate.
    0:44:34 That’s a fair point.
    0:44:37 So what does being a moderate really mean to us,
    0:44:38 especially during this new administration?
    0:44:40 Jess, you kick us off.
    0:44:42 What do you think it means to be a moderate?
    0:44:44 – Well, first, the fact that we’re talking about this
    0:44:48 is your fault because you made me a comments monster.
    0:44:51 And I went and looked at what people were saying
    0:44:54 and there’s a lot of positive stuff.
    0:44:54 But it–
    0:44:56 – Oh, don’t look at the comments.
    0:44:57 I mean, read the first five, learn from it
    0:45:00 and then ignore it, just your own mental health.
    0:45:03 – I stay up later than everyone in my household
    0:45:05 so I could spend a good amount of time
    0:45:07 comments doom scrolling.
    0:45:10 But seeing a lot of this, like this is not
    0:45:12 what a moderate means.
    0:45:15 And I am completely willing to accept
    0:45:17 that A, a moderate means different things
    0:45:19 to different people.
    0:45:24 And that also, I think of myself as a liberal moderate,
    0:45:28 not someone who is a swing voter at this point.
    0:45:31 And most people who advertise that are lying
    0:45:33 ’cause there’s usually a crucial issue
    0:45:35 that puts you into one camp or the other.
    0:45:37 For a lot of people, it’s whether you’re pro-choice
    0:45:39 or pro-life, and as a pro-choice person,
    0:45:42 I would be hard-pressed to support
    0:45:44 a candidate that was pro-life.
    0:45:46 But I guess I wanted to talk about this
    0:45:48 because I think what we share
    0:45:53 and why we wanted to do this specific podcast together
    0:45:57 under this name is because we wanna talk
    0:46:00 about politics through the framework of pragmatism,
    0:46:04 not just optimism or what we want to happen.
    0:46:07 And that it’s important to have political discussions
    0:46:10 that are cognizant of the guardrails
    0:46:12 of the way government actually operates.
    0:46:15 And also, I think most crucially,
    0:46:18 understanding that the framework of a partisan worldview
    0:46:21 is not how the general populace operates.
    0:46:25 And the 2024 results were so indicative
    0:46:27 of that transformation,
    0:46:30 that people are not interested in backing a team
    0:46:32 in the same way of anything they’re backing Trump
    0:46:33 because they back that player, right?
    0:46:35 Like that’s their favorite football player,
    0:46:38 their favorite basketball player or whatever sport they’re into.
    0:46:43 And I think that being a moderate right now is trying,
    0:46:47 as per our earlier discussion,
    0:46:50 to see the good in what the other side
    0:46:52 may be bringing to the table and saying like,
    0:46:54 “Sure, that works for me.”
    0:46:57 And also I have constituents or I have people that I know
    0:47:00 who absolutely feel that way,
    0:47:04 recognizing the faults of the party that we both belong to
    0:47:07 and then trying to find a way to constructively
    0:47:12 and effectively push back where we need to.
    0:47:15 And that’s how I see it right now.
    0:47:17 I don’t, how do you see it?
    0:47:19 Even though you said earlier, you’re not moderate.
    0:47:21 So the podcast is over.
    0:47:24 – I would define myself as kind of ground zero
    0:47:25 for moderates.
    0:47:26 And that is what, as far as I can tell moderates
    0:47:29 are basically people that everybody hates.
    0:47:34 And essentially, I mean, the generous
    0:47:36 or the actual Webster definition would be someone
    0:47:39 who has tempered views
    0:47:42 and is somewhere in the middle on the political spectrum.
    0:47:45 And the way I see it is I’ve tried as I’ve gotten older
    0:47:48 to not be lazy and sign up for any political orthodoxy.
    0:47:50 When I hear something crazy on the left,
    0:47:54 I like to call it out when I think our democratic leadership
    0:47:58 is too inefficient, feckless, cowardly, I call it out.
    0:48:01 When I think DEI is out of control,
    0:48:04 when I think that immigration is out of control,
    0:48:07 when I think that social security spending is out of control,
    0:48:10 you know, there’s some of their favorite policies
    0:48:12 of the left, I call it out.
    0:48:17 And when I am, you know, I’m vigorously pro-Israel.
    0:48:19 It’s like, I don’t bark up any one tree.
    0:48:20 I try to have my own views.
    0:48:24 In this environment, based on where the political spectrum is,
    0:48:25 I’m now seen as center left.
    0:48:28 In the 70s, I would have been a Rockefeller Republican.
    0:48:30 I just would have been, I would have,
    0:48:32 I would have been in that party.
    0:48:34 But I think it’s also just saying,
    0:48:36 look, I’m gonna look at, I’m gonna be a critical thinker.
    0:48:39 I’m gonna look at issue by issue.
    0:48:41 And regardless of the political orthodoxy
    0:48:42 you’re supposed to sign up to, you say,
    0:48:44 okay, I’m not down with this.
    0:48:47 And it’s almost like you become, unfortunately,
    0:48:50 to a certain extent, the left is much harsher on moderates.
    0:48:52 They treat you like an apostate.
    0:48:54 Yeah, Scott, I thought we could trust you.
    0:48:57 People from the Biden campaign, sign up.
    0:48:59 Don’t you understand the assignment?
    0:48:59 Sign up.
    0:49:01 Well, no, he’s too fucking old.
    0:49:03 What are we doing here?
    0:49:05 And then people on the right are just like,
    0:49:06 kind of write you off as a libtard.
    0:49:09 But they don’t come after you the same way the left does
    0:49:11 when they thought you were, quote unquote,
    0:49:13 when we thought you were one of us.
    0:49:17 So I see a moderate as someone who says,
    0:49:19 okay, I’m gonna go issue by issue.
    0:49:20 I’m gonna use critical thinking.
    0:49:22 I’m gonna be unafraid to say this makes no sense,
    0:49:26 regardless of the cult dynamics of pressure
    0:49:29 to sign up for the full orthodoxy and narrative.
    0:49:31 Because when the narrative gets crazy
    0:49:35 or makes no fucking sense, you say, okay, I don’t buy this.
    0:49:37 I don’t have a problem with deporting criminals.
    0:49:39 I get the symbolism of it.
    0:49:41 I don’t have a problem with a surge of troops
    0:49:44 at the border, fine.
    0:49:46 Deficits, anyways, my point is,
    0:49:49 I’d like to think a moderate is someone who says,
    0:49:50 I’m a critical thinker.
    0:49:52 I’m gonna look at the issues
    0:49:54 and I’m gonna decide one by one,
    0:49:57 what I think is the right view on this.
    0:50:00 I’m not gonna sign up and just say, okay,
    0:50:03 I’m a fan, I’m a cultist, no matter what they say.
    0:50:05 And this is true about the left and the right.
    0:50:08 But I will say as someone who’s seen or identified
    0:50:13 as a Democrat that I get more hate from the left
    0:50:13 than I do from the right.
    0:50:16 The right has just kind of written me off.
    0:50:17 And that’s how it is in our society.
    0:50:18 You gotta pick a side.
    0:50:21 You can’t say, well, I wanna go issue by issue, right?
    0:50:24 Do I believe women should have the right
    0:50:25 to terminate a pregnancy?
    0:50:26 Yes.
    0:50:28 And the third trimester?
    0:50:30 Okay, that’s worth a discussion, right?
    0:50:35 If the woman’s health or the baby’s health is not in danger,
    0:50:38 at the same time, nobody trusts each other.
    0:50:41 So nobody wants to have anything resembling
    0:50:45 kind of a moderate conversation.
    0:50:49 In addition with Citizens United and gerrymandering
    0:50:52 that has hard right and hard left districts,
    0:50:54 there’s no political room for moderates anymore.
    0:50:56 They can’t get elected, right?
    0:50:58 Because basically every election now
    0:51:00 with these hard blue and hard red districts
    0:51:03 is decided in the primary.
    0:51:05 So it’s basically who can be craziest.
    0:51:07 Who can be more crazy left or who can be more crazy right?
    0:51:10 And a moderate is just a recipe for not getting elected.
    0:51:14 So I think in the media or as a commentator
    0:51:16 or whatever you wanna call yourself a thought leader,
    0:51:18 I think it’s especially important that we demonstrate.
    0:51:20 It’s okay to be a critical thinker
    0:51:23 and occasionally have your followers on threads
    0:51:25 or blue sky come after you because you say,
    0:51:28 yeah, I don’t get this democratic policy.
    0:51:34 I think the vice president was a great senator.
    0:51:36 I think she’d be a great Supreme Court justice.
    0:51:39 I don’t know if she’d be a great president.
    0:51:41 And then everyone comes for you.
    0:51:45 And I’m not willing to sign up and blind,
    0:51:49 bend the knee for C above an insurrectionist and a rapist.
    0:51:51 – Critical thinking, look at every person,
    0:51:53 look at every issue and decide where you are on it.
    0:51:56 Because that’s my story and I’m sticking to it, Jess.
    0:51:58 – No, I think that’s the right story.
    0:52:01 And I think that there’s so much pressure
    0:52:02 to always be a good soldier.
    0:52:06 I certainly feel this in my role,
    0:52:08 being part of the conservative media ecosystem,
    0:52:13 that there are liberals who get enraged if I even say,
    0:52:14 well, this doesn’t make a ton of sense.
    0:52:19 Or this is an issue that 70% of Americans agree on.
    0:52:21 Like, why can’t that just be our position?
    0:52:24 It seems pretty normal.
    0:52:29 Like, you know, the was a prop 36 on the California ballot,
    0:52:33 making it not okay to shoplift up to $950
    0:52:34 without getting arrested.
    0:52:37 Like these are just obvious things.
    0:52:40 And you should be able to have opinions about them
    0:52:42 without people flipping out on you.
    0:52:45 But I do think a critical component
    0:52:48 of how we approach politics
    0:52:53 and how people who are governing more in the middle do
    0:52:55 is that they fundamentally understand
    0:52:58 that it’s not the intentions that matter,
    0:53:00 it’s the outcomes that matter.
    0:53:03 And we just had an enormous outcomes election
    0:53:07 where people said that governance in blue cities and states
    0:53:12 is not meeting the moment, far from it.
    0:53:16 That riding the subway is not a good option anymore.
    0:53:18 That we don’t support law enforcement,
    0:53:21 that we have people who are incompetent,
    0:53:24 perhaps corrupt in big positions of power.
    0:53:27 And that we are not living up to the covenant
    0:53:31 that our elected officials make with the people
    0:53:32 who send them there.
    0:53:35 And that we’re actually failing ourselves
    0:53:37 when it comes to our values
    0:53:40 because of how poor that governance has gotten.
    0:53:44 And that’s really crucial to how I think about politics
    0:53:47 and how I think about my advocacy for policies
    0:53:49 that I think will improve quality of life.
    0:53:53 And more often than not, those policies are linked
    0:53:58 to more liberal legislators or people who see the world
    0:54:01 through a similar prism to me.
    0:54:03 But I am completely open to the fact
    0:54:05 that there are good representatives
    0:54:07 from the other side of the aisle
    0:54:10 that also live by those ideals
    0:54:12 or certainly can meet us somewhere in the middle
    0:54:13 to get something good done.
    0:54:18 And I remember when the Democrats were funding MAGA candidates
    0:54:21 to run against moderate Republicans.
    0:54:26 And I understood it from a, we got a win perspective.
    0:54:29 But it was upsetting to me
    0:54:34 that good people like Peter Mayer in Michigan lost his seat
    0:54:39 because we put in tons of money to back a crazy person
    0:54:41 that would then go on to lose to the Democrat.
    0:54:44 And I think that those are the kinds of things
    0:54:45 that we need to explore
    0:54:49 because we’re only gonna have a healthy political system
    0:54:52 if we do have two thriving parties, right?
    0:54:55 That are full of people that actually capture
    0:54:58 the cultural and political zeitgeist of the country.
    0:55:03 And the extremes on both sides are wildly dangerous.
    0:55:07 And I think that the right more dangerous than the left.
    0:55:10 But when you look at what people think
    0:55:12 and how they’re talking about the issues,
    0:55:14 you know that the Marjorie Taylor Greens of the world
    0:55:18 are not appealing to the general population writ large.
    0:55:21 And that first sound like the Alonno Mars
    0:55:22 or the Rashida Thalibs of the world
    0:55:24 are not appealing to them either.
    0:55:29 And so I think it’s important to be trying to stake out ground
    0:55:31 to have these discussions about it to, you know,
    0:55:34 bring people on from the other side of the aisle
    0:55:37 or people who work regularly with Republicans
    0:55:41 so that we can hear about how progress can actually be made
    0:55:42 because they’re the ones,
    0:55:44 we can talk all we want from our studios,
    0:55:47 but they’re the ones actually casting the votes
    0:55:50 for all of this and hopefully making a big difference
    0:55:51 in people’s lives.
    0:55:52 – Yeah, something I think the Republicans
    0:55:54 have done much better than Democrats.
    0:55:57 And your buddy Tim Miller, I thought made a great point on
    0:56:01 from the Bollard podcast that they kind of the coarseness
    0:56:06 and the, I don’t know, just the provocative,
    0:56:09 sometimes stupid, sometimes weirdness
    0:56:10 that’s come out of the right.
    0:56:13 It came across as authentic, whereas Democrats,
    0:56:15 it’s as if they’re reading off a press release
    0:56:17 or believe that they’re crossing the Delaware
    0:56:20 or giving an inauguration speech.
    0:56:21 I mean, they’re just so like,
    0:56:25 it’s like speak like a regular person for God’s sakes.
    0:56:28 And the Republicans do that better than Democrats.
    0:56:31 Also, it strikes me that a big component
    0:56:33 in terms of what impacts people’s lives
    0:56:38 and gives them the impression of the respective brands,
    0:56:40 we have to get our shit together around on the ground,
    0:56:44 running some of these Democratic cities.
    0:56:49 A bunch of my friends lost houses in the Pacific Palisades.
    0:56:50 And they basically all said the same thing.
    0:56:53 They’re like, I keep hearing these excuses
    0:56:54 where the reservoirs weren’t full,
    0:56:56 the water pressure was down, whatever it is.
    0:57:01 And he’s like, we pay 13% a year in additional taxes.
    0:57:02 If I’m gonna have my house burned down,
    0:57:05 I’ll move to Florida or Texas where I pay 0%.
    0:57:09 It’s like, we should have the supersize gold-plated VIP,
    0:57:15 a white glove government when you’re paying 13%.
    0:57:17 And instead, some of the highest tax,
    0:57:19 highest tax states are offering,
    0:57:21 I mean, they’re expensive but bad,
    0:57:23 which isn’t a recipe for a good product.
    0:57:27 And the most expensive but bad metros right now
    0:57:29 are governed by Democrats
    0:57:31 who seem to be weaponized by unions
    0:57:34 or whatever it might be, special interest groups
    0:57:37 and are just taxing the shit out of their local residents
    0:57:40 and doing head up your ass, you know,
    0:57:42 enforcement like you’re talking about
    0:57:45 where if you steal less than $900,
    0:57:47 they don’t even prosecute you.
    0:57:50 So I think until, and by the way,
    0:57:51 I don’t think that’s true of New York.
    0:57:53 New York has 12, I think it’s a total
    0:57:56 of about 12 or 13% also in taxes.
    0:57:58 But I would argue Manhattan is worth it.
    0:58:02 I think it’s actually, I think Manhattan is well run.
    0:58:05 And I don’t know about you, but I do feel,
    0:58:09 I do feel that the subway feels a little bit different.
    0:58:11 I’ve noticed that, I’m more aware.
    0:58:13 – That’s an understatement.
    0:58:15 – Yeah, you really feel it, don’t you?
    0:58:17 – Well, I do personally,
    0:58:20 but I don’t remember I grew up here
    0:58:22 and have ridden the subway my whole life,
    0:58:25 the number of people being like thrown in front of cars,
    0:58:29 being punched, slashed, a woman burned on the F train.
    0:58:33 That’s not normal.
    0:58:35 Oh, things happen every once in a while.
    0:58:36 – Agreed.
    0:58:38 – And Cathy Hookle is definitely getting the message
    0:58:42 because of Richie Torres, but it’s bad.
    0:58:46 – Yeah, and also I just, in terms of crime, I think it’s-
    0:58:48 – Well, crime is down, but those are also
    0:58:50 to the point about the messaging of this.
    0:58:54 No one, if they feel less safe and you come at them
    0:58:57 with a bunch of statistics, it doesn’t matter.
    0:58:58 – If I don’t feel safe on the subway,
    0:59:00 if I can’t have my AirPods in,
    0:59:03 ’cause I need to, I have to feel like I’m more aware.
    0:59:05 All right, that’s all for this episode, Jess.
    0:59:07 Thank you for listening to Raging Moderates.
    0:59:10 Our producers are David Toledo and Chenenye Onike,
    0:59:12 our technical director is Drew Burroughs.
    0:59:14 You can find Raging Moderates on its own feed every Tuesday.
    0:59:17 – That’s right, Raging Moderates on its own feed.
    0:59:20 Please follow us wherever you get your podcasts.
    0:59:22 Jess, have a great rest of the week.
    0:59:23 – You too, Scott.
    0:59:24 – Thank you.
    0:59:26 (upbeat music)

    Scott Galloway and Jessica Tarlov dive into Trump’s whirlwind first week back in office. From immigration crackdowns to controversial pardons and foreign aid freezes, they break down the chaos and its implications for the U.S. They also explore why the Democratic opposition is falling short, the balance between confrontation and compromise, and how moderates fit into the current political climate.

    Follow Jessica Tarlov, @JessicaTarlov

    Follow Prof G, @profgalloway.

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • The $500B AI Project Stargate: Everything You Need to Know

    AI transcript
    0:00:09 Hey, welcome to the Next Wave Podcast. I’m Matt Wolf. I’m here with Nathan Lanz. And
    0:00:15 this has been a monumental, massive week in the world of AI. We got Project Stargate.
    0:00:21 We got DeepSeek R1. We got announcements about the next versions that OpenAI are about to
    0:00:27 release. So many huge things happen this week. And we’re both optimistic and maybe a little
    0:00:30 bit scared about some of it. But we’re going to go ahead and break it all down for you and
    0:00:35 really dive deep into our thoughts about all of this stuff. So let’s just get right into it.
    0:00:41 Look, if you’re curious about custom GPTs or our pro that’s looking to up your game,
    0:00:47 listen up. I’ve actually built custom GPTs that helped me do the research and planning for my
    0:00:53 YouTube videos. And building my own custom GPTs has truly given me a huge advantage. Well,
    0:00:58 I want to do the same for you. HubSpot has just dropped a full guide on how you can create your
    0:01:03 own custom GPT. And they’ve taken the guesswork out of it. We’ve included templates and a step-by-step
    0:01:09 guide to design and implement custom models so you can focus on the best part, actually building
    0:01:13 it. If you want it, you can get it at the link in the description below. Now back to the show.
    0:01:23 This week has been a fairly monumental week in AI. It’s been a really, really big week,
    0:01:30 I think the biggest news in the world of AI was the Stargate project. Donald Trump announced it
    0:01:36 alongside Masayoshi-san and Sam Altman and Larry Ellison of Oracle, right? They were all together
    0:01:42 up during this press conference, all sort of taking turns explaining why they’re excited
    0:01:46 about this project Stargate. You know, it had been rumored for like probably the last six or
    0:01:50 even nine months that Sam Altman was working on something like Stargate. They had been having
    0:01:54 conversations with Masayoshi-san. You know, a lot of people in America don’t know who Masayoshi is.
    0:02:01 Masayoshi started SoftBank, which is almost like the AT&T of Japan, I guess is one way to describe it,
    0:02:05 but also they’ve brainstormed out into many different ventures, including they raised one of
    0:02:09 the largest venture funds in the world in collaboration, I think, with some of the Saudis,
    0:02:14 I believe. They were also the largest investor in WeWork. Yeah, yeah. They had the Vision Fund,
    0:02:20 which I think was the largest venture capital fund ever, right? So Masayoshi aims big. In my previous
    0:02:24 startup, one of my main investors was Taizo-san, his younger brother. And actually, I got to be
    0:02:28 pretty good friends with Taizo-san and also really good friends, like even closer with his
    0:02:32 right-hand man, Atsushi Taira, but he aims very big. Anyway, so there have been rumors that they
    0:02:35 were trying to build something like this together, that there have been, you know, conversations
    0:02:40 about it. And then I think around the time when Trump won the election, I believe there was some
    0:02:45 kind of press conference with him and Masayoshi, where they kind of alluded to something like this,
    0:02:50 but they didn’t talk details. And they talked about, okay, $100 billion is what Masayoshi was
    0:02:55 saying. And Trump told him, make it $200. And so now they’re coming out and saying,
    0:03:01 oh, this is actually going to be a $500 billion AI infrastructure project. And so just to give
    0:03:05 people context, I mean, I did some research on perplexity, like that’s the largest infrastructure
    0:03:11 project ever in human history. The closest thing is like maybe there’s a city that Saudi Arabia
    0:03:16 built, which was around $500 billion, but that’s an entire city. Definitely in terms of technology,
    0:03:21 this is the largest infrastructure project ever in human history. Yeah, yeah. So I’ve got the
    0:03:26 tweet up here. So I’ll sort of dig in on like a few of the elements just to sort of make it super
    0:03:32 clear. But the Stargate project is a new company, which intends to invest $500 billion over the
    0:03:38 next four years in building new AI infrastructure for open AI in the United States. So this is
    0:03:45 specifically saying we’re going to build this infrastructure for open AI to use, right? So
    0:03:48 that’s a very important element. And we’ll probably get into some discussion around that
    0:03:55 in a little bit here, but they are building this multi $100 billion infrastructure for open AI.
    0:04:00 So they’re going to begin deploying $100 billion immediately. They claim it’s going to
    0:04:04 secure American leadership and AI, create hundreds of thousands of American jobs,
    0:04:10 and generate massive economic benefit for the entire world. And if you watched any of the press
    0:04:15 conferences, there was the press conference inside the White House, and then there was another sort
    0:04:19 of interview that happened between those same three people out on the White House lawn. If you
    0:04:26 watched any of those, they really, really focused in on how this was going to benefit health. Larry
    0:04:30 Ellison talked a lot about how people are going to use AI and give all their health records to AI,
    0:04:35 and AI is going to sort of pre-diagnose things. And then you can take your sort of pre-diagnosis
    0:04:39 to a doctor and the doctor could sort of confirm the results. He also talked about how
    0:04:44 this infrastructure is going to find the cure for cancer and is going to create a vaccine for
    0:04:50 cancer. And it’s going to solve pretty much like every health ailment that plagues mankind, right?
    0:04:55 That sort of the vision that they pitch to everybody, creating hundreds of thousands of jobs,
    0:05:01 and also solving all of these health issues. Their tweet doesn’t really go into all the health
    0:05:06 issues, but that was what they really, really honed in on the press conference. So the initial
    0:05:12 founders in Stargate are SoftBank, which is Masayoshi Sun, OpenAI, which is Sam Altman, Oracle,
    0:05:18 which is Larry Ellison, and then MGX, which I’m not super familiar with MGX. They’re the company
    0:05:22 that I guess is going to be really honed in and focused on the medicine stuff. So SoftBank and
    0:05:28 OpenAI are the lead partners for Stargate with SoftBank being the financial part of it. OpenAI
    0:05:33 having the operational part. Masayoshi Sun’s actually going to be the chairman of this company.
    0:05:38 And then they have some key tech partners, ARM, who makes basically all the chips that go into
    0:05:42 mobile devices these days. And Masayoshi owns. Yeah, that’s what I was going to say. I believe
    0:05:48 Masayoshi Sun is like the majority owner in that company. Microsoft, I mean, I think Microsoft is
    0:05:54 in the mix by way of OpenAI, right? Yeah. NVIDIA, Oracle, and OpenAI. They’re all the key technology
    0:05:59 partners in this project. And they’ve already started building their first mega data center in
    0:06:04 Texas, where once that data center is built, will be the largest data center ever built on the planet,
    0:06:09 right? Yeah, they’re going to closely collaborate to build and operate this computing system.
    0:06:14 And in particular, AGI, we believe that this new step is critical on the path and will enable
    0:06:19 creative people to figure out how to use AI to elevate humanity. So that’s the big sort of pitch
    0:06:27 there is that it is this company that’s really kind of OpenAI, Oracle, and SoftBank coming together
    0:06:35 to create this $500 billion AI infrastructure to essentially get to AGI and to create new jobs.
    0:06:40 I feel like the create new jobs part might be like more of a short term thing. I don’t think
    0:06:44 over like a 10 year window. Yeah, I think there’s a reason they focused on the drug discovery and
    0:06:49 all that kind of stuff. Because like messaging wise, it’s like, okay, like long term, what also
    0:06:54 will I do? Yeah, there could be some job loss. But I think, you know, as we’ve said before on
    0:07:00 the podcast, you know, there’s good and bad sides of AI, but we both believe that AI in general
    0:07:04 will have a positive outcome for humanity. Yes. And that even if people end up doing less jobs,
    0:07:07 a lot of it will be jobs that they didn’t actually want to do. And there will be more
    0:07:13 abundance in society that people can live better lives. So I’m personally super excited for this.
    0:07:16 You know, it’s like when we first started the podcast, we’re talking about like how big of a
    0:07:19 moment this is in human history. Yeah. Right. That was like when the first episodes we ever did
    0:07:23 talking about that. And that’s exactly the stuff that Masayoshi’s talking about. I kind of wonder
    0:07:27 if he’s reading my newsletter. Yeah, it could be because I would talk about like the golden age
    0:07:32 of AI in America. Yeah. And he repeated that multiple times. Yeah, he kept on saying this is
    0:07:36 the golden age, right? I do believe it’s true. It’s like this is a moment where you could reimagine
    0:07:41 everything using AI. And also, you know, talking about like deep seeking China and the progress
    0:07:46 China is making. This is a moment where like, yeah, the AI wars have begun. Yeah. It’s a monumental
    0:07:51 moment. Like I really, really think this is a big moment in sort of the trajectory of human history.
    0:07:57 This is like the beginning of the Manhattan project, right? Like this is like a big step in
    0:08:02 saying we are going to be the dominant leader in the world. We are going to be the first ones to
    0:08:07 hit AGI and, you know, probably not long after that ASI. That is sort of like what they’re doing.
    0:08:12 They’re planting their flag in the sand and saying we are going to lead this. Yeah. It’s the beginning
    0:08:17 of what I feel like is going to be like essentially the space race between us and China, right?
    0:08:20 Yeah. And what I’ve said before, like, you know, I live here in Japan, I’m like,
    0:08:24 eventually America’s going to have a huge advantage because of their partnership with Japan,
    0:08:28 especially when it comes to robotics in the future. I still strongly believe that.
    0:08:32 And so the fact that this alliance is between an American company and a Japanese company
    0:08:39 is really promising for the future. Yeah. So I am very, very excited. I lean mostly optimistic,
    0:08:44 but I do have some things. I actually made a whole YouTube video about it and I made a whole
    0:08:50 tweet about it. And, you know, it might come off as a little tinfoil hat conspiracy theorist sort
    0:08:57 of thing. But I think my concern around all of this is specifically around Larry Ellison.
    0:09:02 I’m not sure how much you know about Larry Ellison, but he’s the CEO of Oracle.
    0:09:06 Yeah. He’s notorious in Silicon Valley and it’s interesting too. He is friends with Elon Musk.
    0:09:09 And now Elon Musk, it kind of seems to be pissed off about this whole thing.
    0:09:12 Yeah. Elon Musk is not happy about it. He’s already talking crap on Twitter about it.
    0:09:18 But yeah, anyway, with Larry Ellison. So Oracle was originally founded as a company
    0:09:24 to build databases for the CIA, right? So it was originally they had a different name.
    0:09:29 Their very first project was called Project Oracle. Project Oracle was designed to build
    0:09:36 databases for the CIA. And to this day, Oracle still has like government contracts with the CIA
    0:09:41 and various, you know, three letter government agencies here in the US, right? So you’ve got
    0:09:48 that element of it, right? Also, Larry Ellison just about four weeks ago did like this investor
    0:09:53 meeting to all of like the Oracle investors. And while he was on stage, he was talking about
    0:09:59 envisioning a future where everybody was under surveillance. Yeah, I saw that he was talking
    0:10:04 about how there was cameras on drones, cameras on buildings, cameras on police, cameras on all the
    0:10:09 newer car models all have cameras. And he was talking about how all of this data is going to
    0:10:15 get fed to a data center somewhere. And then AI is going to analyze all of this. And when there’s
    0:10:21 anything that pops up that the AI deems is worrisome, they’re going to alert the authorities
    0:10:27 automatically. So when I say I have like some concerns, Larry Ellison is the one that like
    0:10:32 his background with working with all the government agencies and also literally recent
    0:10:37 statements within the last like four months about how he wants all this surveillance and he
    0:10:42 sees a world where people and police officers will fall in line because they’re always being
    0:10:47 watched. Yeah, like that is the future he wants to build. He’s publicly talked about that. So
    0:10:53 that to me is a little worrisome, honestly. And then also just sort of going further down
    0:10:57 this rabbit hole. I feel like that meme of always sunny in Philadelphia where he’s like
    0:11:01 connecting all the dots and he’s got like the pin board and he’s like tying strings together
    0:11:08 and stuff. The most recent board members that open AI brought in house to be in the open AI
    0:11:14 board, one of them is an ex member of the NSA. And the newest one is a member of BlackRock,
    0:11:18 one of the executives at BlackRock, which is, you know, the world’s largest investment firm
    0:11:24 that has huge political ties and tries to steer the politicians. Like if there is an Illuminati
    0:11:29 BlackRock is kind of part of it. Anyway, okay, done with all the conspiracy stuff there. But
    0:11:34 like when I’m starting to put all those pieces together, it makes me wonder if like outwardly
    0:11:40 the motives they talk about are building new jobs, curing cancers, creating new vaccines that will
    0:11:46 prevent cancer from ever happening in the first place. But inwardly, Larry Ellison needs a massive
    0:11:52 data setter to collect all this video footage so that he can use AI to analyze it and keep tabs on
    0:11:58 the people. Just throwing that out there. That was my whole like rant and ramble that I put on
    0:12:02 Twitter. But I get what he’s saying. Like I do think that actually, you know, AI will be good
    0:12:06 in that way that you can have customized medicine in terms of the monitoring and the surveillance
    0:12:11 stuff. Yeah, I saw that. I was kind of, you know, alarming. You know, I’ve read a lot of sci-fi books
    0:12:15 on the topic. I’ve always been of the opinion that that’s probably going to happen in the future.
    0:12:19 That’s going to be like inevitable. And I don’t like it, but I don’t see any way around it because
    0:12:24 people do really value safety. And in the future, as AI gets better, it’s going to be harder to
    0:12:28 harder to argue against safety. And so I do think you will have AI systems that do like mass
    0:12:32 surveillance and stuff like that. I think that’s going to be really hard to avoid. So I get the
    0:12:37 concern. I don’t really know how you avoid that as technology gets better. You know, slightly
    0:12:41 comforting that, you know, Masayoshi and Sam Altman, I don’t think they’re all for surveillance,
    0:12:46 you know, like Larry is. So hopefully there’s a balance there. Yeah, I wouldn’t imagine so. But
    0:12:50 yeah, I don’t know where Trump stands on it. I don’t want to get into the politics of it all. I
    0:12:54 don’t know where he stands on it. But obviously, you know, this new company has the backing of the
    0:13:00 US government. One of the last executive orders that Biden signed before he left office was an
    0:13:06 executive order to allow AI data centers to be built on federal land. Yeah. Right. So basically,
    0:13:12 data centers can be built anywhere in the country. The government can basically give land for these
    0:13:18 data centers. And clearly Donald Trump was the one who sort of announced this new Stargate project
    0:13:22 before introducing, you know, the three main players. So they have the backing of the federal
    0:13:25 government. When you mean the backing, you mean the money though, because I’m not sure that it’s
    0:13:30 confirmed. No, no, no, not the money. I think the money’s mostly coming from Masayoshi-san and,
    0:13:34 you know, maybe some from Oracle and Open AI, but it seems like Masayoshi-san is sort of responsible
    0:13:38 for the financing of it. But it sounds like the government is essentially saying, we’re not going
    0:13:43 to get in your way to build whatever you want to build. That’s kind of my takeaway from it is like
    0:13:48 they have the backing, not in a financial sense, but in the like open doors sense, right?
    0:13:51 Tends to support. They’re probably going to get things like, you know, they want to go through
    0:13:55 as many regulations to get things set up and they’ll all be fast track. That’s kind of my
    0:13:59 understanding. Yeah. Yeah. But then talking about Open AI, right? I sort of highlighted the fact that
    0:14:05 they are building this for Open AI. Well, to me, that sort of brings me back to when we’re talking
    0:14:10 about like the whole podcast between Joe Rogan and Mark Andreessen, right? Yeah. Mark Andreessen
    0:14:16 made a comment on that podcast about how he was in closed door meetings with the government where
    0:14:20 they basically said, don’t even pursue building AI companies at this point, right? Oh man, I didn’t
    0:14:25 make that connection. Yeah. Essentially saying that there’s going to be like one true king,
    0:14:29 like one main AI player. So if you’re trying to build an AI company, you’re probably going to
    0:14:34 fail because we’ve already sort of picked our winner. And Larry Ellison made an offhand comment
    0:14:39 during that sort of outside the White House interview the other day. He made a comment that
    0:14:43 this has been in the works for a while now. He didn’t say how long, but I’m assuming they’ve
    0:14:49 been working on this long before Donald Trump was in the picture with it, right? So it makes me think
    0:14:53 that maybe some of this stuff that Mark Andreessen was referring to, that like they already knew
    0:14:57 open AI was going to be it months ago. Yeah, I didn’t actually make that connection. That’s
    0:15:02 interesting. Yeah, that could be. I mean, I can get it from the government’s perspective. It’s
    0:15:07 kind of like, do you want multiple companies building the nuclear bomb? It’s like, no, you
    0:15:10 probably want one and you want to be in control of that. Yeah. Yeah, I don’t know. Like, I mean,
    0:15:14 even though open AI is going to have a lot of support, I don’t think that means that like,
    0:15:18 you know, X AI will not or that anthropic or Google, I think you’re going to see
    0:15:22 these kind of projects from all of them, I believe. You think so? I think so. I think so.
    0:15:27 I hope so. That’s what I would prefer to see, right? Like, I would kind of prefer to see
    0:15:30 not just one company controlling all the power with this kind of stuff.
    0:15:35 Yeah. But this is like what we talked about with open source before. It makes me way more
    0:15:40 pessimistic about what chances open source has. Yeah. Because obviously, one of the reasons that
    0:15:45 Sam is getting the finance to do this is because of stuff we’ve discussed on this podcast.
    0:15:50 You know, they are probably seeing some amazingly promising signs from the internal models that
    0:15:55 they’re building, right? They’ve learned how to scale up test time compute, and they got O3
    0:15:59 in three months, and they’re kind of mapping out what that means. And there was even an interview
    0:16:04 today where they start talking about like O4 and saying, like, yeah, we’re expecting that to also
    0:16:09 come kind of faster than people might anticipate. And the improvement seems to just kind of keep
    0:16:14 going up at a very fast rate. So if that’s true, they’re like, yeah, AGI is basically here,
    0:16:19 and all you need is more compute. And possibly we have ASI, you know, basically kind of a digital
    0:16:25 guide that you can create as long as you throw, you know, $500 billion at it. So now that that’s
    0:16:29 a known thing, and like flags been planted there, and like, yeah, we know this is a possibility,
    0:16:32 I think everyone will be going after it. You know, that’s why I think you see Elon Musk talking so
    0:16:36 much crap, because before this announcement, it was announced kind of like the AI cluster he was
    0:16:41 building was going to be the largest in the world. And it’s like, oh, by the way, $500 billion.
    0:16:47 Yeah, they’re having a data center measuring contest. Yeah, yeah. But also, you know, not to
    0:16:51 get political, but there’s also reason they’re all going to Texas and places like that. As discussed
    0:16:57 before, this is going to require a major rethinking about like energy, the creation of energy and
    0:17:03 things like that, because these systems are going to require massive amounts of energy, like massive
    0:17:08 amounts. Yeah. Yeah, one of the things I heard about why they wanted to build out in like Western
    0:17:14 Texas specifically is it’s one of the spots in the country that gets the most sunshine and heat all
    0:17:19 year long, right? And there’s tons and tons and tons of open land in Texas, right? Especially
    0:17:26 West Texas. I’ve driven from Austin all the way to San Diego. There’s hours and hours of driving
    0:17:31 where there’s just nothing, right? And it’s also the area that gets like the most sun throughout
    0:17:37 the year. So yeah, the reasoning for it is partially political, but also just geographically,
    0:17:43 there is the land there and there is also the sunshine to get assistance from the solar power,
    0:17:47 obviously, right? Yeah, but there’s also like less restrictions on generating energy and using
    0:17:51 energy in Texas versus California. I mean, there’s definitely multiple reasons they’re choosing that
    0:17:58 location, but just the geography of it is also one of the big reasons as well. But yeah, I definitely
    0:18:02 have like mixed feelings about it. When I first saw it, I’m like, this is amazing. We’re going to see
    0:18:06 AGI like way sooner than anybody thinks. And that means we’re probably going to see ASI way
    0:18:11 sooner than anybody thinks. And maybe we are going to enter in this like sort of post-capitalistic
    0:18:17 world fairly soon, sooner than most people realize where we don’t have to work if we don’t want,
    0:18:21 because AI is just going to do everything for us. And then I started seeing a lot of the like
    0:18:26 Larry Ellis and stuff. And then I started thinking about the more like regulatory capture sort of
    0:18:32 element of it that open AI now seems to be really tied in with the US government. I’m curious, like,
    0:18:38 who else do you think could provide the sort of financing to do? Open AI has Masayoshi son,
    0:18:43 who’s essentially going to help them get to $500 billion over the next several years to build
    0:18:47 these dentisters. Who else has that capability? Oh, you mean outside of Stargate? Yeah, that’s
    0:18:52 interesting. And also you would like Masayoshi, I do wonder where the money is coming from. I don’t
    0:18:57 think he has all that money. No, no, he’s raising it. I mean, even Elon said like he’s only got 10
    0:19:01 billion dollars. It’s a cute or something. But I don’t think Elon knows actually. Yeah, I don’t
    0:19:04 think he does either. I mean, he might have heard from a friend or something like, oh, they were
    0:19:08 trying to raise money. This is how much they had committed at the time or something. But like I
    0:19:13 said, like with probably the internal data that open AI has, that is what’s making the fund raise
    0:19:18 that large. If they didn’t have the internal data showing, oh, yeah, here’s 04 and 05 is going to be
    0:19:22 like in three to six months after that, it’s going to be this much better. Like if they couldn’t show
    0:19:26 that, they wouldn’t be raising all this money. They have something incredible inside of open AI.
    0:19:30 Yeah. Well, I also think just the fact that, you know, they had the president announce it and they
    0:19:34 did the whole announcement with the White House and everything like that, that’s only going to help
    0:19:40 them raise, right? Like knowing that they’ve got the support of the US government, it’s not going
    0:19:44 to hurt their ability to raise. That’s only going to help them raise the money. I think after
    0:19:48 all the press conferences and all of that kind of stuff, I think it’s going to get a lot easier
    0:19:52 for them to actually come up with the funds to actually do this thing. You know, I mean, you’ve
    0:19:56 been saying it on podcasts since the very beginning, like nobody’s catching up with open AI. And if
    0:20:01 nobody else can build data centers like this, now I really believe nobody’s catching up with open AI.
    0:20:05 I have been saying that, right? For a long time. A lot of people are like, what are you talking
    0:20:10 about? Look, Claude’s great and all this stuff. I don’t know. Just things I’ve heard from friends
    0:20:15 who know Sam Altman is just the signs internally have been very positive for a long time, despite
    0:20:19 the drama that people saw, you know, from the company. Yeah, I don’t know. I do believe it’s
    0:20:23 eventually going to be open AI versus Elon Musk. Like I’ve been believing that for a long time.
    0:20:27 I think Google will try to catch up. Who knows, maybe Google will even have to try to make an
    0:20:30 alliance with Elon Musk or something. Like who knows, like what will happen there, like long term.
    0:20:35 But I do believe that the only one who could attract the talent and the capital would be
    0:20:39 Elon Musk. How big is the data center that Elon Musk is building? I can’t remember.
    0:20:46 I’m perplexing it right now. So he spent 2.5 billion on 100,000 H 100s and an additional 2.5
    0:20:50 billion on other data center costs and infrastructure. So I’m seeing 5 billion.
    0:20:55 Okay. So I’m seeing 6 billion and a lot of it came from the Middle East and say that even the
    0:20:59 thing with the open AI, you know, the Stargate, probably a lot of that’s Middle East money,
    0:21:04 quite honestly. Yeah. This actually does conclude 6 billion because it does say a recent 1.08 billion
    0:21:08 order placed for NVIDIA GB 200 AI servers brought it up to 6 billion. That investment
    0:21:12 just happened like within the last couple of weeks. So they were at 5 billion, just got
    0:21:16 another billion like a couple of weeks ago. Well, one thing that Skobal brought up too,
    0:21:21 he mentioned this in an X post that I saw earlier today is like, there’s been a lot of talk about
    0:21:26 essentially running out of data. Like if you’ve scraped the entire internet and all of the data
    0:21:33 has already been sort of grabbed, what do you need $100 billion data center for? Like where is
    0:21:39 the data coming from? So that is an interesting question too, right? I’m sure with a 5 billion,
    0:21:46 6 billion dollar data center, do you really need that $100 billion data center? I don’t know.
    0:21:49 I don’t know. I kind of disagree with that. It was a question that he raised, but so
    0:21:54 there was some recent research that came out showing some success, basically using data,
    0:21:59 using content created by the AI to teach the AI, like in training the models.
    0:22:03 Yeah, it’s synthetic data, right? Yeah, it’s synthetic data. And so the early signs seem kind
    0:22:10 of promising, and this was not from open AI. So I assume that the ’03, ’01 pro models are good enough
    0:22:15 to actually create synthetic data that’s actually helping improve the models. And so if that’s true,
    0:22:18 I mean, that’s what people have been saying for the last six months or so. If that’s true,
    0:22:23 in theory, that’s no longer a problem. They can just keep producing new content and actually
    0:22:28 train their models on that. And also, we’ve been saying they can keep scaling up with test time
    0:22:33 compute, so they can just throw more processing power to give these models more juice to actually
    0:22:38 think, yeah, you could spend infinite money. How much energy can you throw at it? The more you
    0:22:43 throw at it, the smarter it will be. Not to mention if there is a goal of putting cameras
    0:22:48 everywhere, that’s all new incoming data to build world models or whatever, right? If you’ve got
    0:22:54 cameras and drones and on bodycams and on cars, companies put them on their buildings or whatever
    0:22:59 for their own sort of security, and they’re trying to collect as much of that footage as they can,
    0:23:04 that’s going to require some pretty massive data centers to be able to pull all of that in
    0:23:10 and sort of analyze it for AI. But I also can see all of that data being what they use to train
    0:23:16 like actual world models to understand physics and the world around us and how things and people
    0:23:21 and objects move through the environment, right? Yeah, I think maybe our listeners are like,
    0:23:25 “Okay, cool. What does it mean for me?” Anything we’ve said on this show about timelines,
    0:23:31 just cut that in half or less now. Like literally a lot of things that we may have been saying
    0:23:36 three to five years, some of those things may be one or two years now. Development is going to
    0:23:40 increase. And like I said, the reason they’re able to raise this much money is OpenAI has something
    0:23:44 amazing internally that they’re showing investors. Yeah, I would believe it. And they still are
    0:23:49 maintaining their partnership with Microsoft and Microsoft is still getting access to any sort of
    0:23:54 new OpenAI models that they develop according to Microsoft, which means Clippy is going to get
    0:23:59 really good really quick. But Microsoft did not invest. I mean, to me that’s a signal. In Silicon
    0:24:04 Valley, if somebody invests and then they don’t follow into the next round, you try to frame it
    0:24:07 as not being a negative signal, but that is some kind of signal. And I don’t think it’s a negative
    0:24:12 signal on OpenAI’s part. I kind of think that OpenAI wanted to have other partners. It’s like,
    0:24:16 yeah, Microsoft’s one of our partners, but like, look, we got all these other partners. We don’t
    0:24:21 just need Microsoft. Well, the impression that I get is that the scale that OpenAI has in mind
    0:24:26 is bigger than what even Microsoft Azure can provide, right? Like I think they’re imagining
    0:24:31 this sort of scale that’s just sort of unfathomable for us. That’s what I said before. I said like
    0:24:35 people talking about Microsoft is going to take over OpenAI. I’m like, I envision the potential
    0:24:40 future where OpenAI acquires Microsoft to not have to deal with their contracts with them anymore.
    0:24:44 Like it’s the other way around. Yeah, yeah. I wonder if there’s a world where
    0:24:50 XAI and Elon actually get rolled into this new project. I know Sam Altman and Elon Musk have
    0:24:55 been beefing on X and whatever, right? But if they’re both really in it for the good of the
    0:25:00 country and building like the super intelligence, makes me wonder if there’s a world where they
    0:25:04 bury the hatchet and work on this together. Well, it’s crazy, you know, because like Sam Altman,
    0:25:07 definitely like, you know, I think I told you before, like I did a speech at Stanford and Sam
    0:25:12 was there at the same time. And I know politically he was at the time pretty far left. I think he’s
    0:25:15 like kind of switched now, being more somewhere in the middle. I think they’re all opportunistic.
    0:25:20 That’s what I think. Yeah, he hated Trump for sure. Like that’s definitely for sure.
    0:25:24 And so I do wonder now, it’s going to be kind of odd, but like I wouldn’t be surprised if that
    0:25:28 happens. It literally would be some kind of deal brokered by Trump between the two of them. Like,
    0:25:33 hey guys, makeup, it’s for the best of America. But I’ve been saying this for a while. I do believe
    0:25:38 that open AI owes Elon Musk something. I really do. Like I think it’s crazy that he owns no equity
    0:25:43 in open AI. I think that’s ridiculous. In the early days of open AI, like everyone in San
    0:25:47 Francisco, when they would talk about the company, they would talk about Elon Musk. And like Sam’s
    0:25:51 name would occasionally come up, but it was like, oh, open AI, oh, that’s Elon Musk thing.
    0:25:56 Yeah. The first time I ever heard of open AI, it was with Elon Musk attached, right? Like that
    0:26:00 very first time I ever heard of it. It was like, this was another Elon Musk company.
    0:26:03 Yeah. It was like his company and like Sam Altman’s help and run it. It was like, that was at least
    0:26:08 the messaging externally that was being presented to people. And then also like the main talent
    0:26:14 early on, he helped recruit the capital. He provided some of it, but also it was from his friends.
    0:26:18 Yeah. Yeah. Didn’t he bring on Ilya? Like I think Ilya was working with Elon in the beginning.
    0:26:21 I think he did. I could be wrong about that. Don’t quote me.
    0:26:26 Yeah. My understanding is he did. And so a lot of the main talent he brought on and the capital
    0:26:30 and just the reputation. And then also that’s having someone like Elon Musk brings so much
    0:26:34 talent to you too. Just like, oh, it’s Elon Musk’s AI company. I’ll go work there now.
    0:26:39 And so the fact that he owns nothing is crazy. So I would love if something happened where
    0:26:44 Elon Musk owns a small piece of open AI and they have some kind of technology sharing deal,
    0:26:48 but like, will that actually work? I have no idea. Yeah. Yeah. I don’t know. Anyway,
    0:26:53 there was another really big piece of news that came out this week and it is related to open
    0:26:58 source and China. Right. This week we got deep seek R one. There was already these deep seek
    0:27:05 models out there, but the R one is this new like research model. And I did test it on a live stream
    0:27:12 and I tested it side by side with O one just standard and O one pro. O one pro definitely
    0:27:17 still outperforms deep seek R one. It was definitely giving me more like quality in-depth
    0:27:25 responses, but comparing it to O one, they felt pretty even, which to me blew my mind for an
    0:27:29 open source model. I know you have some opinions on it. Obviously the models out of China. Yeah,
    0:27:34 our good friend Matthew Berman was doing some testing with it and he asked it about
    0:27:39 if like Taiwan was part of China and it basically said Taiwan is part of China and anybody who
    0:27:44 opposes this thought will be shut down or something weird like that. Right. It wasn’t exactly that,
    0:27:48 but yeah, it was in that vein. No, I’m definitely paraphrasing, but it was basically saying like
    0:27:52 any plans for independence will not work or something like that. Right. It was some wording
    0:27:57 like that, which I found really interesting when I actually asked that same exact question,
    0:28:02 like I put the exact same prompt into deep seek and it just said, sorry, I can’t answer that.
    0:28:05 Let’s talk about something else. That’s what it did when I put the prompt in. That’s what I’ve
    0:28:09 been calling it like open propaganda, like open source. I don’t know. It’s open propaganda.
    0:28:14 I mean, because it is like, okay, so why is China allowing this model to be open source and be out
    0:28:18 there? The Chinese government is allowing it. And to me, that is why they’re doing it is because
    0:28:23 they know that people love open source. It’s a great way psychologically to have people go,
    0:28:27 oh, it’s open source. I love it. But there’s other things going on here. Like the reason they’re
    0:28:31 allowing it is because if it becomes one of the biggest models, they get the kind of control
    0:28:36 reality and distort history through that. Right. Like, okay, TMN square didn’t happen or different
    0:28:40 things like that. They can change that in the AI model. And, you know, full disclosure, I mean,
    0:28:46 I did live in Taiwan. I studied Mandarin there, kind of biased. I love Taiwan. But for that reason,
    0:28:49 I just, I can’t support the model. It’s because like, yeah, you talked to it about Taiwan and it’s
    0:28:54 like, yeah, it’s owned by China. No one in Taiwan sees it that way. Or maybe a few people, but like
    0:29:00 not many. Yeah, yeah. But I mean, like there is biases built into pretty much every AI model,
    0:29:06 right? Like, you know, a lot of the US models refuse to talk about certain things or
    0:29:11 sort of share their own political bias that was trained in as well. Right. But yeah,
    0:29:16 I definitely see that. And the fact that it’s actually open weights, though, people can take
    0:29:23 this deep seek R1 and fine tune it and sort of essentially train out all of that bias if they
    0:29:27 wanted to, right? Because the model weights are open source as well. Like you can actually take
    0:29:33 the weights and fine tune them if you want. So, you know, as a whole, as a model that’s open source,
    0:29:38 I do think it’s really, really impressive that we’ve gotten to 01 level this quickly,
    0:29:42 right? Like, you know, they’re constantly talking about the gap between when we released
    0:29:48 GPT-4 to 01. Look at how big that gap was and then 01 to 03. That was only like a three month
    0:29:52 difference. Well, we’re seeing that sort of same scaling happen in open source as well.
    0:29:57 Yeah, there was a tweet from one of the open AI guys and he was kind of saying,
    0:30:02 I expect to see a lot of new reasoning models in the next few months in open source and other
    0:30:07 areas that don’t fully get, you know, how it’s done, or like he basically kind of lunage like
    0:30:10 opening is doing something a little bit different than you think. It’s not as straightforward as
    0:30:14 what you think. And I kind of still think that’s probably true. I think that’s why open AI is moving
    0:30:18 faster than anyone else. They have discovered something, but I did test deep seek. I mean,
    0:30:24 it was impressive. It’s better than Claude. It’s definitely in the ballpark of 01. I found that
    0:30:29 in some ways it was better than 01 and in some ways it was worse. I tested it on coding after
    0:30:34 you told me to try it. And I found that like in some ways the code was better in some areas.
    0:30:37 I was like, oh, that’s amazing. Like it’s actually better. And then it would do some things that
    0:30:42 were like pretty dumb that 01 would never do. Yeah. I was like, I think there’s something missing
    0:30:45 in the reasoning side of this. I could see it. There’s something they’re not getting that the
    0:30:50 open AI is getting some technique that they’ve discovered that this model is not using. It would
    0:30:53 hallucinate more. It would imagine files and things like that. They didn’t exist. I was like,
    0:30:58 what? Like how this is a reasoning model? Like shouldn’t it have like checked to see that that
    0:31:02 file actually existed? It’s actually hallucinating that. That’s like, you know. Yeah. Well, one thing
    0:31:07 that I do like about the R1 model though is like you can literally see everything it’s thinking
    0:31:12 as it thinks, right? You can see it go, all right, let me test this. Okay, that didn’t work. Let me
    0:31:16 test this. That’s not the right way to do it. Let me test it. And it’s literally showing you
    0:31:21 everything it’s trying and testing and going back and forth. And I think that’s pretty fascinating.
    0:31:26 But I also think there is a reason open AI isn’t showing you all of that, right? Like it does show
    0:31:30 you some, but it’s almost like a summarized version of how it thought as opposed to the whole
    0:31:34 thinking process. Yeah, there’s some secret sauce there. And that’s why I’ve been saying for a while,
    0:31:38 like I’m a huge fan of Elon Musk, but he was kind of saying, oh, we’re going to have this new model
    0:31:42 and it’s going to be better than open AI. It’s going to be the best in the world. You know,
    0:31:46 I don’t know if that’s the case. I think that you can’t just, you know, okay, you have more
    0:31:50 processors and now you trained a larger model. I don’t think that’s the game. Like I think open
    0:31:53 AI is playing a different game now. They’ve learned that it’s a combination. You’re training
    0:31:57 the model with more data, but also there’s a whole test time compute there and some kind of secret
    0:32:02 sauce that they have discovered with deep seek. I guess the one takeaway is like you’ve been saying
    0:32:06 before, eventually we are going to have open source models that are like somewhat close to the best
    0:32:10 models. Maybe they’re not as good in some ways, but you’re going to be able to have like models
    0:32:14 that you can run on your local machine that are really freaking good. Well, and the reason that
    0:32:19 I pointed out to you that I thought you might be interested in it was more the fact that there’s
    0:32:22 actually an accessible API for it right now. Well, you can’t use O3 at all right now,
    0:32:28 but there is no like O1 API. So you can’t just use it inside of something like a cursor or a
    0:32:32 Windsor for something like that. Oh, you can, you can use O1 in cursor. Oh, you just can’t use O1
    0:32:37 Pro yet, right? Correct, correct. Okay. And there’s a lot of limitations on O1 in cursor too. I
    0:32:41 think unless you put in your own API key, but like if you’re using it by default, it’s like very
    0:32:46 restrictive. Like you run out of like queries or prompts, whatever, very vast. Oh, so even when
    0:32:50 you’re using the API, they still rate limit you? Well, if you put in your own API key, they don’t,
    0:32:55 but if you’re using just like the cursor plan or whatever, it’s very restrictive. It’s probably
    0:32:59 because the O1 API is more expensive. So they have some kind of like system where it’s like,
    0:33:02 if you’re paying us 20 bucks a month, we’re not going to allow you to put up some gigantic O1
    0:33:07 bill and then we pay for it. Yeah, yeah, yeah, yeah. Okay. Yeah. For some reason, it slipped my
    0:33:12 mind that O1 had an API available. And I’m like, Oh, well, here’s an alternative to O1 that has
    0:33:18 an API. That’s O1 Pro that doesn’t. And actually, it’s worth noting too that Sam Altman has been
    0:33:22 being way more public about their like upcoming plans recently, like he’s talked about like
    0:33:28 O3 minis coming in the next two weeks, which is like, holy crap. Although he did say O1 Pro is
    0:33:33 still better than O3 mini. Did you see that? Yes. But under that tweet, somebody asked him,
    0:33:38 how is O3 mini compared to O1 Pro? And he said O1 Pro is still going to be way better at most
    0:33:43 things. Yeah. I mean, we had seen some benchmarks that had shown that like maybe a few weeks ago,
    0:33:48 but the big difference will be that O3 mini will be really fast. And also it’s really
    0:33:53 cheap for open AI to operate. So it should be a model that’s like better the deep seek,
    0:33:59 way better than O1. That’s very fast. And the other comment that he made on that same tweet,
    0:34:04 is that they’re going to release it with an API, which amazing. No, but also even that the O3
    0:34:09 Pro is going to have an API. So like apparently O3 and O3 Pro are coming. So they’re going to
    0:34:13 continue the thing where now that they’ve learned that you can just throw more compute at the models,
    0:34:18 they can always have a better model available, just like throwing more compute at it, right?
    0:34:22 And so they will continue to have like a pro model. And the amazing thing there was I was
    0:34:26 concerned that they’re going to like increase the price to like 2000 or something. Because I was
    0:34:30 like, well, I might actually pay it. Like if it continues to get better and it replaces me hiring
    0:34:36 an engineer, I might actually pay that. But he said it’s going to continue to be $200. And that
    0:34:41 the O3 Pro model will have an API as well. So that’s super exciting. So the O3 Pro is going
    0:34:46 to be on the $200 model? Yes. And have an API. That’s the one where they were like, it costs
    0:34:51 like three grand per task right now. Well, they said that they’ll figure out how to make it cheaper.
    0:34:54 And apparently they seem to think that they probably have or they’re going to have by the
    0:34:58 time it comes out. And the latest information too, I forgot who it was. If it was like their chief
    0:35:01 product officer or something like that, he was interviewed by the Wall Street Journal. Kevin
    0:35:06 Weill. Yeah, exactly. Yeah. And he said that the current plan is that the O3 models, not the mini,
    0:35:09 the mini should be coming in the next week or two. Maybe by the time you’re listening to this,
    0:35:15 it may already even be out. But that the full-blown O3 models, which probably means O3 and O3 Pro,
    0:35:19 timeline is like two to four months. For someone who’s been using O1 Pro, and like at the last
    0:35:23 episode we did where I showed you how much value you can get out of O1 Pro if you give it tons of
    0:35:28 context, to imagine that we’re about to get a model like three to five times smarter than that
    0:35:35 in the next three months. It’s just blowing my mind. Well, and we’re about to get agents. I mean,
    0:35:40 don’t hold me to this, but like the week that this episode is going live, we might already have
    0:35:45 agents or it might be announced this week. But like apparently, according to the information,
    0:35:50 OpenAI is preparing to release a new chat GPT feature this week that will automate complex tasks
    0:35:54 typically done through the web browser, such as making restaurant reservations or planning
    0:35:59 trips. According to a person with direct knowledge of the plans, they’re calling it operator. And so
    0:36:04 this says this week. Yeah. And actually, there was some kind of benchmark leaked recently showing
    0:36:09 operator versus Claude’s computer use showing the operators way better, like not perfect. A lot of
    0:36:13 stuff like Claude was like in the 50% success rate. And then the operator was like in the 80%
    0:36:17 success rate, but still, yeah, dramatically better. Yeah. And that’s actually when we talked about
    0:36:21 computer use before. That’s what I said. I was like, I’ve heard that like OpenAI has stuff internally.
    0:36:26 That’s pretty good. There’s not happy to release it yet. Right. And that’s Claude released their
    0:36:30 thing very early, but no one really used it because it wasn’t that great. But it sounds like,
    0:36:34 I mean, they’re saying it’s going to be like do stuff for you, like make reservations online,
    0:36:38 buy things for you. Just like basic stuff you would do in the browser. Probably when operator
    0:36:42 comes out, you’ll be able to just tell it, hey, go do that for me. And then it’ll just do it.
    0:36:46 Yeah. It’s really a pain in the butt to set up the anthropic version, right? Like if you want to
    0:36:49 use computer use, you have to do it through Docker. You have to get it all set up. And you
    0:36:54 actually have to use it with their sort of crappy browser, like on a remote computer.
    0:37:00 It’s not a great experience. I would imagine when chatGPT rolls it out, it’s just going to work
    0:37:04 in your own browser. It’s going to be a lot more seamless of an experience.
    0:37:08 Yeah. Yeah. So yeah. So exciting times. I mean, we’re about to have like AI that can
    0:37:12 like just code full products for you. You just talk to it and it makes the whole thing. Like
    0:37:17 this is probably this year. And you have AI that can like just use websites for you and do whatever
    0:37:21 you would do on your computer, do it for you. It’s exciting. I mean, a lot of stuff that just
    0:37:24 is time consuming stuff that you don’t want to waste your life on. Soon you’re not going to have
    0:37:28 to waste your life on it. I can’t wait till the day where I’m just like, hey, I need to update my
    0:37:33 future tools website, go find all the news and put it on my website for me. I’m going to go take a
    0:37:39 nap, which is funny because I don’t think that’s actually that far off. Probably not. I actually
    0:37:43 set up that thing with a chatGPT where it sends you little notifications or whatever. It’s not,
    0:37:46 they definitely need to improve that experience, but it’s kind of cool. It’s been a similar little
    0:37:50 Japanese words to learn and send me little summaries of AI news. Yeah. I’ve done it too.
    0:37:56 I set up a daily task list to find any AI news from that morning and don’t find anything from
    0:38:01 like before today. I only want the most current news. Yeah. And it sends me an email every morning
    0:38:05 to let me know that it’s done it and I click the link and it shows me what it found. Yeah,
    0:38:09 and it’s useful. I tried it recently in my newsletter issues to see if anyone would complain,
    0:38:16 but I just took a summer of the news and then used whisper flow just to talk to my computer
    0:38:21 about my own thoughts on the matter. Yeah. Right? Did that for like 10 minutes,
    0:38:25 handed it off to 01 Pro where I had provided all this context of what’s good newsletter
    0:38:29 issues and what’s not and it edited all my words and like looked really good and everyone seemed
    0:38:34 to like it. Nice. And it dramatically reduced how long it took me to create my newsletter issue.
    0:38:37 Like, you know, instead of taking like a few hours, it probably took me an hour to finish
    0:38:41 everything. And so I’m just like, this new world is so exciting where you just like all the kind
    0:38:44 of work that you don’t like doing. You know, I like sharing my opinions. I don’t like sitting
    0:38:48 there like editing them for like hours and doing all that. Yeah. Yeah. It’s going to do all that
    0:38:52 for me. It’s awesome. Yeah. I saw Dario Amadei from Anthropic. Was it the Wall Street Journal
    0:38:57 that was doing all the interviews out at that sort of Davos thing? Yeah. So he was on there.
    0:39:03 They were actually asking about like the future of jobs. And he essentially said that like what
    0:39:08 they found and what a lot of research has shown is that when you give people all of these automations,
    0:39:13 it doesn’t actually take away too many jobs. It just makes people way, way, way more efficient
    0:39:18 at the stuff they actually want to be doing within their job. And he was talking about how
    0:39:24 basically so many people have been trying to use AI to replace jobs. But if you start using AI as a
    0:39:30 way to like sort of enhance jobs, they find that the effectiveness of those people is way better
    0:39:36 than when you use AI to replace jobs like the efficiency and the output and that sort of thing
    0:39:42 is like dramatically improved when they’re using AI to actually like improve the things and also
    0:39:47 hone in on just doing the things that are like their core competencies and letting AI do all the
    0:39:53 mundane stuff. So, you know, I think that’s sort of the next phase. I mean, there might be a phase
    0:39:57 beyond that where it’s just kind of like, all right, AI and the robots do everything and we just
    0:40:02 get to travel and live our best life. That might be like a future phase. But I think the next phase
    0:40:06 that we’re moving into is like, we get to focus on the stuff that we actually enjoy doing in the
    0:40:12 work that we do while AI does all the sort of mundane boring stuff we just don’t want to do
    0:40:16 because it’s repetitive or whatever. Yeah, totally. I mean, like, like I’m already seeing that in my
    0:40:20 life right now. The fact that like these things are about to get three to five times better.
    0:40:24 And it appears that that’s going to start happening like every three to six months,
    0:40:28 not every like two years. Yeah, it’s just amazing. I mean, because like, and then now that you got
    0:40:32 Stargate, I don’t think people are like processing like they’re not like reimagining like what’s
    0:40:36 going to happen based on the new data because like things are going to improve faster than you
    0:40:42 expect if you’re listening to this. And the OpenAI also recently talked about like their last models
    0:40:45 took like, I think a year and a half or longer than a year and a half to train. And they were like,
    0:40:50 maybe like 50% to 100% better. You know, the new 03 model, they’re seeing like a, you know,
    0:40:55 3x improvement in a matter of three months. We’re entering a new phase of development here. We’re
    0:40:59 talking about probably like improvements speeding up like 10 times or more. Yeah. And now there’s
    0:41:05 going to be a hundred billion dollar data center worth of compute behind OpenAI to move even faster.
    0:41:10 So I think you’re going to see that sort of exponential growth continue, right? Like,
    0:41:14 it’s just going to be a vertical line. And also there’s going to be now that that’s
    0:41:16 announced. I mean, you’re going to see Elon Musk and everyone else
    0:41:20 raising more money to go after this faster too. It’s not going to be just OpenAI. It’s going to
    0:41:24 be everyone else. It’s going to be investing more into this for the race. For sure. It’s going to
    0:41:29 happen faster. Yeah. Well, we did say there was actually a lot to talk about. We probably rambled
    0:41:35 a lot. Sorry. Yeah. We had a lot to talk about. It felt like a pretty monumental week between
    0:41:41 the Stargate, between deep seek, between, you know, the announcements from OpenAI and 03 coming.
    0:41:46 There was a lot, a lot, a lot of big stuff that I felt like we needed to kind of deep dive and
    0:41:51 unload. And hopefully anybody who’s just sort of listening to this podcast and checking in to
    0:41:56 stay looped in, you feel a little more looped in. You might feel a little bit more optimistic.
    0:41:59 You might feel a little bit more scared. I don’t know. Or confused or whatever. Yeah.
    0:42:02 Maybe I like to share something about what Stargate means for the future. There was this tweet
    0:42:05 and it’s kind of philosophical, but Roon shared this tweet, you know, people like,
    0:42:10 why is it called Stargate? Roon’s like a well-known guy who works OpenAI. He’s like,
    0:42:13 a lot of people know who he is, but he keeps an anonymous. He’s not Sam Altman. A lot of people
    0:42:19 think he’s Sam Altman. He’s not. But he had this tweet, the Stargate blasting a hole into the
    0:42:25 Platonic ether to summon angels. First contact with alien civilizations. So I think that is kind
    0:42:29 of a summary of like what Stargate means. I mean, like this is we are like summoning the angels. We
    0:42:34 are making contact with a new intelligence, you know, an alien intelligence. And that will be
    0:42:40 artificial superintelligence. It will be like us discovering something beyond anything we can
    0:42:44 imagine. And that’s what this is designed to do. And so it’s just, you know, I think it’s
    0:42:47 important for people to take a moment to kind of try to take that in. It’s a lot to take in,
    0:42:50 but that is what humanity is trying to accomplish right now.
    0:42:57 Wild. Well, hopefully these aliens are coming to save us and not to destroy us. I choose to
    0:43:03 lean into the optimism and I believe that it’s all going to make humanity better and make us all
    0:43:08 better and more improved, augmented humans that help us get more done that we want to get done.
    0:43:12 So I’m looking forward to the future. I’m excited about it. There are some little concerns,
    0:43:17 but I still lean mostly optimistic on all of this. And I know you do as well.
    0:43:22 Yeah. And I think on that note, we can go ahead and wrap this one up. I think we’ve sort of
    0:43:27 unloaded everything we have to say about all of these big announcements that came out this week.
    0:43:33 If you liked this episode and you want to stay looped in on all of this latest AI news and hear
    0:43:38 some amazing interviews and discussions and deep dives around ways to practically use AI in your
    0:43:42 life. Make sure you’re subscribed to this, either our YouTube channel or audio channel,
    0:43:46 wherever you listen to podcasts. These podcasts is available all over the place.
    0:43:50 We’d love for you to subscribe and thank you so much for tuning in. Hopefully we’ll see you in the next one.
    0:44:00 [Music]
    0:44:04 [MUSIC PLAYING]
    0:44:14 [BLANK_AUDIO]

    Episode 43: How massive is the $500 billion AI Stargate Project, and what does it mean for the future of AI? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) break down this monumental development and discuss its implications.

    This episode delves into the ambitious AI infrastructure project announced by Donald Trump, alongside key tech players. Learn why this initiative is seen as a groundbreaking moment in AI history, its promises for job creation, health advancements, and the potential concerns around surveillance and data privacy. The hosts also touch on related advancements including DeepSeek-R1, OpenAI’s upcoming models, and how they all fit into the larger AI development landscape.

    Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd

    Show Notes:

    • (00:00) Stargate’s $500B AI Infrastructure Investment
    • (03:17) AI Innovation: U.S. Health Revolution
    • (09:17) Surveillance Expansion Concerns
    • (10:27) Corporate Influence and Surveillance Speculation
    • (14:57) Sam’s Breakthrough in AI Development
    • (17:16) Mixed Feelings on AI’s Future
    • (20:15) NVIDIA’s $6B Data Center Expansion
    • (23:14) OpenAI Expands Beyond Microsoft
    • (27:43) AI Model Bias and Control
    • (29:11) OpenAI’s Edge: Unique Reasoning Models
    • (34:20) New O3 Models Release Timeline
    • (38:08) AI Boosts Job Efficiency, Not Loss
    • (39:32) Rapid Technological Advancements

    Mentions:

    Get the guide to build your own Custom GPT: https://clickhubspot.com/tnw

    Check Out Matt’s Stuff:

    • Future Tools – https://futuretools.beehiiv.com/

    • Blog – https://www.mattwolfe.com/

    • YouTube- https://www.youtube.com/@mreflow

    Check Out Nathan’s Stuff:

    The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano

  • #791: How to Feel at Peace Amidst High Stress — Guided Meditation with Zen Master Henry Shukman

    AI transcript
    0:00:04 Hello, boys and girls, ladies and germs. This is Tim Ferriss. Welcome to another episode of
    0:00:09 The Tim Ferriss Show. This episode is a brand new experiment called Meditation Monday. That means,
    0:00:14 in addition to my long-form interviews each week, every Monday, I will be bringing you a short 10
    0:00:19 minute or so meditation, which will help you for the rest of the week. Over this four-episode
    0:00:24 series, you’ll develop a Zen toolkit specifically to help you find greater calm, peace, and effectiveness
    0:00:29 in your daily life. The teacher, Henry Schuchman, has been on my podcast twice before. He is one
    0:00:34 of only a few dozen masters in the world authorized to teach what is called Sambo Zen,
    0:00:38 and I have found this particularly interesting and effective. And now he’ll be your teacher.
    0:00:46 I’ve been using Henry’s app The Way once, often twice a day for the last few months, and it has
    0:00:51 lowered my anxiety more than I thought possible. As a listener of the show, you yourself can get
    0:00:57 30 free sessions by visiting thewayapp.com/tim. So if you like what you hear in these meditations,
    0:01:01 which will be valuable in and of themselves, you can get 30 free sessions by going to
    0:01:07 thewayapp.com/tim. And for the time being, please enjoy this Meditation Monday with Henry Schuchman.
    0:01:16 Welcome back to Meditation Monday with me, Henry Schuchman. Thank you for joining.
    0:01:23 So I’ve already said that meditation can be an extraordinary journey of discovery. I believe
    0:01:30 that its deepest purpose is to reveal aspects of who we are and what our true relationship
    0:01:36 to the world really is, which can kind of blow our minds when we taste them. And of course,
    0:01:41 meditation is not the only way of finding out that kind of stuff, but it’s an incredible way
    0:01:46 of grounding it into our lives and integrating it into the way we actually live. However,
    0:01:55 the vast majority of us, myself included, come to meditation not for those reasons, but to handle
    0:02:01 stress. We may not even realize we’ve got stress. We’re just kind of miserable. I was in that category
    0:02:07 in my early to mid-20s when I started meditating. I was just really unhappy and had a good friend who
    0:02:14 meditated who was much less unhappy than I was, and I decided to try out what he was doing,
    0:02:19 and it changed my life. Here’s what we’re going to do today. The tool we’re going to be picking up
    0:02:27 is around how to handle stress, how to recognize stress and what to do about it through meditation.
    0:02:36 And essentially, the main thing we’re going to be working on is recognizing the signatures of stress
    0:02:44 in our bodies, coming into the body, getting to know it in the body, and just that recognition
    0:02:52 starts to dial it right down. So let’s come into our comfortable seated position.
    0:02:57 You can close your eyes or you can lower your gaze.
    0:03:04 And as always, we’re going to start by really arriving,
    0:03:15 so kind of catching up with ourselves. You know, here we are with this little space,
    0:03:26 this little gap in our day when we get to be still and quiet, and really it can be a kind of refuge.
    0:03:34 So just take stock. You know, how are you doing right now?
    0:03:43 So we’ve all just jumped in from our busy lives outside this space.
    0:03:52 What are we carrying? What’s in our bodies right now? What kind of momentum from the day?
    0:04:03 We’re not seeking to get rid of anything or change anything. We’re just starting to recognize
    0:04:11 what’s going on for us, catching up with ourselves.
    0:04:22 Now let’s give the body a chance to relax, to come into some kind of rest.
    0:04:27 So letting your shoulders settle.
    0:04:34 Letting the seat receive the weight of your body.
    0:04:42 Letting the floor beneath you receive the weight of your legs.
    0:04:52 Letting your whole upper body be at rest, so either balanced or really giving
    0:04:56 giving the weight of your upper body to whatever support it has.
    0:05:02 You get to unwind right now.
    0:05:12 You get to sort of unbind yourself from the threads of your days, your busy days.
    0:05:23 So now we may not particularly want to think about stress right now, but it’s really helpful to
    0:05:30 in this space. It’s kind of shelter almost of meditation.
    0:05:38 We can get to know what happens in our bodies when we feel anxious or stressed.
    0:05:52 You might just think about some minor stress or like there’s an email you haven’t answered yet or
    0:05:59 a phone call or maybe the tire pressure on your car needs to be checked.
    0:06:10 Just some minor thing and just feel what happens in your chest or your diaphragm just below the chest.
    0:06:13 Or perhaps in your throat.
    0:06:25 Is there some sensation that you can notice in your torso
    0:06:33 that seems to associate correlate with stress?
    0:06:41 Just do a kind of gentle scan of your torso.
    0:06:50 Is there some kind of energy, some kind of sensation, could be a tension,
    0:07:01 tightness, might be a sense of heat or some kind of agitation.
    0:07:11 Sometimes there’s a certain density or it might be more like weather.
    0:07:18 There’s a little high pressure system in the torso.
    0:07:29 Whatever you’re finding or even if you’re not finding much of anything, let it be the way it is.
    0:07:44 While keeping one part of your attention open to whatever’s going on in your torso,
    0:07:49 at the same time let your shoulders be soft.
    0:07:55 Let the flanks of your body be soft.
    0:08:04 Let it almost be like the whole of your torso is made of warm wax.
    0:08:15 There’s a warmth surrounding your chest area, your diaphragm area.
    0:08:21 Feel that warmth like warm wax.
    0:08:33 And feel how it can hold any uncomfortable feeling within.
    0:08:42 If there is a tightness or a heat or a density, any kind of unease.
    0:08:52 Notice that there’s a softness, a warmth in your body as well.
    0:09:01 That can welcome and hold any discomfort.
    0:09:17 We’re learning to let things be, not to get rid of them, to let things be.
    0:09:28 To be with our sometimes anxious troubled hearts.
    0:09:36 We don’t need to banish them or suppress them or change them.
    0:09:49 We ourselves can learn to allow them to bring a kindness towards our discomforts.
    0:10:05 So just resting a moment with a soft body and a sense of allowing
    0:10:18 of kindness and patience toward any stress we might be feeling.
    0:10:33 That we really can kindly welcome it and sort of tend it.
    0:10:45 As if we might almost be rediscovering a kind of kind attentiveness
    0:10:48 that we’ve always had.
    0:10:59 And can turn it toward ourselves, our own experience.
    0:11:23 Okay, so let’s in your own time come out of the meditation, come back to space you’re in.
    0:11:32 Sometimes people like to stretch a little bit after coming out of a sit, whatever feels good for you.
    0:11:45 So another tool as we’re proceeding with these sessions, this time probably or maybe
    0:11:51 a counterintuitive one. When we’re feeling stress, we don’t try to get rid of it.
    0:11:58 Instead we try to recognize it as sensation in the body and do maybe the last thing we want
    0:12:07 to do which is welcome it, allow it, let it be part of what our experience currently actually is.
    0:12:13 And in doing that we discover that we ourselves have a greater capacity than we might have
    0:12:21 remembered to give ourselves loving, kind awareness. Thanks so much for joining me.
    0:12:25 See you on the next Meditation Monday.

    This episode is part of a new experiment called Meditation Monday. The teacher, Henry Shukman, has been on my podcast twice before. He is one of only a few dozen masters in the world authorized to teach Sanbo Zen, and now, he’ll be your teacher.

    In addition to my long-form interviews each week, every Monday I’ll bring you a short 10-minute or so meditation, which will help you for the rest of the week.

    Over this four-episode series, you’ll develop a Zen toolkit to help you find greater calm, peace, and effectiveness in your daily life.

    Henry’s app, The Way, has changed my life since I first started using it. Unlike other meditation apps, where you’re overwhelmed with a thousand choices, The Way is a clear step-by-step training program guided entirely by Henry. Through a logical progression, you’ll develop real skills that stick with you.

    I’ve been using it daily, often twice a day, and it’s lowered my anxiety more than I thought possible.

    As a listener of my podcast, you can get 30 free sessions by visiting https://thewayapp.com/tim and downloading the app.

    See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

  • Attention pays (with Chris Hayes)

    AI transcript
    0:00:05 Support for the gray area comes from Blue Nile. When someone proposes on the
    0:00:09 jumbotron thing at a ballgame, you see the joy, the tears, and the magic.
    0:00:14 Hopefully. What you don’t see is the awful process of finding the ring.
    0:00:19 Blue Nile offers a better way to find your dream ring. Blue Nile offers jewelry
    0:00:23 at prices that won’t make you reconsider the entire institution of marriage.
    0:00:28 Blue Nile says that every piece comes with 100% satisfaction guarantee, so
    0:00:32 if something isn’t right, you’re in the clear. Right now, you can get $50 off your
    0:00:38 purchase of $500 or more with code grayarea@bluenile.com. That’s $50 off with
    0:00:40 code grayarea@bluenile.com.
    0:00:42 Blue Nile.com.
    0:00:53 Thumbtack presents the ins and outs of caring for your home. Out, indecision, overthinking,
    0:01:00 second guessing, every choice you make. In, plans and guides that make it easy to
    0:01:10 get home projects done. Out, beige, on beige, on beige. In, knowing what to do, when
    0:01:16 to do it, and who to hire. Start caring for your home with confidence. Download
    0:01:19 Thumbtack today.
    0:01:27 A friend of mine once told me that you are where your attention is. That line
    0:01:31 always stuck with me. It was a reminder that the most important choice we all
    0:01:37 make is also the most common one. We make it a thousand times a day, every day of
    0:01:43 our lives. It’s the decision about what to pay attention to and what not to pay
    0:01:52 attention to. One of the primary features of this age, the age of the internet
    0:01:58 and smartphones and algorithmic feeds, is that our attention is everywhere and
    0:02:02 nowhere at the same time because we’re endlessly pushed around by a parade of
    0:02:07 distractions. Your phone’s ringing. Your Apple Watch is blinking. You got a ping
    0:02:11 on Slack from a co-worker. You’re getting an email notification as you’re
    0:02:18 sitting down for dinner. It’s always something. Which is probably why, if
    0:02:22 you’re like me, it’s hard to remember the last time you watched an entire movie
    0:02:26 or show without checking your phone. Hell, I barely made it through recording
    0:02:33 this intro without checking my phone. This level of distraction is not an
    0:02:38 accident. Our devices have engineered these impulses and a whole industry has
    0:02:43 emerged that’s devoted to capturing our attention in all these ways and then
    0:02:49 selling it to the highest bidder. And their tools and tactics are getting better
    0:02:58 every day. I’m Sean Elling and this is the Grey Area.
    0:03:15 Today’s guest is Chris Hayes. You of course know Chris as the host of All In
    0:03:23 with Chris Hayes on MSNBC. But he’s also a writer and has a new book out called
    0:03:30 The Sirens Call. How attention became the world’s most endangered resource. The
    0:03:36 discourse on attention and digital technology is crowded. So when we have
    0:03:40 someone on the show to talk about it, it’s because we think they have something
    0:03:49 new to offer. And Hayes certainly does. For him, the reordering of our social and
    0:03:54 economic conditions around the pursuit of attention is, in his words, a
    0:04:01 transformation as profound as the dawn of industrial capitalism. It’s a bold
    0:04:07 claim and I’m not sure Chris is right about that, but he might be. And in any
    0:04:12 case, it’s a smart and ambitious book and I’m excited to have Chris back on the
    0:04:19 show to talk about it. Chris Hayes, welcome back to the show. It’s great to
    0:04:24 be back. I think you’re officially now a friend of the show. Oh, definitely. Yeah,
    0:04:29 absolutely. And is it true? A lot of people are saying that the ideas for this
    0:04:35 book were actually planted in our last conversation. Actually, yes. I mean, in
    0:04:39 large part, I remember that conversation and also your work, your book, which I
    0:04:45 read and really enjoyed. You opened the book with this famous image of Odessius
    0:04:51 strapped to the mast of a ship by his own command in order to resist, shall we
    0:04:58 say, certain temptations. Why start your story with that story? It’s one of the
    0:05:01 most potent images in the entire Western canon and I’ve always been kind of
    0:05:07 obsessed with it because it’s a metaphor for so many things. I mean, one, I
    0:05:10 think we think of in terms of addiction. If you’ve ever been around to someone
    0:05:13 trying to quit smoking and they like buy a pack and then they throw the, they
    0:05:17 smoke one cigarette and they throw the pack out, right? As a sort of commitment
    0:05:21 mechanism to bind myself to the mast to resist the temptation. So it’s such a
    0:05:27 potent metaphor for so many things. But fundamentally, what, what started with, I
    0:05:32 was just thinking about the word siren and how weird it was that we, there are
    0:05:40 two meanings of that word. One, the, you know, from the Homeric epic, these
    0:05:46 creatures, some people say they look like birds. Usually in movies, they’re
    0:05:51 just like super hot women that seduce and warble you to death in the words of,
    0:05:57 of Homer. And then the thing that’s on top of an ambulance or a cop car. Very
    0:05:59 different things, but they’re both doing the same thing, which is they’re
    0:06:04 compelling your attention. And that experience of having your attention
    0:06:11 compelled and trying to manage that compulsion through, in the case of Odysseus,
    0:06:17 extremely elaborate means is to me, the experience of contemporary life at all
    0:06:21 times in some ways. And so that was sort of there almost from the beginning of
    0:06:28 working on the book. How do you define a word like attention? What are some of the
    0:06:34 more useful or practical ways to think about what it actually means in human
    0:06:39 life? So there’s a lot of debate about this. There are some people who say it’s
    0:06:45 not really even a coherent a concept. And some of the critiques I take seriously in
    0:06:50 some ways, I’m using it in an everyday sense because I think it’s useful to use
    0:06:55 in the everyday sense because I think it is naming something real. So one way to
    0:07:00 think about it is the flash beam of thought. That’s a common trope, right?
    0:07:04 There’s a William James description of attention that everyone who writes about
    0:07:07 attention quotes because it’s so good, which is withdrawal from certain things
    0:07:11 to focus on others. If you think about what a stagehand of the spotlight does in
    0:07:17 a Broadway play, like I’m focusing on you right now. If I take a second, there’s a
    0:07:24 million forms of perceptual stimulus in my visual field right now. I could focus
    0:07:29 on those. I’m not. I’m focusing on you through an effort of conscious will. So
    0:07:32 that’s how we think about attention, the ability to focus basically, willfully
    0:07:37 focus. But then there’s other dimensions of that. So there’s conscious
    0:07:41 attention, voluntary attention, then there’s involuntary attention. Like right
    0:07:46 now, if someone, I have a door to my studio right now, if someone busted in
    0:07:52 here and opened that door, I couldn’t not look. It would literally be impossible.
    0:07:57 Before I had any conscious will over it, before I made any decision, no matter how
    0:08:04 disciplined I am, pre-consciously, a system would fire that would wrench my
    0:08:09 attention towards that door going open. So that’s involuntary
    0:08:13 attention, right? And then the third aspect I talk about is social attention,
    0:08:18 which I think has its own kind of particular weight and depth, which is
    0:08:22 it’s not just that we could pay attention to things in the world. We can pay
    0:08:27 attention to people, and crucially, people can pay attention to us. We can be on
    0:08:31 the receiving end of attention, which is another thing that makes it so
    0:08:36 psychologically and socially and emotionally rich. Well, you call it the
    0:08:39 substance of life, attention. I mean, is that just kind of another way of saying
    0:08:43 it’s really everything or the most important thing, certainly one of the
    0:08:48 most important things that we have? I think it’s the most important thing, and I
    0:08:52 think to go back to William James, one of James’s philosophical preoccupations is
    0:08:59 free will, whether we have it, what it means to have it. And to him, attention is
    0:09:05 indistinguishable from will, because that ability to focus is the essence of
    0:09:14 will. And for me, if you’re not a religious person, so you don’t think that
    0:09:20 the kind of meaning of your existence is imbued by some higher power or some sort
    0:09:28 of spiritual essence, in a secular sense, what we get is one life. And what we do
    0:09:34 during that one life is we go around through the world and this one body and
    0:09:39 brain we have, peering out at it, and from moment to moment paying attention to
    0:09:46 this or that. And what we pay attention to in the end adds up to a life. And I
    0:09:50 don’t think there’s any way to, it’s elemental in that sense. And I don’t think
    0:09:54 there’s any way to detach what your experience of life is from this faculty.
    0:10:00 We do sort of become what we pay attention to. And given how important it
    0:10:04 is, it is kind of nuts. I’ve said this a bunch on the show, and I’ll probably say
    0:10:09 it a bunch more, but it’s wild how thoughtlessly we give it away every day.
    0:10:13 And I have no doubt, almost no doubt, that when we’re all at the end of our
    0:10:17 lives, our biggest regret, certainly one of our biggest regrets, will be that we
    0:10:22 gave our attention away to the wrong things. Yes. And I think there’s a
    0:10:27 few reasons for that. One is this aspect of compelled attention, right? So we have
    0:10:33 these biological inherences that are very deep that have produced a faculty
    0:10:38 that’s there to like warn us of danger, right? Or to do all kinds of
    0:10:43 things that may be evolutionarily necessary. So that faculty’s always there.
    0:10:46 So we’re always being sort of drawn towards certain things, whether we kind
    0:10:50 of consciously will it or not, you know, the lurid, the prurient, like this whole
    0:10:53 category of things, we have a whole set of words to describe things that draw our
    0:10:58 attention, even though we don’t necessarily want to go there. So we’re
    0:11:02 always fighting that. And then there’s the fact that we have a hard time sitting
    0:11:06 with our own thoughts. So there’s, there’s these sort of two sides of this
    0:11:09 coin stuff is always trying to take our attention, but we’re always trying to put
    0:11:14 it somewhere. So the and this is an experience of modernity, I think it’s
    0:11:18 really interesting in research to the book, talking to reading anthropologists
    0:11:25 who work with hunter gatherers, basically, people that live outside of fully
    0:11:30 outside of what we call modernity, even outside of like, modernity circa, you
    0:11:35 know, 1000 or the Roman Empire, right? They’re hunter gatherers. Don’t have words
    0:11:40 for boredom. Don’t really talk about being bored, literally, like, in an
    0:11:44 aboriginal indigenous tribe, like the word boredom has to be imported from,
    0:11:48 from English to the wallpaper, because they don’t have word for it. So at some
    0:11:54 level, this isn’t a elemental human inheritance, but it is constitutive
    0:11:57 majority in some ways, being bored. So you’ve got these two things, there’s stuff
    0:12:00 always trying to take our attention. And we’re then also always trying to get
    0:12:04 it away, because if something isn’t taking it, talk about, you know, sitting at
    0:12:11 the breakfast table as a 10 year old, just desperate to read something and
    0:12:16 reading the back of the cereal box, like, please, don’t like, you must give me
    0:12:21 something to, for my mind to chew on, or it’s gonna chew on itself. Yeah, there’s
    0:12:25 that famous Pascal quote, right, that like, all of man’s problems stem from his
    0:12:29 inability to sit quietly in a room. Exactly. It’s true. And it’s amazing to
    0:12:34 encounter that quote now, right. I mean, he wrote that in 1650, I think,
    0:12:38 somewhere around thereabouts. It’s funny because of how much we think of this as
    0:12:45 a contemporary conundrum, right? Yeah. That it’s born of, of the smartphone. And
    0:12:49 one of the things that I think was so enjoyable about working on this book and
    0:12:54 thinking about it is that the conditions of contemporary life, which I think are
    0:13:00 distinct in many ways, end up being, drawing an arrow to like the core of the
    0:13:05 human conundrum. So you end up kind of wrestling with these deep things that
    0:13:11 manifest in different ways under different social or technological conditions,
    0:13:16 but fundamentally come back to like, living with our own conscious mind in
    0:13:22 the world. Well, you make it really interesting. And as far as I know, novel
    0:13:29 argument about the transition we’re experiencing now, comparing it to the
    0:13:35 emergence of wage labor in the industrial revolution. And you make the case that
    0:13:41 the modern attention economy does to attention something very similar to what
    0:13:49 industrial capitalism did to labor. So lay that out for me. Yeah. So labor is the
    0:13:55 product of a specific set of legal, market, social institutions that produce
    0:14:01 this thing called a wage and a laborer, even though humans are doing stuff as soon
    0:14:05 as they get to the planet, right? Effort, toil, whatever you want to call it exists
    0:14:11 prior to that. Labor is turning to commodity. And there’s a bunch of weird
    0:14:17 things about that. And Marx is, I’m not a Marxist personally, but I think his
    0:14:24 observations here are quite prophetic. There’s something weird about it, you
    0:14:28 know, like, first of all, just the lived experience of the difference between a
    0:14:32 guy who runs a shoe shop who’s a cobbler, which exists prior to industrial
    0:14:36 capitalism, where like, you’re making the, you’re making the whole shoe, you know,
    0:14:39 first you’re putting, you’re cutting the sole, then you’re putting the upper on,
    0:14:41 then you’re putting it together. In the end, you got this thing, it’s a shoe. And
    0:14:45 now you own it, and then I sell it to you, Sean. You pay me money, now you own it.
    0:14:50 Okay. You go from that to, I work in a shoe factory 12 hours a day where I just
    0:14:56 stamp soles all day. I’m completely alienated, like, it is external to me the
    0:15:02 shoes I make. I don’t actually own them in a market sense. And also, like, it’s a
    0:15:08 much different experience of life. This thing has been taken from me in some
    0:15:12 deep sense. Like, I don’t want to stamp soles all day. That’s, that’s like, that
    0:15:17 kind of sucks. And maybe making shoes kind of sucks too, but it sucks in a more
    0:15:23 interesting way. That’s more mine. So you have this extraction of this thing that’s
    0:15:26 so essential to you. And not only extraction of things that are essential to you,
    0:15:32 the other thing that’s weird about it is, in the grand scheme, labor in the
    0:15:38 aggregate is necessary for all of industrial capitalism. So it’s incredibly
    0:15:44 valuable in the aggregate. But each individual slice of it is essentially
    0:15:49 valueless. You’re like, this is all I got. I got this one body, and I go and stamp
    0:15:53 soles 12 hours a day, and I get nothing for it. But that’s it. That’s, from my
    0:15:57 perspective, that’s all I got, right? So all of these attributes are there for
    0:16:05 attention, right? Attention pre-exists before its marketization, right? It now
    0:16:10 has a value out in the world. It’s now being extracted at scale. In the aggregate,
    0:16:17 it’s wildly valuable. Google, Meta, right? All their money comes from this. I
    0:16:21 argue in the book, Amazon, to a certain extent, is really an attention company. So
    0:16:26 the aggregate is wildly valuable. Individually, like, they’re paying tiny
    0:16:30 slivers of sense for your attention in any moment. The amount of advertising you
    0:16:34 get shoved in a day, the amount of content you get shoved in a day through
    0:16:38 these algorithms, I don’t know, maybe it’s like cost someone somewhere in the
    0:16:43 aggregate 20 bucks. But to you, it’s like, that’s all you got. That’s all you have is
    0:16:47 what you’re paying attention to in any moment. So that same sense of extraction,
    0:16:53 right? A thing in us, it gets named and commodified. A set of institutions take
    0:16:58 it from us, assign it a market value. Carl Pogliani, who’s a sort of socialist
    0:17:04 economic thinker, calls these fictitious commodities, right? Like, there are certain
    0:17:08 commodities that exist in the market. And then there’s certain commodities like
    0:17:14 labor, attention, Pogliani argues land. They’re not, like, made for market
    0:17:18 production. They’re just out in the world. And yet they get turned into a
    0:17:24 commodity. And that, and the, it requires a reorientation of the world, of all
    0:17:30 social relations in some ways, to make them function as commodities. So attention
    0:17:35 is the most important resource in the world now. And a key argument in the
    0:17:41 book is that this is very different from previous eras built around resources
    0:17:46 like land or capital or coal or whatever. What is the most significant
    0:17:48 difference here for you?
    0:17:52 The argument I make in the book is that what we think of as commonly referred to
    0:17:56 as the attention age, and you could decide when you want to start that, the
    0:18:03 1980s, the 1990s, the 70s, is truly the information age, that you have a switch
    0:18:08 from physical market production to non-material market production, information
    0:18:14 economy, claims adjusters, coders, podcasters, like you and I, right? All doing
    0:18:19 these things that don’t amount to the physical refashioning of the world. And
    0:18:23 in that world, we think of it as like information being the defining feature of
    0:18:28 it. But information is limitless. Information, there’s just tons of
    0:18:33 information. The thing that’s scarce and valuable is attention. So everyone’s
    0:18:38 got to fight over that. And the more information there is, the lower the
    0:18:41 barriers it is to get in front of someone’s face, the more competitive it
    0:18:46 becomes. And I think that we’re in a position now as more and more of the
    0:18:50 world moves from sort of industrial modes of production to post-industrial
    0:18:54 modes of production, that it’s just necessarily the case that under those
    0:18:59 conditions, the one thing that’s left that’s scarce, that’s finite, that’s the
    0:19:01 most valuable is our attention.
    0:19:08 And I love the point you make in the book that unlike coal or land, which is
    0:19:12 outside of us, right, that this resource, attention, is in our minds. It’s in
    0:19:17 our heads. And so that involves cracking into our minds, as you put it.
    0:19:23 Yeah. Now it’s like traffic or air travel. Like, it’s a thing that we all
    0:19:27 just experience as a bummer. Yeah. That you just talk to about like, doesn’t it
    0:19:32 suck that, you know, we can’t pay attention. The phones are always going
    0:19:32 off.
    0:19:40 I am constantly making noises about what tech is doing to us on the show and to
    0:19:45 basically everybody in my life to their great annoyance. But I don’t have a
    0:19:54 compelling response necessarily to the arguments that no one’s forced to stare
    0:19:59 at their phones all day. We’re choosing this. We want this. And that’s not
    0:20:04 exactly wrong, but I also think our creaturely vulnerabilities are so
    0:20:09 exploitable. And even though we’re not being forced in the literal sense, I’m
    0:20:13 also not sure we’re really free in any meaningful or recognizable way.
    0:20:16 Well, I mean, I think that’s, I think that’s the deepest question, right? I
    0:20:20 mean, I don’t think I can resolve the free will question. Come on. You’re
    0:20:24 Christopher Hayes, guy, your podcast. Come on. But, but I, but I think you’re
    0:20:29 right. I mean, I do think it implicates, it implicates our freedom in a profound
    0:20:33 and deep way. I mean, when you get that notification on your phone, and again, I
    0:20:36 want to be very upfront here. I was joking, my wife, that like, I feel like
    0:20:39 I’ve written a recovery memoir and I’m still drinking, like people are going to
    0:20:43 go to me and like, well, here’s how you do it. It’s like, I’m still fighting all
    0:20:46 this stuff. I’m, you know, I’m not, I’m not great about it. So I don’t want
    0:20:51 anyone to think that I’m on some elevated plane here. Like I’m in the muck with
    0:20:55 everyone. Okay. When you get that notification on the screen time
    0:21:01 notification, that like, this was your average screen time for the week. That
    0:21:07 is a profound moment of like, who am I and what is my will? And we fail the test
    0:21:13 every day. I’m like, what are you talking about with that number? That number is
    0:21:18 shocking. The saddest part of my week. Who am I? The saddest part of my week,
    0:21:22 every week on Sunday morning between nine and 10, I get the notification from my
    0:21:27 phone about the average amount of screen time this week. And it’s horrifying.
    0:21:32 It’s a horrifying number. But it’s a horrifying number also in that deep way
    0:21:38 of like, what does it say about you? It’s wild. Again, I fuck it. I guess I’ll just
    0:21:43 go full philosophy seminar here. But if we no longer have meaningful conscious
    0:21:47 control over our attention, at some point, we do reach a level of passivity
    0:21:52 that makes us more of an object than a person. Yes. And that has profound
    0:21:56 implications for, for instance, democratic theory. Yes. I mean, and when, and
    0:22:00 when you, and these are, it’s interesting because there was a round of these
    0:22:05 conversations, particularly in the 20s and 30s, the sort of collision of mass
    0:22:10 media, mass propaganda, mass advertising, and, and, and, and industrial
    0:22:14 democracy all coming together. And these debates that happened during that period
    0:22:19 of time, where everyone’s sort of trying to deal with this exact same question
    0:22:26 that we’re now dealing with, which is, can people be subjects in a meaningful
    0:22:31 sense under these conditions of like, mass media? Like if everyone is just
    0:22:38 listening to the same propaganda all day on their radios, in what sense do we
    0:22:42 have individual subjects with free wills making decisions about self
    0:22:47 governance? You know, and this is the Lipman. This is Lipman’s big experience,
    0:22:52 right? He’s the chief propagandist to get us into World War One. And he, and
    0:22:55 again, I think it was much easier to manipulate public opinion then, to be
    0:23:00 honest. But he does it, and he’s like, oh my God, that was way too easy. What
    0:23:05 does it mean about democracy if you can just propagandize a whole population?
    0:23:10 And we have a different set of questions now that aren’t about, in some
    0:23:14 fascinating way or sort of the converse, right? That was all about massness. It
    0:23:20 was like, everyone’s listening to the same thing. So it, it’s subsuming the
    0:23:25 individual. And we’re watching fascism as this sort of, this sort of the mob
    0:23:28 basically come to life. And the mob is all getting the same propaganda. The mob
    0:23:34 is acting as one. We’re now seeing this like weird hyper individuation, which
    0:23:39 like, no one seems, it sees exactly the same content all day. And what is that
    0:23:45 radical individuation and sort of self selection do to the, you know, the
    0:23:46 Democratic project?
    0:23:51 I love that you went here because this is where I wanted to go. Well, this is
    0:23:56 what what your books about, I mean, in a lot of ways it is. Yeah. And to the
    0:24:00 point you’re making here and in the book, if we also lack the capacity to
    0:24:05 pay attention together, what the hell does that mean for democracy? I mean,
    0:24:09 democracy on some level is a shared culture. So if mass culture isn’t
    0:24:15 possible anymore, is democracy? I mean, there’s a few things I say. One is, I
    0:24:18 do want to be, I want to always in this book, and I try very hard to sort of
    0:24:24 resist the temptation, dehistoricize everything. Like, you know, as I say in
    0:24:28 the book, like, they didn’t need Facebook and Salem to like, start having
    0:24:33 viral rumors that so and so was a witch. Like people, people are very good at
    0:24:38 spreading disinformation, just analog style, which is like the core of the
    0:24:43 human condition. And like, you know, that’s, that’s our lot. And, you know,
    0:24:46 democracy is incredibly fallible with a bunch of fallible people. So I just
    0:24:53 want to say that. But yes, I think there is a profound question about what
    0:24:56 this is doing to our democracy. And particularly because, as I write in the
    0:25:02 book, and this is really key and it’s something that I live every day,
    0:25:11 attention is not a moral faculty. It doesn’t, it is distinct from what we
    0:25:18 think is important. You know, Lipman in public opinion lines about this, he
    0:25:22 wants about a lot of things. You know, he says, you know, he’s talking about the,
    0:25:25 the, he’s talking about Versailles actually, right? So talking about the end
    0:25:28 of the war and the reparations, he says, Americans have an incredible interest
    0:25:33 in this, but they’re not interested in it. Like, it, we have a, he’s like the same
    0:25:38 way the child has an enormous interest in his father’s business that he will
    0:25:44 inherit, but is not interested in it. So this problem is old, but I think it’s
    0:25:52 so sheer right now that overcoming the compelled, the sirens call the, the, the
    0:25:57 sort of lowest common denominator, tabloid casino effect of everything in a
    0:26:00 very competitive attention environment where we’re driven towards the lowest
    0:26:06 common denominator, we’re driven towards what compels it, malforms the public
    0:26:15 collective ability to reason collectively, to think of issues independent of what
    0:26:18 just sustains our attention from moment to moment, because what sustains our
    0:26:22 attention from moment to moment is distinct from what is important. And we
    0:26:26 all know that we, everyone understands that. And yet it’s very hard to
    0:26:30 counteract sort of what’s being done to us through the technologies.
    0:26:34 And of course, look, the problem isn’t just that we’re losing control over what
    0:26:39 we pay attention to. We’re also losing the capacity to pay attention for more
    0:26:43 than 10 seconds. You know, I mean, you talk about the, the Lincoln Douglas
    0:26:47 debates in the book, we talk about it in ours as well, you know, and it really
    0:26:51 is striking how much more sophisticated the language was back then.
    0:26:52 It’s wild.
    0:26:56 It’s wild. And people had the capacity to pay attention to it for so long. And
    0:27:03 there’s just no question that more people think and speak in soundbites now,
    0:27:05 because that’s how we consume information. I mean, maybe it started with
    0:27:10 the telegraph and radio and TV, but it’s ratcheted up to a whole other level
    0:27:14 with digital tech. We are a meme culture now. And if you live in a meme
    0:27:18 culture, you’re going to have a meme politics in a citizenry that can only
    0:27:21 communicate at the level of memes. I don’t know what you do with that.
    0:27:24 Yes, no, you’re right. I mean, and yes, and your discussion, I think your
    0:27:29 discussion on Lincoln Douglas actually was what sent me originally back to
    0:27:33 read them. I also have no doubt that if those people attending the Lincoln
    0:27:38 Douglas debates could go home and stream CSI Toledo or whatever, they would.
    0:27:43 Dude, all I mean, go back and like people, that is, again, this is one
    0:27:46 of these challenges with this whole discourse is like, what’s distinct?
    0:27:52 What’s old? Like, go read, all Marx did is just fight with people online,
    0:27:56 essentially, for what his day was. Like, that’s all he spent his whole life.
    0:28:01 Like, he was a compulsive poster. He’s constantly having 15 different
    0:28:06 factional fights. People always forget the conus manifesto is so funny.
    0:28:09 It’s basically, it’s like 15 pages of like, you know, all this stuff.
    0:28:12 People know workers of the world unite. And then there’s an addendum that’s
    0:28:17 like why every other factional tendency in the broad anti-capitalist movement
    0:28:21 is wrong, like goes through each one like this one’s wrong for this reasons.
    0:28:24 And then there’s like, there’s like this like weird formation of kind of
    0:28:28 monarchist right wing Catholics were also anti-Buschwan anti-capitalist.
    0:28:31 They’re wrong for this reason. And literally, just like, it’s just like
    0:28:36 a set of fights he’s picking with every different person.
    0:28:41 So some of this, again, this is the thing that I say all the time.
    0:28:47 Democracy is a technology for managing the conflict endemic to human affairs.
    0:28:51 It’s the best technology we have come up with for managing conflict
    0:28:56 endemic human affairs, but conflict is endemic to human affairs.
    0:29:01 So that, that doesn’t go away. You know, people are going to be disagree
    0:29:05 and fight with each other. And the question of how we manage that
    0:29:09 is the question of how we collectively govern.
    0:29:13 And I do think that like all of us having our brains stripped to the studs
    0:29:16 is not helpful in that enterprise.
    0:29:19 What a hot take there, Chris.
    0:29:33 Support for the gray area comes from Blue Nile.
    0:29:36 Popping the question is a memory you’ll hold on to forever,
    0:29:40 especially if the answer is yes. Actually buying the ring, though,
    0:29:43 that’s an experience most of us would prefer to forget.
    0:29:46 Navigating the world of less than stellar salespeople
    0:29:50 and eye-popping prices isn’t a lot of fun.
    0:29:53 Luckily, Blue Nile offers a new way to buy that perfect ring.
    0:29:56 At Blue Nile, they say you can find your dream engagement ring at a price
    0:29:59 you’ll never find at a traditional jeweler.
    0:30:02 And according to the company, they’re committed to ensuring that the highest
    0:30:06 ethical standards are observed when sourcing diamonds and jewelry.
    0:30:10 Plus, because Blue Nile has a 100% satisfaction guarantee
    0:30:14 with free shipping and returns, you can make sure the ring you pick
    0:30:18 is the one. And you know it will last because they offer free service
    0:30:24 and repair for life. Right now, get $50 off your purchase of $500 or more
    0:30:27 with code grayarea@bluenile.com.
    0:30:35 That’s $50 off with code grayarea@bluenile.com, bluenile.com.
    0:30:40 Support for the gray area comes from Shopify.
    0:30:44 So it’s a new year, 2025, and you might be thinking
    0:30:46 how am I going to make this year different?
    0:30:48 How am I going to build something for myself?
    0:30:52 Maybe you’re dying to be your own boss and see if you can turn that business
    0:30:54 idea you’ve been kicking around into a reality.
    0:30:57 But don’t quite know how to make it happen.
    0:30:59 Well, Shopify wants to help.
    0:31:03 Shopify makes it simple to create your brand, get it open for business,
    0:31:04 and get your first sale.
    0:31:08 You can get your store up and running easily with thousands of customizable
    0:31:11 templates, no coding or design skills required.
    0:31:13 All you need to do is drag and drop.
    0:31:17 Plus, they’re powerful social media tools, let you connect all your channels
    0:31:21 and create shoppable posts and help you sell wherever people are scrolling.
    0:31:25 Established in 2025 has a nice ring to it, doesn’t it?
    0:31:28 You can sign up for your $1 per month trial period
    0:31:31 at Shopify.com/box all lowercase.
    0:31:35 You can go to Shopify.com/box to start selling with Shopify today.
    0:31:37 Shopify.com/box.
    0:31:43 Support for the gray area comes from Upway.
    0:31:45 If you’re tired of feeling stuck in traffic every day,
    0:31:48 there might be a better way to adventure on an e-bike.
    0:31:52 Imagine cruising past traffic, tackling hills with ease,
    0:31:57 and exploring new trails, all without breaking a sweat or your wallet.
    0:32:01 At UpWight.co you can find e-bikes from top tier brands like Specialized,
    0:32:03 Cannondale, and Aventon.
    0:32:05 Add up to 60% off retail.
    0:32:07 Perfect for your next weekend adventure.
    0:32:11 Whether you’re looking for a rugged mountain bike or a sleek city cruiser,
    0:32:13 there’s a ride for everyone.
    0:32:18 And right now you can use code Gray Area 150 to get $150 off
    0:32:21 your first e-bike purchase of $1,000 or more.
    0:32:42 You know, we’re talking about TV and, of course, we all know what you do.
    0:32:45 You’re the host of a cable news show.
    0:32:49 And you grapple with some of these questions in a really interesting way in the book.
    0:32:56 You have a point of view as a journalist, as a TV host.
    0:33:01 You want to inform and presumably persuade your fellow citizens,
    0:33:03 but you also work in TV.
    0:33:05 You work in the attention industry.
    0:33:09 And the logic of that industry and the logic of that medium
    0:33:11 is constantly imposing itself on you.
    0:33:13 So how do you navigate this?
    0:33:18 How do you play the attention game without compromising yourself?
    0:33:19 It’s really hard.
    0:33:23 It’s what I spend most of my life thinking about, most of my working life.
    0:33:27 I mean, it was the rudest awakening when I moved to Primetime,
    0:33:29 partly because the first TV show I had, which was on weekend mornings,
    0:33:33 I just didn’t think about intentional imperatives at all.
    0:33:37 And I was just like, wouldn’t it be cool to do a two-hour sort of like seminar
    0:33:39 about 80 topics at a roundtable?
    0:33:41 And then it did well.
    0:33:42 It rated pretty well.
    0:33:44 And it was like, oh, well, then.
    0:33:46 And then I tried to do that at 8 p.m.
    0:33:51 after people had just gotten home from like a day teaching third grade
    0:33:55 or a shift in the hospice.
    0:33:59 And it didn’t really work.
    0:34:04 Partly because I think people just started to have different attentional capacity
    0:34:08 at 8 p.m. on a weeknight than they do at 9 a.m. on Saturday morning.
    0:34:10 Like, you’re pretty clear.
    0:34:11 You can sit and think a little.
    0:34:19 So I had to deal with those attentional imperatives and I always have to.
    0:34:23 I mean, the thing about attention, I say, is that it’s mere.
    0:34:25 It’s always necessary and never sufficient.
    0:34:29 That’s what’s so fascinating about it.
    0:34:31 You always need it to do anything else.
    0:34:35 Like, in a relationship, it’s necessary, but it’s not sufficient.
    0:34:37 Like, what you want in a relationship is love.
    0:34:41 But you need attention to get love.
    0:34:44 Like, you need your spouse to pay you attention and listen to you
    0:34:46 and they need you to do the same to them.
    0:34:49 But if all you’re doing is paying attention
    0:34:51 and sometimes people get into toxic relationships
    0:34:53 where they’re paying negative attention to each other
    0:34:56 and they’re fighting with each other in this desperate attempt to get that,
    0:34:58 it’s not enough.
    0:35:02 So that’s the same about the conundrum I have, right?
    0:35:05 It’s necessary, but not sufficient.
    0:35:10 I need to keep people’s attention as a means to the end
    0:35:14 of doing something that I think improves civic life
    0:35:16 to be as highfalutinous possible.
    0:35:21 When I first started in journalism, I was more of a,
    0:35:24 I guess you would call it a take writer.
    0:35:26 And I did some cable hits.
    0:35:30 And it didn’t go well, in part because
    0:35:34 I just didn’t understand how performative it was,
    0:35:35 especially when you’re in the guest room.
    0:35:39 You know, I wanted to be deliberate and make arguments,
    0:35:43 but that’s hard to do when you’ve got a few minutes, maybe.
    0:35:44 It’s entertainment, right?
    0:35:47 And so you have to capture and hold attention.
    0:35:51 And that incentivizes a certain style of communication.
    0:35:53 So I kind of just stopped doing TV.
    0:35:55 If I did it again, it would go better
    0:35:57 because I understand that world now
    0:35:59 and I can perform if I need to.
    0:36:04 But I didn’t think it brought out the best version of me.
    0:36:06 Yeah, I don’t know if it brings out the best version of me
    0:36:07 either, to be totally honest.
    0:36:10 I mean, one thing that you mentioned there that I think is
    0:36:15 part of this discussion is just time and the speed.
    0:36:15 That’s right.
    0:36:19 People don’t realize how the pace at which they talk
    0:36:20 and how compressed it is on television.
    0:36:23 And actually, this is a thing I kind of love
    0:36:24 about the kind of podcast resurgence.
    0:36:28 And to my point about not everything is terrible,
    0:36:29 like Lex Friedman is a great example.
    0:36:33 He’s a podcaster who has a very, very popular podcast.
    0:36:36 I listen to him sometimes, some of them I love,
    0:36:38 some of them I’m not that crazy about.
    0:36:42 But he’s very deliberate and he’s very slow
    0:36:43 and it would never work on television.
    0:36:47 And I love the fact that it does work in the medium
    0:36:48 he’s working in.
    0:36:50 But one thing about TV for people that I haven’t done it is,
    0:36:53 if you’ve ever had the experience of going to a batting cage,
    0:36:58 and putting it up to like 70, 80, 90, like professional,
    0:37:01 and you’re standing there and the ball is just past you
    0:37:04 before your muscle even twitch.
    0:37:08 You’re just like, whoa, that ball got on me very fast.
    0:37:12 That’s how TV feels when you, if you’re not used to it.
    0:37:15 It’s just, it’s like trying to hit majorly pitching.
    0:37:19 All of a sudden, everything is moving way faster
    0:37:22 than it does in normal conversation, in normal thing.
    0:37:26 In anything you do normally, it’s happening way, way, way, way faster.
    0:37:31 I will say, and it’s not just because you’re a friend of the show.
    0:37:33 I think you do it as well as it can be done.
    0:37:35 Well, thanks, I appreciate that.
    0:37:38 All right, let’s back up a second.
    0:37:43 Because I do want to ask, and it’s something you ask in the book.
    0:37:47 You point out, every time we have these periods of change,
    0:37:51 we do have to pause and ask, what’s really new here?
    0:37:52 What’s not?
    0:37:55 What’s really harmful and what isn’t?
    0:37:57 As you say, people freaked out about comic books, right?
    0:38:00 And that was clearly ridiculous in retrospect.
    0:38:04 But people also freaked out about cigarettes or worried about cigarettes,
    0:38:06 which was clearly wise in retrospect.
    0:38:12 So how do we know the attention age is cigarettes and not comic books?
    0:38:14 It’s a great question.
    0:38:18 I think there’s a few ways to answer this question.
    0:38:25 So one, I think, is on the sort of Jonathan Haidt who wrote The Anxious Generation question of,
    0:38:31 what does the empirical research say about what this is doing to us, right?
    0:38:34 In the case of tobacco, we just acquired a huge body of evidence.
    0:38:35 This is terrible for our health.
    0:38:39 Even though, as I cite in the book, there were people going back to the 17th century,
    0:38:43 16th century, who were like, boy, this sure seems like an awful thing to do.
    0:38:46 You light this stuff on fire and you put the smoke in your lungs.
    0:38:48 I don’t think that’s going to work out well.
    0:38:53 So I think, in some ways, the empirical question, while important, like,
    0:38:55 is it making us more depressed?
    0:39:01 A very difficult causal question to resolve, as all causal questions are.
    0:39:08 Is also distinct from the deeper philosophical thing, which is just like,
    0:39:09 are we, is this good?
    0:39:10 Do we like this?
    0:39:14 Like, is this forming my soul well?
    0:39:16 And I don’t need data to tell me that.
    0:39:19 That’s a human question.
    0:39:25 That’s, in some ways, why the book is really, to a certain extent, a work of philosophy.
    0:39:30 You could tell me, you could come back and be like, actually, none of the empirical data,
    0:39:33 like, it doesn’t cause more anxiety.
    0:39:34 It doesn’t cause depression.
    0:39:35 You know, fine.
    0:39:37 That might be true.
    0:39:43 But the bigger question is like, our experience of majority is an experience
    0:39:47 of an ever-quickening pace and new forms of alienation
    0:39:50 that we then have to wrestle with as people.
    0:39:55 And whatever the data says in the end, we all got to live in this world and in this environment,
    0:40:00 which I think a lot of us, understandably, are not enjoying.
    0:40:12 Hey, whatcha doing?
    0:40:17 Programming our thermostat to 17 degrees when we’re out at work or asleep.
    0:40:22 We’re taking control of our energy use this winter with some easy energy saving tips I got from FortisBC.
    0:40:24 Ooh, conserve energy and save money?
    0:40:27 Maybe to buy those matching winter jackets?
    0:40:28 Uh, no.
    0:40:31 We’re also getting that whole matching outfit thing under control.
    0:40:37 Discover low and no-cost energy saving tips at fortisbc.com/energysavingtips.
    0:40:39 Matching tracksuits?
    0:40:39 Please, no.
    0:40:43 TD Direct Investing offers live support.
    0:40:47 So whether you’re a newbie or a seasoned pro, you can make your investing steps count.
    0:40:53 And if you’re like me and think a TFSA stands for Total Fun Savings Adventure,
    0:40:55 maybe reach out to TD Direct Investing.
    0:40:59 It’s Today Explained.
    0:41:02 I’m Noelle King with Miles Bryan.
    0:41:04 Senior reporter and producer for the program.
    0:41:04 Hello.
    0:41:06 Hi, you went to public school, right, Miles?
    0:41:08 Yes, go South High Tigers.
    0:41:10 What do you remember about school lunch?
    0:41:15 Oh, I remember sad lasagna, shrink-wrapped in little containers.
    0:41:16 I remember avoiding it.
    0:41:18 Do you remember the nugs, the chicken nuggets?
    0:41:23 Yeah, if I had to eat school lunch, that was a pretty good option.
    0:41:24 I actually liked them.
    0:41:28 But in addition to being very tasty, those nugs were very processed.
    0:41:32 And at the moment, America has got processed foods in its crosshairs.
    0:41:36 It’s true, we are collectively very down on processed food right now,
    0:41:40 none more so than Health and Human Services Secretary nominee,
    0:41:42 Robert Floride Kennedy, Jr.
    0:41:45 I’ll get processed food out of school lunch immediately.
    0:41:50 About half the school lunch program goes to processed food.
    0:41:55 Hen the man who once saved a dead bear cub for a snack, fixed school lunches.
    0:42:00 Today Explained, every weekday, wherever you get your podcasts.
    0:42:17 Well, the final chapter of the book is titled Reclaiming Our Minds.
    0:42:20 So does that mean you have a blueprint for how to unfuck
    0:42:23 ourselves in the world?
    0:42:26 I need a 10-point, money-back guarantee plan.
    0:42:27 I know, I’m bad at this, you know.
    0:42:29 It’s the worst part of writing a book.
    0:42:33 I know, this is my favorite last chapter I’ve written because I actually do think,
    0:42:36 I do think there’s some concrete stuff here.
    0:42:42 So the individual stuff, I think, you know, people are doing all the things they’re doing,
    0:42:46 mindfulness, playing their phones in boxes, you know, schools, for instance.
    0:42:50 I think like schools, it’s crazy to me that schools have only started taking kids’ phones
    0:42:52 at class at the beginning, totally crazy.
    0:42:53 Insane.
    0:42:54 Insane, like what do we do?
    0:43:00 Like also, have you ever watched, have you gone to a conference recently or any kind
    0:43:02 of adult meeting where people can have their phones?
    0:43:05 Like no one’s paying attention.
    0:43:08 You just take them for all of that stuff.
    0:43:15 So individually, like, you know, taking long walks without listening to a podcast and
    0:43:19 letting, being along with your thoughts, like a cult trading, you’re forcing yourself to do that,
    0:43:21 even if it’s 20 minutes a day.
    0:43:26 I’m gonna do 20 minutes where I take a walk by myself and I think, and I just sit with
    0:43:26 my own mind.
    0:43:27 I really think that’s useful.
    0:43:31 Not just like an individual thing.
    0:43:33 And there’s a million different individual things.
    0:43:41 Hobbies, habits, things we do that are neither work or the phone, being with other people.
    0:43:44 Then there’s like social stuff.
    0:43:47 And here’s where I do think the food stuff is really important and interesting.
    0:43:53 A bunch of people in the 60s started, for specific ideological reasons, rebelling against
    0:43:58 a whole bunch of aspects of industrial food production.
    0:44:04 People that started opening up whole food stores, not like the brand name, but like whole grain
    0:44:11 stores, health food stores, natural food stores, people starting green markets, farmer’s markets
    0:44:15 in the early 1970s, Alice Waters and sort of farm to table stuff.
    0:44:23 All this was like a rebellion against basically like the slop people were eating,
    0:44:29 the Chef Boyardee, Jell-O-Mold, Peak TV Dinner, 1970s cuisine.
    0:44:30 People were just like, “I don’t like this.”
    0:44:37 Like, there’s an empirical question about is that stuff good for you and how much is it
    0:44:38 causing obesity?
    0:44:41 But then there’s also a question like, “I don’t like this.”
    0:44:51 And that at the time seemed fringe and bespoke and avant-garde.
    0:45:01 It was on to something and has become an entire alternate universe of food production.
    0:45:05 Now some of it co-opted, you know, by big agra.
    0:45:08 We haven’t like defeated corn syrup, for instance, in America.
    0:45:11 But it is so different.
    0:45:15 The food landscape, the way we think about food and talk about food
    0:45:18 between now and like the 1970s.
    0:45:21 And that is the product of activism.
    0:45:24 It’s the product of like free spirits.
    0:45:26 It’s the product of entrepreneurs.
    0:45:29 I think you were going to see something coalesce around attention now.
    0:45:35 And again, this is like all this stuff feels like kind of precious and bespoke.
    0:45:41 But like jogging and fitness were precious and bespoke at some point.
    0:45:43 Like jogging was like a weird avant-garde thing.
    0:45:48 That like is a sort of silly thing.
    0:45:51 George W. Bush lost his first congressional campaign when he moved back to Texas
    0:45:53 because there was an ad by his opponent of him jogging.
    0:45:55 He was like, “Get a load of this, dude.”
    0:46:01 So I think there are going to be social movements.
    0:46:06 And there’s some interesting folks around, you know,
    0:46:09 the Struthers School for Radical Attention, which is here in New York.
    0:46:11 You may have seen the name D. Graham Burnett.
    0:46:15 He did podcasts with Azra Klein and he’s going to have a book about this.
    0:46:20 And a whole bunch of people around, there’s like this sort of secret society
    0:46:24 they have that was profiled in New Yorker, thinking about this,
    0:46:27 rebelling against it in a very similar kind of back to the land way, right?
    0:46:34 Like born of a kind of spiritual ideological set of principal commitments to like rebelling against this.
    0:46:40 Well, as you point out, you know, in the 19th century, the labor movement
    0:46:45 basically arrived at two big regulatory responses, right?
    0:46:50 Abandoned child labor and limitations on total hours worked.
    0:46:56 What are, what could be the equivalent regulations today?
    0:46:58 I mean, I think that’s an interesting place to start.
    0:47:02 So I think first of all, regulating attention and regulating the extraction of attention
    0:47:05 is just an area that we need to explore.
    0:47:11 I mean, there’s a lot of controversy about cutting teenagers off from social media.
    0:47:13 A lot of people on the left think it’s bad precisely around
    0:47:17 kids having access to LGBTQ information.
    0:47:19 And I totally hear that.
    0:47:25 Also, they think there’s sort of toxic ways in which like the particulars of a bill can empower,
    0:47:27 you know, right wing attorneys general to do bad stuff.
    0:47:28 And I totally hear that too.
    0:47:31 I think as a general principle, the idea that
    0:47:37 companies should not be buying and selling the attention of 14 year olds is just obviously true.
    0:47:42 And a huge part of that too, this goes hand in hand.
    0:47:45 So when I talk about the sort of social movement before I even get to regulation,
    0:47:50 non-commercial spaces for connection.
    0:47:54 Just the way that like we have non-commercial public space, I can meet you in Prospect Park.
    0:47:55 We can walk on the street.
    0:47:57 We don’t just exist in a mall.
    0:48:02 So one big part of it too, before we even get to the regulatory part of it,
    0:48:08 and this is why I’m saying this, is we need to build non-commercial space.
    0:48:15 Like all of digital life has been completely taken over by commercial spaces that are trying
    0:48:16 to buy and sell your attention.
    0:48:20 And then the regulatory question, I think is a deep one.
    0:48:23 Like, first of all, there’s constitutional issues because of speech.
    0:48:28 But I think if you think about it in terms of regulating the attention,
    0:48:32 like an app just can’t take more than an hour of your attention today.
    0:48:35 I don’t know.
    0:48:37 Maybe we pass the law and do that.
    0:48:42 Like that seems crazy at some level, but is it?
    0:48:47 And so I think we need to be thinking about regulating attention.
    0:48:49 I think that, and part of that is breaking up the big tech firms,
    0:48:50 which are too big and things like that.
    0:48:58 But more specifically, like this does feel like a place for governments to do something.
    0:49:03 Your book is rightly grounded in political economy,
    0:49:05 because that’s the driver of a lot of this.
    0:49:08 And it’s just very hard to imagine meaningful solutions
    0:49:12 that don’t involve a serious rethinking at that level.
    0:49:13 Yeah.
    0:49:17 I mean, I think there’s a deeper question about the form of the general form of capitalism
    0:49:21 and kind of gilded age oligopoly that we found ourselves in right now.
    0:49:24 And all these things are converging at the same point.
    0:49:29 I mean, Elon Musk is both kind of an allegory, but also very real.
    0:49:34 It’s sort of wild to me how much he just over the course writing the book
    0:49:37 became the full embodiment of everything the book says,
    0:49:42 both in his own personal compulsions, which he clearly can’t control.
    0:49:44 I mean, he’s very obviously addicted to posting.
    0:49:48 To his kind of, through his own personal brokenness,
    0:49:52 I think finding his way to understanding that attention is the most valuable resource,
    0:49:56 to iterating on Donald Trump’s key insights,
    0:49:59 to capture it and become the main character all the time,
    0:50:01 and then the power that that’s given him.
    0:50:06 It’s pretty dystopian, but it is playing out right in front of us.
    0:50:08 Do you have any final thoughts you want to add?
    0:50:11 I mean, if someone listens to this conversation,
    0:50:14 if they go and read your book after listening to this conversation,
    0:50:16 what do you hope they take away from it?
    0:50:18 Maybe more to the point, what do you hope they do?
    0:50:25 I do think there are parent groups that are working.
    0:50:26 There are a whole bunch of groups happening.
    0:50:30 You can go to the Struthers School for Radical Attention, you can Google that.
    0:50:32 There are more and more grassroots groups.
    0:50:38 A lot of it have been associated around Jonathan Hyde’s book and kids, particularly teenagers.
    0:50:44 But one of the things I think is important is that there’s a little bit of an
    0:50:45 instinct to be like, this is a teenager problem.
    0:50:53 I sometimes think actually teenagers are better about this than boomers, for instance.
    0:50:59 But I think you should find other people
    0:51:02 and see if there are ways to plug into local people that feel the same way.
    0:51:08 And then I think also doing things like joining a book club.
    0:51:13 Collective ways that you manage attention together.
    0:51:18 Again, as I start subscribing to a physical newspaper,
    0:51:21 going for a walk 20 minutes just for your thoughts.
    0:51:27 These are small ways to begin to connect with other people, particularly
    0:51:31 around all of us kind of reconceptualizing this collectively.
    0:51:34 That’s a good place to end it.
    0:51:37 Once again, the book is called The Siren’s Call.
    0:51:41 How attention became the world’s most endangered resource.
    0:51:46 I legitimately love the book and I appreciate having a chance to read it.
    0:51:48 And I’m glad you wrote it.
    0:51:50 Chris Hayes, thanks buddy.
    0:51:50 Sean, that was great.
    0:51:51 Thank you so much.
    0:52:06 All right, I hope you enjoyed this episode.
    0:52:08 You know I did.
    0:52:12 And in case you’re wondering, my screen time was actually down this week.
    0:52:13 Was it down a lot?
    0:52:16 No, but it was down.
    0:52:18 And that’s a start.
    0:52:23 As always, we want to know what you think of the episode.
    0:52:26 So drop us a line at TheGrayArea@Vox.com.
    0:52:30 And please rate, review, subscribe to the podcast.
    0:52:31 That stuff really helps.
    0:52:38 This episode was produced by Beth Morrissey, edited by Jorge Just, engineered by Patrick Boyd,
    0:52:43 fact-checked by Kim Eggleston, and Alex Overington wrote our theme music.
    0:52:47 New episodes of The Gray Area drop on Mondays.
    0:52:49 Listen and subscribe.
    0:52:51 This show is part of Vox.
    0:52:55 Support Vox’s journalism by joining our membership program today.
    0:52:58 Go to vox.com/members to sign up.
    0:53:13 And if you decide to sign up because of this show, let us know.
    0:53:15 [end]
    0:53:25 [BLANK_AUDIO]

    Where is your attention right now? Where was it a minute ago? A second ago? Where will it be a minute from now?

    One of the primary features of this age — the age of the internet and smartphones and algorithmic feeds — is that our attention is everywhere and nowhere at the same time.

    This is no accident. Our devices and apps are engineered to constantly alert us to things that are important and to things that are not. That’s because holding our attention is valuable. The time we spend reading, watching, and listening to content on our digital devices has been commodified, and that commodity is fueling the economy of the digital age.

    Today’s guest is Chris Hayes, the host of All In with Chris Hayes on MSNBC and author of The Sirens’ Call: How Attention Became the World’s Most Endangered Resource. Chris speaks with Sean about how the attention industry is changing our economy, our society, and ourselves.

    Host: Sean Illing (@SeanIlling).

    Guest: Chris Hayes, host of All In with Chris Hayes on MSNBC and author of The Sirens’ Call: How Attention Became the World’s Most Endangered Resource.

    Learn more about your ad choices. Visit podcastchoices.com/adchoices

  • 653: 3 Ways to Escape of the Rat Race

    AI transcript
    0:00:03 Here are three ways to get out of the rat race.
    0:00:05 Now getting out of the rat race is simple,
    0:00:07 but not necessarily easy to escape.
    0:00:10 All you need is monthly income from a source
    0:00:13 other than your job that exceeds your monthly expenses, right?
    0:00:15 Simple, but not always easy.
    0:00:16 I’m Nick Loper.
    0:00:17 You’re listening to The Side Hustle Show
    0:00:19 where we’ve been making day jobs optional.
    0:00:21 Since 2013 in this episode,
    0:00:22 I’m breaking down those three most common
    0:00:24 rat race escape routes,
    0:00:26 including the one at the end that got me out,
    0:00:27 the pros and cons of each
    0:00:30 and how to choose the right path for you.
    0:00:33 So remember that freedom equation is non-job income
    0:00:36 that is exceeding your monthly expenses.
    0:00:38 The three most common ways to generate that income
    0:00:40 are traditional investments,
    0:00:41 real estate and entrepreneurship.
    0:00:45 These are in contrast to and probably more realistic
    0:00:48 than the other paths that some people bank on
    0:00:50 like an unexpected inheritance,
    0:00:53 a lonely Nigerian prince or winning the lottery.
    0:00:56 But what’s maybe even more surprising is most people
    0:00:59 don’t really seem to have much of a plan at all.
    0:01:01 They’re just going through life day to day
    0:01:05 with the assumption and hope that someday they’ll retire,
    0:01:07 but it doesn’t really work that way.
    0:01:09 And if you’re not steering your own ship,
    0:01:12 I’m not sure you’re gonna ever get to where you wanna go.
    0:01:14 The first step in escaping the rat race
    0:01:17 is to figure out your actual monthly expenses.
    0:01:19 What is your lifestyle cost?
    0:01:22 This isn’t gonna be a lecture on extreme frugality,
    0:01:24 but at the very least spending with intention
    0:01:27 has gotta be a part of your rat race escape math.
    0:01:30 I mean, why set the bar unnecessarily high?
    0:01:33 And if you’ve never calculated how much you actually spend
    0:01:35 on a monthly basis, it’s worth taking a minute
    0:01:37 to figure that out.
    0:01:40 This was an exercise that my wife and I did last year.
    0:01:42 We kind of had this offsite retreat.
    0:01:45 We went through Monarch money and we were like,
    0:01:48 well, in our mind, we had a typical monthly budget,
    0:01:49 kind of like bare bones.
    0:01:51 Well, we know how much the mortgage is,
    0:01:53 we know how much we roughly spend on utilities
    0:01:55 and groceries, but why is the credit card bill
    0:01:56 always so much higher?
    0:01:58 And it’s like, well, we bought plane tickets
    0:02:01 or we had this repair charge.
    0:02:03 It was always higher than what it was.
    0:02:06 So we looked at, well, how much do we actually spend
    0:02:07 on a monthly basis?
    0:02:09 And the number is gonna be different for everybody.
    0:02:12 It might be $3,000, it might be $10,000.
    0:02:15 But how much does your lifestyle cost?
    0:02:17 That’s the income that you need to generate.
    0:02:20 That’s your rat race freedom number.
    0:02:21 So how do real people achieve it?
    0:02:24 The first way is to save your way out
    0:02:26 with traditional investments.
    0:02:29 This is probably the most commonly prescribed path
    0:02:32 to retirement, whether early or not.
    0:02:36 And this is stocks, bonds, mutual funds, ETFs,
    0:02:38 stuff like that, paper assets.
    0:02:42 And this is how retirement has worked for generations, right?
    0:02:45 Amass a big enough nest egg during your working years
    0:02:49 and then slowly draw down those savings
    0:02:50 after you stop working.
    0:02:52 And the problem is, if you’re listening to this,
    0:02:53 you probably don’t wanna wait decades
    0:02:55 until you’ve saved enough.
    0:02:56 Now, the fire movement,
    0:02:59 the financial independence, retire early movement
    0:03:00 has an alternative for you
    0:03:03 and argues that retirement isn’t an age.
    0:03:05 You don’t have to wait till you’re 65.
    0:03:06 It’s a number.
    0:03:10 It is 25 times your annual expenses in savings.
    0:03:12 This is from the Trinity study.
    0:03:13 I think it was like late ’90s.
    0:03:16 They did back testing on a bunch of 30-year scenarios
    0:03:19 in the market and said 95 times out of 100.
    0:03:22 If you’re starting nest egg is 25 times your expenses,
    0:03:23 you’re unlikely to run out of money,
    0:03:24 at least 95% of the time.
    0:03:26 And this is not set in stone.
    0:03:28 Like if the market has a series of bad years,
    0:03:30 you could adjust your expenses downwards
    0:03:32 to hopefully make it last a little bit longer.
    0:03:34 So what does that look like in real numbers?
    0:03:37 If you’re spending $40,000 a year,
    0:03:40 you could theoretically leave the rat race behind
    0:03:43 once you got a million dollars in traditional investments.
    0:03:46 You live off dividends and share price appreciation
    0:03:47 for decades under that scenario.
    0:03:49 If you spend $100,000 a year,
    0:03:50 you need two and a half million.
    0:03:52 Now, there are a few advantages
    0:03:54 to this kind of traditional investment,
    0:03:56 this traditional path to retirement.
    0:03:58 One is that these so-called paper assets
    0:04:00 are accessible to just about everyone.
    0:04:02 You can even invest right from your phone
    0:04:04 with any number of different brokerage apps,
    0:04:07 stocks, bonds, mutual funds, ETFs.
    0:04:08 They’re highly liquid,
    0:04:11 meaning you can buy and sell them quickly if you need to.
    0:04:15 And over the long run, they’ve performed historically well.
    0:04:19 Like projecting seven to 9% annualized returns
    0:04:21 would be realistic there.
    0:04:24 Their biggest drawback is trying to get out of the rat race
    0:04:27 with traditional investing either takes a lot of time
    0:04:31 to let compounding do its thing or a lot of money.
    0:04:33 Now, despite enthusiasm from the fire community,
    0:04:35 which I would consider myself a part of,
    0:04:39 I like that they put a milestone, an end game,
    0:04:41 a goalpost, something to reach,
    0:04:42 something that’s hopefully attainable
    0:04:46 and doesn’t have to be related to your age more or less.
    0:04:49 But the truth is, unless you have a really wide margin,
    0:04:52 a lot of profitability in your personal finances,
    0:04:54 that’s the gap between what you earn and what you spend.
    0:04:58 There’s really no shortcut to building up that nest egg.
    0:05:00 Plus, if you have unexpected expenses
    0:05:02 that pop up during retirement,
    0:05:05 your assumptions around withdrawal rates
    0:05:06 can probably go out the window.
    0:05:08 For the traditional investing path,
    0:05:10 if you go way back in the archives,
    0:05:12 you’ll find episode 105.
    0:05:15 This is probably from like year two or three of the show.
    0:05:18 This is with Jeremy Jacobsen from Go Curry Cracker,
    0:05:20 who retired in his late 30s,
    0:05:24 thanks to this high level of personal profitability
    0:05:25 that we’re talking about.
    0:05:29 – We were roughly saving 70-ish percent of income
    0:05:30 for quite a while.
    0:05:33 And then I probably worked three years too long
    0:05:36 and was saving nearly 100% of income at that point.
    0:05:39 If you’re just living off the dividends and interest,
    0:05:41 if you’re saving that percentage of income,
    0:05:43 it really only takes about 10-ish years
    0:05:46 in order to build up enough net worth
    0:05:47 to fund your lifestyle forever.
    0:05:50 – What kept you working those extra three years?
    0:05:53 Was it just the, like, can we really do this?
    0:05:55 – There’s a little bit of that.
    0:05:56 You know, maybe I’d call it fear.
    0:05:58 Nobody does this.
    0:06:00 Can we, yeah.
    0:06:02 Like, I’ve read stuff, but thinking you can do it
    0:06:05 and actually doing it are two very different things.
    0:06:09 – That mindset shift from saving and investing
    0:06:12 and accumulation to all of a sudden drawing down,
    0:06:14 intentionally bringing your earned income to zero,
    0:06:16 that’s a complete 180.
    0:06:19 And even if the math and the models
    0:06:21 and the projections say you’re gonna be fine,
    0:06:24 I think it’s a lot harder to pull the trigger in reality
    0:06:26 and just toss your career aside.
    0:06:27 So one thing that we’re trying to do
    0:06:28 that we’ve seen some other friends do
    0:06:33 is as that nest egg grows, scale back some working hours.
    0:06:35 Maybe you don’t jump off the cliff.
    0:06:37 Maybe you kind of start repelling down, you know,
    0:06:39 one level at a time.
    0:06:41 Maybe that means negotiating a working part time
    0:06:45 or only four days a week or transitioning to a role
    0:06:48 that is less demanding and requires much overtime.
    0:06:52 It’s reducing your income by baby steps
    0:06:54 rather than going cold turkey all at once.
    0:06:59 So who is this traditional investing route best for?
    0:07:02 I think this is the best way to escape the rat race
    0:07:07 for high earners who live a relatively inexpensive lifestyle.
    0:07:10 If you or you and your significant other
    0:07:13 bring in say $300,000 a year,
    0:07:16 but you only spend 50, this is a great option.
    0:07:18 Now, if you ignore taxes for a second,
    0:07:20 because that always throws a wrench in any time
    0:07:23 on the air math, but you can see how it would only take
    0:07:27 five years if I’m saving, if I’m profiting $250,000 a year
    0:07:29 as a household, it would only take five years
    0:07:32 to accumulate that one and a quarter million
    0:07:35 that I would need to support that $50,000 a year
    0:07:37 lifestyle in retirement.
    0:07:40 So that means if your work is tolerable,
    0:07:41 I think those five years are gonna fly by
    0:07:45 and that assumes you’re starting at $0 in savings today.
    0:07:48 Now, on the other hand, if you make $50,000 a year
    0:07:50 and you spend 49 of it,
    0:07:52 traditional investing is never gonna get you
    0:07:53 out of the rat race.
    0:07:56 There’s simply not enough savings margin there,
    0:07:57 which brings us to option number two,
    0:08:00 which is to beat the rat race with real estate
    0:08:02 and that’s coming up right after this.
    0:08:05 Did you know there’s a disease running rampant
    0:08:07 alongside hustlers and new entrepreneurs?
    0:08:10 It’s called super hero syndrome.
    0:08:12 Symptoms include a feeling like you gotta do
    0:08:14 everything yourself, thinking you’re the only one
    0:08:16 who can do it right and struggling to let go
    0:08:17 of certain tasks.
    0:08:19 Does that sound familiar to anyone?
    0:08:21 But the good news is there is a cure.
    0:08:23 Our sponsor indeed can help you find the best candidates
    0:08:26 for the roles you need to fill and find them fast.
    0:08:29 Stop struggling to get your job posts seen
    0:08:30 on other job sites.
    0:08:34 Indeed, sponsored jobs help you stand out and hire fast.
    0:08:37 With sponsored jobs, your post jumps to the top of the page
    0:08:39 for your relevant candidates so you can reach
    0:08:40 the people you want faster than ever.
    0:08:43 Sponsored jobs posted directly on Indeed
    0:08:45 get 45% more applications.
    0:08:47 Don’t let super hero syndrome hold you back.
    0:08:50 That’s why for my next hire, I’m using Indeed.
    0:08:51 There’s no need to wait any longer.
    0:08:54 Speed up your hiring right now with Indeed.
    0:08:58 Side hustle show listeners get a $75 sponsored job credit
    0:08:59 to get your jobs more visibility
    0:09:03 at Indeed.com/SideHustleShow.
    0:09:06 Just go to Indeed.com/SideHustleShow right now
    0:09:08 and support our show by saying you heard about Indeed
    0:09:09 on this podcast.
    0:09:13 Indeed.com/SideHustleShow terms and conditions apply.
    0:09:16 Hiring Indeed is all you need.
    0:09:19 If saving more and spending less is one of your top goals
    0:09:21 for 2025, you’re gonna wanna hear this.
    0:09:23 When I switched to premium wireless
    0:09:25 with our sponsor Mint Mobile,
    0:09:28 it added hundreds of dollars a year back to my bottom line.
    0:09:29 Why is that?
    0:09:31 Well, because Mint Mobile lets you maximize your savings
    0:09:33 with plans starting at 15 bucks a month
    0:09:35 when you purchase a three month plan.
    0:09:38 Even better, all their plans come with high speed data
    0:09:40 and unlimited talk and text delivered
    0:09:42 on the nation’s largest 5G network.
    0:09:45 You can use your own phone with any Mint Mobile plan
    0:09:47 and bring your existing phone number
    0:09:49 and all your existing contacts.
    0:09:50 They make it super easy.
    0:09:51 To get this new customer offer
    0:09:54 and your new three month unlimited wireless plan
    0:09:56 for just 15 bucks a month,
    0:09:59 go to MintMobile.com/SideHustle.
    0:10:02 That’s MintMobile.com/SideHustle.
    0:10:05 Cut your wireless bill to 15 bucks a month
    0:10:08 at MintMobile.com/SideHustle.
    0:10:10 $45 upfront payment required,
    0:10:12 equivalent to $15 per month.
    0:10:15 New customers on first three month plan only.
    0:10:18 Speeds slower above 40 gigabytes on unlimited plan.
    0:10:21 Additional taxes, fees and restrictions apply.
    0:10:23 See Mint Mobile for details.
    0:10:28 The second common rat race escape path is real estate.
    0:10:29 And for the sake of this episode,
    0:10:32 I’m gonna focus on rental property investing.
    0:10:34 Real estate comes in so many different flavors
    0:10:36 and strategies, many of which we’ve covered
    0:10:38 on the SideHustle show before,
    0:10:41 but we’re focusing on rental property investing in this one.
    0:10:44 So how real estate works to escape the rat race,
    0:10:46 it’s pretty easy to understand business model, right?
    0:10:48 You buy a house, you rent it out,
    0:10:50 and you pocket the difference between that rent
    0:10:51 and your monthly expenses.
    0:10:55 Your mortgage, your insurance, your maintenance costs, right?
    0:10:56 And lather, rinse and repeat
    0:10:59 until you got enough monthly cash flow to quit your job.
    0:11:00 This is what Dustin Heiner did,
    0:11:04 who for him, it was around 26 different properties
    0:11:07 and around $15,000 a month in reliable cash flow.
    0:11:11 He retired at age 37 and supports his family
    0:11:14 off the income from that rental property empire.
    0:11:17 So his big thing is invest with that monthly cash flow
    0:11:21 in mind and then use it to start slowly chipping away
    0:11:23 at your own living expenses.
    0:11:26 Again, another argument for keeping those expenses low
    0:11:27 ’cause the lower they are,
    0:11:30 the less properties, the less cash flow
    0:11:31 that you’re gonna need.
    0:11:33 Now rental property investing can accelerate your climb
    0:11:36 to financial independence in several important ways.
    0:11:39 First, you can take advantage of leverage.
    0:11:41 That’s borrowing money in contrast
    0:11:44 to traditional stock market investing
    0:11:45 that we talked about a minute ago
    0:11:49 where $20,000 buys you $20,000 worth of index funds.
    0:11:52 That same $20,000 could be used as a down payment
    0:11:55 to buy $100,000 or more worth of real estate.
    0:11:57 Then you can pay down that loan balance
    0:11:59 with the rental income that you receive
    0:12:01 over the next 30 years.
    0:12:03 The next big advantage of real estate
    0:12:04 is appreciation.
    0:12:06 As you know, houses tend to cost more today
    0:12:08 than they did a generation ago.
    0:12:10 By buying those properties,
    0:12:12 you can capture this appreciation when you sell
    0:12:15 or you can borrow against that equity
    0:12:18 in your houses to fund future acquisitions.
    0:12:19 This is another thing that Dustin talks about,
    0:12:22 kind of like recycling that initial down payment money
    0:12:25 as equity as future down payments.
    0:12:27 And third, being a landlord comes
    0:12:30 with a bunch of different tax advantages, tax benefits
    0:12:33 including the ability to write off your mortgage interest
    0:12:36 and even take depreciation on the buildings that you own.
    0:12:40 Finally, real estate can be a pretty passive income stream
    0:12:42 once you have your tenants and other relative
    0:12:44 or other relevant team members in place.
    0:12:46 Yes, there’s an up front time investment
    0:12:49 but no trading hours for dollars down the road.
    0:12:51 So what disadvantages should you be aware of?
    0:12:54 Well, home prices don’t fluctuate as wildly
    0:12:58 as the stock market but investing in physical assets,
    0:13:00 it does take more legwork and it also means
    0:13:02 that your cash isn’t as liquid.
    0:13:04 And by that I mean you can’t just push a button
    0:13:07 on your phone and sell a house when you need cash
    0:13:09 like you could do with an index fund.
    0:13:11 And although there are some creative ways
    0:13:14 to buy houses with no money down,
    0:13:16 like we talked about in our creative financing episode
    0:13:18 with Austin Miller, really fun episode,
    0:13:23 real estate is usually a takes money to make money option.
    0:13:25 And as a landlord, you’re also gonna face vacancies
    0:13:27 if the house sits empty,
    0:13:28 that erases any positive cash flow
    0:13:31 you were banking on that month, repairs and maintenance,
    0:13:34 roofs, windows, toilets, water heaters,
    0:13:35 nothing lasts forever, it all costs money.
    0:13:38 And if you’re the owner, it comes on you.
    0:13:41 There are unexpected expenses like our friends in California
    0:13:45 had to redo their foundation to the tune of like $90,000.
    0:13:47 There might be tenant issues that come up
    0:13:50 and why some humans think it’s acceptable
    0:13:51 to trash other humans property,
    0:13:53 other people’s houses is beyond me.
    0:13:56 But you might be lying if I said it didn’t happen.
    0:13:59 On top of that, your local real estate market
    0:14:00 might not be a great place to invest.
    0:14:03 So you might be dealing with all of this remotely
    0:14:05 or through a third party management service.
    0:14:07 So who’s real estate investing best for?
    0:14:10 The people I see having the best success with real estate
    0:14:12 are those who take a long-term view
    0:14:15 and are committed to operating multiple properties.
    0:14:17 I don’t think one house is gonna get you there.
    0:14:19 Now, especially if you can buy multiple properties
    0:14:22 in one location, there are some economies of scale
    0:14:25 that might make life easier than having only one house
    0:14:28 or having houses in different cities across the country.
    0:14:29 The reason for that is that way
    0:14:32 you can have one property management company
    0:14:34 or one general contractor person
    0:14:36 that you have as your go-to in that space
    0:14:39 versus having spread out in different cities
    0:14:40 all over the place.
    0:14:41 Now, as your empire grows,
    0:14:44 you’re also better able to absorb a vacancy here or there
    0:14:47 or an unexpected expense for a property or two
    0:14:48 in any given month.
    0:14:50 But just like traditional investing,
    0:14:53 real estate can and does work to escape the rat race
    0:14:56 if you have the capital, the patience and the fortitude
    0:14:57 to stay the course.
    0:15:00 If the idea of accumulating a portfolio
    0:15:02 of cash flowing real estate appeals to you,
    0:15:05 check out episode 387 with Dustin Heiner.
    0:15:07 It’s a super inspiring episode.
    0:15:09 And one of my favorite clips is where Dustin talks
    0:15:11 about getting laid off from his government job,
    0:15:13 his supposedly safe government job
    0:15:16 and the identity shift that happened after that.
    0:15:18 – I bought maybe two or three properties
    0:15:19 and I was really enjoying it.
    0:15:21 But at the same time, I was working a great job.
    0:15:24 I was working for the county government.
    0:15:25 And then I’m working from Monday to Friday,
    0:15:28 just one week back after my fourth child was born
    0:15:32 on Friday at 3.30 in the afternoon,
    0:15:35 I get a call from my boss’s boss’s secretary,
    0:15:37 like the top dog, his secretary gave me a call and said,
    0:15:41 “Hey, Dustin, the boss needs to see you come to the office.”
    0:15:42 I said, “Okay.”
    0:15:44 And hung up the phone and I sat there for a second like,
    0:15:46 “What is, why are they calling me?”
    0:15:48 And then as I’m sitting there,
    0:15:51 I start to think what could be calling about
    0:15:54 and oh my goodness, back before I left,
    0:15:56 I heard some rumors or some rumbling
    0:15:59 throughout the entire office about possible layoffs
    0:16:01 ’cause there wasn’t much money.
    0:16:03 And this was like 2009,
    0:16:04 you know, right after the crash,
    0:16:05 it eventually trickled down to the government.
    0:16:06 I’m me working for the government.
    0:16:07 I’m like, I should be fine.
    0:16:08 I have plenty of seniority.
    0:16:09 I’m doing really well.
    0:16:11 They’ve always gotten raises.
    0:16:14 And so I get up and I start walking down the hallway
    0:16:15 to the boss’s office.
    0:16:17 It feels like it’s a mile long
    0:16:18 because I’m just thinking,
    0:16:21 “What am I gonna do if I get laid off?”
    0:16:24 And as I’m walking, my feet feel like lead bricks.
    0:16:27 Like I just, it’s hard to take that next step.
    0:16:29 And each time my heart started pumping a little more
    0:16:31 because I started realizing,
    0:16:33 “My goodness, I have four kids.
    0:16:34 How am I gonna feed them?
    0:16:36 How am I gonna put a roof over my head?”
    0:16:40 And I get to where my boss’s office is, his door is closed.
    0:16:42 I turn the corner and I see the secretary.
    0:16:45 Sheepishly, she looks at me and kind of grins
    0:16:47 and says, “Dustin, would you please have a seat?”
    0:16:51 She knows exactly what’s gonna happen, what is happening.
    0:16:51 I don’t.
    0:16:53 And she’s trying to console me
    0:16:54 just by her eyes and her smile.
    0:16:55 She can’t tell me.
    0:16:56 So I sit down.
    0:16:57 And as I’m sitting there,
    0:17:00 I’m feeling like a pit in my stomach thinking,
    0:17:01 “Oh my goodness, this is probably it.”
    0:17:03 And I started realizing or thinking,
    0:17:06 “Am I a failure as a husband?
    0:17:08 Am I a failure as a father?
    0:17:10 Even as a man, am I a failure?”
    0:17:11 And as I think more and more,
    0:17:13 it’s literally like 30 seconds or a minute.
    0:17:14 It’s just sitting there.
    0:17:16 I start to sweat on my forehead.
    0:17:18 My hands get all clammy
    0:17:20 and then opens the door to my boss’s office.
    0:17:23 And now walks a lady with a piece of paper.
    0:17:25 She’s noticeably distraught, almost crying,
    0:17:27 but she’s not really not saying anything,
    0:17:28 holding this piece of paper and walking out
    0:17:29 and then my boss says, “Dustin,
    0:17:31 would you please come into my office?”
    0:17:34 And so I get up and go in and lo and behold,
    0:17:36 I get laid off.
    0:17:37 And who gets laid off from the government?
    0:17:38 Well, I did.
    0:17:40 I absolutely get laid off from the government.
    0:17:41 And so I take that piece of paper,
    0:17:44 I go back to my office and I realize two things.
    0:17:47 Well, number one, I realize that I need to provide
    0:17:48 for my family.
    0:17:51 And everything that I need to do from this point forward
    0:17:53 is to be able to provide for my family,
    0:17:54 my four kids, my wife.
    0:17:58 And so I was blessed within maybe like a week later,
    0:18:00 I was able to find another job in the county
    0:18:01 because I had a good reputations.
    0:18:03 So I got that, that was the number,
    0:18:05 that my job was to find a job and I did that,
    0:18:07 which is the first goal.
    0:18:11 The second thing was I needed to never, ever
    0:18:15 let this happen to me again outside forces,
    0:18:18 causing me to not be able to provide for my family.
    0:18:20 So what I decided to do was that point,
    0:18:21 as I’m literally sitting in my desk
    0:18:22 right after I got laid off,
    0:18:26 the second thing I realized, I am now an investor.
    0:18:27 Even though I had two or three properties,
    0:18:29 I was just a side hustle.
    0:18:31 I realized I am now an investor,
    0:18:35 even though like 98% of my income comes from my side job,
    0:18:37 it’s now my side job.
    0:18:39 Even though when 98% of the money comes from it,
    0:18:41 my value is in what I give myself.
    0:18:43 And so what we usually say and what I would always say
    0:18:45 if somebody says, “Hey, Dustin, what do you do?”
    0:18:47 Basically, what do you put value in?
    0:18:49 I would always say, I work for the county government,
    0:18:51 doing IT work of the county government.
    0:18:53 No longer did I ever say that after that.
    0:18:56 I said, “I am an investor in real estate rental properties.”
    0:19:00 So from there, I worked every single penny
    0:19:01 into another property.
    0:19:03 I was frugal.
    0:19:04 We only took one vacation a year,
    0:19:06 which was driving from California to Arizona
    0:19:07 to see the in-laws for Christmas.
    0:19:09 That was the only vacation we didn’t eat out.
    0:19:14 And so in making that transition, this was my goal.
    0:19:17 I said, “No longer am I ever gonna let this happen to me.”
    0:19:19 And so I strove every single day,
    0:19:21 every single week to get that next property
    0:19:23 and that next property and the next property.
    0:19:26 So to actually taking that leap, honestly,
    0:19:27 it was a little hard to leave that W2,
    0:19:31 a lot of hard to leave that stable W2 job once I had it.
    0:19:34 But once I realized I am losing money here,
    0:19:36 my value is so much more than this.
    0:19:38 And I’ll be honest, now that I quit my job,
    0:19:42 it was so amazing to see how much more money
    0:19:43 I can make when I work for myself.
    0:19:46 So for everybody listening, that’s my processes.
    0:19:49 I had to change my value in myself.
    0:19:51 No longer am I working for the government.
    0:19:54 No, I’m an investor with a side job.
    0:19:56 Same thing with you, you’re a side hustle.
    0:19:57 Whatever your side hustle is,
    0:19:58 if you wanna turn that into your job
    0:20:00 and you wanna take that leap,
    0:20:03 literally change your vision and your value of yourself.
    0:20:05 And that’s what got me to where I am today.
    0:20:07 – Yeah, this is like the identity habit.
    0:20:09 This is a really powerful thing.
    0:20:12 That subtle shift from I’m a worker first
    0:20:13 to I’m an investor first.
    0:20:15 So I appreciate you sharing that.
    0:20:18 Again, that’s from episode 387.
    0:20:20 Now, to be fair, for every Dustin,
    0:20:22 for every evangelist for real estate,
    0:20:25 there’s at least an equal number of burnt out landlords
    0:20:28 who buy into the leverage and tax advantages
    0:20:29 and cash flow of real estate
    0:20:32 only to get chewed up and spit out along the way.
    0:20:35 No businesses without risk and headaches in real estate
    0:20:39 is one that often gets oversimplified and oversold.
    0:20:40 It can definitely work.
    0:20:42 It can be a great inflation hedge, great tax shelter,
    0:20:45 but it is real world inventory
    0:20:46 with humans involved.
    0:20:49 It’s a model that I got really excited about in college,
    0:20:51 even bought my first rental property,
    0:20:54 but perhaps didn’t have the intestinal fortitude
    0:20:56 to stick it out over the long run.
    0:20:58 And that’s one reason that I have shied away
    0:21:01 from direct investment in recent years,
    0:21:04 instead relying on alternatives like Fundrise
    0:21:06 where you can begin adding some real estate
    0:21:08 to your portfolio for as little as $10.
    0:21:13 I’m an affiliate of an investor in Fundrise since 2015.
    0:21:15 Their model appealed to me as a way to benefit
    0:21:18 from real estate in a way that’s diversified,
    0:21:19 that’s totally hands-off.
    0:21:22 It does not come with the leverage benefits,
    0:21:23 at least directly, right?
    0:21:25 You know, $10 in is $10 in,
    0:21:28 but this is one that has appealed to me.
    0:21:29 Again, real estate, it’s a long-term game.
    0:21:31 So it might take eight to 10 years,
    0:21:32 like in Dustin’s case,
    0:21:34 to build up that portfolio to the point
    0:21:36 where it is exceeding your expenses
    0:21:37 and helping you escape the rat race.
    0:21:39 A friend of mine put it this way,
    0:21:42 like with traditional savings and investments
    0:21:43 done right and done well
    0:21:47 with a reasonably high personal profitability margin,
    0:21:50 it might take 15, 20 years to reach your fire number,
    0:21:52 which would still mean retiring
    0:21:55 or achieving financial independence way earlier than most,
    0:21:58 like in your early to mid ’40s, which is fantastic,
    0:21:59 but with real estate,
    0:22:02 it might take eight to 10 years of concerted effort
    0:22:04 buying a house every year or two,
    0:22:06 stacking leverage, stacking cash flow.
    0:22:10 And with our third and final rat race escape option,
    0:22:11 it might take three to five years.
    0:22:13 And that’s entrepreneurship.
    0:22:15 The third way to get out of the rat race
    0:22:17 is to build your own business.
    0:22:19 If you look at the Forbes 400 list
    0:22:21 of the richest people in the country,
    0:22:23 one thing should stand out to you.
    0:22:26 Most of them built their wealth through entrepreneurship.
    0:22:29 And even if you have no aspirations to build
    0:22:32 the next Amazon or Apple or Tesla or Facebook,
    0:22:35 like I don’t have those aspirations either,
    0:22:38 but building a business is a realistic way
    0:22:40 to break out of the nine to five grind.
    0:22:43 That’s how I was able to walk away from corporate America
    0:22:45 years before starting Side Hustle Nation.
    0:22:48 Entrepreneurship has helped probably thousands of friends,
    0:22:50 Side Hustle Show listeners,
    0:22:54 Side Hustle Nation readers do the same at this point.
    0:22:58 So how entrepreneurship works to escape the rat race
    0:23:00 is pretty simple.
    0:23:01 We tend to overcomplicate it,
    0:23:02 but I’m gonna try and break it down here.
    0:23:05 So a business is simply a system
    0:23:07 that solves a problem in exchange for money.
    0:23:10 It’s a problem solving machine.
    0:23:13 And the good news is we’re all natural born problem solvers.
    0:23:15 It’s what we do all day every day.
    0:23:18 That means to come up with a business idea,
    0:23:20 what you really need to come up with is a problem.
    0:23:22 So you can think of what frustrates you,
    0:23:25 what headaches or challenges that you’ve overcome,
    0:23:27 what other people complain to you about,
    0:23:29 because on the other side of those problems,
    0:23:31 there might be a business idea.
    0:23:34 Now the solution is usually gonna take one of three forms.
    0:23:37 First, a service that makes that problem go away
    0:23:39 in the example of a dirty house.
    0:23:41 You can hire a cleaning service.
    0:23:44 Number two, a product that makes that problem go away
    0:23:45 if you got a dirty house.
    0:23:48 So you can go buy cleaning supplies and cleaning products.
    0:23:50 And number three is content
    0:23:51 that makes that problem go away.
    0:23:53 Got a dirty house so you can watch YouTube videos
    0:23:56 on how to organize and optimize your space, right?
    0:23:58 And when the money from your solution
    0:23:59 starts to exceed your living expenses,
    0:24:02 that’s when you say goodbye to the rat race.
    0:24:05 I break down each of these three business models in detail
    0:24:06 with lots of examples in my book,
    0:24:08 The Side Hustle, how to turn your spare time
    0:24:10 into $1,000 a month or more.
    0:24:11 It’s free on Kindle.
    0:24:12 I’ll link it up in the show notes.
    0:24:14 It is due for a refresh or an update,
    0:24:16 which is on my to-do list for the year.
    0:24:18 And you’ll also find lots of side hustle ideas
    0:24:20 throughout the archives for this show.
    0:24:23 I did an episode earlier this month
    0:24:26 on seven different idea generating frameworks.
    0:24:27 If you might find that helpful
    0:24:29 if you’re in the idea seeking phase,
    0:24:32 that is episode 650 in your archives.
    0:24:34 So what’s so great about entrepreneurship?
    0:24:37 Building a business is unique of these three paths
    0:24:40 in that your primary investment
    0:24:41 is probably gonna be sweat equity.
    0:24:43 These days, you can get an enterprise off the ground
    0:24:45 for a very low startup cost.
    0:24:49 And thinking back to my own 15, 20 years here,
    0:24:52 just about everything I’ve started costs less than 500 bucks,
    0:24:56 at least for that initial validation and testing phase.
    0:24:59 On top of that, starting a business is a way to work
    0:25:00 on something that you care about.
    0:25:03 It’s bringing an idea into the world
    0:25:05 that’s exciting and rewarding
    0:25:10 in a way that collecting stock dividend payments just isn’t.
    0:25:13 And in contrast to the stock market or real estate market,
    0:25:15 you’ve got considerably more control
    0:25:18 over the succession failure of a business that you own
    0:25:20 and the speed at which that can happen.
    0:25:23 Plus, if you intentionally build something with scale,
    0:25:26 you’ll find entrepreneurship to be pretty time leveraged.
    0:25:28 By that, I mean your earning power
    0:25:31 or your effective hourly rate improves
    0:25:32 as the business grows.
    0:25:35 For example, Becky Beach put a lot of time,
    0:25:37 six years into her online business
    0:25:39 before getting up the nerve to quit her day job,
    0:25:42 but she built it intentionally with that leverage in mind.
    0:25:45 – I started getting 250,000 pages a month
    0:25:47 and lots of traffic to my printables
    0:25:49 and my sales phones were doing so well
    0:25:52 I was getting like up like 20,000 at the time.
    0:25:55 So I decided to quit my job and that was two years ago.
    0:25:57 Like it was hard at first because like I was living in fear,
    0:25:59 I didn’t think I could do it.
    0:26:01 You know, I thought my business would just end the next day
    0:26:03 or something if I quit.
    0:26:05 But I just went all in and told my boss,
    0:26:06 hey, I’m gonna be doing my own thing right now
    0:26:07 and I need to like be quit.
    0:26:09 And at first he was like, oh, don’t quit.
    0:26:09 You know, it’s not a good idea.
    0:26:11 I don’t think that’s smart.
    0:26:13 So then I just went ahead and did it anyway.
    0:26:16 – And I haven’t looked back since.
    0:26:18 So that’s very exciting and it really cool
    0:26:20 to build something up to that point.
    0:26:23 We’re able to have that opportunity,
    0:26:24 to have that flexibility and say,
    0:26:27 look, I’ve got this other thing that’s working.
    0:26:28 I don’t need this job.
    0:26:30 My role was five bad meetings or something,
    0:26:32 five bad days at work until I’m out of here,
    0:26:33 something like that.
    0:26:35 So I think that makes a ton of sense.
    0:26:39 So mombeach.com kind of plays in the mom blog space,
    0:26:40 the personal finance space,
    0:26:43 and talk to me about what’s ringing the cash register
    0:26:46 in terms of the digital products you mentioned,
    0:26:48 the printables, what’s going on over there
    0:26:50 in terms of how the site is earning revenue.
    0:26:53 – The first I was relying on ad income and affiliates alone.
    0:26:56 But when I started also making digital products,
    0:26:59 it just exploded and since making them with AI,
    0:27:00 it’s even went further
    0:27:02 ’cause I’m able to crank out even more.
    0:27:04 The digital products are just doing so well.
    0:27:06 Like people will just visit my blog out of the blue,
    0:27:07 like I don’t even know them really.
    0:27:08 They just come in, they’re just like,
    0:27:10 they’re just internet randos.
    0:27:12 They come and purchase and I have all these sales funnels
    0:27:14 set up like freebie opt-ins
    0:27:15 where they sign up with their email.
    0:27:17 They’re redirected to a sales page
    0:27:19 that leads to my Shopify store.
    0:27:20 – Okay, that’s interesting.
    0:27:22 I have always thought of Shopify
    0:27:24 as a physical product, e-commerce platform,
    0:27:27 but you can use it for digital products as well.
    0:27:30 So that’s the visitor flow through SEO,
    0:27:32 through Pinterest, they come to your site,
    0:27:34 download some freebie and then the digital products
    0:27:38 are largely like an email-based upsell after the fact.
    0:27:40 – They are ’cause I get quite a lot of traffic
    0:27:42 and they come in, they sign up to my freebies
    0:27:44 and the freebies are like free printables.
    0:27:45 Like I have a budgeting planner
    0:27:47 and I got a home planner for people
    0:27:50 to organize their homes like specifically moms.
    0:27:52 And I just get like so many leads
    0:27:54 like every single day from these freebies.
    0:27:56 And then they’re directed to a sales page
    0:27:59 ’cause in ConvertKit you can actually make them
    0:28:01 go to a sales page after they sign up to the email.
    0:28:03 And I just put the sales page there
    0:28:05 and then they buy a product off my Shopify store
    0:28:07 ’cause you can actually put your cart
    0:28:09 in your Shopify store right on the sales page
    0:28:11 so they can click a link and get to the cart.
    0:28:12 – Okay, okay.
    0:28:14 So give me an example of like,
    0:28:17 let’s talk about this budgeting planner, for example.
    0:28:18 Talk to me about top of the funnels.
    0:28:20 How does somebody discover that?
    0:28:23 Is this like ranking in Google for those types of terms?
    0:28:25 – Yes, I make user-specific content
    0:28:27 people are searching for that solves problems.
    0:28:31 Say like a saving money post or a money-making post
    0:28:33 and people are searching for these problems on Google
    0:28:36 and I use long-tail keywords.
    0:28:38 And then I create the piece of content.
    0:28:40 I’ve also been utilizing chat GPT lately
    0:28:41 to help me create content.
    0:28:44 Like I’ll use it to make like a blog outline.
    0:28:47 It makes it so much faster to create content now.
    0:28:49 – So I’m trying to find an example of one of those posts
    0:28:50 but how to save money.
    0:28:53 And here’s a list of ideas on how to solve this problem.
    0:28:55 And by the way, if you’re trying to save money
    0:28:57 you probably need this budgeting planner.
    0:28:58 Here’s my free template.
    0:29:01 And then after somebody puts in their email for that,
    0:29:04 boom, sales page for something more advanced
    0:29:06 or what’s on the sales page or what’s the digital product?
    0:29:08 – Well, for instance, one of my posts that are new
    0:29:11 are 30-day money-saving challenge.
    0:29:13 And then that post have the budgeting planner
    0:29:15 and then when they subscribe to the planner
    0:29:16 they get it sent to their email
    0:29:20 and they’re also directed to a budgeting spreadsheet.
    0:29:21 – The spreadsheet is for purchase.
    0:29:23 – Yes, like I’ll have the spreadsheet for purchase.
    0:29:26 I also sell spreadsheets in my Shopify store as well.
    0:29:28 – You can learn more about Becky and her business
    0:29:30 in episode 582.
    0:29:33 But the idea is creating something once
    0:29:34 that you can sell over and over again.
    0:29:36 That’s the leverage that’s built
    0:29:39 into a digital product business, a content business.
    0:29:41 And of course there are other business models too
    0:29:43 but with each of them I think it’s important
    0:29:46 to think about how it might go one to many.
    0:29:50 How you might be able to leverage your specific skills
    0:29:51 and expertise to build systems
    0:29:53 and serve lots of different customers.
    0:29:55 One book I might recommend on this topic
    0:29:58 is MJ DeMarco’s Millionaire Fastlane.
    0:30:01 If you can get past all this talk about fancy cars
    0:30:03 which didn’t really appeal to me at all,
    0:30:05 the underlying foundations and ideas in the book
    0:30:07 I think are really strong.
    0:30:09 That’s Millionaire Fastlane.
    0:30:10 With job security in question
    0:30:15 and this shift towards a more on-demand freelance workforce
    0:30:17 it’s hard for me to see the downsides
    0:30:19 in learning an entrepreneurial skill set
    0:30:22 but still the fact remains that half of small businesses
    0:30:24 fail in the first five years.
    0:30:26 For that reason, it’s important to start small,
    0:30:28 to minimize your expenses
    0:30:31 and to grow at a pace you’re comfortable with.
    0:30:34 And if that failure happens to you, if you’re in that 50%
    0:30:37 you can dust yourself off and start again.
    0:30:40 Building a business can be labor intensive
    0:30:42 and that’s why many entrepreneurs find themselves
    0:30:44 in the trap of working in the business
    0:30:45 rather than on it.
    0:30:48 They feel like, well, I just built myself a job
    0:30:51 and only this one has an even more demanding boss
    0:30:53 that’s even harder to walk away from.
    0:30:55 So that’s the question that you have to ask,
    0:30:56 what if this works?
    0:30:58 If the business I’m starting works,
    0:31:00 well, what does success look like?
    0:31:03 And maybe you can find somebody who’s walked that path
    0:31:05 there three to five years ahead of you.
    0:31:07 Well, what does their day-to-day look like?
    0:31:08 Do they have the income that they desire?
    0:31:11 Do they have some freedom and flexibility in their life?
    0:31:14 Or are they still stuck working 60 hour weeks
    0:31:16 or they’re super stressed all the time?
    0:31:17 What’s the end game?
    0:31:18 And is that gonna be a win from you?
    0:31:20 If you build with intention from the start,
    0:31:22 I think it’s easier over the long run.
    0:31:25 Certain models are faster and easier
    0:31:26 to see initial results with,
    0:31:29 but can be harder to scale and remove yourself
    0:31:31 from delivery over time.
    0:31:34 I think freelancing is probably the prime example of this,
    0:31:35 freelancing your skills.
    0:31:37 Totally viable, side hustle,
    0:31:40 one that I recommend all the time, but it can be tough.
    0:31:43 Not impossible, but hard to get out of trading time for money
    0:31:47 if clients are used to hiring your special skills
    0:31:48 and expertise.
    0:31:50 Again, hard to get out of, but not impossible
    0:31:52 if that’s an ultimate goal of yours.
    0:31:54 Some people just love doing the work and that’s totally fine.
    0:31:56 So who is this third path best for?
    0:31:59 Who is the entrepreneurship path best for?
    0:32:02 I believe it is the most realistic rat race escape path
    0:32:05 for most people, but especially for those
    0:32:08 who don’t have the quote unquote golden handcuffs
    0:32:11 of a great paying job that’s harder to walk away from
    0:32:14 and might be more apt to take path number one,
    0:32:17 the traditional saving and investment path.
    0:32:18 People who aren’t afraid of failure,
    0:32:19 you’re probably gonna get punched in the face
    0:32:21 along the way in this entrepreneurship path
    0:32:26 and or a little impatient, definitely the fastest path
    0:32:28 if it works.
    0:32:30 And so it’s like, well, that’s where it combines this.
    0:32:33 Well, I’m not afraid of failure and I’m impatient.
    0:32:34 I don’t wanna wait 20 years
    0:32:37 to do this traditional savings path.
    0:32:38 So entrepreneurship appealed to me
    0:32:40 ’cause I couldn’t fathom the reality
    0:32:43 of working a corporate job for the next 30 years.
    0:32:45 There had to be a better way.
    0:32:48 And there was and I think there is for you too.
    0:32:51 And of course, if the entrepreneurship path is for you,
    0:32:53 there are hundreds of side hustle show episodes
    0:32:54 to choose from.
    0:32:55 You can pretty much scroll through
    0:32:58 and pick the ones that sound most interesting to you.
    0:33:00 They’re all great.
    0:33:02 I learned so much from each and every guest.
    0:33:03 If there’s a specific topic
    0:33:05 that you’d like me to cover in the future,
    0:33:07 be sure to reach out and let me know.
    0:33:10 Nick at sidehustlenation.com is my direct email.
    0:33:12 And if you’re not sure where to start,
    0:33:14 I encourage you to hit up hustle.show.
    0:33:16 This is where you can answer a few short,
    0:33:17 multiple choice questions.
    0:33:18 You can do it on your phone
    0:33:21 and the system will build you a personalized playlist
    0:33:23 based on where you’re at, what you’re interested in,
    0:33:24 where you wanna go.
    0:33:25 Again, hustle.show
    0:33:29 for your custom curated side hustle show playlist.
    0:33:30 Big thanks to our sponsors
    0:33:32 for helping make this content free for everyone.
    0:33:35 As always, you can hit up sidehustlenation.com/deals
    0:33:38 for all the latest offers from our sponsors in one place.
    0:33:40 Thank you for supporting the advertisers
    0:33:41 that support the show.
    0:33:43 That is it for me.
    0:33:45 Thank you so much for tuning in.
    0:33:46 If you’re finding value in the show,
    0:33:48 the greatest compliment is to share with a friend,
    0:33:51 to fire off that text message with your fellow friend
    0:33:52 who wants to get out of the rat race
    0:33:53 and help spread the word.
    0:33:55 Until next time, let’s go out there
    0:33:56 and make something happen
    0:33:58 and I’ll catch you in the next edition
    0:33:59 of the Side Hustle Show.

    Getting out of the rat race is simple, but not necessarily easy.

    To escape, all you need is monthly income — from non-job sources — that exceeds your monthly expenses.

    For example, if you spend $3000 a month, you’ll need to bring in at least $3000 (after taxes) outside of your day job.

    Simple, but not always easy.

    In this episode, I’ll break down:

    • the most common rat race “escape routes”
    • the pros and cons of each
    • how to choose the right path for you

    Ready? Let’s do it!

    Full Show Notes: 3 Ways to Get Out of the Rat Race

    New to the Show? Get your personalized money-making playlist here!

    Sponsors:

    Airbnb — Discover how much your home could be worth and find a professional co-host today!

    Mint Mobile — Cut your wireless bill to $15 a month!

    Indeed – Start hiring NOW with a $75 sponsored job credit to upgrade your job post!

    OpenPhone — Get 20% off of your first 6 months!

    Gusto — Get 3 months free of the leading payroll, benefits, and HR provider for modern small businesses!