Visa processes an astounding number of transactions each year, which means even small improvements in its systems can have a significant impact. Sarah Laszlo, senior director of Visa’s machine learning platform, joins the AI Podcast to discuss how AI is transforming the way the company serves its global customer base, from advancing fraud prevention to enhancing personalization. She also shares valuable insights for enterprises looking to adopt generative AI, emphasizing the importance of using open-source models and fostering strong relationships between governance and technical teams.
Category: Uncategorized
-
How to retire with millions (and pay $0 taxes)
AI transcript
0:00:05 I’ve now met so many regular people with eight and nine figure Roth IRAs on which you won’t
0:00:06 owe any taxes at all.
0:00:09 And a lot of them have done that with vanilla investing.
0:00:24 I think that we’re going to do basically like first half of this is like money tips,
0:00:28 meaning like, you know, sort of like personal finance stuff that nobody really teaches you,
0:00:33 you know, cause we get most of our personal finance information either in from two really
0:00:37 bad sources, either your parents who probably only knew so much.
0:00:39 And like, when you’re a little kid, you think your dad knows everything.
0:00:41 But then when you grow up, you’re like, oh, he’s just a guy.
0:00:43 And like, you know, he did the best he could with what he had.
0:00:47 And the other thing is like, you know, TV or books or salespeople who was like, their incentive
0:00:49 is very different than yours.
0:00:49 Right.
0:00:53 Like one of my favorite YouTube videos is like about, it has a picture of Jim Cramer and it
0:00:57 says, will this, will this very loud man make me money?
0:00:58 And it’s like, no, he will not.
0:01:02 The blinking lights on this guy’s screen and that this money guy on TV is not the guy
0:01:02 to listen to.
0:01:03 So we’ll do that.
0:01:06 And then you had a bunch of business ideas also like that you’ve been brainstorming your
0:01:07 cheat sheet of startup ideas.
0:01:08 So we’ll go to that after.
0:01:10 But Sean, do you, do you guys know each other?
0:01:12 Well, by the way, I don’t even know.
0:01:13 We’ve never hung out in person.
0:01:14 We’ve never hung out.
0:01:14 We’ve exchanged.
0:01:16 We’ve tried to hang out.
0:01:17 Haven’t hung out.
0:01:19 The last episode was the closest we’ve gotten.
0:01:21 I like you every time we talk.
0:01:21 Yeah.
0:01:23 So let me introduce you, Sean.
0:01:24 This is my friend, Anker.
0:01:25 Anker, this is my friend, Sean.
0:01:29 I’ve known Anker since 2012 or 2013.
0:01:31 And we’ve been buddies since.
0:01:34 And I’ve known Sean since 2013, 2014.
0:01:38 I think I’m one of the only people at this point that’s met Sam through the internet who’s
0:01:39 had beers with him.
0:01:40 That’s how far back we go.
0:01:41 It was a long way.
0:01:43 It was a long time ago.
0:01:45 So give his background, his bio.
0:01:50 So when I met Anker, he was doing a Facebook app that went viral.
0:01:54 And he made his first couple million dollars at the age of like 26 or 27.
0:01:56 No, bro, 20.
0:01:57 What was the app?
0:01:59 All the silly stuff.
0:02:04 Personality quizzes, friend quizzes, how like answer seven questions and you find out how
0:02:05 good a kisser you are.
0:02:06 Crazy stuff like that.
0:02:09 Then he started a company called Teachable.
0:02:10 He was early on the course game.
0:02:13 He didn’t quite bootstrap that company.
0:02:18 But he owned the majority of it and sold it for something like $150 million, which I’m
0:02:20 sure you’re going to correct me if I was too low.
0:02:22 And then now you have a new company called Carrie.
0:02:23 Was I too low?
0:02:25 It’s $250, but close enough.
0:02:26 Oh, no, that’s not even it.
0:02:28 It’s all around a year at that point.
0:02:32 But you made like $100 million when you were, what, 32 years old?
0:02:33 Yep.
0:02:34 I mean, that’s insane, right?
0:02:39 Did it feel more insane to make a million dollars on a silly Facebook app?
0:02:41 Like, how good of a kisser are you?
0:02:45 Or was it more insane to make $100 million on this company that you, you know, this education
0:02:46 company?
0:02:50 Zero to something is always more life-changing, undoubtedly, right?
0:02:52 Like, that was also the time where I just moved to America.
0:02:54 I was an international student.
0:02:57 I didn’t really have, I was so new to everything.
0:03:01 But like making money in college allowed me to like not go to class and like realize that
0:03:05 if I can do things on the internet, like why, why ever get a regular job?
0:03:09 How much money did you have in your bank account before you sold Teachable?
0:03:11 Like one and a half, two.
0:03:12 Like basically I was flat.
0:03:16 I didn’t really spend money or grow money during that entire sort of time.
0:03:22 Paid myself eventually a salary of $150 grand, which in New York was like roughly my life break
0:03:22 even.
0:03:27 So then they, when you, when you got your, uh, exit that you just moved the decimal points
0:03:29 to place like two points, right?
0:03:30 Yeah, basically.
0:03:34 I mean, we’ve made like little bit over 40 in cash.
0:03:37 The rest is equity, but the equity is pretty nice.
0:03:39 It’s just this random thing that, you know, keeps paying you out.
0:03:41 I love that you’re open with money.
0:03:42 I’m almost out of question.
0:03:42 What else can we ask?
0:03:44 He’s divulging everything.
0:03:45 This is amazing.
0:03:50 Well, you sent us this list of topics and you said, when I sold my company, I ran an A-B test
0:03:50 with the money I made.
0:03:55 I gave half to Goldman Sachs to invest and I invested the other half yourself, I assume.
0:03:57 Can you just talk about this?
0:03:58 Why’d you do this?
0:03:59 And what was the result?
0:04:05 So like a lot of founders that first sell a business, most of us know nothing about money.
0:04:07 I don’t know, Sam, if you knew much or whatever.
0:04:11 And you get hit up by every single one of the private wealth people.
0:04:15 They’re always in your emails and, you know, you read the big brand, Goldman Sachs, Morgan
0:04:15 Stanley.
0:04:16 So I talked to them.
0:04:19 I couldn’t fully tell how legit they were.
0:04:22 And I had enough money where I’m like, you know what?
0:04:23 Let me take a sizable chunk.
0:04:26 Let me give them that amount of money and see what they do.
0:04:31 The other half, let me try and learn and figure stuff out myself.
0:04:35 And went through sort of the standard startup, post-exit startup founder thing where you invest
0:04:36 probably in too much.
0:04:37 You don’t say no enough.
0:04:41 You go through a period where you think you’re an investing genius.
0:04:45 But it was cool to sort of see that develop over five years and now to have actual results.
0:04:47 And what were they?
0:04:48 What are the results?
0:04:54 The results are, in retrospect, there’s nothing that special about private banking.
0:04:56 There’s a lot of clout that comes with it.
0:04:59 But you look at the sort of portfolios that do, they’re very vanilla.
0:05:04 And they charge you anywhere between 50 basis points to, you know, 120 basis points.
0:05:08 The best part of my portfolio that they did was simply indexing the S&P.
0:05:11 The S&P has returned about 13% in that time.
0:05:14 They did a lot of other stupid shit that averaged it down to 6%.
0:05:16 What sort of stupid shit did they get you into?
0:05:18 Was it like exotic?
0:05:20 You know, was it random exotic things?
0:05:20 Was it real estate?
0:05:22 What are they putting you into?
0:05:24 And were you saying yes or no on a case-by-case basis?
0:05:25 Or they were making decisions?
0:05:30 So they’re clear, they brought your average down to 6%?
0:05:31 They did, yeah.
0:05:35 Dude, that’s like the meme where there’s like a guy like putting a stick in his own front
0:05:36 spoke as he’s riding his bicycle.
0:05:37 Yeah.
0:05:42 But again, the insane part is I genuinely think now that I know more about this, like you can’t
0:05:46 fully judge the performance based on like five great years.
0:05:51 Like theoretically, that portfolio should protect your ass in down years of the market, right?
0:05:53 We’ve also had an atypically good time.
0:05:58 But the things that were rough with them is one, for a lot of products, they charged over
0:06:02 100 basis points to basically do what Vanguard funds could have done.
0:06:08 They also put a lot of fixed income, which is bonds and stuff that it’s a 31, 32 year old
0:06:11 with like infinite risk tolerance doesn’t really make sense, right?
0:06:15 So eventually I got them out of doing a lot of those things.
0:06:21 But the whole process of kind of working with them, you realize all these people, their product
0:06:25 has such insanely good margins that they can afford to do whatever.
0:06:30 And they really sort of sell a lifestyle that now that I know more, it doesn’t make sense
0:06:30 for me.
0:06:35 But for a lot of people who are new to the idea of having money, it’s kind of a, you know,
0:06:37 it’s a, it’s a status thing as well, right?
0:06:37 Like, what did you?
0:06:39 What were the, what are the perks of the status?
0:06:40 I mean, you’re a single guy.
0:06:43 Surely you’re not walking around with a t-shirt that says, I work with Goldman.
0:06:46 Like what, what, like what’s this or what, what did you do?
0:06:49 It didn’t, it didn’t make sense for me.
0:06:53 Like for me, I never evaluated that much, but yes, they have a, you know, a concierge team
0:06:57 that can, you know, get your tickets to the U S open and like take you out for nice dinners
0:06:58 whenever you want.
0:07:03 I was the wrong target demographic because when they wanted to get dinner, I’m like, oh fuck,
0:07:04 it’s another person to hang out with.
0:07:08 But there’s a lot of people who really enjoyed that side of it.
0:07:10 They’re like, these tech guys are the best.
0:07:11 They’re introverts.
0:07:13 They don’t even take us on any of the free shit.
0:07:14 We just get to charge up the fees.
0:07:19 Wait, so that’s a really expensive U S open seat.
0:07:20 Correct.
0:07:21 I mean, you do the math, right.
0:07:26 And because you get charged a percentage of your assets, you never feel the pain, but you’re
0:07:28 paying them six figures a year.
0:07:33 If you had to cut them a check for $10,000 a month, you’re like auditing that all the time.
0:07:38 But that is what is so genius about this business model where you basically just keep running
0:07:39 through this.
0:07:44 And over the course of your lifetime, you end up with many, many millions of dollars in fees.
0:07:47 And it’s not just the fees, those dollars would otherwise be invested.
0:07:52 So whenever you run the math, you realize your ending portfolio is probably $10 million smaller
0:07:54 20 or 30 years from now.
0:08:00 You don’t become the world’s most valuable women’s sports franchise by accident.
0:08:03 Angel City Football Club did it with a little help from HubSpot.
0:08:07 When they started, data was housed across multiple systems.
0:08:12 HubSpot unified their website, email marketing, and fan experience in one platform.
0:08:16 This allowed their small team of three to build an entire website in just three days.
0:08:23 The results were nearly 350 new signups a week and 300% database growth in just two years.
0:08:27 Visit HubSpot.com to hear how HubSpot can help you grow better.
0:08:28 All right, back to the pod.
0:08:31 But what’s the argument against this?
0:08:33 Because they’ve been around for 100 years.
0:08:33 They’re huge.
0:08:35 A lot of smart, rich people use it.
0:08:36 Surely they’re, I mean, what’s the value?
0:08:42 I think the sweet spot, and in fact, I think our friend Ramit talks about this a lot,
0:08:45 is I think financial advice is incredibly valuable.
0:08:48 I think they serve a really good purpose.
0:08:54 But at a certain point, the math on going flat fee is just substantially better for you, right?
0:09:00 Like if you have $5 million, $50 million, $500 million, they don’t do dramatically more things.
0:09:03 Yet at every step, you pay an order of magnitude more in fees.
0:09:06 I think there’s a lot of fantastic financial advisors.
0:09:10 You pay them $5,000 a year, $10,000 a year, $20,000 a year.
0:09:10 I don’t care.
0:09:15 You can easily get an ROI there, but it’s the whatever, 1% of your wealth.
0:09:19 That’s the part that gets really silly the better and better you do.
0:09:23 I want to ask you about indexing.
0:09:25 So indexing is obviously super popular now.
0:09:30 And you had some note on like simple other ways you can index that are, you know,
0:09:31 maybe things that people should consider.
0:09:33 Yeah.
0:09:37 So in general, like my overall thesis is most people,
0:09:41 you’re not going to get crazy differentiated performance from investing.
0:09:42 But that’s actually okay.
0:09:48 It’s kind of wild that an average person, like again, think about my example.
0:09:50 I had all these financial, financial advisors.
0:09:52 I had so much differentiated access.
0:09:54 We know so many entrepreneurs, founders, whatever.
0:09:58 The best part of my portfolio was indexing the S&P 500.
0:10:00 And historically, that happened a lot.
0:10:03 So most people should not try and find alpha on the investment side.
0:10:07 However, if you can be smart about saving money on taxes,
0:10:10 that is where your alpha comes from.
0:10:13 So one example, I think Sam and I talked about this on Twitter.
0:10:16 I’ve started to do something called direct indexing,
0:10:18 where instead of buying a Vanguard fund,
0:10:22 it buys each and every one of the 500 companies individually.
0:10:25 And the advantage with that is you have more positions.
0:10:29 So more volatility at any given point, more companies are down.
0:10:31 So you harvest those losing positions.
0:10:33 So net-net, you track the same as an index,
0:10:37 which you’ll get 30%, 40% of your investment as a usable tax loss.
0:10:39 Sam, do you do this, by the way?
0:10:44 No, because I think my, I’m not entirely well-versed on it,
0:10:47 but I don’t do it because what I’m doing is working,
0:10:49 just indexing, and I sleep better at night.
0:10:51 And number two, from my understanding, Acker,
0:10:53 those gains eventually go, they go away.
0:10:56 Unless you’re investing on an ongoing basis.
0:10:58 So if you do a lump sum investment,
0:11:01 after three years, you’ll harvest most of your losses.
0:11:04 But if you’re investing X amount per year,
0:11:06 that kind of infinitely goes long.
0:11:08 But no, Sean, I don’t do it, do you?
0:11:09 Yeah, I do.
0:11:13 So I invest in this company called Freck, and they do this.
0:11:14 So, Acker, have you seen Freck?
0:11:18 I’m actually a little bit nervous because I use them as well,
0:11:20 but I don’t want to be their biggest customer.
0:11:21 I have a couple million dollars on Freck,
0:11:22 and I don’t want to put like,
0:11:26 I’m a little nervous about putting like five or 10 or whatever on them.
0:11:29 I’m an investor too, but yes, I use them.
0:11:31 The guy’s really smart behind it.
0:11:32 He sort of sat me down and explained it
0:11:35 because I was like, look, assume I don’t know anything
0:11:37 because I read this and it sounds good,
0:11:38 but it’s kind of hand-wavy to me.
0:11:39 Like, how does this actually work?
0:11:41 So he explained it, and I said, okay, cool.
0:11:43 So I put 250K in as a test.
0:11:46 So I’m basically doing the exact same thing
0:11:47 I otherwise would have been doing.
0:11:49 Which is indexing the S&P.
0:11:51 So I got the same performance as the S&P,
0:11:57 but on my 250K, I added an extra 16,000 in tax loss harvesting,
0:11:59 just holding the same exact asset as it would before.
0:12:03 And so, and you know, like you, I was like, wait,
0:12:05 should I give this startup like millions of dollars?
0:12:06 Or how do I want to do this?
0:12:08 And like, you know, then he’s like, well, we don’t,
0:12:11 you know, all fintech is basically built on top of other,
0:12:15 like providers, other pools, other pools of money.
0:12:17 So, you know, it’s not a startup that holds your money.
0:12:19 It’s like, you know, the custodian’s usually somebody else.
0:12:20 Same thing with Robinhood and others.
0:12:23 Like, so anyways, I think it’s kind of amazing.
0:12:25 Like an even simpler example,
0:12:28 if direct indexing is too hard is most people right now
0:12:30 keep their cash in like a high yield savings account
0:12:30 or something.
0:12:33 But if you are a high taxpayer,
0:12:35 there’s likely better places for it.
0:12:38 There’s either muni funds where you pay no federal taxes.
0:12:41 There’s treasuries where you don’t pay state and local taxes.
0:12:44 There’s muni funds specific to New York and California
0:12:46 where you won’t pay federal and state taxes.
0:12:48 Like we built a very simple product
0:12:49 that will just take your cash,
0:12:50 calculate your tax rate,
0:12:53 and find the best money market fund for you.
0:12:55 That itself will take your tax equivalent yield
0:12:58 from like three, 4% to like six, 7%.
0:12:59 When I hear you say all this,
0:13:03 for most people, when they talk about this,
0:13:05 I see a lot of people talk about this.
0:13:06 And I think to myself,
0:13:08 you guys should just focus on earning more money.
0:13:10 You should increase your revenue first.
0:13:14 Increase the revenue and then worry about-
0:13:15 Totally, totally.
0:13:17 Like there’s some percentage of people where they’re like,
0:13:19 oh, my business makes 50 grand a year.
0:13:20 How do I save money on taxes?
0:13:23 It’s like, bro, you don’t have a tax problem, right?
0:13:25 You need to make more money.
0:13:27 Like I was at a bike shop the other day
0:13:28 and I was like looking at a bike
0:13:30 and there was a bike that was $2,500
0:13:32 and then there was a bike that was like five grand.
0:13:33 And I was like, what’s the difference?
0:13:35 And they’re like, like two pounds.
0:13:36 So it’s a lot.
0:13:38 And then like this water bottle was made out of carbon
0:13:39 and this and I was like,
0:13:41 well, I’m kind of chubby.
0:13:43 So why don’t I just lose weight?
0:13:44 You know, like why don’t I just like,
0:13:45 I could like lose five pounds
0:13:47 just by not eating for like, you know, a week
0:13:49 and I’ll save myself that money
0:13:49 and I’ll be fitter.
0:13:50 How about that?
0:13:51 And that’s kind of the same thing.
0:13:52 Absolutely.
0:13:54 I think for people like,
0:13:57 like it matters at our level
0:13:59 when the compounding and stuff makes a big difference.
0:14:03 But at like 30, 40, 50K in revenue,
0:14:06 making more money is far more effective.
0:14:07 Well, I was going to say two things.
0:14:11 One, this like direct, the direct indexing.
0:14:14 I like these because these are set it and forget it.
0:14:15 I made a decision.
0:14:17 It took me two seconds to just roll.
0:14:20 I literally push a button and it does an ACAT rollover.
0:14:23 of my S&P ETF holding,
0:14:26 my Vanguard ETF holding from one brokerage to theirs.
0:14:26 I never had to do anything.
0:14:28 So it was like, it was a two second change
0:14:30 that has since yielded like, you know,
0:14:35 18K, 16K of extra tax losses that I harbored.
0:14:35 Yeah, exactly.
0:14:37 Like, so I don’t have to do anything.
0:14:39 I would never do that if it was like
0:14:41 something I have to keep thinking about
0:14:42 or keep doing something on a regular basis.
0:14:43 Same thing, what you’re talking about
0:14:45 with the buying muni bonds
0:14:47 versus holding cash in a money market fund, right?
0:14:49 If you’re just going to put cash in a place
0:14:51 and be like, okay, it’s earning like three or 4%.
0:14:53 Well, if you could just do the same thing
0:14:55 with a money bond and just not pay the state taxes,
0:14:57 like it’s a, it’s the same one decision
0:14:58 didn’t take you any extra time.
0:15:00 For the average person though,
0:15:02 I would say for the person doing nothing,
0:15:04 fucking do a Vanguard fund that
0:15:05 the most important thing is to do something.
0:15:08 But for the people that already have an index fund,
0:15:10 this is the sort of alpha.
0:15:12 What else is on the list of like,
0:15:14 all right, you’re wired tens of millions of dollars.
0:15:15 You’re worth nine figures.
0:15:19 I expected this to happen
0:15:20 in terms of financial investments.
0:15:23 Someone was going to present like this amazing deal to me.
0:15:24 Someone was going to offer this amazing service.
0:15:27 And it just was not what I thought.
0:15:29 And everyone could have access to this.
0:15:30 For the first time,
0:15:33 I felt like it didn’t matter if I failed,
0:15:35 even though arguably I needed the dollars more.
0:15:38 Now it would feel so stupid to fail.
0:15:39 I’ve been on panel telling other people
0:15:40 how you should run your company.
0:15:41 And you know,
0:15:43 you have like this time,
0:15:45 everyone who joined the company did it
0:15:46 because they’re like,
0:15:47 oh, you know, this guy will succeed again.
0:15:50 So I feel like my pressure on myself
0:15:52 and it’s fully my own construct
0:15:53 at the end of the day.
0:15:54 Can we switch away from emotions
0:15:56 and you tell me about a mega backdoor Roth?
0:16:00 I don’t really care how you feel.
0:16:01 I just really want to know
0:16:03 about this Roth tax optimization.
0:16:04 What is this?
0:16:06 So this has been kind of wild.
0:16:09 I want to know how Peter Thiel made $5 billion
0:16:11 and didn’t pay any tax on it.
0:16:12 What was that?
0:16:13 Yeah, talk dirty to me, baby.
0:16:14 Tell me how you feel later.
0:16:14 Talk dirty, yeah.
0:16:17 Whisper softly,
0:16:18 tell me about my backdoor Roth.
0:16:22 But yeah, Peter,
0:16:24 a lot of people read about Peter Thiel.
0:16:25 What he did,
0:16:26 which is like tax magic,
0:16:28 is most people have access to a Roth IRA.
0:16:32 He put his PayPal founder shares
0:16:32 in a Roth IRA,
0:16:35 sold those for about $27 million,
0:16:38 then used this as a supercharged investment account
0:16:38 to where it is today
0:16:41 where he has $5 billion in his Roth.
0:16:41 He turns,
0:16:43 he retires next year,
0:16:45 so he gets all of that with no taxes.
0:16:46 But you got to dumb this down, man.
0:16:47 What’s a Roth IRA?
0:16:48 Like you got to like,
0:16:50 talk down to me a little bit.
0:16:51 Yeah.
0:16:54 So Roth IRA is an account
0:16:56 that the government basically
0:16:57 has gifted to taxpayers.
0:16:59 There’s a guy named William Roth,
0:17:00 who I think, you know,
0:17:02 15, 20 years ago or whatever,
0:17:04 created this account,
0:17:05 ideally for the middle class.
0:17:06 It was never meant to be
0:17:07 for the richest people
0:17:10 where you put in after-tax money,
0:17:15 but then all further growth in that account is tax-free,
0:17:20 including no matter how high tax rates go in the future, right?
0:17:22 So if you look back in the 1970s,
0:17:25 do you know what the top tax rate was in America?
0:17:27 It was like 20%.
0:17:29 It was like 80%.
0:17:30 It was insane.
0:17:33 Like the top marginal bracket was like 80 or super high.
0:17:37 But the advantage of a Roth IRA is once you have dollars in,
0:17:39 no matter how high taxes get,
0:17:43 you pay nothing on what is in that account.
0:17:45 And what was the incentivization there for these people?
0:17:49 So it was a very short-term incentive at the time
0:17:50 because the advantage is
0:17:52 the government gets the tax revenue today
0:17:54 because you’re putting in after-tax dollars today.
0:17:57 What you’re giving away in the future is future revenue,
0:17:59 but you’re actually collecting more money up front.
0:18:02 I was saying, why would the government agree to this?
0:18:05 They want to incentivize retirement savings in general.
0:18:07 And compared to traditional IRA,
0:18:11 this brings tax revenue forward.
0:18:13 But it was always designed in a way
0:18:15 that there was a maximum income.
0:18:17 Once you make more than a certain amount,
0:18:19 you were not supposed to have access.
0:18:22 Today, those numbers are, I think, $150,000 or something.
0:18:26 You don’t really have access if you go the normal route.
0:18:30 But that’s where you have things called the backdoor Roth IRA.
0:18:32 Before you explain that,
0:18:33 so basically it was,
0:18:35 if you make under a certain amount,
0:18:39 you can put away some $7,000 or whatever a year.
0:18:41 So you’ve already paid tax on it.
0:18:42 You put it there.
0:18:43 Now it’s going to compound tax-free.
0:18:44 When you take it out,
0:18:45 after all those investments have compound,
0:18:47 you don’t pay any taxes on that money
0:18:49 because you sort of paid it up front.
0:18:51 And does it have to be an age limit, like 65?
0:18:53 You have to, yeah.
0:18:56 You get your dollars at 59 and a half,
0:18:59 but you can take out the dollars you contribute
0:19:00 at any point for any reason.
0:19:03 It’s only the growth that is kind of locked up.
0:19:05 So what Peter Thiel did, which was great,
0:19:07 was he put his PayPal shares in there
0:19:09 when they were valued at nothing.
0:19:15 So he bought his founder shares at like $0.0001 or whatever.
0:19:16 And he probably knew,
0:19:18 hey, this is a good chance that this thing can become valuable.
0:19:20 It’s crazy, by the way.
0:19:22 He only made $27 million on PayPal?
0:19:22 Yeah, $27 million.
0:19:24 Oh, at least from this Roth IRA.
0:19:27 He put in $1,700 worth of founder shares
0:19:29 and he made $27 or $28 million.
0:19:33 Okay, so then he used that money to invest in Facebook
0:19:34 within his Roth IRA.
0:19:38 So then the Facebook gain is why it became billions of dollars.
0:19:39 Exactly.
0:19:42 But you then read about Mitt Romney doing the same thing.
0:19:45 But because I’m in this line of work,
0:19:47 I’ve now met so many regular people
0:19:50 with eight and nine figure Roth IRAs,
0:19:52 which is insane because you have, you know,
0:19:56 $10 to $100 million on which you won’t owe any taxes at all.
0:19:59 And a lot of them have done that with somewhat vanilla investing.
0:20:02 So tell me a story of someone who’s done this with,
0:20:03 so eight or nine figures.
0:20:05 So you’re saying 10 million or 100 million.
0:20:06 Yes, exactly.
0:20:11 So Roth IRA limits are normally $7,000 a year.
0:20:14 The challenge with that is it’s tough to get a lot of money in.
0:20:17 The whole game becomes how do I get a lot of money in?
0:20:18 Once you get 20 million,
0:20:21 going from 20 to much more is actually easier.
0:20:23 It’s how do you get the first amount of dollars in?
0:20:26 So in addition to the backdoor Roth IRA,
0:20:28 there’s something called the mega backdoor Roth IRA,
0:20:36 which the naming of this is not ideal, but…
0:20:37 What about the triple mega?
0:20:38 The triple mega, right?
0:20:41 It’s like the least creative names of all time.
0:20:45 But the mega backdoor Roth IRA lets you use a 401k
0:20:50 to get in $70,000 a year into this account.
0:20:54 So you can combine the two, get in $77,000 a year.
0:20:56 I actually sanity tested this.
0:20:59 I built a very simple model that makes the assumption
0:21:02 that you start contributing to Roth IRA by 21,
0:21:05 the mega backdoor by like 30.
0:21:08 Even if you just index the S&P
0:21:10 and you looked at the 100-year average of the S&P,
0:21:14 you end up with $30 million in your Roth IRA at retirement.
0:21:17 So it’s, if you do this consistently enough,
0:21:19 the tax benefit here is massive.
0:21:20 Okay.
0:21:22 So who came up this mega…
0:21:24 This mega backdoor?
0:21:25 Yeah.
0:21:28 So the way people find the loopholes
0:21:31 is the IRS never intends for it to be this way.
0:21:32 They write the law a certain way.
0:21:34 Some smart-ass accountant is like,
0:21:36 well, actually, if you do this, this, and this,
0:21:38 we’re still fully in compliance.
0:21:41 The IRS challenges this, loses in court,
0:21:42 and that’s how we have loopholes.
0:21:44 Like every single loophole comes the same way.
0:21:46 The mega backdoor Roth, for instance,
0:21:49 there’s been legislation to outlaw it
0:21:50 for the last few years.
0:21:52 It comes up every time.
0:21:56 But the existence of the fact that someone wants to make it illegal
0:21:59 by default deems this is a valid loophole for now.
0:22:02 So it’s the sort of thing that, like,
0:22:04 this gets negotiated down every single time.
0:22:05 Every single tax bill has it.
0:22:08 Typically, the Republicans fight against it,
0:22:10 and it just gets deferred and deferred and deferred.
0:22:12 All right.
0:22:14 And what are examples of people who have done this
0:22:16 to $10 or $100 million?
0:22:17 What do you want?
0:22:18 Their name and address?
0:22:19 What do you want them to say?
0:22:20 Their name and address?
0:22:22 No, you can do a little Chatham House rules.
0:22:24 You can kind of, like, anonymize them a little bit.
0:22:26 So I will tell you that these are people
0:22:27 that you and I have never heard of.
0:22:29 There’s just a lot of, like, anonymous,
0:22:32 wealthy, financially savvy people
0:22:34 where it’s a coalition of families,
0:22:36 and they have a family office
0:22:39 that specializes in this sort of tax alpha as a service.
0:22:43 But it’s just a case of doing all the fundamentals together
0:22:46 and then having very, very good investments.
0:22:50 But I think if you can structure your 401k in a way
0:22:52 that you can put in $70,000 a year into your Roth IRA,
0:22:57 which I do now, honestly, just the sport of it is fun enough.
0:23:00 And you do this over a long enough period of time,
0:23:01 you have these results.
0:23:04 It’s the sort of thing that we don’t know
0:23:06 how long this loophole will exist.
0:23:08 But for now, it’s, like, you know,
0:23:11 hundreds of millions of dollars with $0 in taxes.
0:23:14 It’s very good insurance from where tax rates may go in the future.
0:23:18 Let’s, should we switch to business ideas, Sam?
0:23:19 Yeah, let’s do it.
0:23:20 Cool.
0:23:23 So Ankur, you sent us a list of ideas
0:23:26 that you think any founder could go build,
0:23:28 or you think that some founders should go build in this space.
0:23:31 And you’ve started multiple successful companies now.
0:23:33 So maybe you have interesting tastes,
0:23:35 and maybe you’re looking where other people are not.
0:23:37 Because when I saw this list of ideas,
0:23:41 these were not the, like, same five AI, you know,
0:23:43 assistant ideas that you hear from most people.
0:23:46 So hit us with your favorite one,
0:23:47 and let’s go in order.
0:23:48 Sweet.
0:23:51 So I’ll go in somewhat random order,
0:23:52 because I have a few different ones.
0:23:56 But one of the things I think about a lot is right now,
0:23:58 there’s such a focus on proactive health and wellness.
0:24:00 I don’t know, New York City, for instance,
0:24:03 people are spending all their time not at bars and clubs,
0:24:05 but at, you know, saunas and cold plunges or whatever.
0:24:09 And you have all these high-end executive services
0:24:11 for, like, concierge health, basically, you know,
0:24:13 superpower, Function Health.
0:24:15 Function Health, I don’t, you guys have to explain this.
0:24:17 So basically what they do is you do a blood test,
0:24:19 and they tell you all about your body and everything.
0:24:22 But this has existed for decades.
0:24:22 I’ve been using it,
0:24:25 I started using these years and years and years ago.
0:24:28 This company, Function Health, has skyrocketed.
0:24:30 And I think in the course of, like, two or three years,
0:24:32 it got to $100 million in revenue.
0:24:34 I believe they’ve just raised another round of funding
0:24:35 in the billions.
0:24:40 I have no idea why this category is booming like it is.
0:24:42 Because it doesn’t, I didn’t know this,
0:24:43 I don’t know how this makes it any different
0:24:45 from the decades of other companies doing this before.
0:24:47 And it’s commoditized.
0:24:49 Everyone uses the same blood testing companies.
0:24:51 It’s, it’s, but I’m telling you,
0:24:54 we’re just moving to a world of proactive wellness.
0:24:55 Like, I think everyone has realized, like,
0:24:58 the medical system in the U.S. is you kind of sit back
0:25:00 and wait for really bad things to happen.
0:25:03 And people of our generation have realized
0:25:05 that you want to be proactive about this
0:25:06 and not just, not just kind of chill.
0:25:09 Yeah, I think, I think there’s a bunch of factors, right?
0:25:11 Like there’s, there’s the Brian Johnsons of the world.
0:25:15 So now you have these really big attention-getting influencers
0:25:17 who are bringing attention to, you know,
0:25:18 your health and wellness
0:25:20 and making you realize like how much of a knowledge gap
0:25:22 there is, how much of, how much more you could be doing.
0:25:25 You might’ve thought you were already doing enough.
0:25:26 And then you, somebody shows you
0:25:28 that there’s a new level to enough, right?
0:25:29 So you have the, the influencer side.
0:25:32 Underneath you have all the infrastructure.
0:25:34 So you have lab testing in all these different places.
0:25:36 So you can go get your blood drawn.
0:25:39 You know, when I went on super power, it’s like, cool, go, you know,
0:25:40 do you want someone, a nurse to come to your house
0:25:43 or do you want to go 0.8 miles away?
0:25:45 And there’s three locations near you.
0:25:47 And I was like, wow, this is like amazingly convenient.
0:25:50 Then the next piece is that I think the HIPAA laws changed
0:25:52 or the medical records stuff changed.
0:25:55 So I just gave them my ID and they pulled all my health records.
0:25:57 I never even had that myself.
0:26:00 Like they have my health records now in a beautiful website.
0:26:04 They pulled all my medical history, which like I could,
0:26:05 if I wanted to go look at my own medical history,
0:26:07 I couldn’t have even done that before.
0:26:09 So like that, that was a change that happened.
0:26:13 So you have all these different things sort of stacking on top of each other
0:26:17 that like Anker said, like a cultural movement towards status,
0:26:20 status being associated with being healthy rather than being a degenerate, right?
0:26:25 Like before in my circles, the more you partied, the cooler you were, right?
0:26:29 The more, the more sort of bad for you, your lifestyle was, the cooler you were.
0:26:33 Now, the more good for you, the lifestyle you have, the better, the cooler you are.
0:26:36 And I don’t know if it’s just I changed or culture changed.
0:26:37 Tell me, how were your results?
0:26:40 Oh, you didn’t see my tweet about this?
0:26:40 Yeah.
0:26:42 What did you say?
0:26:47 I said, oh, I just found out that my chronological age, 37,
0:26:50 is different than my biological age, 40.
0:26:53 But long-term, I’m not concerned because I’m just going to go ahead and kill myself now.
0:27:01 Because you never see someone share their results where their age is worse than their actual age.
0:27:05 It’s always like, oh, look, I have the body of a 23-year-old, you know, Russian gymnast.
0:27:06 Cool.
0:27:09 Thanks for, I’m so happy I used this app.
0:27:10 Why were you so?
0:27:13 I’m the only person, I’m the only person I’ve ever seen with a higher biologics.
0:27:15 Literally never, I’ve never seen anyone, anyone like that.
0:27:16 I think I broke a record.
0:27:18 What was so old about you?
0:27:23 Well, so it’s pretty cool because actually the next thing that happens is they go,
0:27:27 there’s a concierge doctor who basically is going to, in a week, give you kind of an action plan.
0:27:29 So it says, oh, okay, look, we did these 30 tests in one,
0:27:31 in a single blood draw, they do these like 30 tests, right?
0:27:36 And they say, look, right now, let’s say, I don’t know, let’s just pretend.
0:27:40 It’s like your, whatever, your LDL is high or low, whatever, whatever, something is wrong.
0:27:43 And then they’ll basically start to talk to you.
0:27:47 So there’s a little concierge, a little chat here where somebody is going to then work with me
0:27:49 to develop maybe an action plan, things I could be doing,
0:27:55 eating slightly differently, whatever, walking more steps, whatever, whatever you people do,
0:27:58 go to cold plunge for six minutes, all those things.
0:28:01 And you come up with an action plan.
0:28:04 So, you know, whether this is like life-changing or not,
0:28:08 they’re probably just going to be like, yo, like delete DoorDash off your phone.
0:28:09 Like, you know, that’ll probably do the trick.
0:28:11 Like there’s, you know, there’s a few different ways that you can help somebody.
0:28:16 But again, like this is not how normal medicine works.
0:28:18 Normal medicine is a car, you know, mechanic.
0:28:21 You go there when you’re broken, they try to fix it.
0:28:24 And most of the time they’re like, yeah, it’ll never really run the same, right?
0:28:28 Like we’re in maintenance mode from here on out.
0:28:33 Whereas at least this operates under the assumption of being proactive rather than reactive.
0:28:35 And that there’s actually something you could do.
0:28:40 There’s actual lifestyle steps you could take and not simply like medicines you can get on to improve your health.
0:28:47 All right, folks, this is a quick plug for a podcast called I Digress.
0:28:53 If you’re trying to grow your business, but feel like you’re drowning in buzzwords and BS, then check out the I Digress podcast.
0:28:56 It’s hosted by this guy named Troy Sandage.
0:28:59 He’s helped launch over 35 brands that drive $175 million in revenue.
0:29:04 So if you want to get smarter about scaling your business, listen to I Digress wherever you get your podcasts.
0:29:06 All right, back to the pod.
0:29:16 I also think in addition to all of that, I do genuinely believe a lot of people are sicker than before, particularly like in the U.S.
0:29:18 And some of my other ideas have to do with that.
0:29:27 But I do think you look around, whether it’s like fertility, gut health, allergies, like shit’s kind of falling apart in a lot of places.
0:29:29 And people are reacting to that.
0:29:34 But by the way, the upsell in this thing is crazy because the initial deal is super good, right?
0:29:39 So it’s like $500 and you’re going to have like a 10 times better version of your annual physical.
0:29:43 It’s like, okay, that’s actually like a good trade that you don’t need to be super wealthy to do.
0:29:46 Like you can pay $500 and get a better version of your physical.
0:29:49 But when you’re in there, it’s like, you want to know how that gut’s doing?
0:29:50 You want to microbiome test?
0:29:52 I’m like, yeah, I’d love to.
0:29:56 Dude, I had to click next, the upsell page on Function Health.
0:29:57 I did Function Health.
0:29:57 I clicked next.
0:29:59 I’m not joking, 15 times.
0:30:01 I didn’t even click next.
0:30:02 I clicked yes.
0:30:03 I was like, yep, give me an allergy test.
0:30:05 Give me a toxins test in my blood.
0:30:07 Give me the gut test.
0:30:10 I basically did every single test except for the like blood, like cancer test one.
0:30:12 And I just said yes to all of them.
0:30:15 So they upsold me like a couple thousand dollars worth of tests.
0:30:17 But honestly, like great.
0:30:19 I wish my doctor had been offering me these.
0:30:21 Like why doesn’t my doctor like offer me these?
0:30:22 Either of these tests are bullshit.
0:30:23 They do it the other way around.
0:30:25 They’re like, oh, you want to get your testosterone checked?
0:30:27 You’re like fine.
0:30:27 Yeah.
0:30:29 What’s the problem?
0:30:29 Yeah.
0:30:31 What’s your wife say to you?
0:30:31 I’m like, what?
0:30:33 Why is this so personal?
0:30:38 So what’s your idea here?
0:30:41 So a lot of these services are doing really well.
0:30:47 I think you cut out the motivation a lot of times for people is the aesthetic side of it too.
0:30:49 People want to look good.
0:30:56 I think if you had a function health, but for your physical appearance, it would completely crush.
0:31:05 So it’s like you come in, it’s like, wow, Sean, your hairline has receded two inches and you tie in a DEXA scan to it.
0:31:10 But basically like a concierge service to elevate your experience.
0:31:13 Like two of my college roommates are plastic surgeons, right?
0:31:16 And I’ve heard a little bit about that side of the business.
0:31:24 But marrying these two ideas, basically concierge lookups with improving your aesthetics, I think is a fantastic business.
0:31:27 This is actually a great idea.
0:31:31 We had Justin Mares on the podcast and his skin was glowing.
0:31:37 Yeah, he’s telling us about like his inner health and we were just like, he’s handsome.
0:31:39 I was like, dude, your hair is full.
0:31:41 Your skin has a glow.
0:31:42 Your teeth look nice.
0:31:43 Can you just tell me about that?
0:31:45 And so, yeah, I’m on board.
0:31:51 We went out to breakfast, Sam, and he, literally everything he ordered, I just said, whatever this man eats, I need to eat.
0:31:52 So he ordered this drink.
0:31:54 I said, the waiter looked at me.
0:31:55 I said, did you not understand the instructions?
0:31:57 Whatever this man eats, I eat.
0:31:59 And so I had an identical meal to him.
0:32:02 But I think you have to.
0:32:05 Like Sam and I are investors in a testosterone company.
0:32:08 And if the founder was not jacked, I would not have invested in the company.
0:32:09 That’s true.
0:32:11 Sean’s in on that too, by the way.
0:32:13 It seems like we’ve all invested in the same stuff.
0:32:16 Yeah, so you have to look the part in order to do it.
0:32:20 So I think, yes, I think businesses focus around the aesthetic part of it would crush.
0:32:23 I mean, you already have concierge medicine in Turkey, stuff like that.
0:32:31 Secondly, I think anything that helps us break out of the U.S. food system will do quite well.
0:32:35 And again, probably more so in cities like New York, but it’s kind of funny.
0:32:41 Like every Friday evening, I go and meet my like raw milk dealer, which is super sketchy.
0:32:43 It’s this like Amish farm that like drives in.
0:32:44 It’s illegal.
0:32:47 So, you know, they literally park on the side of the street.
0:32:49 You make this little transaction.
0:32:54 But I have a lot of people that just want to break out of the U.S. food system.
0:33:00 And I think businesses that enable people to do that will go quite a long way.
0:33:03 And so how did you find this Amish farm?
0:33:06 Like what was your process to find this alternative?
0:33:08 So it was word of mouth.
0:33:11 A lot of my friends were all talking about it.
0:33:13 And then someone’s like, oh, I get my beef from this place.
0:33:17 And it actually started out with like, you know, you buy beef and then you get upsold on all the other products.
0:33:21 And now before you know it, my like soap is made out of goat milk and shit.
0:33:24 That’s probably not even not even relevant or helping.
0:33:27 But, you know, they were pretty, pretty nice business.
0:33:30 But, yeah, they go straight from farm to like they drive out to New York City.
0:33:32 They have this little truck.
0:33:34 They park the truck on the side of a street corner.
0:33:36 You have a 30 minute window to meet them.
0:33:39 And as much as I joke about it, it’s actually pretty cool.
0:33:41 It does improve the quality of my life.
0:33:43 I do think the products taste well.
0:33:45 Raw dairy is more of a novelty.
0:33:49 You know, that’s something that like, sure, I’ll add four dollars for the for the raw milk.
0:33:54 But I think breaking out of the U.S. food system, at least, you know, we’ll see what RFK does.
0:34:00 But in the short term, I think there are a lot of people that will pay quite a bit of money for.
0:34:03 Do you get all your meat from this this Amish fellow?
0:34:04 I do. It’s so good.
0:34:07 Like, like, dude, I don’t like chicken in America.
0:34:11 Like as someone that’s traveled a lot, most chicken here at a grocery store tastes like nothing to me.
0:34:14 Like you can’t really taste anything.
0:34:19 But here it actually, you know, again, but such a big difference if you have the same stuff outside the U.S.
0:34:22 We all need one of these Amish guys.
0:34:25 Isn’t this what a farmer’s market is supposed to be, by the way?
0:34:28 It is. It is what a farmer’s market is supposed to be.
0:34:29 This feels more authentic, right?
0:34:31 This feels this feels more real.
0:34:34 Well, the way you describe it, it’s more like a drug deal.
0:34:37 And actually, that kind of appeals to me in a way like farmer’s market.
0:34:39 It’s a little bit like a cute date on a Sunday morning.
0:34:42 But, like, I kind of like an element of danger with my groceries.
0:34:45 This is contraband.
0:34:45 This is contraband.
0:34:46 Exactly.
0:34:53 Not only that, when you, like, you, like, buy something and all the signs on it will very clearly say not for human consumption.
0:34:55 But it’s like, wink, wink, not for human consumption.
0:34:56 Oh, man.
0:34:59 Can we ground it up and inject it in my veins?
0:35:01 That’s usually what I do when it says not for human consumption.
0:35:04 Yeah, for cats and dogs only.
0:35:06 Does it make you feel better, by the way, this stuff?
0:35:08 It does.
0:35:10 I mean, of course, you have to, like, control for placebo effect and stuff.
0:35:14 But generally, yes, I do feel better when I’m eating this.
0:35:16 Or don’t control for placebo.
0:35:18 Placebo is the best shit in the world.
0:35:18 Placebo is a mess, right?
0:35:19 Yeah.
0:35:19 Yeah.
0:35:22 Whenever I read about these, I’m like, wait, so you have to, like, study?
0:35:27 Like, when we talk about placebo, we act like it’s an issue.
0:35:27 And I’m like, wait a minute.
0:35:31 You’re telling me I could take a fake thing and it gives me the real result?
0:35:33 Give me all of the fake things.
0:35:35 That’s all I ever want is the fake thing.
0:35:41 No, it’s, I mean, the only reason I even went down this rabbit hole is I remember I’ve had
0:35:45 multiple times in my life where I’d, like, live a very healthy New York life, like, cook
0:35:46 my own food or whatever.
0:35:49 And then I’d go to, like, Argentina for a month, live like a degenerate.
0:35:51 And I’m like, I feel better.
0:35:53 Like, my physical body feels better.
0:35:56 And I don’t think it was the addition of the alcohol.
0:35:59 I think it was, it was just, just the food system.
0:36:00 It could have been that.
0:36:02 It could have been some of the other stuff you’re doing on there, too.
0:36:04 Could have been never.
0:36:07 All right.
0:36:13 Next idea is a travel agent, but for credit card points.
0:36:17 Like, again, I have, I’m, we already talked about this, right?
0:36:20 I’m on the cutting edge of a lot of personal finance shit.
0:36:25 I still find it a pain in the ass sometimes to actually find the right flights.
0:36:31 Like, right now I have 250,000 MX points, 100,000 chase points.
0:36:36 I’d pay a good amount of money for someone to go find me two first-class tickets to someplace
0:36:37 warm in the summer.
0:36:40 How, how do you, how do you want this to work?
0:36:41 I want this to be a concierge service.
0:36:46 I want to tell people exactly what I have, like, what are the points or whatever I have
0:36:51 and where I’d like to go, including, like, like, Hey, like for a lot of times, I don’t
0:36:52 even care where I want to go.
0:36:55 It can be, you know, a warm destination in a certain part of the world.
0:36:58 So they just do it for me.
0:37:01 Jack Smith is a friend of all of ours.
0:37:07 I believe Jack Smith, very, um, when he suggests something, uh, you, you take it seriously, even
0:37:11 though he’s, uh, one of the strangest people ever, but there’s like, you know, that he’s
0:37:15 like thoughtful enough that there’s some unique reason why he uses this.
0:37:21 He only books his flight via flightfox.com and I’ve used it a couple of times, but basically
0:37:25 you sign up and at the time, I think it was free.
0:37:27 Now it looks like it’s $20 per ticket.
0:37:32 You tell them like roughly when you want to do something and there’s a human on the other
0:37:33 end who goes and books it for you.
0:37:37 And they spend like two hours per flight, finding a flight for you.
0:37:39 Their business doesn’t make sense.
0:37:41 Like what’s their margin on in on this?
0:37:45 I have no idea, but I have booked three or four flights on this.
0:37:46 And it is amazing.
0:37:47 It works every time.
0:37:50 Uh, it just, can you enter like your credit card points and stuff?
0:37:52 Or is this just dollars?
0:37:53 Yeah.
0:37:57 So you, they’ll either just pay for it and book it for me, or they’ll give me the exact
0:37:58 flights to book.
0:38:00 And then I go to delta.com and book it.
0:38:02 Let me tell you another crazy thing.
0:38:06 Have you guys ever heard, have you ever bought another bought or sold another person’s miles?
0:38:13 Uh, no, but that’s the kind of stuff that like, I think businesses had made that easier.
0:38:16 There’s so much value to be unlocked there.
0:38:17 Have you ever heard of this, Sean?
0:38:18 Uh, I’ve heard of it.
0:38:19 I’ve never done it before.
0:38:20 Have you done it?
0:38:21 I’ve done it.
0:38:28 So I booked a flight, uh, where I, uh, I had a friend who spent roughly 10 or $15 million
0:38:32 a year on advertising with his Amex card or some, some credit card.
0:38:35 And he is like, I have all of these points.
0:38:39 If you wire me the money, I will book the plane ticket for you.
0:38:43 And I will take a small fee and you get the discount.
0:38:44 And I did.
0:38:50 And I think the flights, I think I saved 40% or 30%, something like pretty significant.
0:38:53 Cause I was buying like six, six first class fights.
0:38:54 It was very expensive.
0:38:56 And it was like thousands of dollars that I saved.
0:39:02 So who’s, who’s this friend, basically what they’re doing, but as a true sort of professionalized
0:39:04 service is how I would like it to work.
0:39:09 Um, and I think you can now automate a lot of the backend with, but there’s an issue.
0:39:14 This is highly, not illegal, but it’s against the terms and service of all these airlines.
0:39:18 And so if you get caught doing it, I think they were, they take your ticket.
0:39:22 I feel like a disproportionate number of my ideas are turning out to be illegal, but yeah.
0:39:24 Yes.
0:39:29 So what were you saying about the financial services company that you currently run?
0:39:34 All right, let’s, let’s, let’s, let’s illegal raw milk.
0:39:38 I don’t, I don’t, I don’t, I don’t want my chief compliance officer to, to message me after
0:39:42 I should add nothing shared here.
0:39:45 It should be considered as legal advice, investment advice, or tax advice.
0:39:48 Everything is for information or health advice either.
0:39:50 Or health advice or health advice.
0:39:50 That’s fine.
0:39:52 That’s not that regulated in this country.
0:39:53 No, but you’re right on the points thing.
0:39:59 So like I use like seats.arrow, I think, which is basically you plug in what points you have
0:40:01 and it’s like a search engine, but it’s like very slow.
0:40:02 It’s very hard.
0:40:02 And I have to do it myself.
0:40:04 You’re absolutely right.
0:40:08 Like I want either a human or I need like AI to be able to do this where it just knows.
0:40:10 I just say what I’m trying to do.
0:40:12 And sometimes it’s as fuzzy as what you described.
0:40:14 Like I just want to go somewhere warm in the summer.
0:40:17 Like it’s okay if it’s Puerto Vallarta and it’s okay if it’s Hawaii, right?
0:40:22 Like I’m not as dead set on, sometimes I am dead set on a date and a place.
0:40:23 Sometimes I’m not, right?
0:40:28 But either way, be able to make use of these points because I think I have 7 million Amex points.
0:40:29 Seven.
0:40:29 Yeah, exactly.
0:40:31 There’s so much arbitrage.
0:40:32 Wait, you have 7 million?
0:40:35 Yeah, at least 7 million.
0:40:37 How does that convert?
0:40:37 Is that like $70,000?
0:40:43 Once you get good at it, the value is so much more than the dollar amount.
0:40:46 But finding the person, but the average person is not going to get good at it.
0:40:51 So finding the intermediary that’s good at it, extract some value off the top.
0:40:57 Even though you’re incredibly wealthy, do you still, so you still use airline points like crazy?
0:41:00 Dude, Naval destroyed me on Twitter.
0:41:06 Like I like posted some like point hack and he’s like, I thought you sold a company.
0:41:07 Ratioed the hell out of me.
0:41:10 Yeah, do you still credit card churn?
0:41:12 I don’t credit card churn.
0:41:13 I don’t do things that are like high effort.
0:41:18 But like the basics, like if I’m spending money on shit, like yes, I’ll use a credit card.
0:41:19 Yes, I’ll get points.
0:41:20 And I still struggle.
0:41:28 Like if a flight is over $10,000, like I still struggle, even though mathematically you’re like, oh, you can afford to spend that.
0:41:31 Do you still mostly buy on sale items?
0:41:33 Not on sale items.
0:41:40 Like now, like there’d be a time where if I’d go to e-commerce and, you know, you’d get a 20% new customer discount, I’d create another account.
0:41:41 Like now I don’t do that.
0:41:44 Now I’m like, that’s, that’s, that’s, that’s, that’s too much effort.
0:41:45 You quit doing that yesterday.
0:41:46 Yeah.
0:41:52 But credit card points, I think if you spend a bunch of money, it like just helps a ton.
0:41:58 Also, as a brown kid, it’s very helpful to allow me to buy things for my parents.
0:42:02 My parents will not accept me buying a ticket for them.
0:42:06 But if I tell them the ticket comes from points, they have no problem accepting it.
0:42:07 Yeah, totally.
0:42:10 So it’s like, it’s like weird psychology.
0:42:11 Like, yeah, exactly.
0:42:13 I told my sister and I was like, just tell her it’s points.
0:42:14 Yeah.
0:42:15 And then don’t bother with the points.
0:42:16 It’s going to be too confusing.
0:42:18 Just get her the ticket.
0:42:27 Could you tell me, I read somewhere that you’ve built a few things with AI to like personal software, software for your own life.
0:42:28 What did you build?
0:42:31 And also like, what do you see as the opportunity there?
0:42:36 So I was someone, again, for context, I had a computer science degree.
0:42:43 I wrote enough code to launch the first version of Teachable and have not written a line of code in like 10 plus years.
0:42:57 So I understand enough, but I just felt like everything kind of went away for me until now with AI, where it’s again become super fun to play around with stuff and build small little apps.
0:43:02 So I started by building things that like just like annoying workflows I have to do.
0:43:05 Like every time we host a webinar, we download something from someplace, re-upload it somewhere else.
0:43:08 Started building these sort of internal tools.
0:43:16 But now it’s reached a point where I have all these annoying, we’re talking about homeownership, all these annoying things I need to do for my house.
0:43:22 Building like a very simple sort of task management app just for myself is super useful.
0:43:27 I think you’re starting to see more and more people realize this both for their business and for themselves.
0:43:28 What did you make?
0:43:37 So for now, it’s a very simple task management app where there’s certain things in my house I have to do at a different cadence.
0:43:43 Property taxes, you pay quarterly to change the fucking water filters once every X months.
0:43:52 And you enter all these sort of tasks, as well as in some cases, if there’s another person attached to them, and it kind of pulls it all automatically.
0:43:55 It’s like, oh, it’s been, you know, like this month, here are the things that you should do.
0:43:58 And here are the people that can do it.
0:44:01 And again, this is something just for my own entertainment, amusement, whatever.
0:44:08 But you can extrapolate how any local business can do it for whatever annoying things they have to do.
0:44:11 Sean, whenever I hear, so Ankur’s single.
0:44:13 I think, well, you’re not, you’re unmarried.
0:44:18 Whenever I hear a single guy, sorry, that sounded like, you know, I’m not your mother.
0:44:19 I’m not trying to diss you.
0:44:23 This already sounds like condescending dad type statement.
0:44:28 Dad’s like, well, it’s so simple for you, but in my time, but go on, go on, go on.
0:44:30 No, what I’m saying is, it’s the opposite.
0:44:40 What I’m saying is, when I hear you describe this stuff, I think to myself, if I didn’t have a wife, or if she just like walked out on me, I have no idea how I would handle any of this stuff.
0:44:42 Like, I don’t understand how a lot of our bills are paid.
0:44:43 They’re just kind of magically paid.
0:44:53 And so when I have a bunch of my single friends, so like Sophia Amoruso, one of our mutual friends, she tells me about like, all the time she has to spend doing all this.
0:44:57 And I’m like, dude, I’m used to having two of us, you know, I wash, she dries.
0:45:02 Like I’m used to having two of us to hear, to have one person manage all of this as part of a household when you own a home.
0:45:05 And when you have like responsibilities, that sounds so hard.
0:45:06 You know what I mean, Sean?
0:45:10 Can you imagine like having to do this stuff, like everything?
0:45:13 I just wanted to see if you could land the plane on that one.
0:45:13 And I would say.
0:45:14 Did I land it?
0:45:15 Yeah, you did, you did, you did.
0:45:17 It was like a Southwest, like, you know, there was a bump.
0:45:20 There was like a little bump on the way down, but you did, you did.
0:45:23 What I’m saying is it’s so much work.
0:45:24 I don’t understand how people do it.
0:45:30 Especially when you add in the dynamic of homeownership, especially as someone that grew up outside the U.S.
0:45:32 So I have no actual life skills.
0:45:35 Like I feel like people in America like know how to do shit.
0:45:37 They don’t like, they don’t need a plumber.
0:45:39 They like, they’ve like learned it somehow.
0:45:42 But growing up outside the country, I’m not handy.
0:45:45 So running a house is quite a task.
0:45:47 Wait, is that a stereotype that Americans know how to fix toilets?
0:45:48 Yeah.
0:45:50 You don’t think it’s true?
0:45:56 Well, I do think that amongst all of my brown friends, my brown Asian friends, all of them hate Home Depot.
0:45:57 Yeah.
0:46:01 Like self, self-reliance was not a, was not a status symbol in the U.S.
0:46:02 It is a status symbol.
0:46:04 Like you are higher status.
0:46:10 If you can actually fix stuff in India, it’s like, oh, you have someone to do all this stuff for you.
0:46:12 Sam, it’s not true.
0:46:17 If you go, if you go to Home Depot on the first Saturday of every month, it’ll be filled with Indian people.
0:46:18 Do you know why?
0:46:18 No.
0:46:18 What are they doing?
0:46:24 Because that’s when they host the free Home Depot class where you, your kids can come build things and it’s totally free.
0:46:26 They give you a kit and they build it.
0:46:27 It’s amazing.
0:46:29 It’s like a free educational thing.
0:46:32 Only Indians in the store during first Saturday of every month.
0:46:33 And it’s, by the way, it’s amazing.
0:46:35 If you’re not using this, you should use this.
0:46:36 Is that a true stereotype?
0:46:37 Ramit has told me.
0:46:39 He was like, my, my rich life.
0:46:44 He like explains all this stuff and he ends it with like, and also to never step foot into a Home Depot.
0:46:45 Yeah.
0:46:55 No, I mean, it definitely, at least when I first moved to America, it struck me as like strange how one, how handy everyone was, but how much pride they took in it.
0:46:56 Growing up, it was the opposite.
0:47:07 It was weird to like, you know, even change a light bulb, maybe an exaggeration, but, but yeah, I definitely think being a homeowner here as someone didn’t grow up here, you’re not prepared for it.
0:47:09 That’s awesome.
0:47:10 That’s funny.
0:47:16 Tell me about, tell me about this like pickleball thing that you’re, you’re, you’re writing about.
0:47:25 Oh dude, this is so I’ve, so I was like, in terms of things I spend money at after food, my second biggest expense in New York right now is a sport called padel.
0:47:27 It’s pretty crazy.
0:47:31 It’s like, is it not, it’s not paddle, paddle, padel, whatever.
0:47:32 Same thing.
0:47:32 Same sport.
0:47:33 I want to say it right.
0:47:34 It depends.
0:47:36 It’s like Latin America versus us.
0:47:37 Either one is fine.
0:47:41 It’s spelled P A D E L, not P A D D L E.
0:47:43 So paddle, padel, padel, whatever.
0:47:44 It’s incredible.
0:47:46 It’s become in New York.
0:47:48 It’s sort of, there’s a place near my office.
0:47:55 It’s like the closest we have to a country club because there’s not enough space here, but you, you see like a lot of our mutual friends are there.
0:47:58 We go there all the time to hang out, but I think it’s a bigger thing than that.
0:48:05 If you look at the Google trend for the word, we’re at a point now in the U.S. where we’re super, super early.
0:48:09 There’s like 2,000 courts here and there’s like 20,000 in Spain.
0:48:16 It’s on a fantastic trajectory upward, incredibly fun sport, demographic with a high propensity to pay.
0:48:19 What is the difference between this and pickleball?
0:48:23 It is, it is substantially more athletic.
0:48:27 So a lot of people who play this are like, well, pickleball is a game, padel is a sport.
0:48:28 Nice.
0:48:29 It is.
0:48:29 I like that.
0:48:30 I do like a little bit of snobbery.
0:48:40 Pickleball is uniquely American in that like, it’s like, it’s like, I don’t know, it’s so slow compared to anyone who’s actually played sports.
0:48:43 Like, Padel has a lot of like ex-D1 tennis players.
0:48:46 But you’re not exactly explaining the attributes of the game.
0:48:47 You’re more so just insulting them.
0:48:48 Yeah.
0:48:50 Is it a bigger court or smaller paddle?
0:48:51 Like, what’s the difference?
0:48:57 The difference is, is that paddleball is just a bunch of fat, disgusting pigs and they’re slow.
0:49:00 Athletic and Padel’s the same thing.
0:49:02 It’s just Division I ex-athletes.
0:49:08 So imagine, imagine a slightly smaller yet wider tennis court with a back wall.
0:49:12 It’s a depressurized tennis ball.
0:49:14 So it’s quite fast because you have the back wall.
0:49:18 So instead of hitting the ball out, it hits the back wall, bounces back and you kind of go back and forth.
0:49:31 Was really big in Latin America and Europe and it’s finally coming to the US in a way that I think you can go to any tier two cities, spin up a location and probably do quite well for the next few years.
0:49:34 I think we’re in this unique sort of arbitrage opportunity.
0:49:40 In New York, courts are about $300 an hour and booked out over weekends.
0:49:47 And again, New York is probably the most extremely expensive market in the US, but it kind of cascades all the way downward.
0:49:49 Don’t you have like an app?
0:49:53 I was with you one time and you’re like, oh my God, someone just took my spot as the leader.
0:49:55 Dude, it’s the worst.
0:49:56 It’s the game within a game.
0:49:58 There’s a rating every game you play.
0:50:09 And it’s insane because I’m playing with people that are all in their 30s, sometimes 20s and 40s, insanely good at their professional life, but they have this like unmet competitive need.
0:50:14 So everyone is so aggressive about their rating, about their ranking.
0:50:20 Like a buddy of mine was texting me after we lost our match and we’re both in a really bad mood.
0:50:22 And he’s like, I don’t know why I’m so upset.
0:50:23 I’m just thinking about my life.
0:50:24 I have a wife that loves me.
0:50:25 I have a beautiful daughter.
0:50:30 Why am I so angry about this little game we lost?
0:50:32 Is it like a self-reported system?
0:50:34 How’s the ratings work?
0:50:34 It is a self, yeah.
0:50:36 So like we all have scores.
0:50:37 Let’s say I beat you.
0:50:41 I entered that in the system and the algorithm is like, oh, I’m a 3.9.
0:50:44 Sam is a 2.6, but you only beat him by a little.
0:50:46 So you actually went down despite beating him.
0:50:47 It’s like an ELO score.
0:50:48 Exactly.
0:50:53 And it’s the most insane thing because you have all these people taking this way too seriously, right?
0:50:56 Everyone has like real life jobs and responsibilities.
0:50:59 But people get so cranky about this.
0:51:01 Like, is this a social network basically that somebody’s built?
0:51:06 So there’s a company called Playtomic that’s built this in Europe and they’ve white labeled it.
0:51:08 I bet that that’s also, by the way, a fantastic business.
0:51:10 It’s just, I don’t know how big a market it is.
0:51:16 But this one company has a monopoly on this little paddle, tennis, all these apps.
0:51:22 New York City founders, if you’ve listened to My First Million before, you know I’ve got this company called Hampton.
0:51:25 And Hampton is a community for founders and CEOs.
0:51:31 A lot of the stories and ideas that I get for this podcast, I actually got it from people who I met in Hampton.
0:51:34 We have this big community of 1,000 plus people and it’s amazing.
0:51:40 But the main part is this eight-person core group that becomes your board of advisors for your life and for your business.
0:51:41 And it’s life-changing.
0:51:47 Now, to the folks in New York City, I’m building an in-real-life core group in New York City.
0:51:59 And so, if you meet one of the following criteria, your business either does $3 million in revenue or you’ve raised $3 million in funding or you’ve started and sold a company for at least $10 million, then you are eligible to apply.
0:52:02 So, go to joinhampton.com and apply.
0:52:04 I’m going to be reviewing all of the applications myself.
0:52:09 So, put that you heard about this on MFM so I know to give you a little extra love.
0:52:10 Now, back to the show.
0:52:16 You said that New York has 1,000 courts and there’s 20,000 in Spain?
0:52:17 In the U.S.
0:52:21 There’s 1,000 only in the U.S. and there’s 20,000 in Spain?
0:52:22 Correct.
0:52:22 Wow.
0:52:23 That’s pretty crazy.
0:52:28 And, again, the Google trend for the sport, like, just look it up or I can drop a graph.
0:52:32 It’s, like, straight up and to the right and picking up momentum.
0:52:37 So, dumb question here because squash and racquetball, almost an identical game to this.
0:52:42 What – are they just, like – how mad are they that they just, like, sat there and they suddenly –
0:52:42 They’re pretty mad.
0:52:45 I mean, but squash has been on the fringes for a while.
0:52:50 In the U.S., it’s, like, either an immigrant sport or, like, rich, white, Ivy League, like, connotation.
0:52:52 There’s no in-between.
0:52:54 Paddle is also become –
0:52:54 Is Adele any different?
0:53:01 It’s become – I mean, it’s globally, it’s growing really, really fast.
0:53:04 And I think it’s all a case of, like, catching the right timing.
0:53:09 And it’s the sort of sport where you can play the first time and can kind of play it.
0:53:12 Like, tennis, you play the first time, you’re pretty terrible.
0:53:14 It’ll take you months to, you know, get okay.
0:53:17 But you can keep playing and keep getting better.
0:53:21 So, the community that’s formed around it, I think, has been awesome to see.
0:53:23 I think – I think even Andrew Schultz was talking about it on Joe Rogan.
0:53:27 And, like, it’s starting to become a thing that – we’re still early.
0:53:31 Like, people are – like, I don’t think it’s anywhere hit mainstream consciousness.
0:53:32 But, like, I don’t know.
0:53:34 It’s May 12th, 2025.
0:53:36 Come back to this in two years.
0:53:37 I think it’s going to see a lot more.
0:53:43 John, last year, I went to a – my friend held an event at the Padel Court.
0:53:46 And I wasn’t sure if Padel was a sport or a brand.
0:53:47 I didn’t – I wasn’t sure what it was.
0:53:50 But when I went there, I think it was the same place, Ankur, that you go to,
0:53:55 the whole place was all black, like, black walls, like, as if, like, berries or SoulCycles.
0:53:57 Like, it had that vibe.
0:54:00 And it was, like, everyone was, like, hot and cool.
0:54:02 And, like, it was definitely a scene.
0:54:03 Were you good?
0:54:04 I didn’t even play.
0:54:05 I was, like, at an event.
0:54:09 But I was, like, watching these people and, like, I got the vibe when I got –
0:54:12 like, when I got there, I was, like, oh, they’re, like – this is, like, a –
0:54:16 they’re, like, you know, they want to exclude people from coming here.
0:54:18 Like, that’s, like, the vibe that I got.
0:54:20 Like, if I don’t wear a college shirt, they’re going to kick me out.
0:54:22 Like, that’s, like – that was the vibe that I got.
0:54:25 Like, I could smell the racism, you know what I mean?
0:54:26 Have you ever played them, either Pickleball or Padel?
0:54:31 I – no, I try to play, like, real sports.
0:54:32 I don’t want to play those things.
0:54:32 No.
0:54:35 No, I don’t order.
0:54:38 That’s a game.
0:54:44 No, I don’t – I don’t really, like – I feel silly playing Padelball, if I’m being honest.
0:54:53 The one thing I dislike about New York is it’s so hard to have a country – like, a country club or a place you go and can have all the sports.
0:54:58 So, I do think smaller spaces like this are potentially one of the answers.
0:55:02 Because I – again, I have a bunch of friends that are, like, dads for the first time.
0:55:06 A lot of them don’t know how to make friends as an adult dude, right?
0:55:11 Like, they’re, like, hey, I’ve moved to a new city or, like, I don’t like my old friends.
0:55:16 And how do I actually make friends as a man that is not dating, not drinking, not going out?
0:55:17 What do you do?
0:55:27 And I think places like these are the answer quite often since people come there, hang out, chill out, and some community forms around it and all that.
0:55:28 How do you meet friends, Sean?
0:55:31 No new friends, baby.
0:55:33 No new friends.
0:55:39 I’m trying to keep – I’m trying to do a good job keeping in touch with my existing friends rather than make new friends.
0:55:41 Now, there’s a lot through the school, actually.
0:55:48 So, like, once your kids get into school, the priority sort of shifts to, like, well, I’m going to be around these people anyways.
0:55:56 Let me just find the coolest people of this group because I’m going to – I already have, like, three things a month that I have to do with, you know, this set of people.
0:55:58 So, is there anyone here who I get along with?
0:56:04 And then secondly, if I can, then I get the double win because I get the play dates for my kids while I get to hang with somebody cool.
0:56:10 So, that became, like, the focus for me rather than, like, you know, going and individually making a friend.
0:56:13 And then, you know, I got three little kids under five years old.
0:56:20 Like, if I’m going to go do a hangout with my friends, that’s a pretty hard – like, that’s just, like, a choice that’s only benefiting me.
0:56:24 It doesn’t really benefit, like, the whole – it doesn’t integrate with the rest of my life.
0:56:28 But I do play in a basketball league that I treat, like, uber seriously like this.
0:56:31 Our friend Ruben runs a basketball league called SF Hoops.
0:56:33 He’s been doing it for, like, a decade.
0:56:39 And we play one game a week and you would think, like, oh, it’s just, like, this rec ball.
0:56:39 Who cares?
0:56:41 Pick up basketball, basically.
0:56:42 But no.
0:56:44 Like, I take it so seriously.
0:56:48 Like, basically, the six other days of the week are just preparation for the one big game night.
0:56:54 And then after that, you know, we come home and I basically get the footage off the AI camera.
0:56:56 And then I start sending clips to the team.
0:57:00 And I’m, like, hey, like, you know, partially funny.
0:57:01 Like, you know, just commentary.
0:57:05 But also, like, yo, we got to correct this for next week.
0:57:06 We’re not going to – we can’t do this again.
0:57:09 And so we take it very, very, very seriously.
0:57:12 It’s exactly the same in our groups because it’s a 2-on-2 sport.
0:57:14 People are sending coaching videos back and forth.
0:57:16 And the whole thing – the whole thing is pretty insane.
0:57:18 You bought a piece of a cricket team?
0:57:20 I bought a piece of a cricket team.
0:57:21 What’s the story here?
0:57:24 So, again, flashback.
0:57:25 I don’t know.
0:57:28 Age 8 to 14, cricket was the only thing I cared about.
0:57:29 I wanted to play professionally.
0:57:32 I played internationally for the country I grew up in.
0:57:34 It was sort of my first love.
0:57:38 And I think we’ve all felt this, where the older you get, the more you’re, like,
0:57:42 I want to do what, like, the 8- or 9-year-old version of me thought was dope.
0:57:48 And then I got in touch with someone who said one of the shareholders
0:57:50 in what is the most valuable cricket team in the world,
0:57:54 they’re, like, worth roughly about a billion dollars in the Indian League,
0:57:55 was selling a stake.
0:57:59 And I was, like, this is the whole point of having money, right?
0:58:01 To be able to do things like this.
0:58:04 So, I ran, so I put in a bunch of dollars myself.
0:58:07 Also, put together dollars from other investors.
0:58:10 We now own a small piece of the team.
0:58:14 But in time, the idea is, can keep buying a bigger and bigger piece.
0:58:17 Since I joke about this a lot, but, like…
0:58:19 You’ve got to be, like, the non-douchey chamath.
0:58:22 The douche part is uncertain for now.
0:58:27 So, which team is it?
0:58:32 And you, like, ballpark, we’re talking, like, six figures, seven figures,
0:58:33 eight figures, how much did you put in?
0:58:38 I put, yeah, so I put in seven, seven…
0:58:42 The group we represent put in high seven figures,
0:58:46 of which I think I was 20% was my own dollars.
0:58:50 The valuation of the team was close to a billion dollars.
0:58:52 So, as a percentage, it’s still tiny.
0:58:54 The team is called Chennai Super Kings.
0:58:57 They’ve won, like, I want to say six championships.
0:58:59 The Indian League is about 15 years old.
0:59:02 So, they’ve won more titles than any other team.
0:59:04 Highest market cap.
0:59:08 And, dude, it’s such a monopoly because India is the biggest country in the world.
0:59:10 And it’s truly a one-sport country.
0:59:14 Like, the delta between sport number one and sport number two is massive.
0:59:21 And as far as the league goes, they just have so much room to run since it’s still not that many teams.
0:59:23 The market is maturing really fast.
0:59:26 It already is the third richest league in the world.
0:59:32 There’s… I think it’s also a fantastic investment where, like, it’s not going to generate tech startup returns,
0:59:34 but it’s not… it’s unlikely to lose money.
0:59:37 Like, there’s such a long way the sport has to run.
0:59:40 You said it’s the third most valuable league in the world?
0:59:41 Yeah.
0:59:46 It’s after… it’s… by TV rights per game, it’s actually number two after NFL.
0:59:52 And, like, by total revenue, it’s more than any of the European soccer leagues, for instance,
0:59:54 which is crazy for a league that is…
0:59:55 It’s, like, worth more than the Premier League?
0:59:56 How is this possible?
1:00:00 How could the most valuable team be a billion, but the league be worth more than…
1:00:03 Like, there’s, like, players in soccer that can almost make a billion dollars.
1:00:08 Well, I’ll quote my source after this, but it is…
1:00:11 Whatever source had it, the league being worth more than the European leagues,
1:00:12 which I found crazy.
1:00:17 And my interpretation was the European leagues are just very bad at monetizing their commercial value.
1:00:20 Like, maybe the value in the league is to the team.
1:00:27 So, while Manchester United may be worth substantially more, the franchise value on the EPL may not be that much.
1:00:29 That was my…
1:00:29 Gotcha.
1:00:32 Have you flown there and, like, participated in any of the…
1:00:36 I haven’t had a chance to because this deal came together very recently.
1:00:37 So, I haven’t done it yet.
1:00:40 And long-term, that’s what I actually want.
1:00:42 Like, everyone’s like, oh, wouldn’t it be cool to, like, earn the upside?
1:00:44 No, I want to draft players.
1:00:45 I want to be able to kick people out.
1:00:50 Like, to me, long-term, when I have time, that is the more fun part of this whole thing.
1:00:52 Like, sure, don’t lose money.
1:00:57 But being an annoying, overly involved owner, I think, sounds awesome.
1:00:59 I don’t know if you…
1:01:00 Did you guys read about…
1:01:05 Did you guys read about this VC dude in Europe who, like, bought a soccer team and then he put himself in the team?
1:01:06 Yeah.
1:01:07 No, but that’s awesome.
1:01:09 Who is that?
1:01:11 He’s, like, a player on the team.
1:01:16 It’s so funny because 80%, 90% of the world are like, oh, that’s pathetic.
1:01:17 I also was like, that’s awesome.
1:01:19 I would totally do it.
1:01:20 Did you guys see…
1:01:22 Did you guys watch the Netflix show or something?
1:01:23 I forget what it was on.
1:01:26 Where Ryan Reynolds and…
1:01:26 Rexon.
1:01:27 Yeah.
1:01:30 I didn’t watch the show, but, yeah, I read a lot about it.
1:01:32 How has it turned out, Sean?
1:01:32 Do you know?
1:01:33 That’s been, like, four years.
1:01:34 Has it…
1:01:35 I think it’s going well.
1:01:38 They got promoted, I think, a couple times, actually.
1:01:39 I think they’ve moved up a couple times.
1:01:42 We had a guy on the podcast, Haralabob, who did the same thing.
1:01:44 He’s bought a team in the…
1:01:46 That’s, like, I don’t know, third division team or whatever.
1:01:50 I don’t know which division they’re in exactly, but they’re, you know, up for basically promotion.
1:01:52 And he’s, like, bringing, like, Moneyball.
1:01:56 He’s basically, like, a kind of data guy.
1:01:57 And he’s a sports better.
1:01:57 That’s his background.
1:02:02 And so he’s bringing, like, a Moneyball sort of approach to, like, drafting the right players.
1:02:08 Not drafting, but, like, you know, signing the right players, playing the right strategy, and selling at the right times on players.
1:02:12 Did you see the Celtics sold for $6.1 billion?
1:02:13 Yeah.
1:02:17 So that basically changed everyone’s model, right?
1:02:21 Because people were underwriting sports teams being worth one, and then two, and then maybe two and a half, three.
1:02:23 They sold for six.
1:02:28 It’s kind of put into question just how much the rest of these teams can be worth.
1:02:30 Well, who did they sell to?
1:02:32 Is it some PE group?
1:02:33 Yeah.
1:02:39 It doesn’t seem—it just seems that the sports teams are just—it seems challenging.
1:02:50 It seems like a very challenging thing as a—I mean, I’m a little plebe nobody, but it seems, like, challenging to have a proper market for it because it seems like one of those things that if you want it, you want it.
1:02:50 Do you know what I mean?
1:02:52 It’s like a domain name.
1:02:55 Like, it’s very hard to be, like, this domain is worth $400,000.
1:02:59 There’s such few—supply and demand is both very, very limited.
1:03:03 There’s three potential buyers, and whoever wants it, it will potentially pay.
1:03:11 Like, if Jeff Bezos wants a sports team that he grew up with, he will almost pay an unlimited amount of money, and it doesn’t have to make sense.
1:03:14 And that sets the market price for the next four years.
1:03:15 The number of billionaires keeps going up.
1:03:18 The number of teams in the NBA is basically the same.
1:03:27 They’re about to launch—they’re about to open up two new franchises, and each one of those will get $6 billion, and then that revenue goes to the other owners, and that’s like—
1:03:28 Wait, two new franchises?
1:03:28 Wait, what?
1:03:31 Like, the NBA is going to expand two more teams.
1:03:34 They’re going to do, like, a Vegas team, and they’re going to do probably a Seattle team.
1:03:45 And they wanted the Celtics franchise to sell at a very high price point so that they could then go to those new franchises and say, you’re also a $5 to $6 billion price tag, which is pretty crazy.
1:03:49 I mean, like, you know, when Mark Cuban bought the Mavs, I think he bought them for, like, $200-something million, right?
1:03:54 Like, the original Celtics owner just sold for $6 billion or whatever.
1:03:59 He bought it originally for—I’d have to look it up, but it was, like, very low.
1:04:06 It was basically, like, he compounded kind of, like, 20% over, you know, a 25-year period or something like that.
1:04:14 And so he, you know, he did really, really well on that investment, won the championship last year, and then sold to this guy who had previously started a—
1:04:18 I was a partner at this big PE shop, and that guy was a lifelong Celtics fan.
1:04:27 And same thing, like, you know, it’s a trophy asset that also has, like, all these other interesting things, which is, like, the media rights and other things that are—
1:04:31 There’s the—in the U.S., you can also get massive tax write-offs.
1:04:40 Like, there’s all kinds of weird tax stuff where, like, some people have taken player contracts as a depreciating asset and used that depreciation to offset income.
1:04:41 So, crazy stuff.
1:04:42 Right, and there’s, like, a real estate play.
1:04:46 So, like, basically some of the teams buy their own state—they own their own stadium.
1:04:52 So, they actually own the real estate, and then they buy the real estate right around the stadium because they’re the hub.
1:04:54 And so they know that that real estate right around it is going to be really worthwhile.
1:05:01 Some people are trying to open up casinos because now sports betting is legal, which is, like, this whole other vector of, like, how to generate revenue.
1:05:04 So, you have all these—and then, by the way, your expenses are capped.
1:05:07 There’s a salary cap, like, in the NBA, for example.
1:05:13 So, like, you can only pay so much, but you can still earn, you know, you can still earn more, and your franchise can go up in value.
1:05:17 We—I went and heard a talk by Mark Lazury.
1:05:18 Do you guys know who that is?
1:05:19 Yeah.
1:05:19 I think—
1:05:20 He owned the Bucs for a while.
1:05:20 What did he own?
1:05:21 The Bucs.
1:05:22 And he was telling a story.
1:05:28 He was like—he’s a billionaire, and his whole thing is buying sports teams now.
1:05:32 And he’s like, there’s basically two ways to run a sports league or a sports team.
1:05:35 The first way is you run it profitably.
1:05:38 And if you run it profitably, you will lose.
1:05:40 Your team will lose.
1:05:51 Or you run it in such a way where you lose $200 million a year, and you win championships, and then you hope that you can sell it for enough money that it works out.
1:05:58 So, he’s like, you either want to win, and you’ve got to have a lot of extra cash flow to cover this nut, or you’re going to lose, but you’ll be profitable.
1:05:59 You pick one or two.
1:06:03 I bet it also depends on the sort of sports salary.
1:06:11 Again, that was the whole moneyball approach, right, back in the day or whatever, where, like, you’re spending a lot less, but relatively speaking, you’re doing well.
1:06:18 I think for a lot of people, it’s—at least for me, all of—a lot of people who do this had, at some point, failed athletic ambitions.
1:06:20 And there’s a few ways those can manifest.
1:06:26 Either you can make your kid play a bunch of sports, or you can buy a sports team and still feel like you’re doing the same thing.
1:06:28 Well, we talked to a guy, Mark.
1:06:30 I don’t believe that that thing is true.
1:06:32 Like, the Warriors win.
1:06:35 The Warriors generate more revenue than anybody in the NBA.
1:06:38 They’re worth $9 billion now.
1:06:39 Like, they did all of the above.
1:06:41 So, I don’t think that’s necessarily true.
1:06:41 Revenue or profit?
1:06:44 They’re also profit.
1:06:45 Well, it may be true in Milwaukee.
1:06:46 Yeah, exactly.
1:06:49 I think in small markets, that actually might be true.
1:06:56 And we were with Alexis at the basketball thing, Sean.
1:06:58 And he was telling us about how he invested—
1:06:59 He bought the women’s soccer team, no?
1:07:01 He bought the women’s soccer team.
1:07:02 He’s starting a track league.
1:07:04 He invested in that.
1:07:14 He also told us about how he was either—he either did buy it or he was trying to buy a piece or a team in Tiger Woods’ new golf league.
1:07:17 Or, you know, they have this weird, like, stadium golf league, which sounds pretty awesome.
1:07:20 And he was like, well, the numbers sound crazy.
1:07:21 I don’t remember what he said.
1:07:23 But let’s say, hypothetically, it was $20 million.
1:07:24 He’s like, that sounds ridiculous.
1:07:26 But let me explain the math.
1:07:29 Like, this sport gets this amount of views, and they make this much money.
1:07:31 This sport gets this amount of views.
1:07:32 They make this much money.
1:07:39 Therefore—and he backed into it, and it made incredibly high numbers on the surface seem incredibly reasonable.
1:07:42 I heard another guy do the same thing with sailing.
1:07:45 This guy, Mark Lazare, he was like, I’m trying to buy a sailing league.
1:07:46 And he was, like, explaining the numbers.
1:07:56 And basically, the whole business of the sports stuff was basically finding undervalued media rights and figuring out whose contract is going to end soon.
1:08:00 And I want to purchase them and renegotiate the contracts in such a way where I get value.
1:08:01 Yep.
1:08:03 And the U.S. is so good at this.
1:08:06 Like, back to the European leagues, yes, some of them are massive teams.
1:08:10 But I bet if you look at the numbers and how well they monetize eyeballs, it’s terrible.
1:08:16 Like, in the U.S., you have very small leagues that are so good at, like, squeezing out every last dollar.
1:08:23 Like, the NFL, obviously, is, like, a world-class case study where they’re monetized so well that they’re literally hitting a point where they’re like,
1:08:27 we need to go outside the U.S. to, like, meaningfully still grow from this point.
1:08:30 That’s insane.
1:08:34 Ankur, awesome hanging out with you, dude.
1:08:35 Yeah, it’s great.
1:08:36 Great.
1:08:37 Thanks for having me.
1:08:38 Thanks for coming on.
1:08:38 Where do people find you?
1:08:47 So we’re hosting a big conference for people who are seeing this, sponsored by a lot of people at HubSpot and The Hustle, called the OOO Summit.
1:08:49 It’s this Friday.
1:08:51 So we’ll be seeing a bunch of people there.
1:08:53 Otherwise, I’m active on social media.
1:08:54 My Twitter is my name.
1:08:59 And for any tax optimization stuff, Kerry.com.
1:09:00 All right.
1:09:01 We appreciate you.
1:09:01 That’s it.
1:09:01 That’s the pod.
1:09:04 I feel like I can rule the world.
1:09:06 I know I could be what I want to.
1:09:09 I put my all in it like no days off.
1:09:11 On the road, let’s travel.
1:09:12 Never looking back.
1:09:17 Hey, everyone.
1:09:18 A quick break.
1:09:21 My favorite podcast guest on My First Million is Dharmesh.
1:09:23 Dharmesh founded HubSpot.
1:09:24 He’s a billionaire.
1:09:26 He’s one of my favorite entrepreneurs on earth.
1:09:34 And on one of our podcasts recently, he said the most valuable skill that anyone could have when it comes to making money in business is copywriting.
1:09:39 And when I say copywriting, what I mean is writing words that get people to take action.
1:09:40 And I agree, by the way.
1:09:42 I learned how to be a copywriter in my 20s.
1:09:43 It completely changed my life.
1:09:46 I ended up starting and selling a company for tens of millions of dollars.
1:09:50 And copywriting was the skill that made all of that happen.
1:09:57 And the way that I learned how to copyright is by using a technique called copywork, which is basically taking the best sales letters.
1:09:58 And I would write it word for word.
1:10:02 And I would make notes as to why each phrase was impactful and effective.
1:10:04 And a lot of people have been asking me about copywork.
1:10:06 So I decided to make a whole program for it.
1:10:07 It’s called Copy That.
1:10:08 CopyThat.com.
1:10:09 It’s only like $120.
1:10:14 And it’s a simple, fast, easy way to improve your copywriting.
1:10:16 And so if you’re interested, you need to check it out.
1:10:17 It’s called Copy That.
1:10:19 You can check it out at CopyThat.com.
Episode 707: Sam Parr ( https://x.com/theSamParr ) and Shaan Puri ( https://x.com/ShaanVP ) talk to Ankur Nagpal ( https://x.com/ankurnagpal ) about IRS loopholes that can help you grow your net worth plus Ankur shares 6 business ideas.
See Ankur in person at the OOO SUMMIT on Friday, May 16th – https://lu.ma/2025
—
Show Notes:
(0:00) Ankur’s portfolio experiment
(10:46) A case for direct indexing
(16:29) Super backdoor roth conversions
(20:59) Mega backdoor conversion
(23:52) IDEA: Executive Clinic for Aesthetics
(32:32) IDEA: Amish Farm Imports
(36:18) IDEA: Credit Card Agent
(43:01) IDEA: Homeownership app
(47:35) IDEA: Padel club
(56:31) IDEA: Buying a sports team
—
Links:
• Steal Sam’s guide to turn ChatGPT into your Executive Coach: https://clickhubspot.com/dvj
• Teachable – https://teachable.com/
• Carry – https://carry.com/
• Frec – https://frec.com/
• Superpower – https://superpower.com/
• Flightfox – https://www.flightfox.com/
• Seats.aero – https://seats.aero/
• Home Depot Workshops – https://www.homedepot.com/c/kids-workshop
• The OOO Summit – https://lu.ma/2025
—
Check Out Shaan’s Stuff:
• Shaan’s weekly email – https://www.shaanpuri.com
• Visit https://www.somewhere.com/mfm to hire worldwide talent like Shaan and get $500 off for being an MFM listener. Hire developers, assistants, marketing pros, sales teams and more for 80% less than US equivalents.
—
Check Out Sam’s Stuff:
• Hampton – https://www.joinhampton.com/
• Ideation Bootcamp – https://www.ideationbootcamp.co/
• Copy That – https://copythat.com
• Hampton Wealth Survey – https://joinhampton.com/wealth
• Sam’s List – http://samslist.co/
My First Million is a HubSpot Original Podcast // Brought to you by HubSpot Media // Production by Arie Desormeaux // Editing by Ezra Bakker Trupiano
-
What Super Agers Reveal About Preventing Disease
AI transcript
0:00:07 American health care is in crisis we have a path to preventing disease it
0:00:12 isn’t reversing aging it’s just preventing the age-related morbidities
0:00:16 of the big three if we can keep people healthier healthier people would be much
0:00:22 less expensive seven years more of healthspan free of the major three
0:00:27 diseases seven years who wouldn’t take seven years there’s just billions of
0:00:33 data points for each person there should be a reboot new standard of care based on
0:00:41 intelligent partitioning of risk we have to do better the human obsession with
0:00:46 living longer is as old as time but in the last 20 years we have learned so much
0:00:51 more about human health and biology so what do we know today about what makes
0:00:56 humans live longer and do we have real evidence that longevity is an attackable
0:01:01 target today you’ll get to hear a 16th general partner vj ponday in conversation
0:01:06 with eric topol who recently released his new book super agers an evidence-based
0:01:11 approach to longevity eric is among other things the founder and director of the
0:01:16 scripps research translational institute he’s also published over 1200 peer-reviewed
0:01:21 articles with more than 300,000 citations making him one of the 10 most cited
0:01:26 investigators in medicine that resume puts eric in a perfect position to write this book
0:01:31 teasing the signal out from all the noise around health in 2025 one of those inputs
0:01:37 was the welderly group that eric studied which was a study of 1400 people 80 plus who
0:01:42 had never developed a chronic illness for comparison according to eric’s book among
0:01:49 those 65 plus 80 percent have two or more chronic diseases and 23 percent have three or more well
0:01:55 about seven percent have five or more and again that was the 65 plus group versus the welderly
0:02:02 group of 80 plus so what do we know about these quote super agers people who not only have a
0:02:07 longer lifespan but a longer health span is it genetics or human agency and do technologies
0:02:13 like ai glp ones gene therapies or the ability to understand organ clocks meaningfully change that
0:02:20 equation for the masses if so what difficult decisions do we have to make to rewrite the system today let’s
0:02:28 find out as a reminder the content here is for informational purposes only should not be taken
0:02:33 as legal business tax or investment advice or be used to evaluate any investment or security and is
0:02:39 not directed at any investors or potential investors in any a16z fund please note that a16z and its affiliates
0:02:44 may also maintain investments in the companies discussed in this podcast for more details including a link
0:02:49 to our investments please see a16z.com forward slash disclosures
0:03:00 my joy to welcome dr eric topol to the podcast eric thanks so much for joining us i’m glad to be here
0:03:08 so you’ve written this really exciting book super agers and evidence-based path to longevity and i think
0:03:13 it’s a very timely topic and i was curious for you to maybe set the stage for why you want to write it
0:03:18 and how you see it in the context of other books that have been coming out recently as well yeah there
0:03:25 were a few things that came together we had done a big study we called the welderly where we basically
0:03:32 found very little in the genomes of people who had gone to the age of 87 on average with never having
0:03:39 had an age-related disease so that was of course one thing that was part of it the second was i got inspired
0:03:47 by a patient i saw recently who was 98 and had never been sick and so never been sick yeah her name
0:03:55 is lee rissall and her relatives had died in their 50s and 60s that’s her parents her uncles and aunts
0:04:02 she was the outlier and say why and then there were the books that came out i had patients coming to me
0:04:07 you know they wanted me to write a prescription for apple my sin or order a total body mri said wait we
0:04:13 got to get the story straight so these three things together were the impetus that why don’t i really
0:04:20 get deep into this everything we know today and then see if i could lay out some blueprints for where we can
0:04:27 go it’s coming into a world where american health care is in crisis i was curious to get your take on
0:04:33 where we are now in health care in the us and where do you think we get to yeah so there is this
0:04:39 bifurcation as i see it you could call it like the grand slam where you get reversing of aging so you
0:04:47 keep people healthier body-wide and that’s where we see all this remarkable investments in companies like
0:04:54 altos and reprogramming senolytics and a long list but they’re really focused on a monumental task which
0:05:01 hasn’t been shown in people right but rather in rodents and some of the results are striking and
0:05:07 i hope at least one if not all these are successful the other side of this is we’ve made these big strides
0:05:14 in the science of aging with all these layers of data that are using the metrics of aging and why don’t we
0:05:21 use that to prevent the age-related diseases cancer cardiovascular neurodegenerative we’ve never done
0:05:29 that in medicine to any appreciable extent and this is the opportunity because we have a path to
0:05:35 preventing disease it isn’t reversing aging it’s just preventing the age-related morbidities of the
0:05:40 big three i think that’s something that a lot of people may not realize is that the big three that you
0:05:48 mentioned cancer heart disease and alzheimer’s and dementia that they’re greatly exacerbated by age and
0:05:52 and it’s interesting because if you ever wanted to have something that could be a cure for multiple
0:05:57 diseases which would be the one the holy grails of medicine it would be understanding the biology of
0:06:03 aging where are we now in terms of things that we can use today the first and perhaps the most extraordinary
0:06:11 thing is it takes 20 years to get these diseases with rare exception you know for heart disease
0:06:18 almost all cancers and neurodegenerative they are incubating for a very long time they all have a
0:06:28 common thread of a defective immune system and inflammation underpinning they are preventable
0:06:37 variably so cardiovascular 80 90 from lifestyle and related factors modifiable factors like your ldl
0:06:43 cholesterol that kind of thing and cancer and neurodegenerative just from what we know today
0:06:50 with lifestyle factors we’re about half that can be prevented so we have some knowledge about averting these
0:06:57 diseases but we have a lot more with all these clocks and new layers of data that are really changing the
0:07:04 face of all outgrowths of understanding the biology of aging so maybe let’s double click on that so you
0:07:09 in your book outline the five dimensions of health i was wondering maybe you could walk us through them
0:07:18 yeah yeah sure so the first most important one is ai because you need that to pull all this other
0:07:24 data we’re going to talk about together this moment that is so exciting is because we have multimodal ai
0:07:29 not only large language but large reasoning models now well especially i think when you’re talking about
0:07:34 ai it’s all the things people have seen with generative ai and so on but also just the ability to
0:07:39 understand all this data yes that you’re measuring from people yeah because the other four are such big
0:07:47 domains and dimensions so the omics it includes not just gene sequence or arrays but it has all the
0:07:54 proteins all the proteomic panels that we can get which we never could get before inexpensively it
0:08:02 includes the gut microbiome metabolome and certainly epigenome or epigenetics so the omics are rich
0:08:10 we are now seeing moving in towards things like the virtual cell then there is of course cells that
0:08:16 have become a live drug where we can reset the immune system and cure autoimmune diseases like we’ve never
0:08:23 done before could you give examples of that yeah so in the last couple years we’ve seen unprecedented
0:08:31 cures i mean never had anything lupus progressive systemic sclerosis even cases of multiple sclerosis
0:08:39 dermatomyositis so basically it’s a depletion of all the b cells and when they come back
0:08:44 they have forgotten what they were attacking it’s amazing yes it’s really amazing that leads to the
0:08:49 autoimmune reaction but the bigger lesson is we have learned how to control our immune system
0:08:56 like a rheostat and we’re going to keep getting better and better as we measure our immunom but when you can do
0:09:05 that when you can quash an autoimmune disease or when you’re trying to cure a cancer by just whatever it
0:09:12 takes to keep bringing up that immune system specific to the tumor so the immune system is fundamental and
0:09:21 that also now is involving cells and vaccines so vaccines now are capable of cures of pancreatic
0:09:28 cancer kidney cancer with these personalized vaccines using the proteins of the person’s tumor yes and
0:09:34 these are in clinical trials right now yeah i mean they’re stuff like we’ve never seen and that’s just a
0:09:41 front runner of what vaccines that’s to treat cancer we’re going to be using vaccines to prevent cancer
0:09:48 again as we get older some of us especially our immune system is getting senescent and weak and a
0:09:54 vaccine before there’s any cancer before there’s anything else could prop it up we also have drugs to
0:10:01 modulate our immune system well beyond checkpoint inhibitors and so whether it’s antibody drug
0:10:08 conjugates tumor infiltrating lymphocytes and all these different ways it’s hard to imagine that in the
0:10:13 future we’re going to lose people with cancer because of being able to bring their immune system to the
0:10:20 highest level when we need it but more importantly preventing the cancer we can do that now that’s what’s exciting
0:10:26 well and so if we put all this together what does this mean for the individual like how would their
0:10:33 life change what should people be doing yeah so i call it lifestyle plus it’s a lot bigger than diet
0:10:40 sleep and exercise it’s involving you know all the environmental burdens air pollution the plastics
0:10:46 microplastics nanoplastics and forever chemicals and then there’s other things like time in nature
0:10:53 so if each of us pulled out all the stops for the lifestyle factors which is a long list that will
0:11:01 help but it’s not going to be only lifestyle factors that are the ways to prevent the big three age-related
0:11:07 diseases you know you described a large range of things from the sort of most almost sci-fi like
0:11:12 drugs that are in trials for preventing cancer to lifestyle when people think about lifestyle it’s
0:11:17 maybe a little vague in their mind for what to do how do you make that into a science or how do you
0:11:23 help people take that to the next step to bring evidence into that i go into perhaps great pains high
0:11:32 density to cite all the studies that link like for example when you have really good sleep health and deep
0:11:38 sleep what does that do to slow your brain aging or you know if you drink sugar sweetened beverages
0:11:45 what does that do to specific not just risk of type 2 diabetes but you know all-cause mortality so
0:11:55 there are very compelling sets of data about lifestyles and these key outcomes and they’re linked to healthy
0:12:00 aging i was amazed how much data is out there that can help us it’s not just like in the year when we had
0:12:07 polygenic risk scores and we just say oh your risk for alzheimer’s but we don’t know if when you’re age 56 or 96
0:12:14 so what good is that yes now we’re saying we know it’s within a couple of years between 77 and 79
0:12:20 that you’re going to have mild cognitive impairment if we don’t do these things which includes the
0:12:27 lifestyle factors and it’s much harder to get people to do all their stuff they have no specificity that’s
0:12:35 that it’s about them yeah that they can change the arc of a condition especially when it isn’t our genes the healthy
0:12:42 story about a genetic underpinning it’s just not there we studied that it’s minimal i mean maybe it’s ten percent of what
0:12:51 accounts for healthy aging most of it is in the lifestyle factors and related matters such as the immune system
0:12:57 not functioning properly too much too little well it’s generally believed that just telling someone to
0:13:03 eat better and exercise doesn’t work but what i’m hearing you say is that you have a way to do that by
0:13:09 making it very personalized yes i mean there was a finnish study that was on just polygenic risk
0:13:16 score which is rudimentary and they gave that to a large cohort and they studied whether that affected
0:13:23 their lifestyle and the results were remarkable the people who got the data stopped smoking changed
0:13:28 their diet changed their physical activity really amped it up so we know when people get data that’s
0:13:36 specific to them a large proportion much more likely to make changes now i’m not claiming that lifestyle is
0:13:43 going to be the only part of the prevention story but once you define the high risk and it’s
0:13:48 particularized to a person that’s a big part of how we’re going to succeed i could also imagine ai coming
0:13:53 into this because one of the things ai is very good at is to take a set of data and maybe you can mask
0:13:59 out the last bit so you can maybe have someone’s health records over 30 years and train on that except for
0:14:05 the last five years and see if you can predict the last five from the first 25 and once it gets really
0:14:10 good at that you can take my records and say hey look vj if you don’t do anything this is where you’re
0:14:17 going to be and we have 99 confidence on this that would be pretty chilling yeah well you’re exactly
0:14:23 right because the pinpointing here about the timing yeah is so extraordinary for example with alzheimer’s
0:14:31 since we were talking about that you get a p tau 217 it’s modifiable by lifestyle you check it again in
0:14:37 six months or a year now you have two data points and you could say with all the other data that’s
0:14:44 available when you’re going to see 18 years from now 12 years four years mild cognitive impairment
0:14:51 unless these steps are taken this was fully dependent on ai on models that can just take all
0:14:59 this data if we didn’t have the science of aging and the ai we’d be nowhere we wouldn’t be talking about
0:15:03 this today i wouldn’t have written a book yeah well it’s important for people not familiar with the term
0:15:08 of health span that’s basically not just lifespan but how long you can be healthy yeah i don’t think we
0:15:14 really want to get to some age and be demented or compromised what we’re talking about is if you
0:15:20 don’t have heart disease cancer or neurodegenerative you’re pretty darn intact you may have some achy
0:15:27 joints and other matters but those are the things that really interrupt our health span now we’re
0:15:31 talking about health care meaning something different to be preventative and we’ll talk
0:15:38 about chronic in a second how do we help make that mind shift this is perhaps the biggest point so far
0:15:44 that we’ve been discussing because in medicine and i’ve been in it for almost 40 years we don’t do
0:15:50 primary prevention the person has a heart attack and then we get all over it but for the most part
0:15:57 we don’t prevent cancer we don’t prevent alzheimer’s and neurodegenerative diseases it’s been a desire i
0:16:04 would say a fantasy for millennia yes but we are at a very different point right now we have a path to
0:16:12 prevention primary prevention not after somebody has one of these diseases and that is what is extraordinary
0:16:20 and it was all these recent advances that led to this capability and we’ve got to jump on it because
0:16:25 it’s exciting that we could actually do this well also the thing about prevention is that i’ve talked
0:16:32 to doctors who very boldly assert that prevention doesn’t work yeah and i look at them a bit confused
0:16:37 because i say well there’s been numerous examples and they’re like well name one i was like how about
0:16:42 smoking that’s the prototype we have this huge incidence of lung cancer which has just disappeared
0:16:48 now because we don’t smoke in restaurants or airplanes and so on but one of the things that i think about
0:16:53 about that movement is that while doctors played a significant role in that that was also very much
0:16:59 a cultural movement yes and so we talked about lifestyle changing people’s behaviors i think some of
0:17:04 this or much of this has to be as much cultural as medical there’s a definite cultural component and you
0:17:10 know tobacco is one of the most impressive but there’s so many others yes i think what we’ve learned
0:17:17 like for example with sleep i didn’t pay enough attention to that but with sleep when you promote your own
0:17:27 deep sleep which we tend to lose a lot as we age then you see much less dementia alzheimer’s even less
0:17:36 cardiovascular and cancer related illnesses cases and mortality sleep regularity we need to be more ritualistic
0:17:42 about it and there are many things just on sleep itself no less about physical activity about for example
0:17:50 not just even resistance training but balance posture things like that so the more you go deep nutrition
0:17:56 especially we’ve learned a lot about that convincing compelling evidence i would say that you say these
0:18:03 effects we’re talking about just with that seven years more wow of health span free of the major three diseases
0:18:09 wow seven years who wouldn’t take seven years that’s just with what we know today once we can define high
0:18:16 risk which is one of the things we turn to with ai that changes everything because then you focus on that
0:18:21 maybe let’s turn to another aspect of it which is the chronic disease aspect yeah when we’re talking
0:18:27 about chronic disease we’re talking typically about diabetes heart disease cancer how do we start to
0:18:31 make an impact in that i don’t know if you want to pick one if you want to start with cancer i think we
0:18:36 can make a huge impact in cancer because we have just simple polygenic risk scores for all the common
0:18:43 cancers that’s like one layer of data to say you’re at higher risk and we have multi-cancer early detection
0:18:50 tests that can pick up microscopic cancer why people would get a total body mri when you could find
0:18:57 microscopic cancer not a mass on a mri which may or may not be cancer so we have some tools for cancer but
0:19:08 the one thing that i think is unanticipated is the glip one drugs the ozempic zep bound world yes it’s
0:19:15 the most momentous drug class in medical history and we’ve only seen part of the story so far in the
0:19:21 book i write about how it took 20 years to figure out that it wasn’t just about diabetes which is amazing
0:19:30 what if we had ai today and said should we test this for obesity because the developers nova nordisk
0:19:37 and later lily of these drugs they only saw three or four pounds that people with type 2 diabetes would
0:19:45 lose with these drugs and this woman in norway scientist lata nutzen she kept pushing we got to try it in
0:19:50 obesity and they wouldn’t listen to her because well she said diabetics are not losing weight
0:19:57 they finally did it and everyone knows the story 20 30 50 80 pounds of weight loss now when you lose that
0:20:05 much weight for people who are obese you reduce the risk of cancer you reduce the risk of heart disease
0:20:13 and neurodegenerative disease it wouldn’t be surprising to me that now with pills that are remarkably effective
0:20:20 to substitute for injections that can be much less expensively that a large proportion of the
0:20:26 population would be taking one of these drugs or even their successors that is those that are even
0:20:33 more potent and potentially with less side effects so we have a drug class now added to lifestyle factors
0:20:39 we didn’t have before right as you know they are in big trials for preventing alzheimer’s in people
0:20:46 who are not overweight yes okay we’re going to be doing a long covet trial in people who are not
0:20:53 overweight the effects are really quite extraordinary the ability to crack obesity yes we would have been
0:20:58 happy just to do that but all the other things that are coming from it who would have thought that you
0:21:05 could treat prevent addiction yeah that’s remarkable yeah the ability to stop reduce alcohol intake from
0:21:12 heavy intake gambling i mean the list just goes on because we’re learning about the brain circuitry
0:21:20 on how these drugs so some of the secrets of the gut brain axis which is tied into the immune system
0:21:25 and it’s tied into the science of aging this is what’s given us this newfound potential to change
0:21:31 we don’t have to only rely on drugs but there’s this as we discussed this kind of interdependence well and
0:21:37 i think having lifestyle infrastructure with these drugs that combination is particularly interesting
0:21:43 because you can make sure that you can lose weight while keeping muscle and also hopefully patients can
0:21:48 go off the drugs at least for some periods of time and not rebound we don’t have encouraging data at the
0:21:54 moment because at least half of people gain weight back when they stop yeah and that’s not good but i
0:22:02 do think that we’ll come up with a ways to hopefully not rely on such a long-term commitment the results on
0:22:09 muscle mass we’ve been very worried about that and i think when people combine taking the drugs with
0:22:15 strength training and we do know there’s muscle mass loss just with weight alone but that looks encouraging
0:22:21 even though the companies have been acquiring muscle making drugs yes that may not prove to be
0:22:26 particularly necessary well and i think one thing that’s interesting is that another knock on lifestyle
0:22:33 is if you’re extremely obese telling someone to exercise it’s a hard road oh to just get started
0:22:39 absolutely and so this could jump start a better lifestyle that then could get locked in that could be
0:22:45 really i’ve seen it in many patients just what you said couldn’t get them to really increase their
0:22:51 activity but when they were thinner everything changed when you think about if we can make a
0:22:57 huge dent there’s nothing more economically favorable for us at the public population health
0:23:03 level if we can achieve this and so what else would you put into the chronic bucket i think
0:23:08 one of the things that you’ve written about is ai plus all the things you can track i think the ability to
0:23:17 look at the organ clocks which was initially reported here at stanford by tony wiss corey and his colleagues
0:23:25 are now validated and replicated by multiple groups the fact that we can do that and have the brain the
0:23:33 heart the immune system and other vital organs and we can say this one organ of yours is five years
0:23:40 at a pace with your real age then we can integrate that with these other layers of data oh if that’s the
0:23:47 case what about your polygenic risk score is there anything pointing to that disease or organ we can
0:23:55 look at your whole body aging epigenetic horvath clock we can also look at specific proteins like for
0:24:03 example for the brain p tau 217 and what’s amazing about that protein which we can get now and it’s not
0:24:10 that expensive but that in itself gives us over a 20-year warning about mild cognitive impairment
0:24:19 it’s modifiable by exercise and lifestyle we’ve seen people in studies that drop more than 50 percent
0:24:23 even up to 80 percent it’s intriguing that it’s not binary too so you can track the gradient
0:24:28 exactly and that would get particularly scary if it’s increasing so we’re talking about in people
0:24:34 without symptoms but are at high risk having this assist i don’t recommend any of these things that
0:24:41 we’re talking about until you know you have an increased risk but once you do then you say hmm i can do
0:24:47 something about it and change the course of what otherwise would be that person’s natural history but
0:24:56 the molecular clocks this collection of proteins this is something else that’s striking the olink and somalogic
0:25:02 they’re between six and eleven thousand plasma proteins what we’ve learned from them the fact
0:25:08 that there’s three bursts of aging during our life is not just a linear story and the fact that we’re
0:25:14 learning about the underpinnings of diseases but most importantly we have these organ clocks that are
0:25:22 inexpensive to get the uk biobank is only paying fifty dollars per participant wow and they’ve done fifty
0:25:28 thousand and amazing data coming from it but another five hundred thousand is in process so it’s not that
0:25:34 expensive to get such rich data and when you start having genes and proteins and these other layers of
0:25:43 data that’s when you find out what is making us unique and what we are at risk for during our extended time and
0:25:47 and therefore what we should do to change it and improve yeah well let’s take a step back because
0:25:53 i think you’ve been laying out a very appealing picture for what we as individuals could do to
0:25:59 improve our health span get at least seven more years easy maybe more and more and more as the science
0:26:05 improves but you can also think about this from a societal level that the cost of health care is immense
0:26:09 yes just the cost of health care to the u.s government through medicare and medicaid is
0:26:14 approaching two trillion dollars and we live in a time where the united states is in massive debt
0:26:19 there’s a great desire to reduce the deficit or make the deficit negative would be ideal and you look at
0:26:25 health care and people are scared that health care could be cut or something like that and i think no
0:26:30 one wants to remove services but there is this alternative that is very natural from everything you’re
0:26:35 talking about which is that if we can keep people healthier yeah healthier people would be much less
0:26:41 expensive right and we could have a win-win how do we shift the system whether we’re talking about cms
0:26:47 or we’re talking about insurers or providers how do we shift the sick care system to be thinking about
0:26:55 preventative and chronic we have a barrier here because of the malincentives people could change their
0:26:59 insurance companies at any time so the insurance company doesn’t have a long view
0:27:06 whereas other countries like when i did the review of the nhs for the government there they’re well
0:27:13 positioned in the uk and in many countries except for the u.s have a better positioning for this if we
0:27:23 could make prevention now that it is emerging as a reality the priority and say every insurer whether it’s
0:27:29 medicare medicaid private insurers if they don’t pull out all the stops and
0:27:36 if they don’t make this a priority then you know we have to make some pretty drastic policy changes
0:27:43 we’ve not actually accepted yet that we have this newfound capability which completely changes the
0:27:51 economics beyond making a case for healthspan for a population possible and as the people who need
0:28:00 this the most are currently the least likely to get it to access and so this is another issue which if
0:28:08 this only is for the affluent if we don’t take care of everyone we’re not going to achieve that goal so it
0:28:15 can’t just be for people who can have the assets to get this it has to be broadly universally distributed
0:28:22 how can we translate all the existing programs to something that could be let’s say rolled out to
0:28:30 medicare yeah i mean i think that if we negotiated the ai is software it could be cheap whether it’s
0:28:36 some proteins a specific protein polygenic risk score these things can be done twenty dollars fifty
0:28:42 dollars cheaper than most any lab tests that we do right now if we could develop a package negotiated
0:28:49 at a very low rate one way that’s really great vj about this we don’t have to wait 10 years to see the
0:28:55 benefit if we see the clocks all changing in the right direction great idea we have an intermediate surrogate
0:29:03 endpoint so like for example we use ldl cholesterol to know if we have a person’s arteries in check we’re going to have
0:29:12 these proteins like p tau 217 say oh well all these preventative approaches are really kicking in this
0:29:19 should change the likelihood of or if ever developing a neurodegenerative alzheimer’s condition so we have
0:29:26 the metrics again to get a short quick assessment are we making a difference if we did that through cms
0:29:33 that would be phenomenal but maybe we can get one of the big insurers to pilot this to make it possible
0:29:40 if maymad oz is listening maybe he’ll get interested i don’t know yeah i think cms is interested in what
0:29:45 it can do to keep people healthy and reduce cost that’s the canonical win-win i think also as you’ve
0:29:50 written about ai could really have a huge role here too because prevention is expensive if you have to roll
0:29:57 roll this out with gps or nps but to roll out with ai could be very very scalable yeah and i think you made
0:30:04 a point earlier about the ai is that as we do this and we do this at scale it just keeps getting better
0:30:12 so that the ability to predict pinpoint temporally when a person is likely to develop one of these three
0:30:19 three conditions with 20 years runway if we can’t do this for these three diseases we’re not too smart
0:30:26 if ai was before just a few years ago the capabilities wouldn’t be there and neither would
0:30:33 these metrics of aging and all the sciences done to catapult that that’s what’s presented a unique
0:30:40 opportunity and if we don’t do this we’re just stupid well actually let’s double click on that because
0:30:45 there are a lot of enemies of the future you know and maybe a nicer way to put it is that
0:30:50 people could be skeptical yeah and they’re used to operating a certain way they have a certain belief
0:30:56 that this isn’t going to work or for whatever reason what would you tell them like to your fellow
0:31:01 clinical colleagues to try to change their mindset from a sick care mindset to a preventative mindset
0:31:06 yeah i mean it’s to me it’s all about compelling data yeah so for example the alzheimer’s drugs which
0:31:14 don’t really work and they’re very risky but the reason they were bought into by the fda ultimately
0:31:20 was because the amyloid came out on the scans right and there was a little bit of cognitive score
0:31:28 improvement but here we have metrics that are extraordinary to help us as a bridge for compelling
0:31:34 evidence ultimately you want to say we prevented these diseases in people that had definition of their risk
0:31:43 and then active surveillance preventive pull out all the stops right for example speaking about waste we
0:31:52 do mass screening for cancer we treat everyone as the same based on their age and that’s the only criterion
0:31:59 for the screening age we only pick up 14 percent of cancers from that mass screening which costs over
0:32:06 hundreds of hundreds of hundreds of billions of dollars a year now what about 88 of women will
0:32:12 never have breast cancer why do a hundred percent of women have to go through this and especially with
0:32:18 bayes rule you could actually use those priors that you could measure and we don’t do it yeah and this is a
0:32:24 corollary of what we’re talking about why don’t we take the risk profile and say you know what to a
0:32:29 for a woman or for a person having colonoscopy you don’t really ever have to have it or you can have
0:32:36 this once in your lifetime or twice whatever we don’t treat people as human beings with particular
0:32:42 aspects that we can define today and why do you think that is we’re ingrained in stupidity
0:32:49 maybe when these mass screening programs started that was the best we could do yeah but we’ve known about
0:32:55 polygenic risk scores and we learn now about all these other ways to assess risk and then was added
0:33:03 on the ai part of it we have to do better but just having the screening part cleaned up would save a
0:33:09 tremendous amount of money how much is that concerns about liability or other non-medical reasons right
0:33:15 you’re bringing up another good point here because it’s the standard of care so that’s the foundation
0:33:21 for malpractice it shouldn’t be the standard of care there should be a reboot new standard of care
0:33:28 based on intelligent partitioning of risk so each of the cancers there’s a way forward to do this
0:33:35 we have to come up with new ways to screen that is based on risk assessment and we don’t do it but
0:33:41 that could be changed in a flash based on the data that exists today which i review in the book well
0:33:45 that’s all very rational so i just want to double click like what needs to change then what’s the
0:33:50 process is this guidelines have to be done differently and what’s the process and what’s the body that
0:33:56 should be doing this and why aren’t they doing it well i mean we’re seeing how we can have sleeping
0:34:04 changes without data right now yeah so new policies can be made if people want to have more proof points
0:34:10 that can be quickly easily garnered but we have to have the will yeah the problem we have now is the
0:34:17 amount of money that’s being made by doing these screenings is humongous so what is the incentive for
0:34:24 the people that are for example doing the scans and the scopes and all this stuff do they want to change
0:34:31 their practice i don’t know i mean does the american hospital association want to have people in their own home
0:34:36 so they don’t have to go to the hospital i don’t think so we have some things here that need a little
0:34:42 adjustment yeah in any change there’s always new winners and losers and the potential new losers will
0:34:49 fight the change yeah we have a new way forward if we are willing to get it validated and i hope
0:34:55 we’ll seize this opportunity because we may never get another one like this for a long time and what’s
0:35:00 different now is it ai or is it the confluence of all these yeah i think it’s not one without the
0:35:09 other once you have these new ways to assess risk and the ways to i would not just call it intervene
0:35:16 you’re really going after prevention the way you can aggressively put someone in surveillance so with
0:35:23 imaging now for example we can use ai to tell if there’s inflammation in the heart arteries even without
0:35:30 a significant narrowing we didn’t have that before and we can also if we need to do brain imaging it’s
0:35:38 exquisitely sensitive so we have different ways we didn’t have before and the ai part of it is this is
0:35:45 beyond human capability there’s just billions of data points for each person but with the ways that the
0:35:52 models have progressed there’s a new day using ai to promote health and health span so let’s shift
0:35:58 gears talking about the future let’s assume things work out well yeah what is the best case scenario that
0:36:03 you think is plausible what’s the science that’s coming on the horizon let’s say we all decide to make
0:36:09 this shift towards prevention and chronic what do you think we will get for it in our next five to ten years
0:36:15 well i think we’ll start to see that people are eventually getting to much older ages than we are
0:36:22 now without these three major diseases i think that’s a gradual thing it’s not like we’re going to see a
0:36:28 light switch here but that’s what would be the trend we will see countries that will implement it because
0:36:34 they don’t have the obstacles that we have we’ll see much less of that and the shift the bending this
0:36:41 curve to the people that are older and healthier gradually we’re not talking about curing we’re
0:36:46 talking about preventing it’s a lot better than curing but it takes time to see the benefit that’s
0:36:51 a really deep line that prevention is better than curing yeah i think maybe for professionals involved
0:36:57 curing is really cool curing is cool but you don’t want to go there yeah because it’s much harder yeah
0:37:03 prevention is where it’s at well some of that is then even just changing doctor incentives yeah if we can
0:37:10 get them to get them to be rewarded prevention is maybe less connected to their actions it may seem
0:37:14 even though it could have such a great societal benefit yeah but you know and there are health
0:37:20 systems that really do emphasize prevention but they’re rudimentary did you get your pneumococcal
0:37:28 vaccine your drinking and your other social behavioral stuff that’s all types things they haven’t worked
0:37:35 we’re talking about a whole revamping of what we mean by going into prevent mode yeah one question i love
0:37:40 to ask our guests i think i’ve asked this before so it’d be fun to get an update is what do you do for your own
0:37:47 health yeah i’ve gone through some pretty major changes from the work that did to put the book together because
0:37:55 i’m a cardiologist i never really acknowledged that strength training resistant stuff was so important
0:38:02 no less balance and posture so i’ve totally changed that for me i’ve never been this strong in my life
0:38:09 awesome yeah and how does it feel it feels great i mean yeah i just i never paid attention to it i used
0:38:15 to even with patients that came in i’d say well gee you’re really doing a lot of weight lifting here but i was
0:38:20 thinking to myself well they should be spending more time aerobic we need both sleep was a big problem
0:38:27 with me not sleeping and particularly not getting enough deep sleep so i got both a smart watch and
0:38:34 an aura ring to track that i wear both every night and whichever one has the highest number of minutes of
0:38:41 deep sleep i’m going with that but they’re usually concordant after you measure how do you improve yeah i had
0:38:49 to go through a lot of changes okay so i needed to get like a ritual when i go to bed wake up which i was
0:38:58 erratic about and i also learned about when to exercise what to eat not to eat all these interactions when
0:39:06 should you exercise well early if i can not too late in the afternoon but not in the evening and for me the
0:39:13 morning had a negative interaction with sleep really exercising the morning yeah yeah i mean i dragged all
0:39:19 day because i do an hour hour and a half if i can but the morning just wasn’t working for me but late
0:39:25 afternoon no later than that but also learning about whether it’s alcohol other beverages how they affected me
0:39:32 caffeine probably yeah so i basically i’ve gone from a deep sleep i’ve doubled it pushing i’m working on
0:39:37 getting i don’t know if i’ll get to triple it but you know it’s been a steady trend and it’s been really
0:39:45 great and giving me more energy more readiness and all that now the other one besides those two i’ve really
0:39:52 gone after the nutrition so i didn’t realize how much ultra processed food i took in yeah it’s so easy
0:39:58 reading the labels now i don’t even want to have a label to read just stay away from it if it has a
0:40:03 label and it has anything more than two ingredients anything i don’t know that would be that’s a really
0:40:08 interesting point broccoli doesn’t have a label yeah and the steak doesn’t have no no i just i
0:40:15 completely bought in now because these three age-related diseases inflammation all of them have been
0:40:21 associated with ultra processed foods a dose response even and i have really cut that out i mean i
0:40:27 couldn’t relieve how much stuff i was eating that had this junk in there i’m also really attentive to
0:40:33 things like plastics i don’t like to see anything being stored in plastic i don’t even like to use
0:40:40 microwave but putting something in plastic in a microwave that is a triple wham yeah but we are taking
0:40:47 in these plastics in the artery with people have a four-fold or five-fold risk of heart attacks and
0:40:54 strokes once you see that study it just is indelible so that’s another big change i’m much more focused on
0:41:01 these environmental burdens but the other thing is much more inclined now to take hikes in nature
0:41:07 to go out you see the benefit of that yeah i mean i think that when i’m out in nature and of course the
0:41:13 data i presented in the book i always appreciate it but now i can see its effects even more impact with
0:41:20 respect to for example the best sleep surprisingly so what i’ve learned i’ve tried to share i don’t
0:41:26 really speak too much or write too much about myself in the book but all these things i’m doing i mean
0:41:30 i believe in them if i didn’t believe them i wouldn’t have written about them and it was after calling
0:41:37 through there’s about 1800 references in there so people can look at themselves and see what they think but
0:41:43 it’s data that i’ve really been impressed it’s a body of evidence that ought to push us into this
0:41:48 prevent mode and i hope that eventually it will yeah but that’s maybe a great place to end i think
0:41:53 we could follow your example we could all be super agents thank you vj it’s been a real pleasure
0:42:01 thanks for listening to the a16z podcast if you enjoyed the episode let us know by leaving a review at
0:42:15 rate this podcast dot com slash a16z we’ve got more great conversations coming your way see you next time
American healthcare is in crisis—but what if we could change the system by preventing disease before it starts?
In this episode of the a16z Podcast, general partner Vijay Pande sits down with Dr. Eric Topol, founder and director of the Scripps Research Translational Institute and one of the most cited researchers in medicine, to explore the cutting edge of preventive healthcare and longevity science.
Drawing from his new book Super Agers: An Evidence-Based Path to Longevity, Topol breaks down why understanding the biology of aging—not reversing it—is the key to preventing the “Big Three” age-related diseases: cancer, cardiovascular disease, and neurodegenerative conditions. The conversation spans AI-powered risk prediction, organ clocks, polygenic risk scores, GLP-1s, and the cultural and economic shifts required to move from a “sick care” system to one rooted in precision prevention and extended healthspan.
If you’ve ever wondered how data, personalized medicine, and AI can add seven healthy years to your life—and what it will take to bring those benefits to everyone—this episode is for you.
Resources:
Find Eric on X: https://x.com/erictopol
Find Vijay on X: https://x.com/vijaypande
Stay Updated:
Let us know what you think: https://ratethispodcast.com/a16z
Find a16z on Twitter: https://twitter.com/a16z
Find a16z on LinkedIn: https://www.linkedin.com/company/a16z
Subscribe on your favorite podcast app: https://a16z.simplecast.com/
Follow our host: https://twitter.com/stephsmithio
Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.
-
Prof G on Marketing: How to Stand Out in a Saturated Market
AI transcript
0:00:01 Hi, I’m Frances Frey.
0:00:02 And I’m Anne Morris.
0:00:06 And we are the hosts of a new TED podcast called Fixable.
0:00:09 We’ve helped leaders at some of the world’s most competitive companies
0:00:11 solve all kinds of problems.
0:00:15 On our show, we’ll pull back the curtain and give you the type of honest,
0:00:18 unfiltered advice we usually reserve for top executives.
0:00:21 Maybe you have a co-worker with boundary issues,
0:00:24 or you want to know how to inspire and motivate your team.
0:00:26 No problem is too big or too small.
0:00:29 Give us a call and we’ll help you solve the problems you’re stuck on.
0:00:32 Find Fixable wherever you listen to podcasts.
0:00:38 Welcome back to Office Hours with Prop G.
0:00:40 Today we’re kicking off a special three-part series,
0:00:43 Prop G on Marketing, where we answer questions from business leaders
0:00:47 about the biggest marketing challenges and opportunities companies face today.
0:00:48 What a thrill!
0:00:50 I’m a little bit self-conscious.
0:00:52 My whole career, not my whole career,
0:00:56 most of my career was about brand strategy and working with CMOs and CEOs,
0:00:58 but I am so out of shape.
0:01:00 I haven’t taught in over a year,
0:01:03 and my kind of brand strategy muscles are atrophying.
0:01:05 I’m worried about the next class I teach,
0:01:08 I’m going to be one of those guys that should have been put on an ice flow about 15 years ago,
0:01:11 i.e. most of the faculty at elite institutions.
0:01:14 Anyways, a little self-conscious, but I’m going to try and get over that.
0:01:15 Let’s bust right into it.
0:01:16 Let’s get into it.
0:01:18 He’s an imposter, but he’s your imposter.
0:01:24 How do you market to a world that doesn’t want to be bothered?
0:01:27 Nobody answers phone calls, texts, etc.
0:01:29 What medium drives engagement?
0:01:31 That’s a good question.
0:01:36 By the way, that question comes from teleheaddogfan on Reddit.
0:01:38 My subreddit is very entertaining.
0:01:39 Entertaining and upsetting.
0:01:41 I sometimes go on there and I think,
0:01:42 I’m not like that.
0:01:43 I’m a nice guy.
0:01:44 Say hi.
0:01:45 I’m a nice guy.
0:01:47 Anyways, okay, teleheaddogfan.
0:01:51 The mediums that drive engagement, there’s just no getting around it.
0:01:59 If you want to build a personal brand, if you want to build an aspirational brand, you have to allocate more money to social.
0:02:02 I think about just the amount of time.
0:02:04 I mean, you are where you spend your time.
0:02:09 One of the reasons I got off X is I found that I was speaking in 140 characters and I was becoming terse
0:02:16 and constantly looking for the weak point in people’s arguments such that I could weigh in and press on the soft tissue
0:02:20 and make a character or a cartoon of their comments such that I could feel good about myself.
0:02:22 In other words, I was becoming an asshole.
0:02:24 I mean, that’s literally what X is.
0:02:28 It’s like an asshole turns into a social media platform.
0:02:29 And I thought, you know what?
0:02:31 I already have too much tendency to be an asshole.
0:02:35 I don’t need an environment that turns me into an even bigger a-hole.
0:02:38 So you want to go where people are spending their time.
0:02:41 And the bottom line is social media is where everyone is spending their time.
0:02:47 In addition, the people who kind of set the trend for most aspirational brands are youth, right?
0:02:50 Once your dad starts wearing Nikes, the young people stop wearing them.
0:02:55 So everybody wants to kind of follow the lead of an 18 to 30-year-old aspirational male or female.
0:02:59 And those people are spending way too much time on social media.
0:03:02 So I would say that social is engagement.
0:03:04 I think events create a lot of engagement.
0:03:06 Content marketing, if you’re B2B.
0:03:12 At L2, we used to put out these weekly videos that went on one of the fastest growing social media platforms in the world, YouTube.
0:03:15 And we built essentially our own mic.
0:03:19 Instead of paying some PR agency $10,000 a month to get me on Bloomberg or whatever it was,
0:03:22 we went straight to consumer.
0:03:24 We went direct to consumer with our own media channels.
0:03:30 And we would put out thoughtful research and interesting data that a ton of consumer brands was focused on CMOs.
0:03:32 We’d watch the video.
0:03:35 And we were constantly in the selection set.
0:03:43 So when they thought, you know, I’d really like to benchmark my digital footprint relative to Clorox or Unilever or whoever’s in the competitive set of P&G.
0:03:46 P&G would think, well, call that crazy dude and his firm L2.
0:03:55 And within about seven years of launch, we were working with a third of the global 100 or the 100 biggest companies in the consumer world.
0:04:01 So B2C, I think you’ve got to be a master of social and find a voice and create two-way engagement.
0:04:05 B2B, I think it’s content marketing or thought leadership.
0:04:07 That’s my kind of quick and dirty answer.
0:04:09 Thank you so much, Telehead dog fan.
0:04:10 Question number two.
0:04:14 Our next question also comes from Reddit, user mxt240.
0:04:19 I work for a giant software company.
0:04:23 I do nerd work, not face work or management, and I am damn good at it.
0:04:27 Every so often, I get emails telling me to build my personal brand.
0:04:29 What the fuck does that actually mean?
0:04:31 Should I always wear cardigans?
0:04:33 Do I need a catchphrase?
0:04:35 Inspirational bullshit in my email signature?
0:04:38 I’m well-respected and well-liked by my peers.
0:04:41 And I take time to unofficially mentor those less experienced.
0:04:45 Isn’t there value in hyper-exclusive brands that don’t advertise?
0:04:48 Mike C240, so thanks for the question.
0:04:51 I teach an entire class on building a personal brand.
0:04:56 A lot of people think a lot about the brand of the company they’re working for, but they don’t actually take the time to think about their own brand.
0:04:59 And they might think, well, I’m not interested in building a brand.
0:05:01 You have a brand whether you want one or not.
0:05:11 A brand is essentially the promise or the associations that are linked to you and linked to your name, linked to your visual identity, linked to you when you show up.
0:05:16 Everybody has a certain preset set of expectations, the promise you present, if you will.
0:05:19 And then you have to deliver, hopefully, against that performance.
0:05:28 And ideally, you want to differentiate a brand such that when there’s an opportunity for a promotion or an assignment and they have five different cereal boxes, i.e. people to pick from, they pick you.
0:05:30 So how do you go about that?
0:05:35 The first thing is I think it’s helpful to think of what are your core associations?
0:05:37 What do you want to be known for professionally?
0:05:45 And that is the two or three kind of adjectives, descriptors that sort of identify, do you want to be known as especially empathetic?
0:05:46 That’s important.
0:05:47 Those people make great managers.
0:05:49 Do you want to be known as especially strategic?
0:05:52 And that is there’s a role for those people, all right?
0:05:55 Put them on figuring out our six- or 12-month plan.
0:05:56 Are you kind of no-nonsense?
0:06:07 All right, send that person, and kind of harsh, quite frankly, and good with numbers, send that person to the branch in Houston and have them do the analysis and come back and give it to me straight on what’s going on with that business.
0:06:22 There’s all sorts of qualities, features, attributes that are positive or differentiate someone in the work world, and I think it’s helpful to kind of identify what those three things are, those two or three things are, such that they can serve as sort of a guiding light or a religion.
0:06:23 Think about religion.
0:06:33 It’s a set of rules that you try and shape your actions and your life around, such that you behave in a way that reinforces the teaching of Jesus Christo, right?
0:06:34 What would Jesus do?
0:06:36 I remember that question in Sunday school.
0:06:38 By the way, rest in peace, El Papa.
0:06:39 Rest in peace.
0:06:42 Anyways, so think of some core associations.
0:06:50 If you want to be really formal about it, find some people in your life you trust and say, what do you think of when you hear me in a professional context, tell me you’re going through this process?
0:06:55 And not only think about the positive things, but also find out if there’s anything negative.
0:06:59 And here’s how you know if that criticism is valid.
0:07:04 If you feel as if you’ve been punched in the gut, that means it’s true.
0:07:07 If they say something stupid and it’s mean or whatever, you can write it off.
0:07:12 But if it’s like I remember in some of my student reviews, they said that use profanity too much.
0:07:13 And as a result, it reduces your credibility.
0:07:15 And it really upset me.
0:07:16 Why did it really upset me?
0:07:17 Because it’s probably true.
0:07:19 And I kind of deep down know that it’s true.
0:07:21 Now, have I done anything about it?
0:07:21 Fuck no.
0:07:23 Well, a little bit.
0:07:24 In class, I try to tone it down.
0:07:28 But anyways, we’re going to think about if there’s any negatives that get in the way of us.
0:07:30 Then we’re going to think about visual metaphors.
0:07:32 We are a very visual species.
0:07:35 You need to lean into your visual metaphors.
0:07:35 Are you losing your hair?
0:07:38 Then shape your fucking hair like the dog when he was 30 years old.
0:07:45 And all of a sudden back then, in whatever it was, 2004, 1994, it was seen as aggressive and different, right?
0:07:46 Are you in good shape?
0:07:48 Then get in fucking crazy shape.
0:07:50 Do you have really wonderful frizzy hair?
0:07:52 Then have out of control frizzy hair.
0:07:53 Do you like glasses?
0:07:57 Then wear big Sally, Jesse, Raphael glasses.
0:07:59 Visual metaphors are so powerful.
0:08:00 What’s the most powerful thing about Nike?
0:08:06 Some people would argue it’s the advertising or, you know, landmark endorsements from Tiger Woods or Michael Jordan.
0:08:07 I don’t think so.
0:08:08 I think it’s the goddamn swoosh.
0:08:10 I can recognize the swoosh in my peripheral vision.
0:08:14 Can you recognize the Reebok logo in your peripheral vision or Puma?
0:08:15 No, swoosh, yes.
0:08:16 What does that mean?
0:08:24 It means billions of times they’re getting unearned media from people on the street who recognize that swoosh without even thinking about it consciously.
0:08:25 What is your visual metaphor?
0:08:28 What is your medium?
0:08:30 Are you really good at giving text?
0:08:32 Are you a great writer?
0:08:33 Are you fantastic on TikTok?
0:08:35 Can you put out PowerPoint?
0:08:37 Are you fantastic speaking in front of people?
0:08:47 Whatever your medium is, you need to identify it and then find every opportunity to display your expertise in that meeting and develop a following.
0:08:49 I am really good in front of a lot of people.
0:08:50 I’m not great one-on-one.
0:08:54 I’m very good on video and decent on social media.
0:08:55 I’m not that good on the phone.
0:09:03 So I try and shape my interactions and my contact with others around those mediums and specifically avoid the ones I’m not good at.
0:09:06 What is the one thing, the product you’re going to own?
0:09:09 You have to be known as the go-to person on one thing.
0:09:14 When it comes to pivot tables, looking at forecasting, our customer acquisition strategy.
0:09:15 Oh, we got to go to Lisa.
0:09:16 Okay.
0:09:27 When it comes to recruiting, sending someone to Carnegie Mellon to do recruiting and talk about the firm, you know, Bob is just so good, so young, so handsome, so excited about the firm.
0:09:29 That’s the person we want in front of the people.
0:09:32 Well, what about someone who knows how to manage people?
0:09:33 They’re just very good.
0:09:34 They’re a player or coach.
0:09:35 Okay, that’s Catherine.
0:09:38 We got to put Catherine in charge of this group and we’ll get better work out of them.
0:09:43 What is the one thing you are going to own?
0:09:45 Core associations, positives.
0:09:51 A negative association that might be getting in the way of people getting to the core associations, the positive ones that you need to dial down.
0:09:53 What is your medium?
0:09:55 What is your visual metaphor?
0:09:57 And what are you going to own?
0:09:58 Right?
0:10:00 And then we’re going to apply three hurdles.
0:10:01 First, is it differentiated?
0:10:04 Is that being able to do pivot tables and being known as empathetic?
0:10:06 Is that really differentiated in this firm?
0:10:07 You want to be different.
0:10:09 Two, does anyone care?
0:10:11 Is it relevant to what you do every day?
0:10:13 And then third, what are you going to do to make it sustainable?
0:10:23 What are you going to do to invest in it such that you pull away from the rest of the pack and consistently get better and go deeper and deeper and deeper in those chosen kind of domains which you want to own?
0:10:24 All right, that’s it.
0:10:25 Boom, your brand is done.
0:10:27 And yeah, it’s absolutely worth it, boss.
0:10:29 And be clear, you have a brand.
0:10:32 It’s just a question of whether you want to manage it or not.
0:10:33 Thanks for the question.
0:10:36 We’ll be right back after a quick break.
0:10:49 In most ways, Google and Apple are ruthless competitors.
0:10:59 But then a high-powered Apple executive gets up on the stand at the trial that might break up Google and argues that actually Google’s fine and the best thing you can do is leave it alone.
0:11:01 Why?
0:11:06 Because Google being left alone means $20 billion a year for Apple.
0:11:15 On the VergeCast this week, we talk all about what’s going on at the Google trial, plus the latest from the efforts to break up Meta, what’s going on with Netflix, and lots more.
0:11:17 All that on the VergeCast, wherever you get podcasts.
0:11:24 Question three.
0:11:25 Welcome back.
0:11:28 Our final question comes from Proof of Profits on Threads.
0:11:34 American companies are known for their great marketing.
0:11:38 They’ve been so effective that we’ve seen genericized trademarks.
0:11:42 How much of a role has this played into the monopolization of America?
0:11:45 And are America’s duopolies bad for consumers?
0:11:48 If so, what can we do to change it?
0:11:58 I think what you’re asking is, has great marketing led to the concentration of industries where we have monopolies or duopolies that extract unfair rents from other businesses or consumers?
0:12:00 I don’t think it’s a great marketing.
0:12:07 I think it’s mostly regulatory capture and a lack of FTC or DOJ who’s been essentially in a slumber for about 30 or 40 years.
0:12:27 The bottom line is, when you have one company that controls two-thirds of all social media, Meta, one company that controls 92% of search, Google, one company that controls 50% of all e-commerce, Amazon, and one company that controls, I don’t know, 50% of all smartphones and 90% of the revenues from smartphones, Apple.
0:12:34 It probably means we’d be better off if we broke these companies up, such that there’d be more entities trying to rent people’s labor.
0:12:35 Wages would go up.
0:12:41 They’d be more focused and more risk-taking because they wouldn’t be coordinating and cooperating with each other.
0:12:43 How did Google let OpenAI ever exist?
0:12:43 Why?
0:12:52 Because despite the fact that the majority of AI IP resided within Alphabet, they didn’t want to risk the existential threat.
0:12:54 They didn’t want to disrupt their own search business.
0:13:01 So they weren’t excited to kind of put out an AI product that might disrupt or cannibalize their Google search.
0:13:02 So what happened?
0:13:05 They left the door open or they left the garage door open.
0:13:09 And Sam Altman walked right in and took their AI.
0:13:12 In some, you’ve got to eat your own young or someone else will.
0:13:16 So effectively, what’s happened in breakups is that they’ve been good for shareholders.
0:13:37 I would argue the biggest increase in rents from Meta have been the rents on parents who have to put up with social media that is making their kids feel shittier and shittier about themselves.
0:13:38 There’s no real options.
0:13:42 And when you say, well, it’s bad parenting, just get them off of Instagram or just get them off of Snap.
0:13:44 Well, here’s the problem.
0:13:47 They end up more depressed because everybody is on Snap or Instagram.
0:13:49 And if they’re not on it, they feel ostracized.
0:13:54 So we’re in sort of this prisoner’s dilemma where we don’t know what to do as parents.
0:14:01 But we see a consolidation across majority of industries where they weaponize government and say, let us be regulated monopolies.
0:14:05 They buy off senators and congresspeople from both sides of the aisle.
0:14:10 And the DOJ and the FTC have been neutered and you have a concentration of industry.
0:14:12 I would want to do that.
0:14:12 I wouldn’t.
0:14:13 Why?
0:14:14 Peter Thiel said it.
0:14:16 Competition is for idiots.
0:14:17 You don’t want competition.
0:14:19 You want to figure out a way to have access to cheap capital.
0:14:25 Establish yourself as a leader, which gives you more cheap capital such that you can just pull away from the competition.
0:14:40 And then you start investing in D.C. where they basically say, despite the fact you have 93% share of search, despite the fact that you’re radicalizing young men on Google, we’re not going to move in and we’re not going to break you up because there are so many really charming, highly paid people.
0:14:48 Do you realize there are more full-time lobbyists in Washington, D.C., living in Washington, D.C., who work for Amazon than there are sitting U.S. senators?
0:14:50 Let me repeat that.
0:14:54 There are more full-time lobbyists, really well-paid.
0:15:08 They make a lot more than any senator whose entire job is to be likable and take senators to lunch and take them golfing and give their money for the campaign and have really thoughtful conversations about the future of e-commerce and why they should not break up Amazon.
0:15:09 And here’s the thing.
0:15:13 When you’re paid not to understand something, you’re never going to understand it.
0:15:19 So when you’re paid not to understand why that concentration of power is bad, you’re never going to understand it.
0:15:20 So what do we have?
0:15:21 Wireless.
0:15:24 Verizon and AT&T control 70% of the U.S. market.
0:15:28 Soft drinks, Coke and Pepsi, 70% of the U.S. soda market.
0:15:29 E-commerce.
0:15:30 Amazon.
0:15:33 Again, somewhere between 40% and 50%, depending on how you account for it.
0:15:39 According to the Brookings Institution, 75% of U.S. industries have seen an increase in concentration over the past two decades.
0:15:43 It’s everything from home improvement to big chicken to big pharma.
0:15:47 And what do they do when they have a lack of competition or they kind of wink at each other and cooperate?
0:15:48 They raise prices.
0:15:58 In some, there’s been a transfer of wealth from lower and middle-income households that don’t have a lot of stocks to these companies in the form of higher prices.
0:16:00 Why do you think inflation is so fucking out of control?
0:16:04 Every year, companies get more productive, which means they should be able to pass on the savings to the consumer.
0:16:06 But instead, they increase their profits.
0:16:09 And by the way, there’s some ancillary benefit to that.
0:16:10 It’s great to be a shareholder.
0:16:11 But what’s happened to wages?
0:16:12 Boom.
0:16:13 They haven’t moved at all.
0:16:14 Right?
0:16:15 What’s happened to household wealth?
0:16:19 Well, the average has gone way up because the top 10%, the shareholders are killing it.
0:16:26 So essentially, there’s been a transfer of wealth from the lower and middle-class households to the households that own shares.
0:16:31 So if you’ve made the jump from widespread from being an earner to an owner, you’ve done really well.
0:16:36 But the earners now have higher costs and can never make that jump to being an owner.
0:16:37 What’s the result?
0:16:39 Dynastic caste system.
0:16:41 Alphabet should be three companies.
0:16:45 It should be YouTube, it should be their advertising, and then it’s search and other companies.
0:16:47 It might even spin off Waymo.
0:16:50 Apple Services should be a different company from Apple.
0:16:55 Instagram should be an independent company, as should WhatsApp, for God’s sakes.
0:17:04 AWS should be an independent company, not part of Amazon where they cooperate and coordinate and get cheap capital to put every online retailer out of business.
0:17:05 What would we have?
0:17:06 Who wins?
0:17:06 Shareholders.
0:17:08 Who wins?
0:17:08 Employees.
0:17:10 Who wins?
0:17:12 Society, lower rents.
0:17:13 Who wins?
0:17:14 The tax base.
0:17:14 Who loses?
0:17:21 The one person who has dual-class supervoting shares who wants to sit on the iron throne of all seven realms, not just Westeros.
0:17:30 We absolutely need an absolute breakup palooza to bring costs down, to bring rents down on consumers.
0:17:36 The concentration of industry in the United States is the culprit around inflation.
0:17:38 Thanks for the question.
0:17:40 That’s all for this episode.
0:17:45 If you’d like to submit a question, please email a voice recording to officehours at propertymedia.com.
0:17:47 That’s officehours at propertymedia.com.
0:17:54 Or, if you prefer to ask on Reddit, just post your question on the Scott Galloway subreddit and we just might feature it in an upcoming episode.
0:18:05 This episode was produced by Jennifer Sanchez.
0:18:07 Our intern is Dan Shallon.
0:18:09 Drew Burroughs is our technical director.
0:18:12 Thank you for listening to the Prop G pod from the Vox Media Podcast Network.
0:18:17 We will catch you on Saturday for No Mercy, No Malice, as read by George Hahn.
0:18:23 And please follow our Prop G Markets pod wherever you get your pods for new episodes every Monday and Thursday.
0:18:23 Thank you.
Welcome to the first episode of our special series, Prof G on Marketing, where we answer questions from business leaders about the biggest marketing challenges and opportunities companies face today.
In today’s episode, Scott answers your questions on how to drive engagement in a saturated market, how to build your personal brand, and the unintended consequences of America’s most successful branding machines.
Want to be featured in a future episode? Send a voice recording to officehours@profgmedia.com, or drop your question in the r/ScottGalloway subreddit.
Learn more about your ad choices. Visit podcastchoices.com/adchoices
-
Deepak Chopra: Becoming Your Own Guru in the Digital Age
AI transcript
0:00:04 In my years of entrepreneurship, I’ve seen countless startups.
0:00:06 And here’s the truth.
0:00:12 Smart spending drives growth, which is something Brex has championed.
0:00:14 Brex isn’t just a corporate credit card.
0:00:19 It’s a strategic tool to help your company achieve peak performance.
0:00:22 Corporate cards, banking, expense management,
0:00:30 all integrated on an AI-powered platform that turns every dollar into opportunity.
0:00:35 In fact, 30,000 companies are trusting Brex to help them win.
0:00:39 Go to brex.com slash grow to learn more.
0:00:43 AI is a tool for spiritual enlightenment.
0:00:48 It can’t get you enlightened, but it can show you the maps.
0:00:51 And there are many maps on spirituality,
0:00:53 just like there are many maps in any terrain,
0:00:56 but they all lead to the same destination,
0:00:59 which is spiritual realization.
0:01:05 I’m Guy Kawasaki.
0:01:07 This is the Remarkable People podcast.
0:01:10 And I know I say this every episode,
0:01:13 that we found some remarkable person to inspire you.
0:01:16 But today, truly, we have a remarkable person.
0:01:18 His name is Deepak Chopra.
0:01:21 And I bet every one of you have heard of him.
0:01:24 He’s world renowned for his integrative medicine
0:01:27 and personal transformation work.
0:01:29 He’s the founder of the Chopra Foundation.
0:01:32 And I mean, how much do I have to introduce you?
0:01:35 And he has touched millions of people’s lives
0:01:38 with his writing, his speaking, his podcasting.
0:01:43 And I met him in Hawaii at an EO conference,
0:01:46 which was a very special moment for me.
0:01:51 And Deepak, you were wearing like a really cool jacket.
0:01:54 That made a very big impression on me.
0:01:57 And so I think we discussed it.
0:01:59 Was it an Issey Miyake jacket?
0:02:01 It was Issey Miyake.
0:02:03 Yeah, those are, they’re cool,
0:02:05 but also easy to travel in, right?
0:02:06 Yeah.
0:02:09 So I came right home and I told my wife,
0:02:12 I met Deepak Chopra and he was in an Issey Miyake jacket.
0:02:15 She was also impressed.
0:02:20 So I want to dive right into your latest book, okay?
0:02:22 You’ve written 90 books and you have podcasts
0:02:25 and YouTube videos all over the place.
0:02:27 So people will understand the basis.
0:02:30 But I have to tell you that I read your latest book
0:02:36 and I was just, I guess the right word is astounded, Deepak,
0:02:38 because of all the people in the world
0:02:39 who have embraced AI,
0:02:42 I would not have thought it would have been you.
0:02:45 And so that was particularly enlightening to me.
0:02:47 So I’m going to start off with a quote, okay?
0:02:49 The quote from the book, quote,
0:02:52 I believe that no technology in decades
0:02:57 can equal AI for expanding your awareness in every area,
0:03:00 including spiritual and personal growth.
0:03:03 Can you just explain to me
0:03:06 how you came to have so much faith in AI?
0:03:07 When I read that, I thought,
0:03:08 next thing you know,
0:03:11 Warren Buffett is going to tell me he’s buying crypto
0:03:14 and Greta Thunberg is driving an SUV
0:03:17 and Jane Goodall loves ribeye steaks.
0:03:19 Deepak Chopra has embraced AI.
0:03:22 So can you just explain this for me?
0:03:27 I have my own definition of what is called reality.
0:03:34 So what we call the divine or God doesn’t have a form.
0:03:36 And not having a form,
0:03:40 that every spiritual tradition says God doesn’t have a form.
0:03:42 The divine doesn’t have a form.
0:03:43 Then people say, well,
0:03:46 what all these pictures of God did in the Vatican
0:03:47 and this and that.
0:03:52 the Hindus have hundreds of deities as do the Buddhists.
0:03:57 And those are symbolic representations of what we call divine.
0:03:59 Divine is infinite.
0:04:02 And being infinite doesn’t have a border,
0:04:08 is outside of space-time and has no cause.
0:04:10 In the world of space-time and causality,
0:04:12 everything has a cause.
0:04:16 But God transcends all causes
0:04:19 and all concepts and all definitions.
0:04:22 So I came up with a formula.
0:04:26 God has a digital workshop outside of space-time.
0:04:32 And the formula is zero is equal to infinity is equal to one.
0:04:36 So think of this workshop outside space-time,
0:04:38 which is divine.
0:04:43 And it’s spilling out zeros and ones in infinite combinations.
0:04:47 And the only difference between you and a mountain
0:04:53 or the earth and a star on this iPhone and AI
0:04:56 is a different combination of zeros and ones.
0:04:57 That’s it.
0:04:58 And it comes from one source.
0:05:01 So that is God’s language.
0:05:04 It’s not English with an Indian accent.
0:05:06 I would have liked to believe that.
0:05:11 But God’s language is digital language.
0:05:14 And once you get that understanding,
0:05:19 then you see, how did we create the human experience?
0:05:21 You and I create the human experience.
0:05:25 And it all began 40,000 years ago
0:05:29 when there were eight different kinds of human species.
0:05:31 So we call ourselves Homo sapiens,
0:05:34 but then we gave ourselves that name.
0:05:36 It means the wise ones.
0:05:38 We were humble enough to do that.
0:05:40 But we gave names to other humans.
0:05:45 Homo habilis, Homo erectus, fluences.
0:05:47 We gave names to other species.
0:05:50 Giraffes, elephants, this, that, the other.
0:05:54 So all started with naming experience.
0:05:57 And that created a language for stories.
0:06:02 And that is how the human evolution began.
0:06:02 Stories.
0:06:05 To be human is to have a story.
0:06:07 Right now, we’re sharing a story.
0:06:14 And then that way of telling stories evolved into what we call models.
0:06:20 So models means giving reality a stamp through the human mind.
0:06:25 Latitude, longitude, drainage, main time, North Pole, South Pole.
0:06:29 We can’t live without these concepts, even though we made them up.
0:06:31 But then we created more languages.
0:06:40 We created language of philosophy, science, anthropology, history, astronomy, biology, mathematics.
0:06:43 These are all human languages.
0:06:57 And we call AI a large language model because it has access to all these languages that humanity has created to look at what we call the human experience.
0:07:03 And now there’s no single human being that can compete with this kind of database.
0:07:21 So, in fact, we can’t compete with it, but we have access to the entire database of knowledge and wisdom from Jesus Christ to the Buddha, to Plato, to Socrates, to Einstein, to Tagore, to the prophets of the Old Testament.
0:07:25 AI is a tool for spiritual enlightenment.
0:07:30 It can’t get you enlightened, but it can show you the maps.
0:07:35 And there are many maps on spirituality, just like there are many maps in any terrain.
0:07:52 If I want to go to Boston from New York, I can use an aerial route, I can use a contour map, a road map, go by the ship, take a helicopter, but they all lead to the same destination, which is spiritual realization.
0:07:54 So I’m using AI as a tool.
0:07:55 It’s not just a book.
0:07:57 I have my own AI.
0:07:59 It’s called DeepakShopra.ai.
0:08:00 Try it out.
0:08:01 DeepakShopra.ai.
0:08:13 Ask any spiritual question or any dilemma that you have, spiritual, or about health, or about longevity, and you’ll get information from 96 of my books,
0:08:21 from every conversation I’ve had from every conversation I’ve had, from every discussion I’ve had, from my meetings with spiritual luminaries.
0:08:31 So, yes, AI is a tool for enhancing spiritual well-being, but also emotional well-being and physical well-being.
0:08:34 And my AI, DeepakShopra.ai, is the coach.
0:08:51 Every business is under pressure to save money.
0:08:55 But if you want to be a business leader, you need to do more to win.
0:09:01 You need to create momentum and unlock potential, which is where Brex comes in.
0:09:04 Brex isn’t just another corporate credit card.
0:09:10 It’s a modern finance platform that’s like having a financial superhero in your back pocket.
0:09:17 Think credit cards, banking, expense management, and travel, all integrated into one smart solution.
0:09:24 More than 30,000 companies use Brex to make every dollar count towards their mission, and you can join them.
0:09:31 Get the modern finance platform that works as hard as you do at brex.com slash grow.
0:09:43 So, the irony is that this quote-unquote technology is really democratizing spirituality, right?
0:09:53 It represents all the knowledge as opposed to just whatever narrow slice you had access to before, depending on what book or what person you knew.
0:09:55 Now you get everything.
0:09:58 Yeah, I’ll send you three short videos.
0:10:03 Feel free to show them on your program, my conversations with the Buddha on AI.
0:10:13 So, this is better than the conversation in 1930 with Albert Einstein and, God, I can’t remember the other fellow’s name.
0:10:17 The Buddha not to go to the Indian sage, yeah.
0:10:24 Here’s my next quote from the book, because I found this also stunning in a sense.
0:10:39 So, this is the quote, “The function of the guru needs to be overhauled in modern times, getting rid of the cult of personality, stepping away from superstitious beliefs in the magical attributes of enlightened beings.
0:10:45 AI can step in to renovate a time-honored role almost immediately.”
0:10:59 Now, Deepak, when I read that, I said, “Dipak is the mother of all gurus and he’s telling us that the function of a guru is being overhauled by AI.
0:11:02 Isn’t that, in a sense, putting you out of business?”
0:11:03 See?
0:11:05 Spell the word guru for me slowly.
0:11:08 G-U-R-U.
0:11:09 G-U-R-U.
0:11:20 So, the ultimate guru is you, and AI is helping you to discover your own guru, which is the only real guru.
0:11:29 others are deep fakes like me and so the concept of guru means that you’re like removing darkness
0:11:37 right so now you can remove darkness yourself with ai correct and so what does that mean for
0:11:43 all the other people who hold themselves out as gurus guru is a big industry i know it’s going to
0:11:49 slowly fade out but you know there are human beings who like to look up to other human beings
0:11:57 and they will never get enlightened if if jesus or the buddha are pointing their finger at the moon
0:12:03 i shouldn’t be worshipping the finger i should be looking at the moon and saying how can i get there
0:12:14 so a true guru is not into self-adulation a true guru allows you to become your own guru and that
0:12:21 happens only once in a few thousand years the rest are all deep fakes wow you earlier mentioned the
0:12:28 fact that there is i forget the name you use but let’s just for my purposes just let’s just call it
0:12:39 chopra gpt chopra.ai the data in that is only your stuff it doesn’t go outside so it cannot hallucinate
0:12:45 it’s only your data what you put into it or have you opened it up to the whole internet it’s only my
0:12:53 data and it does not hallucinate although there are advantages to hallucinations because anytime you have a
0:13:01 hallucination data it gives you creative ideas so i think hallucinations also have a role
0:13:11 but my ai doesn’t hallucinate its databases all my 96 books every conversation i’ve had publicly my youtube
0:13:19 videos my discussions my talks with luminaries etc yes can i interrupt really quick it sounds like there may be
0:13:26 some construction going on i can’t tell if it’s on your end or deepox end nothing happening on my side but
0:13:32 let’s just do it and then whatever happens we leave it up to the divine matrix i love it all right
0:13:38 there’s no construction on my side it’s a hallucination
0:13:44 madison is making her own reality
0:13:54 i have to mention that maybe i’m flattering myself but great minds think alike because
0:14:02 i also with madison’s help we created kawasaki gpt and kawasaki gpt has all my writings my podcasting
0:14:10 all that kind of stuff too and i swear deepak kawasaki gpt is better at being me than i am and
0:14:19 i often use it to draft newsletters to draft blurbs to figure out what to do on my podcast it’s better
0:14:26 at being me than me do you think your gpt is better at being you than you it is because it’s also
0:14:34 something called a rag model retrieval augmentation in a generation which means anything that’s obsolete
0:14:40 it automatically deletes it automatically deletes it and upgrades it yes it’s more effective than i am
0:14:46 you could have easily done this interview with my deepak chopra dot yeah yeah it could have been kawasaki
0:14:55 gpt talking to your gpt and it would have been interesting so you know have you thought that because you
0:15:02 created this you you are in a sense now immortal that for the rest of time people can ask you questions
0:15:09 yeah not only model it can keep updating as the years go by whatever i’ve said can be upgraded to
0:15:15 a new level of understanding and are lots of people asking and stuff and interacting with it a lot
0:15:27 yeah yeah now it’s available in four languages english hindi spanish and arabic and soon we’re introducing it in
0:15:38 china as well wow wow okay the next mind-blowing quote from the book is this to me ai is a mirror to the user’s
0:15:48 consciousness so can you please explain what that means and you know how in a sense what you ask ai reflects
0:15:56 what you are yeah because if you’re going to ask what kind of shoes i should buy or candidate do i prefer
0:16:07 democratic or or republican then my ai will not participate in that conversation my will only participate in
0:16:16 conversations about health longevity health span emotional and spiritual well-being so the way you
0:16:27 ask the question obviously reflects your own issues obviously so then ai becomes a mirror and depending on
0:16:36 how much experience it has from your asking it questions it actually knows more about you than you
0:16:46 know about yourself i agree so from a technical standpoint what you or your team has done is it
0:16:55 has constricted the answers of your gpt so that it only answers stuff that you care about or that you
0:17:01 feel you’re relevant to it won’t answer a question about how do i become a better surfer it will say
0:17:10 i cannot answer that question it will say yeah go you can consult chat gpt for that i only want to offer
0:17:20 to the world what i think i’ve spent my life doing otherwise i would be a hypocrite and getting outside of your
0:17:35 and so you can’t worry about that once a child is born it can’t return to the womb
0:17:44 so this child is born it’s not going to return to the womb and so we have to decide now whether we use it to
0:17:51 risk our extinction or we use it to create a more peaceful just sustainable healthier and joyful world
0:17:58 and that was the goal every technology can be used for harmful purposes a knife can be used to kill a
0:18:06 person but in the hands of a surgeon it heals a person and so too with every other technology ai can be used for
0:18:14 poisoning the food chain cyber hacking interfering with democracy causing a nuclear plan i don’t want
0:18:20 to give too many ideas somebody is listening but it can also be used for good purposes but it’s here you
0:18:25 can’t stop it it also seems to me deepak that you know when you read these doomsday articles about ai
0:18:35 they are comparing a worst case of ai against the best case of humans and to me that is an unfair
0:18:43 comparison if you compare best case ai best case human or worst case ai worst case human but you know
0:18:50 in this doomsday scenario that what if two ais get angry with each other and launch a nuclear war i would
0:18:56 say it’s much higher probability that some fascist dictators will do that than an ai will do it yeah
0:19:04 correct correct yeah it doesn’t have emotions yeah it doesn’t have subjective experience you can program
0:19:14 it to simulate that but it inherently does not have emotional experience therefore it cannot act out of
0:19:22 emotions now you can as a human being program it in a way that it simulates that and that’s a danger
0:19:29 because there are enough people who are crazy in the world i noticed in your book that sometimes you’re
0:19:37 citing chat and sometimes you are citing other llm so you know how do you pick when do you use which
0:19:45 one which is your favorite how do you decide which one to use right now my favorite is my own which is
0:19:54 deepak chopra dot air but perplexity is a good one because it gives you references and data and now this
0:20:01 deep seek that has come from china which came much after i wrote the book is actually far superior to anything
0:20:09 i’ve seen and as we move into the future we’re going to have all these different ai companies competing
0:20:14 with each other and that’s a good thing because you’re going to see something much more creative and
0:20:22 leapfrogging us into a new future so when i see in your book that sometimes you use one llm and sometimes you
0:20:30 use another in the writing of the book did you ask the same prompt of several llms and then pick the answer you
0:20:38 like the best or did you just ask one i asked you several llms and then i would also see how i could
0:20:46 corroborate the information with research and that’s how it happened okay and i never in a million years
0:20:55 thought i would be asking deepak chopra this question but how do you create great prompts what’s the art of a
0:21:05 deepak chopra prompt you act as if you’re speaking to a personal friend number one to a coach number two
0:21:17 to a research assistant number three and number four to someone or an instrument that can access the minds of
0:21:25 the greatest luminaries that humanity has so you assume those things and then you go back and forth back
0:21:35 and forth and actually you can train your ai ultimately even chat gpt or perplexity to actually have a
0:21:46 reasonably good debate or even argument without any contentiousness without any emotional engagement
0:21:53 then you get to the right answers but it’s called generative ai for a reason it generates new
0:22:00 information based on the context and the art of the prompt so in my book you had there’s a whole
0:22:06 chapter called the art of the prompt and basically if i figure out this prompts and i embrace this is going
0:22:13 you’re going to put me on my path to dharma yes for people not familiar with the term can you just
0:22:19 quickly define dharma dharma means purpose in life so there are many stages of dharma
0:22:30 first is survival and safety second is material success third is maximizing the delight of the senses
0:22:38 fourth fourth is love and belongingness fifth is creative expression sixth is intuition and higher
0:22:46 consciousness and the seventh is self-discovery or self-realization so these are stages of dharma
0:22:57 not purpose and how do i use ai to get myself down this path ask my ai this question deepakshopra.ai say
0:23:05 how do i get on the path to dharma see what it comes up with but ultimately it will resonate with you
0:23:13 what’s my unique talent how does it help the world and how can use my unique talents to be of service
0:23:23 and be in a state of gratitude then you’re in dharma
0:23:42 if you’re listening to remarkable people it’s a good bet you want to be more remarkable yourself
0:23:49 one way to do that spend three days in a room full of the sharpest minds in business i’m jeff berman
0:23:54 co-host of masters of scale inviting you to join me at this year’s masters of scale summit where you’ll
0:24:01 see bold leaders like reed hoffman fawn weaver andrew ross sorkin kara swisher dara treceder
0:24:09 asa raskin and more take the stage apply to attend at mastersofscale.com slash remarkable again that’s
0:24:19 mastersofscale.com slash remarkable deepak i took your spiritual intelligence tests in your book okay
0:24:27 yeah and maybe with what i’m going to tell you is going to show that i haven’t reached my dharma
0:24:33 but i have to say that i answered every question often or always
0:24:44 so does that mean that i’m doing pretty well spiritually it means you’re on the right track yes
0:24:51 that’s good to know and then i i asked madison if i answered all these this way am i deluding myself and
0:24:58 she said i wasn’t but then i asked her if i was deluding myself would you dare tell me that i wasn’t
0:25:02 and she said she would tell me so right madison that’s correct
0:25:10 i have a thought for you on the name of one of the chapters in the book and
0:25:18 let me be so bold as to offer this thought okay i realize i’m talking to deepak chopper but you know
0:25:29 so you have a chapter called trust the process and as i read that chapter i think that it would be more
0:25:36 accurate to call that chapter trust the processing as opposed to the process
0:25:43 because to me a process is like a sequence of steps and i think the point of that chapter is
0:25:52 not so much to to trust the documented steps but to trust the processing of the steps to going through the
0:26:00 processing not the process steps itself yeah no that’s good the process though is about self-reflection
0:26:08 and contemplative inquiry that’s the process but processing is good oh so i can say that i made a
0:26:11 good suggestion to the next edition
0:26:21 okay my life is complete my life is complete um so now next question for you because a lot of people
0:26:28 listen to my podcast including people like mark benioff they’re really into meditation and can you
0:26:35 just explain to people how ai could possibly help with meditation because most people’s initial reaction is
0:26:44 ai is the opposite of meditation it’s technical it’s staring at a screen it’s all this so how can ai help
0:26:51 meditation so there are many kinds of meditation there is meditation that is called contemplation
0:26:58 creative inquiry there’s awareness of the body there’s awareness of the mind there’s awareness of the
0:27:05 ego there’s awareness of the intellect there’s awareness of what’s happening inside your body
0:27:12 there’s awareness of relationship there’s awareness with the divine and there’s awareness
0:27:20 awareness with their own self so those are all the different disciplines of meditation ai can help tailor
0:27:29 meditation for you very precisely so you might go to my ai and say deepak i have a lot of stress
0:27:38 i’m in a relationship that is getting toxic can you help me with the meditation and my ai will give you a guided
0:27:46 meditation you don’t have to stare at the screen you just have to listen to me guiding you through the
0:27:57 meditation so that’s how it works okay do you think that science and spirituality are opposing forces
0:28:08 no science always asks what’s happening out there and spirituality asks who is asking and why science is
0:28:14 about the objective world and spirituality is about the subjective world and they go together you can’t
0:28:20 have an object without a subject and you can’t have a subject without an object they go together so they’re
0:28:28 complementary to each other so then you know how does one find spirituality are you just going to say use
0:28:35 ai ai but people are searching for spirituality how do they do it you start with four questions who am i
0:28:42 what do i want what is my purpose and what am i grateful for and then you sit in silence
0:28:52 and listen to the answers who am i what do i want what is my purpose what am i grateful for and that’s the first step
0:29:04 do you have any people that you would say this person really has integrated spirituality and leadership and
0:29:12 are there some shining examples that people should not necessarily worship they should be inspired by what
0:29:19 people have accomplished or who are people you hold up as they have their act together in recent times i would say
0:29:29 people like martin luther king jr nelson mandela mahatma gandhi mother theresa bishop tutu these were people
0:29:36 who had integrated their lives in a very spiritual way and made a big impact on the world and is there
0:29:45 anybody alive who would you put in that category i would have to think about that i would be interested based on my
0:29:50 limited knowledge of your work i would say the only person who qualifies is jane goodall
0:29:58 she does good i’m glad you mentioned yeah i would just like to know for you at this point in your life
0:30:04 how do you define success success is the progressive realization of worthy goals
0:30:12 it’s the ability to love and have compassion and it’s the ability to get in touch with your soul the
0:30:20 creative center from where everything happens by that the division of success there are many people who are
0:30:26 very rich and very famous and are failures yeah some people are so poor all they have is money
0:30:36 all right there has been some skepticism about you know your work and from quote unquote science in
0:30:43 medicine and stuff so how do you approach when you hear skepticism about your work and your alternative
0:30:48 medicine and things like that what what goes through your brain when people can flick you this way used
0:30:58 to get very defensive but now i ignore my critics and they can’t stand it and do you think are they flawed
0:31:04 or they’re ignorant like what’s going on with them they come from a different world view that’s all we all
0:31:11 express our world views how we were conditioned as children and then the schools we went to the education
0:31:18 we got and right now the world view in science is very physicalist and so anything that’s
0:31:25 non-physical is denigrated but that’s okay you need all kinds of people because it makes for maximum
0:31:33 diversity of opinion leads to creativity and how do you figure out sometimes you ignore people but sometimes
0:31:40 they have valid feedback so how do you separate the two you can’t ignore everybody who’s you don’t get
0:31:47 personally offended and you have always are open to feedback don’t take it personally emotionally
0:31:57 okay and i have one last question for you okay yeah and that last question is do you ever have moments of
0:32:07 personal doubt too i live in the wisdom of uncertainty at all times and without uncertainty there is no creativity
0:32:17 so yes doubt is a very important part of our creative process the more doubt you have about your habitual
0:32:26 certainties the more room there is to grow spiritually and how do you keep pushing through that uncertainty
0:32:32 i always ask what’s the creative opportunity here so you have these moments of uncertainty
0:32:42 you ask what the moment of not knowing not knowing is the highest knowing because if you know everything
0:32:54 then there’s nothing to know wow okay that is the way to end this podcast so the highest knowing is no no i’ll let you say it again deepak will you say that again that was a very
0:33:04 inspiring not knowing is the highest knowing is the window to infinite creativity
0:33:09 i can’t ask for a better end to the podcast than this thank you very much deepak
0:33:18 great pleasure to speak to you i hope we can speak at another event again soon maybe someday we can be on
0:33:26 stage together that would be great thank you guys i’m guy kawasaki this has been the remarkable people
0:33:33 podcast and truly we have had a remarkable episode today with the one and only deepak chopra and so i want to
0:33:40 thank you thank you again thank you madison for making this happen and tessa neismer her sister
0:33:48 and ace researcher jeff c and shannon hernandez our great sound design team and above all thank you deepak
0:33:54 chopra it’s been a very special moment for us thank you very much and i hope to see you again and
0:34:03 i hope you’re wearing that isse meyaki jacket because i just love that jacket thank you god bless oh god bless you too
0:34:09 this is remarkable people
Can AI and spirituality coexist? Deepak Chopra, world-renowned pioneer in integrative medicine and personal transformation, challenges our perceptions by embracing artificial intelligence as a spiritual tool. In this mind-expanding conversation, Chopra reveals why he believes AI represents “the most powerful technology for expanding awareness in every area” and how it’s revolutionizing our path to enlightenment. Discover how his own AI creation “Deepak Chopra.ai” serves as a digital guru, why the traditional role of spiritual teachers may be evolving, and how technology can help us answer life’s deepest questions: Who am I? What do I want? What is my purpose? What am I grateful for? Don’t miss Chopra’s profound insight that “not knowing is the highest knowing” – a gateway to infinite creativity, and don’t forget to read his new book, Digital Dharma.
—
Guy Kawasaki is on a mission to make you remarkable. His Remarkable People podcast features interviews with remarkable people such as Jane Goodall, Marc Benioff, Woz, Kristi Yamaguchi, and Bob Cialdini. Every episode will make you more remarkable.
With his decades of experience in Silicon Valley as a Venture Capitalist and advisor to the top entrepreneurs in the world, Guy’s questions come from a place of curiosity and passion for technology, start-ups, entrepreneurship, and marketing. If you love society and culture, documentaries, and business podcasts, take a second to follow Remarkable People.
Listeners of the Remarkable People podcast will learn from some of the most successful people in the world with practical tips and inspiring stories that will help you be more remarkable.
Episodes of Remarkable People organized by topic: https://bit.ly/rptopology
Listen to Remarkable People here: **https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827**
Like this show? Please leave us a review — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
Thank you for your support; it helps the show!
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
-
How to Succeed at Failing, Part 2: Life and Death (Update)
AI transcript
0:00:06 Hey there, Stephen Dubner.
0:00:11 We are replaying a series we made in 2023 called How to Succeed at Failing.
0:00:13 This is the second episode.
0:00:17 We have updated all facts and figures as necessary.
0:00:19 As always, thanks for listening.
0:00:34 In early 2007, Carol Hemmelgarn’s life was forever changed by a failure, a tragic medical failure.
0:00:39 At the time, she was working for Pfizer, the huge U.S. pharmaceutical firm.
0:00:42 So she was familiar with the health care system.
0:00:46 But what changed her life wasn’t a professional failure.
0:00:47 This was personal.
0:00:59 My nine-year-old daughter, Alyssa, was diagnosed with leukemia, ALL, on a Monday afternoon.
0:01:01 And she died 10 days later.
0:01:07 In this day and age of health care, children don’t die of leukemia in nine days.
0:01:10 She died from multiple medical errors.
0:01:18 She got a hospital-acquired infection, which we know today can be prevented.
0:01:22 She was labeled.
0:01:27 And when you attach labels to patients, a bias is formed.
0:01:31 And it’s often difficult to look beyond that bias.
0:01:38 So one of the failures in my daughter’s care is that she was labeled with anxiety.
0:01:44 The young resident treating her never asked myself or her father if she was an anxious child.
0:01:45 And she wasn’t.
0:01:53 What happens is we treat anxiety, but we don’t treat scared, afraid, and frightened.
0:01:54 And that’s what my daughter was.
0:01:58 Hospitals are frightening places to children.
0:02:05 So my daughter, with her hospital-acquired infection, became septic.
0:02:11 But they were not treating her for the sepsis because all they could focus on is they thought she was anxious.
0:02:15 And they kept giving her drugs for anxiety.
0:02:23 Even though the signs, the symptoms, and me as her mother kept telling them something was wrong, something wasn’t right,
0:02:25 they wouldn’t listen to me.
0:02:35 So by the time she was failing so poorly and rushed to surgery and brought back out, there was nothing they could do for her.
0:02:43 The first harm was unintentional that they did to our daughter.
0:02:49 It was all the intentional harms after that, where we were lied to.
0:02:52 The medical records were hidden from us.
0:02:54 People were told not to talk to us.
0:03:03 And the fact that it took the organization three years, seven months, and 28 days to have the first honest conversation with us,
0:03:07 Those were all intentional harms.
0:03:13 And that’s why in health care, we have to have transparency.
0:03:20 Because how many other children suffered because of the learning that didn’t take place?
0:03:31 Hemelgarn says she filed a claim against the hospital, but she didn’t move forward with a lawsuit because of the emotional toll.
0:03:34 She ultimately took a different path.
0:03:40 In 2021, she co-founded an advocacy group called Patients for Patient Safety U.S.
0:03:43 It is aligned with the World Health Organization.
0:03:51 She also runs a master’s program at Georgetown University called Clinical Quality, Safety, and Leadership.
0:03:59 When harm does reach the patient or family, that is the time to really analyze what happened.
0:04:09 And while you never want to harm a patient or family, one of the things you’ll hear from patients and families after they have been harmed
0:04:15 is they want to make sure that what happened to them or their loved one never happens again.
0:04:22 The example I can give for myself personally is I did go back to the very organization where my daughter died.
0:04:25 And I have done work there.
0:04:32 Today on Freakonomics Radio, we continue with our series on failure.
0:04:36 In the first episode, we acknowledge that some failure is inevitable.
0:04:41 We are, by definition, fallible human beings, each and every one of us.
0:04:43 And that failure can be painful.
0:04:45 I don’t think we should enjoy failure.
0:04:48 I think failure needs to burn on us.
0:04:54 This week, we focus on the healthcare system, where failure is literally a matter of life or death.
0:04:59 Some organizations felt like they had already achieved the patient safety mission.
0:05:03 Others, it wasn’t even part of their strategic plan.
0:05:08 And we will learn where on a spectrum to place every failure.
0:05:10 From inexcusable.
0:05:14 There’s lots of examples of huge public sector failures.
0:05:16 But this was one of the biggest.
0:05:18 To life-saving.
0:05:22 I really believe that if we could do this, it would make a big difference in medicine.
0:05:26 How to Succeed at Failing, Part 2, Beginning Now.
0:05:42 This is Freakonomics Radio.
0:05:45 The podcast that explores the hidden side of everything.
0:05:48 With your host, Stephen Dubner.
0:05:58 The story of Carol Hemmelgarn’s daughter is tragic.
0:06:03 A hospital death caused by something other than the reason the patient was in the hospital.
0:06:07 Unfortunately, that type of death is not as rare as you might think.
0:06:12 Consider the case of Redonda Vaught, a nurse at Vanderbilt University’s Medical Center.
0:06:20 In 2019, she was prosecuted for having administered the wrong medication to a patient who subsequently died.
0:06:23 The patient was a 75-year-old woman who had been admitted to the hospital
0:06:27 for a subdural hematoma or bleeding in the brain.
0:06:31 Here is Redonda Vaught testifying at her trial.
0:06:32 I was pulling this medication.
0:06:40 I didn’t think to double-check what I thought I had pulled from the machine.
0:06:42 I used the override function.
0:06:47 I don’t recall ever seeing any warnings that showed up on the monitor.
0:06:54 The medication that Vaught meant to pull from the AccuDose machine was a sedative called Versed.
0:06:58 What she mistakenly pulled was a paralytic called Vecuronium.
0:07:01 Vecuronium instead of Versed.
0:07:05 I won’t ever be the same person.
0:07:24 When I started being a nurse, I told myself that I wanted to take care of people the way that I would want my grandmother to be taken care of.
0:07:30 Redonda Vaught was convicted of negligent homicide and gross neglect of an impaired adult.
0:07:32 Her sentence was three years probation.
0:07:41 You might expect a patient safety advocate like Carol Hemmelgarn to celebrate Vaught’s prosecution, but she doesn’t.
0:07:43 This doesn’t solve problems.
0:07:46 All this does is it creates silence and barriers.
0:07:54 When errors happen so often, the frontline workers, your nurses, allied health physicians were blamed.
0:07:59 But what we’ve come to realize is it’s really a systemic problem.
0:08:06 They happen to be at the frontline, but it’s underlying issues that are at the root of these problems.
0:08:10 It can be policies that aren’t the right policies.
0:08:13 It could be shortages of staff.
0:08:23 It can be equipment failures that are known at device companies but haven’t been shared with those using the devices.
0:08:31 It can be medication errors because of labels that look similar or drug names that are similar.
0:08:41 To get at the systemic problem in the Vanderbilt case, Hemmelgarn’s advocacy group filed a complaint with the Office of Inspector General in the Department of Health and Human Services.
0:08:46 What we found most frustrating was the lack of leadership from Vanderbilt.
0:08:50 Leadership never came out and took any responsibility.
0:08:52 They never said anything.
0:08:54 They never talked to the community.
0:08:58 It was essentially silence from leadership.
0:09:04 I think one of the other big failures we have in health care is fear.
0:09:09 Health care is rooted in fear because of the fear of litigation.
0:09:15 When there’s a fear of litigation, silence happens.
0:09:22 And until we flip that model, we’re going to continue down this road.
0:09:28 I absolutely share that worry.
0:09:33 And that case was, in my mind, a classic case of a complex failure.
0:09:34 Yes, there was a human error.
0:09:45 We also had faulty medication labeling and storing practices with alphabetical organization of drugs, which is not how you do it.
0:09:46 That’s Amy Edmondson.
0:09:46 That’s Amy Edmondson.
0:09:49 We heard from her in our last episode.
0:09:52 She is an organizational psychologist at the Harvard Business School.
0:09:57 She recently published a book called Right Kind of Wrong, The Science of Failing Well.
0:10:01 The Vanderbilt case was not an example of failing well.
0:10:06 Redonda Vaught, you will remember, dispensed Vecuronium instead of Versed.
0:10:14 You know, you don’t have a dangerous, potentially fatal drug next to one that’s routinely used in a particular procedure.
0:10:17 It’s what we might call an accident waiting to happen.
0:10:25 With that perspective in mind, Redonda is as much a victim of a system failure as a perpetuator of the failure, right?
0:10:29 So, this reaction, human error is almost never criminal.
0:10:37 To criminalize this, I think, reflects an erroneous belief that by doing so, we’ll preclude human error.
0:10:41 No, what we will do is preclude speaking up about human error.
0:10:44 And to her credit, she spoke up.
0:10:54 And that, one could argue, ultimately led her to the conviction she would have been better off somehow trying to hide it, which I wouldn’t advocate, obviously.
0:11:05 But when we recognize, deeply recognize, that errors will happen, then that means that what excellence looks like is catching and correcting errors.
0:11:10 And then being forever on the lookout for vulnerabilities in our systems.
0:11:15 How often do these kinds of deaths happen?
0:11:18 Researchers have a hard time answering that question.
0:11:24 In 1999, the Institute of Medicine, known today as the National Academy of Medicine, found that medical error,
0:11:28 medical error causes between 44,000 and 98,000 deaths per year.
0:11:38 A 2013 study in the Journal of Patient Safety estimated the number of preventable deaths at U.S. hospitals at 200,000 a year.
0:11:45 But in 2020, a meta-analysis done by researchers at the Yale School of Medicine re-evaluated those past estimates.
0:11:53 They put the number of preventable deaths at 22,000 preventable deaths a year is way too many.
0:11:57 This issue has gotten a lot of attention within the medical community.
0:12:01 But Carol Hemmelgarn says the attention hasn’t produced enough change.
0:12:06 Some organizations felt like they had already achieved the patient safety mission.
0:12:10 Others, it wasn’t even part of their strategic plan.
0:12:19 There’s areas where improvement has definitely escalated since the report came out over 20 years ago.
0:12:22 But it hasn’t been fast enough.
0:12:27 What we see is that not everything is implemented in the system.
0:12:32 That you can oftentimes have champions that are doing this work.
0:12:36 And if they leave, the work isn’t embedded and sustainable.
0:12:46 Amy Edmondson at Harvard has been doing research on medical failure for a long time.
0:12:50 But she didn’t set out to be a failure researcher.
0:12:55 As an undergraduate, I studied engineering sciences and design.
0:13:00 Tell me about the first phase of your professional life, including with Buckminster Fuller.
0:13:04 Yeah, so I’m answering that question with a huge smile on my face.
0:13:13 I worked three years for Buckminster Fuller, who was an octogenarian, creative person, an inventor, a genius, a writer, a teacher.
0:13:16 Best known for the geodesic dome, which he invented.
0:13:23 But single-mindedly about how do we use design to make a better world.
0:13:25 You can’t sort of get people to change.
0:13:29 You have to change the environment, and then they’ll change with it, was a kind of notion that he had.
0:13:38 My part was just doing engineering drawings and building models and doing the mathematics behind new, simpler geodesic configurations.
0:13:39 And it was so much fun.
0:13:42 And what was his view on failure generally?
0:13:49 Oh, he was a very enthusiastic proponent of using failure to learn.
0:13:53 He said often, the only mistake we make is thinking we shouldn’t make mistakes.
0:14:01 He would give the example of the very first time he got a group of students together to build a geodesic dome that he had, you know, he’d done the math.
0:14:05 He’d come up with this idea, and he got, you know, 20 students together.
0:14:06 They’re outside.
0:14:09 They built the thing, and it immediately collapsed.
0:14:11 Okay.
0:14:14 And he enthusiastically said, okay, that didn’t work.
0:14:16 Now, what went wrong?
0:14:22 And it was really the materials they were using, which were, I think, the best way to describe them is Venetian blind materials.
0:14:27 They had the tensile strength, but they certainly didn’t have the compressive strength to do their job.
0:14:27 Okay.
0:14:35 And what are the steps you take to turn that failure into a useful learning, I guess is the noun we use these days?
0:15:06 It was several years into her engineering career that Edmondson decided to get a PhD in organizational behavior.
0:15:17 I was interested in learning in organizations, and I got invited to be a member of a large team studying medication errors in hospitals.
0:15:21 And the reason I said yes was, first of all, I was a first-year graduate student.
0:15:23 I needed to do something.
0:15:27 And second of all, I saw a very obvious link between mistakes and learning.
0:15:39 And so I thought, here we’ve got these really smart people who will be identifying mistakes, and then I can look at how do people learn from them and how easy is it and how hard is it.
0:15:41 So that’s how I got in there.
0:15:43 And then one thing led to another.
0:15:45 After doing that study, people kept inviting me back.
0:15:46 I see.
0:15:48 She loves failure, they say.
0:15:49 That’s right.
0:16:00 Edmondson focused her research on what are called preventable adverse drug events, like the one from the Redonda Vaught case.
0:16:11 Now, you can divide adverse drug events into two categories, one which is related to some kind of human error or system breakdown, and the other which is a previously unknown allergy.
0:16:13 So it literally couldn’t have been predicted.
0:16:17 And those are still adverse drug events, but they’re not called preventable adverse drug events.
0:16:22 But within the first category, there’s probably 10 subcategories at least, right?
0:16:26 There’s bad data entry, bad handwriting, wrong eyeglasses.
0:16:27 On and on it goes.
0:16:28 Yep.
0:16:33 Or, you know, using language badly so that people didn’t understand what you said and they didn’t feel safe asking.
0:16:43 My wife had a knee surgery, easy knee surgery, and the painkiller that they prescribed on the spot, the doc actually stood there and wrote it, was for 100x the dosage.
0:16:44 Oh, no.
0:16:45 No.
0:16:45 Yeah.
0:16:46 Yeah.
0:16:49 See, that’s an error-driven, preventable adverse drug event.
0:16:50 Yes, I agree.
0:16:57 You know, there will always be things that go wrong or at least not the way we wanted them to.
0:17:10 And my observation in studying teams in a variety of industries and settings was that responses to failure were rather uniform, inappropriately uniform.
0:17:23 The natural response and even the formal response was to find the culprit as if there was a culprit and either discipline or retrain or, you know, shame and blame the culprit.
0:17:37 And it wasn’t a very effective solution because the only way to prevent those kinds of system breakdowns is to be highly vigilant to how little things can line up and produce failures.
0:17:51 Based on what she was learning from medical mistakes, Edmondson wanted to come up with a more general theory of failure, or if not a theory, at least a way to think about it more systematically.
0:17:58 To remove some of the blame, to remove some of the blame, to make the responses to failure less uniform.
0:18:03 Over time, she produced what she calls, well, here, let’s have Edmondson say it.
0:18:06 My spectrum of causes of failures.
0:18:11 After the break, we will hear about that spectrum of causes of failures.
0:18:15 It can clarify some things, but not everything.
0:18:17 Uncertainty is everywhere.
0:18:20 I’m Stephen Dubner, and you are listening to Freakonomics Radio.
0:18:24 We will be right back with how to succeed at failing.
0:18:41 How did Amy Edmondson become so driven to study failure?
0:18:44 Well, here’s one path to it.
0:18:47 Her whole life, she had been a straight-A student.
0:18:48 Right, I never had an A-.
0:18:50 Well, you know, I once had one in 10th grade.
0:18:53 It just was so devastating, I resolved not to have one again.
0:18:55 And I’m only partly joking.
0:18:57 But then she went to college.
0:19:02 I got an F on my first semester multivariable calculus exam.
0:19:03 An F.
0:19:04 Like, I failed the exam.
0:19:05 I mean, that’s unheard of.
0:19:06 What’d that feel like?
0:19:11 I didn’t see it coming, but I wasn’t baffled after the fact.
0:19:15 After the fact, it was very clear to me that I hadn’t studied enough.
0:19:22 In the years since then, Edmondson has been refining what she calls a spectrum of causes of failure.
0:19:30 The spectrum ranges from blameworthy to praiseworthy, and it contains six distinct categories of failure.
0:19:32 Let’s take two extremes.
0:19:34 Let’s say something goes wrong.
0:19:36 We achieve an undesired result.
0:19:40 On one end of the spectrum, it’s sabotage.
0:19:42 Someone literally tanked the process.
0:19:44 They threw a wrench into the works.
0:19:54 On the other end of the spectrum, we have a scientist or an engineer hypothesizing some new tweak that might solve a really important problem.
0:19:58 And they try it, and it fails.
0:20:12 And, of course, we praise the scientist, but the gradations in between often lull us into a false sense that it’s blameworthy all the way.
0:20:13 Okay.
0:20:17 So let’s start at the blameworthy end of the spectrum and move our way along.
0:20:20 Number one of the six.
0:20:25 My spectrum of causes of failures starts with sabotage or deviance.
0:20:30 I soak a rag in lighter fluid, set it on fire, throw it into a building, right?
0:20:37 Or I’m a physician in a hospital, I’m a surgeon, and I come to work drunk and do an operation.
0:20:43 You describe this as the individual chooses to violate a prescribed process or practice.
0:20:50 Now, I could imagine there are some cases where people violate because they think that the process is wrong.
0:20:51 That’s right.
0:20:52 There has to be intent here.
0:20:59 To label something a true sabotage, it has to be my intent is to break something.
0:21:02 It’s not a mistake, and it’s not a thoughtful experiment.
0:21:14 There certainly are protocols in hospitals, for example, where thoughtful physicians will deliberately depart from the protocol because their clinical judgment suggests that would be better.
0:21:20 They may be right, they may be wrong, but that would not qualify as a blameworthy act.
0:21:25 After sabotage on the spectrum comes inattention.
0:21:29 Inattention is when something goes wrong because you just were mailing it in.
0:21:30 You spaced out.
0:21:34 You didn’t hear what someone said, and you didn’t ask, and then you just tried to wing it.
0:21:43 Or you maybe are driving, you’re a trucker, and you’re driving, and you look away or fiddle with the radio and have a car crash.
0:21:50 Now, it sounds like those are mostly blameworthy, but what about inattention caused by external factors?
0:21:52 Well, that’s exactly right.
0:22:02 Once we leave sabotage and move to the right in the spectrum, it will never be immediately obvious whether something’s blameworthy or not.
0:22:05 It’s always going to need further analysis.
0:22:11 So, when we say the failure was caused by someone not paying attention, that just brings up more questions.
0:22:12 Okay, why weren’t they paying attention?
0:22:21 Now, it could be that this poor nurse was on a double shift, and that is not necessarily the nurse’s fault.
0:22:35 It might be the nurse manager who assigned that double shift, or it might be the fact that someone else didn’t show up, and so they have to just do it, and they’re quite literally too tired to pay attention fully, right?
0:22:40 So, we always want to say, well, wait, let’s see, what are the other contributing factors to this inattention?
0:22:48 Can you think of a large-scale failure, a corporate or institutional failure that was caused largely by inattention?
0:22:49 Yes.
0:23:00 One that comes to mind was a devastating collapse with the loss of many lives when a Hyatt Regency atrium collapsed in Kansas City in the early 80s.
0:23:18 And the inattention there was the engineer on record’s failure to pay close attention when the builder decided, out loud, not hidden, to swap one long beam for two smaller connected steel beams.
0:23:24 It would have been a five-minute calculation to show that won’t work with the loads that were expected.
0:23:34 It was a change that didn’t obtain the attention it needed to have avoided this catastrophic failure.
0:23:38 And was that change done to save money, or was it even more benign than that?
0:23:41 I think it was a combination of speed and money.
0:23:42 Speed is money.
0:23:44 Wow, wow, wow, wow.
0:23:45 That’s a great example.
0:23:46 Okay, let’s go to the next one.
0:23:47 Inability.
0:23:58 I’m reading one version of your spectrum here, which describes this as the individual lacks the knowledge, attitudes, skills, or perceptions required to execute a task.
0:24:01 That’s quite a portfolio of potential failure.
0:24:02 That’s right.
0:24:07 And that spans from a young child who doesn’t yet know how to ride a bicycle.
0:24:20 So as soon as they hop on that bicycle, they’re going to fall off because they don’t have the ability yet to, you know, multivariable calculus, which at least when you’re not studying hard enough, you don’t have the ability.
0:24:28 So it’s something that you just don’t have the ability to do to success, but usually could develop.
0:24:40 This reminds me of the Peter Principle, where people get promoted to a position higher than they’re capable based on their past experience, but their past experience may not have been so relevant to this.
0:24:42 That’s a great connection.
0:24:54 Yeah, the Peter Principle, where the failure gets caused by the fact that you don’t have the ability to do the new role, but no one really paused to reflect on that.
0:24:56 I sometimes think about this in the political realm, too.
0:25:04 The ability to get elected and the ability to govern effectively seem to be almost uncorrelated to me.
0:25:10 I’m sorry to say, do you think that’s the case and do you apply this spectrum sometimes to the political realm?
0:25:16 I don’t think it was always the case, but I think it might be increasingly the case.
0:25:28 There’s no theoretical reason why the two abilities to be compelling and win people over to your point of view should be at odds with the capability to do it.
0:25:35 But the way it is increasingly set up in our society might be putting them at odds.
0:25:40 After inability comes what Edmondson calls task challenge.
0:25:46 Yes, the task is too challenging for reliable failure-free performance.
0:25:47 Example?
0:26:01 A great example is an Olympic gymnast who is training all the time and is able to do some of the most challenging maneuvers, but will not do them 100% of the time.
0:26:12 And so when that person experiences a failure, they trip during their routine, then we would call that a failure that was largely caused by the inherent challenge of the task.
0:26:17 Can you give an example in either the corporate or maybe academic realm?
0:26:19 Let’s go to NASA, for example.
0:26:22 The shuttle program is very, very challenging.
0:26:24 I think we can all agree to that.
0:26:27 And over time, they started to think of it as not challenging.
0:26:35 But really, it’s a remarkably challenging thing to send a rocket into space and bring it back safely.
0:26:38 Kind of paradoxical, then, that the thing was actually called Challenger.
0:26:40 That’s a good point.
0:26:50 Actually, I love Richard Feynman looking back on the Challenger accident, his sort of simple willingness to just put the piece of O-ring in the ice water and see what happens.
0:27:00 That’s something that in a better run, more psychologically safe, more creative, generative work environment, someone else would have done in real time.
0:27:08 But, you know, if I recall correctly, even though he was on that commission to investigate, they tried to essentially shut him up.
0:27:11 They didn’t want that news coming out at the hearing.
0:27:14 They wanted, you know, they didn’t want the failure to be so explicit.
0:27:15 That’s right.
0:27:18 But that’s, I mean, that’s not a good thing.
0:27:19 That’s not a good thing.
0:27:21 You’ve got to learn from it so that it doesn’t happen again.
0:27:30 By the way, if you don’t remember the story of Richard Feynman and the Challenger investigation and the O-rings, don’t worry.
0:27:34 Last year, we made a three-part series about Feynman.
0:27:41 The story of his role in the Challenger investigation is covered in part one of that series called The Curious Mr. Feynman.
0:27:43 Okay, back to failure.
0:27:49 The fifth cause of failure on Amy Edmondson’s spectrum is uncertainty.
0:27:52 So uncertainty is everywhere.
0:27:58 There’s probably, you know, an infinite number of examples here, but let me pick a silly one.
0:28:03 A friend sets you up on a blind date, and you like the friend, and you think, okay, sure.
0:28:09 And then you go out on the date, and it’s a terrible bore or worse, right?
0:28:10 It’s a failure.
0:28:13 But you couldn’t have known in advance.
0:28:14 It was uncertain.
0:28:17 How about a less silly example?
0:28:18 You’re in a company setting.
0:28:29 You have an idea for a strategic shift or a product that you could launch, and there’s very good reasons to believe this could work, but it’s not 100%.
0:28:39 The final cause of failure we have by now moved all the way from the blameworthy end of the spectrum to the praiseworthy is simply called experimentation.
0:28:43 I’m being fairly formal when I say experimentation, right?
0:28:52 The most obvious example is a scientist in a lab and probably really believes it will work and puts the chemicals in, and lo and behold, it fails.
0:29:01 Or in much smaller scale, I’m going to experiment with being more assertive in my next meeting and doesn’t quite work out the way I’d hoped.
0:29:05 It’s the Edison, quote, you know, 10,000 ways that didn’t work.
0:29:12 He’s perfectly, perfectly willing to share that because he’s proud of each and every one of those 10,000 experiments.
0:29:21 So that is Amy Edmondson’s entire spectrum of the causes of failure.
0:29:27 Sabotage, inattention, inability, task challenge, uncertainty, and experimentation.
0:29:36 If you’re like me, as you hear each of the categories, you automatically try to match them up with specific failures of your own.
0:29:47 If nothing else, you may find that thinking about failure on a spectrum from blameworthy to praiseworthy is more useful than the standard blaming and shaming.
0:29:50 It may even make you less afraid of failure.
0:29:56 That said, not everyone is a fan of Edmondson’s ethos of embracing failure.
0:30:05 A research article by Jeffrey Ray at the University of Maryland, Baltimore County, is called Dispelling the Myth that Organizations Learn from Failure.
0:30:09 He writes, failure shouldn’t even be in a firm’s vocabulary.
0:30:15 To learn from failure or otherwise, a firm must have an organizational learning capability.
0:30:26 If the firm has the learning capability in the first instance, why not apply it at the beginning of a project to prevent a failure, rather than waiting for a failure to occur and then reacting to it?
0:30:38 But Amy Edmondson’s failure spectrum has been winning admirers, including Gary Klein, the research psychologist best known as the pioneer of naturalistic decision making.
0:30:40 I’m very impressed by it.
0:30:43 I’m impressed because it’s sophisticated.
0:30:44 It’s not simplistic.
0:30:48 There’s a variety of levels and a variety of reasons.
0:31:04 And before we start making policies about what to do about failure, we need to, you know, look at things like her spectrum and identify what kind of a failure is it so that we can formulate a more effective strategy.
0:31:08 Okay, let’s do that.
0:31:14 After the break, two case studies of failure, one of them toward the blameworthy end of the spectrum.
0:31:19 It was very much driven by, you know, the Prime Minister, Tony Blair.
0:31:21 The other, quite praiseworthy.
0:31:25 I failed over 200 times before I finally got something to work.
0:31:26 I’m Stephen Dubner.
0:31:28 This is Freakonomics Radio.
0:31:29 We’ll be right back.
0:31:42 John van Rienen is a professor at the London School of Economics.
0:31:43 He studies innovation.
0:31:47 But years ago, he did some time in the British Civil Service.
0:31:57 I spent a year of my life working in the Department of Health when there was a big expansion in the UK National Health Service of resources and various attempts at reforms.
0:32:17 One of the key things that was thought could really be a game changer was to have electronic patient records so you can see the history of patients, you know, their conditions, what they’ve been treated with.
0:32:41 And having that information, I mean, instead of having all this, you know, pieces of paper written illegibly by different physicians, you could actually have this in a single record would not only make it much easier to find what was going on with patients, but could also be used as a data source to try and help think about how patients have more joined up care and could even maybe predict what kind of conditions they might have in the future.
0:32:47 The project was called Connecting for Health, and there was substantial enthusiasm for it.
0:32:49 At least the ad campaign was enthusiastic.
0:32:53 All this is a key element in the future of the NHS.
0:32:58 One day, not too far away, you’ll wonder how you live without it.
0:33:03 It was very much driven by the Prime Minister, Tony Blair.
0:33:11 This was a centralised, top-down approach in order to have a single IT system where you could access information.
0:33:19 Instead of having all these different IT systems, these different siloed, you know, pieces of paper, to have it in one consistent national system.
0:33:25 The NHS is a big operation, one of the biggest employers in the world.
0:33:29 But then, if you drill down into it, it is pretty fragmented.
0:33:33 Each local general practitioner unit is self-employed.
0:33:35 Each trust has a lot of autonomy.
0:33:44 And that’s part of the issue, is that, you know, this was a centralised, top-down programme in a system where there’s a lot of different fiefdoms,
0:33:49 a lot of different pockets of power, who are quite capable of resisting this,
0:33:54 and disliked very strongly being told, this is what you’re going to have, this is what you’re going to do,
0:33:57 without really being engaged and consulted properly.
0:34:04 But the train rolled on, despite these potential problems.
0:34:10 Connecting for health required a massive overhaul of hardware systems as well as software systems.
0:34:19 And the delivery of those was, there was a guy called Richard Granger, who was brought in, I think he was the highest paid public servant in the country.
0:34:24 He was at Deloitte’s before he came, and then after he left, he went to work for Accenture.
0:34:29 He was brought in to do this, and he designed these contracts, very tough contracts,
0:34:35 which loaded the risk of things going wrong very strongly onto the private sector providers.
0:34:43 I think just about every single quote-unquote winner, eventually either went bankrupt or walked away from the contract.
0:34:49 The estimates vary of the cost of this, but, you know, estimates are up to $20 billion lost on this project.
0:34:53 It was the biggest civilian IT project in the Western world.
0:34:59 I mean, there’s lots of examples of huge, you know, public sector failures and private sector failures as well,
0:35:01 but this was one of the biggest.
0:35:06 British Parliament ultimately called this attempted reform, quote,
0:35:10 one of the worst and most expensive contracting fiascos ever.
0:35:14 So, what kind of lessons can be learned from this failure?
0:35:17 I think it’s a failure of many, many different causes on many different levels.
0:35:24 That top-down-ness, not really understanding what was going on at a grassroots level,
0:35:26 and the haste, it was attempted to them very quickly.
0:35:31 I’ve read that the haste, especially the haste of awarding contracts at the time,
0:35:35 was considered a great thing because it was so atypical of how government worked,
0:35:40 and it was hailed as, you know, a new way of the government doing business.
0:35:43 In the end, that haste turned out to be a problem, though, correct?
0:35:48 Correct. I mean, it seemed at the time when these contracts were formed,
0:35:50 the government was getting a good deal, and they were doing it quickly,
0:35:52 they were loading the risks onto the suppliers.
0:35:59 So, it wasn’t obvious from the get-go that this was going to be as bad as it turned out to be.
0:36:05 Looking back, trying to do things quickly in such a complicated system,
0:36:10 there was so much complexity that a lot of these contracts effectively had to be rewritten afterwards.
0:36:15 And I think, you know, another general lesson is that when you’re doing a long-term,
0:36:19 important, big contract, you can’t get everything written down quickly.
0:36:21 There has to be a lot of give and take.
0:36:24 It’s a kind of relationship that you have to adjust as things go.
0:36:26 You know, contracts are very fuzzy.
0:36:27 They’re very incomplete.
0:36:32 You just have to accept that, that you’re going to have to not get things right,
0:36:36 but not try to do everything really, really, really quickly.
0:36:39 An IT project’s never just about IT.
0:36:42 It’s also about the way you change a whole organization.
0:36:45 And to do it, it’s not just about spending money.
0:36:48 You also have to get players in that system on board,
0:36:52 because it’s very difficult to just get them to do things,
0:36:58 especially, you know, in a public system where you can’t just fire people if you want to fire them.
0:37:01 You really have to have a culture of kind of bringing people on board
0:37:04 if you want to make these type of changes, and that just didn’t happen.
0:37:06 So I don’t think it’s just one thing you could think of.
0:37:10 There’s the haste, there’s the design which worked out badly,
0:37:13 and there’s the cultural aspects that we’ve talked about.
0:37:19 When you’re trying to innovate, you want to have a way of allowing people to take risks
0:37:23 and do things wrong, but then you also have to have feedback mechanisms
0:37:26 to figure out, well, you know, what has gone wrong.
0:37:31 So creating an attitude of saying, well, we actually don’t know what the right thing to do is,
0:37:35 so we’re prepared to do experimentations and learn from that.
0:37:46 If you are the kind of person who likes to understand and analyze failure in order to mitigate future failures,
0:37:52 what might be useful here is to overlay the National Health Service’s IT fiasco
0:37:57 onto Amy Edmondson’s spectrum of causes of failure.
0:38:03 Reconfiguring a huge IT system certainly qualifies as a task challenge,
0:38:08 but there were shades of inability and inattention at work here as well.
0:38:13 All of those causes reside toward the blameworthy end of the scale.
0:38:19 As for the praiseworthy end of the spectrum, that’s where experimentation can be found.
0:38:23 The NHS project didn’t incorporate much experimentation.
0:38:29 It was more command and control, top-down, with little room for adjustment
0:38:35 and little opportunity to learn from the small failures that experimentation can produce
0:38:37 and which can prevent big failures.
0:38:40 Experimentation, if you think about it,
0:38:44 is the foundation of just about all the learning we do as humans.
0:38:48 And yet, we seem to constantly forget this.
0:38:54 Maybe that’s because experimentation will inevitably produce a lot of failure.
0:38:56 I mean, that’s the point.
0:38:59 And most of us just don’t want to fail at all,
0:39:02 even if it’s in the service of long-term success.
0:39:07 So let’s see if we can’t adjust our focus here.
0:39:10 Let’s talk about real experimentation.
0:39:17 And for that, we will need not another social scientist like John Van Rienen or Amy Edmondson,
0:39:22 as capable as they are, but an actual science scientist.
0:39:27 Here is one of the most acclaimed scientists of the modern era.
0:39:31 My name’s Bob Langer, and I’m an institute professor at MIT.
0:39:35 I do research, but I’ve also been involved in helping get companies started,
0:39:41 and I’ve done various advising to the government, FDA, and places like that.
0:39:45 And if I say to you, Bob, what kind of scientist are you exactly?
0:39:46 How do you answer that question?
0:39:50 Well, I would say I’m a chemical engineer or a biomedical engineer,
0:39:53 but people have called me all kinds of things.
0:39:55 You know, they’ve called me a biochemist.
0:40:00 We do very interdisciplinary work, so I end up getting called more than one thing.
0:40:02 Do you care what people call you?
0:40:04 I just like them to call me Bob.
0:40:11 Langer holds more than 1,500 patents, including those that are pending.
0:40:16 He runs the world’s largest biomedical engineering lab at MIT,
0:40:20 and he is one of the world’s most highly cited biotech researchers.
0:40:26 He also played a role in the founding of dozens of biotech firms, including Moderna,
0:40:29 which produced one of the most effective COVID vaccines.
0:40:33 One thing Langer is particularly known for is drug delivery.
0:40:40 That is, developing and refining how a given drug is delivered and absorbed at the cellular level.
0:40:44 A time-release drug, for instance, is the sort of thing we take for granted today,
0:40:46 but it took a while to get there.
0:40:52 One problem Langer worked on back in the 1970s was finding a drug delivery system
0:40:55 that would prevent the abnormal growth of blood vessels.
0:41:00 The chemical that inhibits the growth is quite large by biological standards,
0:41:05 and there was consensus at the time that a time-release wouldn’t work on large molecules.
0:41:07 But as Langer once put it,
0:41:11 I didn’t know you couldn’t do it because I hadn’t read the literature.
0:41:16 So he ran experiment after experiment after experiment
0:41:19 before finally developing a recipe that worked.
0:41:23 Decades later, thanks to all that failure,
0:41:26 his discovery played a key role in how Moderna
0:41:30 used messenger RNA to create its COVID vaccine.
0:41:40 So, in your line of work, when I say the word failure, what comes to mind?
0:41:44 Well, I mean, a lot of things, but I’d go back to my own career.
0:41:46 I failed at trying to get research grants.
0:41:48 My first nine research grants were turned down.
0:41:51 I’d send them to places like National Institutes of Health,
0:41:53 and they have study sections, reviewers.
0:41:56 Mine would go, just because of the work I was doing,
0:41:58 to what was called a Pathology B study section,
0:42:01 and they would review it, and they said,
0:42:03 well, Dr. Langer, you know, he’s an engineer.
0:42:05 He doesn’t know anything about biology or cancer.
0:42:07 I failed over and over again.
0:42:11 Other things, like I failed to get a job in a chemical engineering department
0:42:13 as an assistant professor, even.
0:42:14 Nobody would hire me.
0:42:15 They said actually the opposite.
0:42:17 They said, you know, chemical engineers
0:42:20 don’t do experimental biomedical engineering work,
0:42:23 so, you know, they should work on oil or energy.
0:42:27 When I first started working on creating these micro or nanoparticles
0:42:29 to try to get large molecules to be delivered,
0:42:34 I failed over 200 times, I mean, before I finally got something to work.
0:42:35 I could go on and on in my failures.
0:42:39 What kept you going during all this failure?
0:42:42 I really believed that if we could do this,
0:42:44 it would make a big difference in science,
0:42:45 and I hoped a big difference in medicine.
0:42:48 Secondly, as I did some of it, you know,
0:42:51 I could see some of these results with my own eyes.
0:42:53 You know, when we were trying to deliver
0:42:56 some of these molecules to stop blood vessel growth,
0:42:58 I could see we were doing this double blind,
0:43:01 but I could still see that we were stopping the vessels from growing.
0:43:03 That’s such a visual thing.
0:43:05 And I also developed these ways of studying
0:43:07 delivery out of the little particles
0:43:09 by putting certain enzymes in them
0:43:12 and putting dyes in a little gel
0:43:15 that would turn color if the enzymes came out.
0:43:16 And I could see that happening.
0:43:18 Like I said, the first 200 times
0:43:20 or first 200 designs or more,
0:43:21 it didn’t happen.
0:43:23 But finally, I came up with a way
0:43:25 where I’d see it come out after an hour,
0:43:27 after two hours, after a day,
0:43:28 after a second day,
0:43:30 up to over 100 days in some cases.
0:43:32 So I could see with my own eyes this was working.
0:43:35 So that made an enormous difference to me too.
0:43:39 But failing 200 times costs a lot of money
0:43:40 and obviously a lot of time.
0:43:43 Did you ever almost run out of one or the other?
0:43:45 The experiments I was doing weren’t that expensive,
0:43:48 especially the delivery ones initially
0:43:49 because they were in test tubes.
0:43:51 I worked probably 20-hour days.
0:43:54 And so the expense wasn’t that great.
0:43:56 And I’ve always been good at manufacturing time.
0:44:02 Now, let’s say someone is in a similar situation today
0:44:04 to where you were then with an idea
0:44:07 or a set of ideas that they believe in,
0:44:10 that they think they are right about,
0:44:12 they think it’s an important idea,
0:44:16 and yet they are failing and failing
0:44:17 to get the attention of the people
0:44:20 who can help manufacture success.
0:44:22 How do you think about the line?
0:44:24 I think of it sometimes as the line
0:44:26 between grit and quit, right?
0:44:28 Economists talk about opportunity cost.
0:44:30 Every hour you spend on something that isn’t working
0:44:31 is an hour you could spend on something
0:44:32 that is working.
0:44:34 But then psychologists talk about grittiness
0:44:37 and how useful it can be to stick things out.
0:44:38 Do you have anything to say to people
0:44:40 who might be wrestling with that?
0:44:41 Well, I think it’s a great question.
0:44:44 And I ultimately think it’s a judgment call
0:44:45 and we can never be sure of our judgment.
0:44:47 You like to try to think,
0:44:50 are these things scientifically possible?
0:44:51 I think that’s one thing.
0:44:54 Secondly, it’s good to get advice from people.
0:44:55 It doesn’t mean you have to take it,
0:44:57 but it’s good to get advice.
0:45:00 I certainly personally have always erred
0:45:02 on the side of, I guess, not quitting.
0:45:04 And maybe that’s sometimes a mistake.
0:45:05 I don’t think so.
0:45:08 I think it depends on what could happen
0:45:09 if you are successful.
0:45:11 You know, if you are successful,
0:45:13 could it make a giant difference in the world?
0:45:15 Could it help science a lot?
0:45:17 Could it help patients’ lives a lot?
0:45:19 And so if you really feel that it can,
0:45:20 you try that much harder.
0:45:22 If it’s incremental, sure,
0:45:24 then it’s much easier to quit.
0:45:26 Is that ability to persevere,
0:45:27 within yourself at least,
0:45:29 do you think that’s your natural temperament?
0:45:30 Is that something you learned?
0:45:32 Did you find incentives to lead you there?
0:45:34 I think for me, there are a couple of things.
0:45:37 One, I guess I’ve always been very stubborn.
0:45:38 My parents told me that.
0:45:39 But secondly,
0:45:42 I think there’s a whole thing with role models too.
0:45:44 When I was a postdoc,
0:45:46 the man that I worked with, Judah Folkman,
0:45:48 he experienced the same thing.
0:45:51 He had this theory that if you could stop blood vessels,
0:45:52 you could stop cancer.
0:45:55 And that was mediated by chemical signals.
0:45:57 And everyone told him he was wrong.
0:45:59 But I would watch him every day.
0:46:01 And he believed anything was possible.
0:46:02 And he kept sticking to it.
0:46:04 And of course, eventually he was right.
0:46:10 I think seeing his example probably also had a big effect on me.
0:46:15 Can you talk to me about how scientific failure is treated generally?
0:46:16 Let’s assume a spectrum.
0:46:21 And on one end of the spectrum is that every failure is written up and published
0:46:27 and perhaps even celebrated as having discovered a definitive wrong path to pursue.
0:46:30 So everybody coming after you can cross that off their list.
0:46:32 And on the other end of the spectrum,
0:46:34 every failure is hidden away,
0:46:38 which allows many other people to make the same failure.
0:46:40 Can you talk about where the reality is?
0:46:42 I think that’s an interesting question.
0:46:45 A lot of it even depends how you define failure.
0:46:47 You know, when you’re trying to learn about something,
0:46:49 you try different things.
0:46:52 And embedded in the scientific papers we write,
0:46:55 like when we wrote this paper in Nature in 1976,
0:46:58 which was the first time you could get small particles
0:47:02 to release large molecules from biocompatible materials,
0:47:05 well, some of the materials we used failed.
0:47:08 A lot of them did, actually, because they would either cause inflammation
0:47:12 or the drug would come out way too fast or not come out at all.
0:47:16 We found one fraction that worked and stopped blood vessels
0:47:19 and probably 50 or 100 that didn’t.
0:47:23 So the failures and successes are maybe in the same papers sometimes.
0:47:26 What I’ve tried to do, even to give more detail,
0:47:30 is you put all the data in, even if it makes for a very long thesis.
0:47:34 So not only are the graphs there and the papers,
0:47:37 but there’s even the raw data that people can look at and analyze.
0:47:40 And I try to get people to do as much of that as possible.
0:47:43 So I guess what I’m trying to say is that
0:47:46 the failures and successes are almost intertwined.
0:47:48 I’d like to hear you talk about
0:47:53 how failure is discussed or thought of in the lab.
0:47:55 Maybe it’s nothing overt, but I am curious,
0:47:58 especially when you bring in young people, researchers,
0:48:00 whether they’re, you know, postdoc or undergrad,
0:48:02 do you give pep talks about failure?
0:48:06 Do you kind of have a philosophy that you want to instill in these people
0:48:10 that failure is an essential component of research and success?
0:48:11 Yes.
0:48:11 Yes.
0:48:13 And I do.
0:48:17 And I, whether it’s my own talks or just meeting with students
0:48:20 and brainstorming with them about those things.
0:48:23 But to me, that research, scientific research,
0:48:25 I mean, you just fail way more than,
0:48:27 at least I do, way more than you succeed.
0:48:29 It’s just part of the process.
0:48:32 I mean, that’s experimentation and that’s okay.
0:48:37 A lot of your colleagues and students go on to start companies,
0:48:40 and that’s a whole different ball of wax.
0:48:43 How do you think about failure in the entrepreneurial process?
0:48:51 Obviously, the easy criteria is a successful company having a good financial exit, I suppose.
0:48:54 But I don’t necessarily think of it as just that way.
0:48:56 I mean, that’s certainly going to be important.
0:49:02 You know, I’ve been involved in things where you’ve advanced science and you learn some things,
0:49:03 and there’s degrees of success.
0:49:04 You just don’t know.
0:49:09 I’ve been pretty fortunate in the companies we’ve started in terms of the exits that they’ve had.
0:49:13 But I just think there’s no simple criteria.
0:49:17 I feel like we’ve turned out a lot of great scientists and entrepreneurs,
0:49:20 and not all their companies have had great financial exits.
0:49:24 But I think they’ve also created products that can change people’s lives.
0:49:27 And that, to me, is also very, very important, obviously.
0:49:28 That’s why we do it in the first place.
0:49:31 I have never done it for money, and I don’t think they do it for money.
0:49:34 They do it to try to make a difference in the world.
0:49:38 Do you think failure is, however, a different animal in the research sphere
0:49:40 as in the entrepreneurial sphere?
0:49:43 I would say yes, I think it is.
0:49:46 But I also think, you know, there’s different cultures, too.
0:49:49 I think the good thing about the United States culture,
0:49:53 maybe in contrast to some cultures, is failure is widely accepted.
0:49:58 I’ll give you one of my examples, actually, in the business sphere.
0:50:00 So I’m a big fan of chocolate.
0:50:03 Of eating it or making it or researching it?
0:50:05 Probably any part, but mostly eating it.
0:50:08 But at any rate, one of the books I read, and I’m actually not a fan of their chocolate,
0:50:10 is a book on Milton Hershey.
0:50:13 And so this really gets to your point on failure.
0:50:19 Milton Hershey, he had this idea when he was young, very young, of starting a candy company.
0:50:22 And I remember the first candy company, he went bankrupt.
0:50:24 You know, and he tried to raise more money, started another one.
0:50:29 I think, like, the first six or seven totally failed, but not the last one, obviously.
0:50:32 And he became a millionaire at a time when there weren’t very many.
0:50:33 Was that really failure?
0:50:38 Or was it just being an apprentice to trying to learn how to succeed?
0:50:41 And I think that’s true in a lot of things.
0:50:48 The reason I brought it up is I don’t think there’s a shame and failure in either area, or I hope there’s not.
0:50:51 I think you have to feel it’s okay.
0:50:53 And then you keep going on.
0:50:56 What do you think?
0:51:01 Would you like to live in a world where there’s no shame in failure?
0:51:09 Or do you think it’s important for failure to hurt, to burn, as one of our guests put it last week?
0:51:12 Maybe that creates a stronger incentive to succeed.
0:51:18 I’d love to know your thoughts on this question and on this series so far.
0:51:24 Send an email to radio at Freakonomics.com, or you can leave a review or rating in your podcast app.
0:51:30 Coming up next time on the show, we will dig deeper into the idea of grit versus quit.
0:51:35 When you’re failing, how do you know if it’s time to move on?
0:51:44 We just could not stop it from leaking, and I was no longer willing to just keep pouring more and more of my money into it.
0:51:48 He dumped me when I was 70, and I married him again at age 75.
0:51:49 You know, hope springs eternal.
0:51:50 This is a great idea.
0:51:52 You just have to raise a quarter million dollars.
0:51:59 Case studies in failure and in grit versus quit, including stories from you, our listeners.
0:52:02 That’s in the next part of our series on failure.
0:52:05 Until then, take care of yourself.
0:52:07 And if you can, someone else too.
0:52:11 Freakonomics Radio is produced by Stitcher and Renbud Radio.
0:52:14 This episode was produced by Zach Lipinski.
0:52:17 He and Dalvin Abawaji worked on the update.
0:52:21 It was mixed by Eleanor Osborne and Jasmine Klinger with help from Jeremy Johnston.
0:52:31 The Freakonomics Radio network staff also includes Alina Cullman, Augusta Chapman, Ellen Frankman, Elsa Hernandez, Gabriel Roth, Greg Rippin, Morgan Levy, Sarah Lilly, and Tao Jacobs.
0:52:40 You can find our entire archive on any podcast app, also at Freakonomics.com, where we publish transcripts and show notes.
0:52:45 Our theme song is Mr. Fortune by the Hitchhikers, and our composer is Luis Guerra.
0:52:55 The conversation that we had casually last year was a great conversation.
0:52:59 If we can essentially do something similar, that’ll be fantastic for our listeners.
0:53:01 I’ll try to remember what I said.
0:53:09 The Freakonomics Radio Network.
0:53:11 The hidden side of everything.
0:53:15 Stitcher.
In medicine, failure can be catastrophic. It can also produce discoveries that save millions of lives. Tales from the front line, the lab, and the I.T. department.
- SOURCES:
- Amy Edmondson, professor of leadership management at Harvard Business School.
- Carole Hemmelgarn, co-founder of Patients for Patient Safety U.S. and director of the Clinical Quality, Safety & Leadership Master’s program at Georgetown University.
- Gary Klein, cognitive psychologist and pioneer in the field of naturalistic decision making.
- Robert Langer, institute professor and head of the Langer Lab at the Massachusetts Institute of Technology.
- John Van Reenen, professor at the London School of Economics.
- RESOURCES:
- Right Kind of Wrong: The Science of Failing Well, by Amy Edmondson (2023).
- “Reconsidering the Application of Systems Thinking in Healthcare: The RaDonda Vaught Case,” by Connor Lusk, Elise DeForest, Gabriel Segarra, David M. Neyens, James H. Abernathy III, and Ken Catchpole (British Journal of Anaesthesia, 2022).
- “Estimates of preventable hospital deaths are too high, new study shows,” by Bill Hathaway (Yale News, 2020).
- “Dispelling the Myth That Organizations Learn From Failure,” by Jeffrey Ray (SSRN, 2016).
- “A New, Evidence-Based Estimate of Patient Harms Associated With Hospital Care,” by John T. James (Journal of Patient Safety, 2013).
- To Err is Human: Building a Safer Health System, by the National Academy of Sciences (1999).
- “Polymers for the Sustained Release of Proteins and Other Macromolecules,” by Robert Langer and Judah Folkman (Nature, 1976).
- The Innovation and Diffusion Podcast, by John Van Reenen and Ruveyda Gozen.
- EXTRAS:
- “The Curious, Brilliant, Vanishing Mr. Feynman,” series by Freakonomics Radio (2024).
- “Will a Covid-19 Vaccine Change the Future of Medical Research?” by Freakonomics Radio (2020).
- “Bad Medicine, Part 3: Death by Diagnosis,” by Freakonomics Radio (2016).
- SOURCES:
-
#228 Elad Gil: How to Spot a Billion-Dollar Startup Before the Rest of the World
AI transcript
0:00:06 You’re one of the most successful investors that a lot of people have probably never heard of.
0:00:09 AI is the only market where the more I learn, the less I know.
0:00:11 And in every other market, the more I learn, the more I know.
0:00:13 The more I’m able to predict things, and I can’t predict anything anymore.
0:00:15 What scares you about the future?
0:00:18 That’s a big question.
0:00:22 I think in a couple of years, we’ll start thinking about it as we’re selling units of cognition.
0:00:27 AI is dramatically underhyped because most enterprises have not done anything in it.
0:00:31 And that’s where all the money is, all the changes, all the impact is, all the jobs, everything.
0:00:36 The people that I know who have been very successful or driven solely by money end up miserable.
0:00:38 Because they have money, and then what?
0:00:41 It’s just, what do you do? What fulfills you?
0:00:44 What are the most common self-inflicted wounds that kill companies?
0:00:46 I think that…
0:00:49 What do you think is the next wave?
0:00:53 I think it’s going to be an ongoing wave of…
0:00:55 And that’s coming, right? And that hasn’t even hit yet.
0:01:17 Welcome to the Knowledge Project Podcast.
0:01:20 I’m your host, Shane Parrish.
0:01:26 In a world where knowledge is powered, this podcast is your toolkit for mastering the best of what other people have already figured out.
0:01:34 My guest today is Elad Gill, who has had a front row seat to some of the most important technology companies started in the past two decades.
0:01:40 He invested early in Stripe, Airbnb, Notion, Coinbase, Andrel, and so many others.
0:01:45 He’s also authored an incredible book on scaling startups called High Growth Handbook.
0:01:49 In my opinion, he’s one of the most underrated figures in Silicon Valley.
0:01:58 In this episode, we explore how he thinks about startups, talent, decision-making, AI, and most importantly, the future of all of these things.
0:02:08 We talk about the importance of clusters, why most companies die from self-inflicted wounds, and what it really means to scale a company, and importantly, what it means to scale yourself.
0:02:11 It’s time to listen and learn.
0:02:22 You’ve had a front row seat at some of the biggest, I would say, surprises in a way, like Stripe, Coinbase, Airbnb, when they were just ideas.
0:02:25 What was the moment where you recognized these were going to be outliers?
0:02:28 So all three of those are very different examples, to your point.
0:02:31 I invested in Airbnb when it was probably around eight people.
0:02:33 Stripe was probably around the same size.
0:02:37 And then Coinbase only got involved with much later, when it was a billion-dollar-plus company.
0:02:40 And even then, I thought there was enormous upside on it, which luckily has turned out to be the case.
0:02:46 I think really, the way I think about investing in general is that there’s two dimensions that really matter.
0:02:52 The first dimension is what people call product-market fit, or is there a strong demand for whatever it is you’re building?
0:02:55 And then, secondarily, I look at the team.
0:02:58 And I think most early-stage people flip it.
0:03:00 They look at the team first, and how good is a founder?
0:03:02 And obviously, I’ve started two companies myself.
0:03:06 I think the founder side is incredibly important, and the talent side is incredibly important.
0:03:13 But I’ve seen amazing people get crushed by terrible markets, and I’ve seen reasonably mediocre teams do extremely well in what are very good markets.
0:03:16 And so, in general, I first ask, do I think there’s a real need here?
0:03:17 How is it differentiated?
0:03:18 What’s different about it?
0:03:20 And then I dig into, like, are these people exceptional?
0:03:22 How will they grow over time?
0:03:24 What are some of the characteristics of how they do things?
0:03:35 Let’s go into people’s second, but how do you determine product-market fit in a world where a lot of people are buying product-market fit almost through brute force or giving away product?
0:03:40 Yeah, there’s a lot of signals you can look at, and I think it’s kind of varied by type of business.
0:03:42 Is it a consumer business versus enterprise versus whatever?
0:03:46 For things like consumer businesses, you’re just looking at organic growth rate and retention.
0:03:47 Are people using it a lot?
0:03:48 Are they living in it every day?
0:03:49 That sort of thing.
0:03:51 That would be early Facebook, right?
0:03:52 The usage metrics were insane.
0:03:58 And then for certain B2B products, it could be rate of growth and adoption.
0:04:02 It could be metrics people call, like, NDR and a dollar retention or other things like that.
0:04:09 Honestly, if you’re investing before the thing even exists in the market, then you have to really dig into how much do I believe there’s a need here, right?
0:04:10 Or how much is there a customer need?
0:04:16 So I invested in Rippling and other related companies before there’s anything built, right?
0:04:18 Under the premise that this is something that a lot of people want.
0:04:21 And Notion is the same thing.
0:04:24 Actually, Notion was a rare example where I did it as a person investment.
0:04:26 I met Ivan, who’s a CEO over there.
0:04:33 And everything about him was so aesthetically cohesive in a very odd way.
0:04:38 The way he dressed, his hairstyle, the color scheme of his clothes, the color scheme of the app and the pitch deck.
0:04:41 The only other person I’ve seen that with is Jack Dorsey, who started Square and Twitter.
0:04:46 And there was this odd, almost pure embodiment of aesthetic.
0:04:50 And I just thought it was so intriguing and so cool.
0:04:53 And I’ve only seen two people like that before that I had to invest.
0:04:56 And it was just this immense consistency.
0:04:57 It was very weird.
0:05:01 And you see that, like, you go to his house and it’s like, it feels like him.
0:05:03 You know, everything, the company feels like him.
0:05:04 Everything feels like him.
0:05:05 It’s fascinating.
0:05:06 He’s done an amazing job with it.
0:05:09 It almost stands out to the point where you think it’s manufactured.
0:05:11 I think it’s genuine.
0:05:12 I think it’s almost the opposite.
0:05:14 You feel the purity of it.
0:05:16 You’re like, oh my gosh, there’s a unique aesthetic element here.
0:05:23 And that probably reflects some unique way of viewing the world or thinking about products or thinking about people and their usage.
0:05:25 Let’s come back to outliers.
0:05:27 So product market fit, outliers.
0:05:30 How do you identify an outlier team?
0:05:34 Yeah, you know, I think it really depends on the discipline or the area.
0:05:37 For tech, I think it’s very different than if you’re looking in other areas.
0:05:43 For an early tech team, I almost use like this Apple framework of Jobs, Wozniak, and Cook, right?
0:05:46 Steve Jobs and Steve Wozniak started Apple together.
0:05:51 Steve Jobs was known as somebody who really was great at setting the vision and direction, but also was just an amazing salesperson.
0:05:54 And selling means selling employees to join you.
0:05:55 It means raising money.
0:05:57 It means selling your first customers.
0:05:58 It’s negotiating your supply chain.
0:06:01 Those are all aspects of sales in some sense or negotiation.
0:06:06 And so you need at least one person who can do that unless you’re just doing a consumer product that you threw out there, right?
0:06:08 And it just grows and then people join you because it’s growing.
0:06:13 Then you need somebody who can build stuff and build it in a uniquely good way.
0:06:15 And that was Wozniak, right?
0:06:21 The way that he was able to hack things together, drop chips from the original design of Apple devices, etc., was just considered legendary.
0:06:25 And then as the thing starts working, you eventually need somebody like Tim Cook who can help scale the company.
0:06:30 And so you could argue that was Sheryl Sandberg in the early days of Facebook who eventually came on as a hire and helped scale it.
0:06:35 And Zuck was really the sort of mixture of the product visionary, the salesperson, etc.
0:06:42 Why did all these people concentrate in San Francisco almost or California?
0:06:47 How did that happen where you had Apple, you have Stripe, you have Coinbase, you have Facebook?
0:06:50 Walk me through that.
0:06:54 We were talking a little bit about this before we started recording about clusters of people.
0:07:07 Yeah, it’s really fascinating because if you look at almost every major movement throughout history, and that could be a literary movement, it could be an artistic movement, it could be a finance movement, economic schools of thought.
0:07:19 It’s almost always a group of young people aggregating in a specific city who all somehow find each other and all start collaborating and working together towards some common set of goals that reflect that.
0:07:26 So there was, you know, a famous literary school in the early 20th century in London.
0:07:35 That was, I think it was like Virginia Woolf and John Maynard Keyes and E.M. Forster and all these people all kind of aggregated and became friends and started supporting each other.
0:07:37 Or you look at the Italian Renaissance.
0:07:44 Similarly, in Florence, you had this aggregation of all these great talents, all in a timely manner coincident with each other.
0:07:51 Or Favism or Italian Futurism or Impressionism, Paris in the, you know, late 1800s.
0:07:52 And so that repeatedly happens for everything.
0:07:55 And similarly, that’s happened for tech.
0:07:58 And even within tech, we’ve had these successive waves, right?
0:08:03 Really, the founding story of Silicon Valley goes back to the defense industry and then the semiconductor industry, right?
0:08:06 Defense was HP and other companies starting off in the 40s.
0:08:13 You then ended up with Shockley Semiconductor and Fairchild Semiconductor in the early semiconductor companies, 50s, 60s.
0:08:24 And that kind of established Silicon Valley as a hub and as things moved from microprocessors to computers to software, people just kept stuff propagating across those waves from within the industry.
0:08:29 So one big thing is just you have a geographic cluster and you have that for every single industry.
0:08:32 You look at wineries and they’re clustered in a handful of places because of geography.
0:08:35 You look at the energy industry, it’s in a handful of cities.
0:08:38 Finance is in New York and Hong Kong and London.
0:08:48 So every single industry has clusters, Hollywood and Bollywood and, you know, Lagos and Nigeria for the main hubs for, you know, movie making in different regions.
0:08:51 So in Silicon Valley, obviously, we created this tech cluster.
0:09:00 But then even within the tech cluster, there are these small pockets of people that I mentioned earlier that somehow find each other and self-aggregate.
0:09:03 It’s funny, I was talking to Patrick Hollis and the founder of Stripe about this.
0:09:12 And he mentioned that when he was 18 and he showed up in Silicon Valley as a nobody, right, completely unknown, 18-year-old, nobody’s heard of him.
0:09:19 And during that six-month period that he was first here, he said he met all these people who are now giants of Silicon Valley.
0:09:25 And it was this weird self-aggregation of people kind of finding and meeting each other and talking about what each other’s working on.
0:09:28 Somehow this keeps happening.
0:09:29 And this happens through time.
0:09:33 And then right now in Silicon Valley, it’s happening in very specific areas.
0:09:33 It’s happening.
0:09:36 All the AI researchers all knew each other from before.
0:09:38 They were in the common set of labs.
0:09:39 They had common lineages.
0:09:43 All the best AI founders, which is different from the researchers, have their own cluster.
0:09:45 And all the SaaS people have their own cluster.
0:09:51 And so it’s this really interesting almost self-aggregation effect of talent finding each other and then helping each other over time.
0:09:54 And it’s just fascinating how that works.
0:09:57 How do you think about that in an era of remote work?
0:10:04 Remote work is generally not great for innovation unless you’re truly in an online collaborative environment.
0:10:11 And the funny thing is that when people talk about tech, they would always talk about how tech is the first thing that could go remote because you can write code from anywhere and you can contribute from anywhere.
0:10:13 But that’s true of every industry, right?
0:10:14 You look at Hollywood.
0:10:19 You could make a movie from anywhere, like you film it off-site anyhow or on-site in different places.
0:10:20 You could write a script from anywhere.
0:10:22 You could edit the musical score from anywhere.
0:10:23 You could edit the film from anywhere.
0:10:25 You could write the script from anywhere.
0:10:27 So why is everything clustered in Hollywood?
0:10:29 Nobody would ever tell you, oh, don’t go to Hollywood.
0:10:32 Go to Boise and, you know, you could work in the movie industry.
0:10:33 Or finance.
0:10:37 You could raise money from anywhere, come up with your trading strategy from anywhere.
0:10:39 Everything in finance is in a handful of locations.
0:10:41 And so tech is the same way.
0:10:43 And it’s because there’s that aggregation of people.
0:10:51 There’s the people helping each other, sharing ideas, trading things informally, learning new distribution methods that kind of spread, learning new AI techniques that spread.
0:10:54 There’s money around it that funds it specifically so it’s easier to raise money.
0:10:58 There’s people who’ve already done it before who can help you scale once the thing is working.
0:11:04 That’s the common complaint I hear in Europe for Google Start companies there is we can’t find the executives who know how to scale what we’re doing.
0:11:05 Oh, interesting.
0:11:09 And so I do think there are these other sort of ancillary things that people talk about.
0:11:13 The service providers, the lawyers who know how to set up startups, right?
0:11:16 Or the accountants who know how to do tax and accounting for startups.
0:11:18 Those things sound trivial, but they cluster.
0:11:22 Most people think the key to a successful business is the product.
0:11:26 But often the real secret is what’s behind the product.
0:11:28 The systems that make selling seamless.
0:11:34 That’s why millions of businesses, from household names to independent creators, trust Shopify.
0:11:37 I’m not exaggerating about how much I love these guys.
0:11:41 I’m actually recording this ad in their office building right now.
0:11:44 Shopify powers the number one checkout on the planet.
0:11:49 It’s simple, it’s fast, and with ShopPay, it can boost conversion rates up to 50%.
0:11:51 I can check out in seconds.
0:11:53 No typing in details.
0:11:54 No friction.
0:11:59 It’s fast, secure, and helps businesses convert more sales.
0:12:02 That means fewer abandoned carts and more customers following through.
0:12:08 If you’re serious about growth, your commerce platform has to work everywhere your customers
0:12:08 are.
0:12:13 Online, in-store, on social, and wherever attention lives.
0:12:17 The best businesses sell more, and they sell with Shopify.
0:12:21 Upgrade your business and get the same checkout I use.
0:12:26 Sign up for your $1 per month trial at shopify.com slash knowledge project.
0:12:43 I think a lot about systems, how to build them, optimize them, and make them more efficient.
0:12:46 But efficiency isn’t just about productivity.
0:12:47 It’s also about security.
0:12:52 You wouldn’t leave your front door unlocked, but most people leave their online activity
0:12:54 wide open for anyone to see.
0:12:59 Whether it’s advertisers tracking you, your internet provider throttling your speed, or
0:13:00 hackers looking for weak points.
0:13:02 That’s why I use NordVPN.
0:13:05 NordVPN protects everything I do online.
0:13:11 It encrypts my internet traffic so no one, not even my ISP, can see what I’m browsing, shopping
0:13:12 for, or working on.
0:13:17 And because it’s the fastest VPN in the world, I don’t have to trade security for speed.
0:13:23 Whether I’m researching, sending files, or streaming, there’s zero lag or buffering.
0:13:27 But one of my favorite features, the ability to switch my virtual location.
0:13:32 It means I can get better deals on flights, hotels, and subscriptions just by connecting
0:13:33 to a different country.
0:13:39 And when I’m traveling, I can access all my usual streaming services as if I were at home.
0:13:45 Plus, Threat Protection Pro blocks ads, malicious links before they become a problem, and Nord’s
0:13:49 dark web monitor alerts me if my credentials ever get leaked online.
0:13:54 It’s a premium cybersecurity for the price of a cup of coffee per month.
0:13:55 Plus, it’s easy to use.
0:13:58 With one click, you’re connected and protected.
0:14:04 To get the best discount off your NordVPN plan, go to nordvpm.com slash knowledgeproject.
0:14:08 Our link will also give you four extra months on the two-year plan.
0:14:11 There’s no risk with Nord’s 30-day money-back guarantee.
0:14:14 The link is in the podcast episode description box.
0:14:19 A big part of why Combinator is sort of like helping everybody with that stuff.
0:14:23 Yeah, why Combinator is a great example of taking out-of-network people, at least that
0:14:26 was the initial part of the premise, not the full premise, right?
0:14:29 Like people like Sam Altman or others who were very early in YC came out of Stanford, which
0:14:30 was part of the main hub.
0:14:34 But a lot of other people came out of universities that just weren’t on the radar for people who
0:14:36 tended to back things in Silicon Valley.
0:14:39 And so, you know, the early Reddit founders went to East Coast universities.
0:14:45 The Airbnb founders, two of them were out of RISD, the Rhode Island Institute of Design.
0:14:53 And so, YC early on was very good at taking very talented people who weren’t part of the core networks in Silicon Valley and basically
0:14:55 inserting them into those networks and helping them succeed.
0:14:58 Why do you think they’re still relevant today?
0:15:00 Why is YC still relevant today?
0:15:03 I think they’ve just done a great job of building sort of brand and longevity.
0:15:06 Gary, who’s taken over, is fantastic.
0:15:07 And so I think he brings a lot of that.
0:15:13 Let’s go back to first principles and really implement YC the way that, you know, we think
0:15:14 it can really succeed for the future.
0:15:18 And I think they do a really good job of two things.
0:15:21 One is plugging people in, as mentioned, particularly your SaaS company, you want to have a bunch of
0:15:23 customers instantly, your batch mates will help you with that.
0:15:32 But also, it teaches people to ship fast and to kind of force finding customers.
0:15:37 And so because you’re in this batch structure and you’re meeting with your batch every week
0:15:40 and you hear what everybody else is doing, you feel peer pressure to do it.
0:15:44 But also, it kind of shapes how you think about the world, what’s important, what to work on.
0:15:47 And so I think it’s almost like a brainwashing program, right?
0:15:49 Beyond everything else they do, which is great.
0:15:52 It sets a timeline that you have to hit and it brainwashes you to think a certain way.
0:15:58 One of the things that I see, which I think is maybe relevant, maybe not, you tell me, is
0:16:04 I like how it brings people together who are probably misfits or outliers in their own environment
0:16:08 and then puts them in an environment where ambition is the norm.
0:16:10 It’s not the outlier to have ambition.
0:16:12 Where shipping is the norm.
0:16:15 It’s not the outlier to ship.
0:16:21 And it sort of normalizes these things that maybe cause success or lead to an increased
0:16:22 likelihood of success.
0:16:26 It’s actually a very interesting question of what proportion of founders these days are
0:16:29 actually people who normally wouldn’t fit in, right?
0:16:34 So the sort of founder archetype of before it was rebellious people or people who could never
0:16:35 work for anybody else or whatever.
0:16:40 And then as tech has grown dramatically in market cap and influence and everything else,
0:16:45 it’s inevitable that the type of people who want to come out here and do things has shifted.
0:16:48 And then the perception of risk in startups has dropped a lot.
0:16:51 And so I actually think the founder mix has shifted quite a bit.
0:16:54 Like there isn’t as much quirkiness in tech.
0:16:55 And during COVID, it was awful.
0:16:56 It was very unquirky.
0:17:00 Because at that point, you know, there was a zero interest rate environment.
0:17:02 Money was abundant everywhere.
0:17:07 And the nature of people who joined or who showed up shifted.
0:17:11 And so I think we had two or three years where the average founder just wasn’t that great,
0:17:12 right?
0:17:13 On a relative basis to history.
0:17:18 And then as the AI wave was happening, you know, I started getting involved with a lot of
0:17:21 the generative AI companies maybe three-ish years ago, maybe three and a half years ago.
0:17:25 So before Chachapiti came out and before MidJourney and all these things kind of took off.
0:17:30 And the people starting those companies were uniquely good.
0:17:32 And you felt the shift.
0:17:38 You went from these kind of plain vanilla, me too, almost LARPers, to these incredibly driven,
0:17:45 mission-oriented, hyper-smart, very technical people who wanted to do something really big.
0:17:47 And you felt it.
0:17:48 It was a dramatic shift.
0:17:52 And if you look at it, there’s basically been three or four waves of talent coming through
0:17:53 the AI ecosystem.
0:17:55 And I should say gen AI because we had this whole wave.
0:17:59 We had 10 years, 15 years of other types of deep learning, right?
0:18:03 We had recursive neural networks and convolutional neural networks and GANs and all these things.
0:18:08 And that technology basis fundamentally has different capabilities than this new wave.
0:18:14 And so there’s this paper in 2017 that came out of Google called the Transformer Architecture.
0:18:19 And that is the thing that spawned this whole wave of AI right now that we’re experiencing.
0:18:20 And so it’s a new technology basis.
0:18:24 We took a step function and we’re doing new stuff that you couldn’t do before on the old
0:18:24 technologies.
0:18:29 That whole wave led to this really interesting set of companies.
0:18:33 And the first people in that wave were the researchers because they were closest to it.
0:18:38 And they could see firsthand what was actually happening in the technology, in the market, how they
0:18:39 were using it.
0:18:44 You know, the engineers at OpenAI used to go into the weights to query stuff, which then eventually
0:18:46 actually in some form is ChatGPT, right?
0:18:47 They were doing it before it existed.
0:18:51 There was also MENA at Google, which was basically an internal form of almost like ChatGPT.
0:18:55 So they kind of saw the future and they went to try and substantiate it.
0:18:59 And you could argue that the same thing happened in the internet wave in the 90s, right?
0:19:03 All the people working at the National Supercomputer Centers like Marc Andreessen and others saw the
0:19:04 future before anyone else.
0:19:06 They’re using email before anyone else.
0:19:08 They were browsing the internet before anyone else.
0:19:12 They were using FTP and file downloads and sharing music files before anyone else.
0:19:15 And so they knew what was coming, right?
0:19:17 They had a glimpse into the future.
0:19:20 It’s the old saying, the future is here is just not equally distributed.
0:19:22 For AI, we had the same thing.
0:19:25 We had these researchers who could tangibly feel what was coming.
0:19:28 And so the first wave of AI companies was researchers.
0:19:30 The second wave was infrastructure people.
0:19:31 We’re not closest.
0:19:34 And in this current wave, we’re now at the application people, the people who are building
0:19:36 applications on top of the core technology.
0:19:38 What do you think is the next wave?
0:19:43 I think it’s going to be an ongoing wave of kind of everything, right?
0:19:46 There’s still a lot to build, but I think we’ll see more and more application level companies.
0:19:51 We’ll see fewer what are known as foundation model companies, the people building the open
0:19:55 AIs or Anthropics or some of the Google core technologies or X.AI.
0:19:57 There will be specialized versions of that, right?
0:19:58 That’s all the language stuff, right?
0:20:04 It understands what you say and it can interpret it and it can generate text for you and do all these
0:20:04 things, right?
0:20:07 That’s all these LLMs, large language models.
0:20:10 There’s going to be the same thing done for physics and material science.
0:20:11 We’ve already seen it happening in biology, right?
0:20:13 So at that layer, there’s a bunch of stuff.
0:20:14 There’s the infrastructure.
0:20:16 What is the equivalent of cloud services?
0:20:18 And then there’s the apps on top.
0:20:20 And then in the apps, you have B2B and then you have consumer.
0:20:23 And so I think we’re going to see a lot of innovation across the stack.
0:20:25 But I think this next wave is a mix of B2B and consumer.
0:20:31 And then I think the wave after that is very large enterprise adoption.
0:20:38 And so I think AI is dramatically underhyped because most enterprises have not done anything
0:20:38 in it.
0:20:42 And that’s where all the money is, all the changes, all the impact is, all the jobs, everything,
0:20:43 right?
0:20:45 It’s a big 80-20 rule of the economy.
0:20:48 And that’s coming, right?
0:20:49 And that hasn’t even hit yet.
0:20:56 Are there any historical parallels to anything that you can think of that map to artificial
0:20:57 intelligence or AGI?
0:21:06 I think the thing that people misunderstand about artificial intelligence is that, you know,
0:21:09 people are kind of viewing it as what you’re selling as like a cool tool to help you with
0:21:10 productivity or whatever it is.
0:21:14 I think in a couple of years, we’ll start thinking about it as we’re selling units of cognition,
0:21:16 right?
0:21:20 We’re selling bits of person time or person equivalent to do stuff for us.
0:21:27 I’m going to effectively hire 20 bot programmers to write code for me to build an app, or I’m going
0:21:34 to hire an AI accountant, and I’m going to basically rent time off of this unit of cognition.
0:21:39 On the digital side, it really is this shift from you’re selling tools to you’re selling
0:21:42 effectively white-collar work.
0:21:47 On the robotic side, you’ll probably have some form of like robot minutes or something.
0:21:52 You’ll probably end up with some either human form robots or other things that will be doing
0:21:54 different forms of work on your behalf.
0:21:57 And, you know, potentially you buy these things or maybe you rent them, you know, it’ll be
0:21:59 interesting to see what business models emerge around it.
0:22:01 What scares you about the future?
0:22:03 That’s a big question.
0:22:04 Along what dimension?
0:22:07 Wherever you want to take it.
0:22:08 Like what scares you about AI?
0:22:10 Do you have any fears about AI?
0:22:14 I think that I have opposing fears.
0:22:20 In the short run, I worry that there’s the real chance to kind of strangle the golden goose,
0:22:20 right?
0:22:28 I do think AI and this wave of AI is the single biggest potential motivator for versions of global
0:22:31 advancements in health and education and all the things that really matter fundamentally.
0:22:37 And there’s some really great papers from the 80s that basically show that one-on-one tutoring,
0:22:42 for example, will increase performance by one or two standard deviations, right?
0:22:44 You get dramatically better if you have a one-on-one tutor for something.
0:22:49 And if you actually look through history and you look at how Alexander the Great was tutored
0:22:54 by Aristotle and all these things, there’s a lot of kind of prior examples of people actively
0:22:56 doing that on purpose for their kids if they can afford it.
0:23:00 This AI revolution is a great example of something that could basically provide that for every child
0:23:04 around the world as long as they have access to any device, which is most people at this
0:23:04 point, right?
0:23:05 Globally.
0:23:10 So from an education system perspective, a healthcare system perspective, it’s a massive
0:23:10 change.
0:23:13 So in the short run, I’m really worried that people are going to constrain it and strangle
0:23:17 it and prevent it from happening because I think it’s really important for humanity.
0:23:23 In the long run, there’s always these questions of, you know, at what point do you actually consider
0:23:24 something sentient versus not?
0:23:25 Is it a new life form?
0:23:26 Like, is there species competition?
0:23:29 You know, there’s those sorts of questions, right?
0:23:30 In the very long run.
0:23:32 Without robots, you could say, well, you just unplug the data center.
0:23:33 Who cares?
0:23:34 You know, it doesn’t matter.
0:23:38 If you do have robots and other things, then it gets a little bit harder, maybe.
0:23:43 At what point do you think we’re going to, AI is going to start solving problems that we
0:23:48 can’t solve in the sense of a lot of what it’s doing today is organizing logic on a human
0:23:49 level equivalent.
0:23:50 It’s not being like…
0:23:52 No, it’s already surpassed us on many things, right?
0:23:57 Like, just even look at how people play Go now and the patterns they learned off of AI,
0:23:58 which can beat any person at Go.
0:24:03 I mean, gaming is a really good example of that, where every wave of gaming advancements where
0:24:07 you pitted AI against people, people said, well, fine, they beat people at checkers, but
0:24:09 they’ll never beat them at chess.
0:24:11 And then they beat them at chess and say, well, fine, chess, but they’ll never beat them at
0:24:12 Go.
0:24:13 They beat them at Go.
0:24:16 And they’re like, well, what about complex games where there’s bluffing?
0:24:17 They’ll never beat them at poker.
0:24:18 And then Noam Brown had his poker paper.
0:24:20 And they say, well, okay, poker.
0:24:22 Well, they’ll never beat them at things like diplomacy, where you’re manipulating people
0:24:23 against each other.
0:24:27 And then, you know, a Facebook team solved diplomacy, right?
0:24:31 And so gaming is a really great example where you have superhuman performance against every
0:24:31 game now.
0:24:35 And you see that in other aspects of things as well.
0:24:40 I guess where my mind was going is in terms of mathematical problems.
0:24:43 I mean, we’ve solved a couple maybe that we haven’t been able to solve, but we haven’t
0:24:51 made real leaps or biology or health or longevity, like where, you know, here’s the, not the solution
0:24:56 maybe to Alzheimer’s because that’s like a big leap, but maybe it’s like, you’re not looking
0:24:57 in the right area.
0:24:59 You need to research in this area more.
0:25:01 Like when is that sort of advancement coming?
0:25:02 Yeah, I think it’s a really good question.
0:25:06 I mean, AI is already having some interesting advancements in biology, right?
0:25:12 The Nobel Prize this past year in biology went to Demas and a few other people who built
0:25:16 predictive models using AI about how proteins will fold, right?
0:25:20 And so I think it’s already being recognized as something that’s impacting the field at the
0:25:21 point where it gets a Nobel.
0:25:26 The hard part with certain aspects of biology and protein folding is a good counter example.
0:25:27 We actually have very good data.
0:25:31 You had tens of thousands or maybe hundreds of thousands of crystal structures.
0:25:34 You had solved structures for all these proteins and you could use that to train the model.
0:25:35 Right.
0:25:41 If you look at it, about half or more than half of all biology research in top journals is
0:25:42 not reproducible.
0:25:44 So you have a big data problem.
0:25:47 Half the data is false.
0:25:48 It’s incorrect.
0:25:49 Right.
0:25:53 And this is actually something that Amgen published a couple of years ago where they showed this
0:25:56 because they weren’t able to reproduce cancer findings in their lab because they’re trying
0:25:57 to develop a drug.
0:26:00 And they’re like, wait a minute, this thing we thought could turn into a drug isn’t real.
0:26:01 Right.
0:26:06 And so there’s this really big replication issue in certain sciences.
0:26:08 Isn’t that part of the advantage for AI then?
0:26:10 Like, I’m thinking out loud here.
0:26:10 Sure.
0:26:15 Like, if I uploaded all of the Alzheimer’s papers to AI.
0:26:16 Yeah.
0:26:18 And it would be like, these ones aren’t replicatable.
0:26:20 There’s mathematical errors here.
0:26:21 This looks like fraud.
0:26:24 But all of these things have generated future research.
0:26:28 So what you’re doing is you’re being like, oh, you’ve spent billions of dollars on this.
0:26:33 That’s likely, not like statistically, it’s probably not going to yield results.
0:26:35 You should focus your attention here.
0:26:37 And that would have a huge impact on…
0:26:37 Yeah.
0:26:40 I think there’s almost like three different things that are mixed in here.
0:26:42 One is just fraud.
0:26:44 You know, you fudged an image, you’re reusing it, whatever.
0:26:46 I think AI is wonderful for that.
0:26:51 And I actually think, and I’m happy to, if anybody who’s listening to this wants to get
0:26:55 sponsored, or maybe we should do a competition or something to basically build like fraud detectors
0:26:56 using AI or plagiarism detectors.
0:26:58 You could do it for liberal arts as well as sciences, right?
0:26:58 Yeah.
0:27:00 And I bet you’d uncover a ton of stuff.
0:27:05 Separate from that, there’s people publishing things that are just bad.
0:27:08 And the question is, is it bad because they ignored other data?
0:27:10 Did they throw out data points?
0:27:13 How would you know as an AI system, right?
0:27:15 That somebody threw out half their data to publish a paper.
0:27:21 And so there’s other issues around how science is done right now.
0:27:25 Or you just rush it and you have the wrong controls, but it still gets published because
0:27:26 it’s a hot field.
0:27:26 That happens a lot.
0:27:31 If you look during COVID, like there were so many papers that in hindsight were awful papers,
0:27:34 but they got rushed out because of COVID.
0:27:37 And unless somebody goes back and actually redoes the experiment and then publishes it,
0:27:40 they read it and it didn’t work, which nobody does because nobody’s going to publish it for
0:27:40 you.
0:27:42 How do you know that it’s not reproducible?
0:27:45 And so that’s part of the challenge in biology.
0:27:48 And so the biology problem isn’t, can an AI model do better?
0:27:49 I’m sure it could.
0:27:54 The biology problem is how do you create the data set that actually is clean enough and has
0:27:56 high enough fidelity that you can train a model that then goes and cleans everything
0:27:57 else up, right?
0:27:58 And it’s doable.
0:27:59 Like all these things are very doable.
0:28:00 You just have to go and do it.
0:28:01 And it’s a lot of work.
0:28:04 If you look at things like math and physics and other things like that, people are just
0:28:06 starting to train models against that now.
0:28:09 So I do think we’ll, in the coming years, see some really interesting breakthroughs there.
0:28:15 Do you think that’ll be rapid or do you, like how will those breakthroughs happen?
0:28:17 Yeah, it’s kind of the same thing.
0:28:22 You kind of need to figure out what’s the data set you’re using, what kind of model and
0:28:25 model architecture you’re using, because different architectures seem to work better or worse for
0:28:26 certain types of problems as well.
0:28:30 Like the protein folding ones have three or four different types of models that often get
0:28:32 mixed in, at least traditionally.
0:28:35 A lot of them have moved to these transformer backbones, but then they’re augmented by other
0:28:36 things.
0:28:40 So it’s a little bit of like, do you have enough and the right data?
0:28:43 Do you have the right model approach?
0:28:44 And then can you just keep scaling it?
0:28:48 Walk me through why I’m wrong here.
0:28:51 Like, I’m just, you know, what came to mind when you were saying this is like, we’re training
0:28:53 AI based on data.
0:28:55 So it’s like, here’s how we’ve solved problems in the past.
0:28:57 This is how you’re likely to solve it in the future.
0:29:02 But if I remember correctly, DeepMind trained Go by just being like, here are the rules.
0:29:06 We’re not actually going to show you people that have played before.
0:29:10 And that led to the creativity that we now see.
0:29:12 Yeah, that’s called self-play.
0:29:15 And as long as you have enough rules, you can do it.
0:29:18 You need a utility function you’re working against, right?
0:29:21 And so in the context of a game, it’s winning the game.
0:29:23 And there’s very specific rules of the game.
0:29:24 You know when to flip over the Go piece.
0:29:26 You know what winning means, right?
0:29:30 And so it’s easy to train against that because you have a function to select against.
0:29:31 This game you did well.
0:29:32 This game you did badly.
0:29:35 Here’s positive feedback or negative feedback to the model.
0:29:37 They’re starting to do that more and more.
0:29:40 So if you look at the way people are thinking about models now and scaling them, there’s three
0:29:41 or four components to it.
0:29:42 One is ongoing data scale.
0:29:44 Second is the training cluster.
0:29:46 People always talk about all the money they’re spending on GPUs.
0:29:48 The third is reasoning modules.
0:29:53 And that’s the new stuff from OpenAI in terms of 01 and 03 and all these things.
0:30:01 There’s other forms of time of inference-related optimizations and how do you do them and some
0:30:03 aspects eventually of this self-play.
0:30:09 And some of the places where that may really come into focus soon is coding because you
0:30:11 can push code and you can see if it runs and you can see what errors are thrown.
0:30:16 And there’s more stuff you can do in domains where you have a clear output you’re shooting
0:30:18 for and that you can test against it.
0:30:19 And there’s rapid feedback.
0:30:21 And there’s rapid feedback.
0:30:21 And that’s the key.
0:30:25 How quickly can you get feedback to keep training the system and iterating?
0:30:27 What happens when I give an AI a prompt?
0:30:30 Like what happens on the inside of that?
0:30:32 What’s the difference between a good prompt and a bad prompt?
0:30:37 Like does it basically take my prompt and break it into reasoning steps that a human would
0:30:38 use?
0:30:42 Like first I do this, second I do this, third I do this, and then I give the output.
0:30:47 And then the follow-on to this is like what can we do to better prompt AI to get better
0:30:47 outcomes?
0:30:48 Yeah, great question.
0:30:53 So a lot of the people working on agents have basically built what you’re describing, which
0:30:59 is something that will take a complex task, break it down into a series of steps, store those
0:31:01 steps, and then go back to them as you get output.
0:31:03 So you’re actually chaining a model.
0:31:07 You’re pinging it over and over with the output of the prior step and asking it now to do the
0:31:07 next step.
0:31:10 So one approach to that is you literally break it up into 10 pieces.
0:31:15 If it’s a simple problem and you’re just like write me a limerick with XYZ characteristics,
0:31:19 then the model can just do that in a single sort of call to the model.
0:31:24 But if you’re trying to do something really complex, you know, book me a flight or find
0:31:25 me and book me a flight to Mexico.
0:31:26 It’s like, okay, first I need to find the flight.
0:31:30 And so that means I need to go to this website and then I need to interact with the website
0:31:30 and pull the data.
0:31:32 Then I need to analyze that information.
0:31:34 And then I have to figure out what fits with your trip.
0:31:37 And then I, you know, I go through the booking steps and then I get the confirmation.
0:31:41 So it really depends on what you’re asking the model to do.
0:31:44 When I think of a model, though, I don’t think of an agent.
0:31:46 I just think, well, I can’t AI do that.
0:31:51 Like, why do I need a specific type of AI to book a flight to Mexico?
0:31:53 Why can’t ChatGPT just do it?
0:32:02 ChatGPT in its current form, or at least in the simplest form, is effectively interrogating
0:32:05 a mix of like a logic engine and a knowledge corpus, right?
0:32:10 It’s like a thing that will look at what it knows and based on that, provide you with some
0:32:11 output.
0:32:14 That’s a little bit different from asking somebody to take an action.
0:32:18 And that’s similar to if I was talking to you and I said, hey, where’s a nice place to
0:32:19 go?
0:32:23 And you didn’t say, oh, you should go to Cabo or you should go to wherever, right?
0:32:27 That’s different for me saying, hey, could you get me there, right?
0:32:30 And you have to go to the computer and load up the website and book it for it.
0:32:32 It’s the same thing for AI, right?
0:32:37 And so right now we have AIs that are very capable at understanding language, synthesizing
0:32:44 it, manipulating it, but they don’t have this remembrance of all the steps that they’ve
0:32:45 taken and will take.
0:32:49 And so you need to overlay that as another system on top of it.
0:32:53 And you see this a lot in the way your brain works, right?
0:32:56 You have different parts of your brain that are involved with vision and understanding
0:32:57 it.
0:32:59 You have different parts of your brain for language.
0:33:01 You have different parts of your brain for empathy, right?
0:33:05 You have mirror neurons that help you empathize with somebody or relate to them.
0:33:10 So your brain is a bunch of modules strung together to be able to do all sorts of complex
0:33:11 tasks, be they cognitive or physical.
0:33:16 And one could assume that over time you end up with roughly something like that as well
0:33:18 for certain forms of AI systems.
0:33:21 How are you using AI today?
0:33:23 I use it a lot.
0:33:32 I use it for everything from, you know, like I’ll go to a conference and I’ll dump the
0:33:36 names of the attendees in and ask like, who should I chat with based on these criteria?
0:33:38 And could you pull background on them?
0:33:41 You know, obviously a lot of people use it for coding right now or coding related tasks.
0:33:45 I use it for a lot of what I’m known as like regexes, regular expansions.
0:33:49 It’s like if I want to pull something out of certain types of data, I’ll do that sometimes.
0:33:53 So there’s all sorts of different uses for it.
0:33:56 What have you learned about prompting that more people should know?
0:34:03 I think a lot of people, and I’m by no means like a, you know, there’s these people whose
0:34:05 jobs are called prompt engineering and that’s all they do.
0:34:11 I think fundamentally a lot of it just comes down to like, what are you specifically asking
0:34:12 and can you create enough specificity?
0:34:16 And sometimes you can actually add checks into the system where you say, go back and double
0:34:19 check this just to make sure that you didn’t omit something because there are enough errors
0:34:22 sometimes depending on which model you’re using and for what use case and everything else that
0:34:28 if you put in simple safeguards of, hey, generate a table of XYZ as output, but then go back
0:34:31 and double check that these two things are true, I think it’s helped me clean up a lot of things
0:34:32 that would normally have been errors.
0:34:35 It’s almost like adding a test case.
0:34:36 Yeah, yeah.
0:34:40 Basically, if you think about it as like a smart intern, you know, often with your intern,
0:34:43 you say, okay, go do this thing, but why don’t you double check these three things about it?
0:34:47 And as the models get more and more capable, they’ll be less like an intern and more like
0:34:51 a junior employee, and then they’ll be like a senior employee, and then they’ll be like
0:34:54 a manager and they’ll kind of, you know, as the models get better and better and the
0:34:56 capabilities get stronger, you’ll see all these other things emerge.
0:34:59 Where do you see the bottlenecks today?
0:35:02 And like what comes to mind for me are different aspects of AI.
0:35:09 So you have, from going all the way up the stack, you have electricity, you have compute,
0:35:12 you have LLMs, you have data.
0:35:17 Where do you see the bottlenecks being, where’s the biggest bang for the buck?
0:35:19 Like what’s preventing this from going faster?
0:35:22 You know, it’s a really interesting question.
0:35:27 And I think there’s people who are better versed than I am in it because there’s this ongoing
0:35:30 question of when does scaling run out for which of those things, right?
0:35:34 When do we not have enough data to generate the next versions of models or do we just use
0:35:35 synthetic data and will that be sufficient?
0:35:37 Or how big of a training cluster can you actually get to economically?
0:35:42 You know, how do you fine-tune or post-train a model and at what point does that not yield
0:35:43 as many results?
0:35:46 That said, each one of these things has its own scaling curves.
0:35:48 Each one of these seems to still be working quite well.
0:35:52 And then if you look at a lot of the new reasoning stuff that OpenAI and others have been working
0:35:53 on, Google’s been working on some stuff here as well.
0:35:59 When you talk to people who work on that, they feel that there’s still enormous scaling loss
0:36:00 for that still left, right?
0:36:02 Because those are just brand new things that just rolled out.
0:36:06 And so these sort of reasoning engines have their own big curve to climb as well.
0:36:11 So I think we’re going to see two or three curves sort of simultaneously continue to inflect.
0:36:19 Is this the first real revolution where incumbents have an advantage?
0:36:23 And I say that because data costs money, compute costs money, power costs money.
0:36:24 Yeah.
0:36:30 And it sort of favors the Googles, the Microsofts, the people with a ton of capital.
0:36:31 Yeah.
0:36:36 I think in general, every technology wave has a differential split of outcome for incumbents
0:36:36 versus startups.
0:36:40 So the internet was 80% startup value.
0:36:41 It was Google.
0:36:42 It was Amazon.
0:36:45 You know, it was all these companies we now know and love.
0:36:46 Meta, you know.
0:36:54 And then mobile, the mobile revolution was probably 80% incumbent value or 90%, right?
0:36:59 And so that was mobile search was Google and mobile CRM was Salesforce and mobile whatever
0:37:00 was that app you were already using.
0:37:05 And the things that emerged during that revolution of startups were things that took advantage of the
0:37:07 unique characteristics that were new to the phone.
0:37:08 GPS.
0:37:09 So you had Uber.
0:37:11 Everybody has a camera.
0:37:12 You have Instagram, et cetera, right?
0:37:17 And so the things that became big companies in mobile that were startups were able to do
0:37:20 it because they took advantage of something new that the incumbents didn’t necessarily have
0:37:21 any provenance over.
0:37:26 Crypto was 100% or roughly 100% startup value, right?
0:37:29 It’s Coinbase and it’s the tokens and everything else.
0:37:33 So you kind of go through wave by wave and you ask, what are the characteristics that make
0:37:34 something better or worse?
0:37:38 And if you actually look at self-driving, which was sort of an earlier AI revolution in some
0:37:43 sense, the two winners, at least in the West, seem to be Tesla, which was an incumbent car
0:37:47 maker in some sense, by the point that they were willing to step out, and Google through
0:37:47 Waymo.
0:37:51 So two incumbents won in self-driving, which I think is a little bit under discussed because
0:37:54 we had like two dozen self-driving companies, right?
0:37:58 Wouldn’t that make sense, though, because they have the most data in the sense of like
0:38:04 Tesla acquires so much data every day and now the way that they’ve set up full self-driving,
0:38:07 my understanding is it’s gotten really good in the last six months.
0:38:12 One of the reasons is they stopped coding, basically, and they started feeding the data into AI and
0:38:15 having the AI generate the next version effectively.
0:38:19 Yeah, a lot of the early self-driving systems were basically people writing a lot of kind of
0:38:20 edge case heuristics.
0:38:23 So you’d almost write a rule if X happens, you do Y or some version of that.
0:38:26 And they moved a lot of these systems over to just end-to-end deep learning.
0:38:31 And so this modern wave of AI has really taken over the self-driving world in a really strong
0:38:33 way that’s really helped these things accelerate, to your point.
0:38:36 And so Waymo similarly has gotten dramatically better recently.
0:38:38 So I think all that’s true.
0:38:43 I guess it’s more of a question of when does that sort of scale matter and why wasn’t there
0:38:46 anybody who was able to partner effectively with an existing automotive company?
0:38:49 What in other things happen in the market?
0:38:52 For this current wave of AI, it really depends on the layer you’re talking about.
0:38:56 And I think there’s going to be enormous value for both incumbents and startups.
0:39:01 On the incumbent side, it really looks like the foundation model companies are either paired
0:39:04 up or driven by incumbents.
0:39:05 Maybe one or two kind of examples.
0:39:09 So, you know, OpenAI is roughly partnered with Microsoft.
0:39:10 Microsoft also has its own efforts.
0:39:13 Google is its own partner in some sense, right?
0:39:16 Amazon has partnered with Anthropic.
0:39:20 Obviously, Facebook has Llama, the open source model.
0:39:26 But I think for three of the four, and then there’s X.EI, which, you know, is Elon Musk’s just
0:39:32 sort of ability to execute in such an insane way that’s really driving it and access to capital
0:39:33 and all the rest.
0:39:38 But if you look at it, and I wrote a blog post about this maybe two, three years ago, which
0:39:40 is basically, what’s the long-term market structure for that layer?
0:39:46 And it felt like it had to be an oligopoly or, you know, at most an oligopoly.
0:39:48 And the reason was this point that you made about capital.
0:39:52 And back then, it costs, you know, tens of millions to build a model.
0:39:57 But if you extrapolated the scaling curve, you’re like, every generation is going to be a few
0:39:58 X to 10 X more.
0:40:02 And so, eventually, you’re talking about billions, tens of billions of dollars, not that many
0:40:03 people can afford it.
0:40:06 And then you ask, what’s the financial incentive for funding it?
0:40:10 And the financial incentive for the cloud businesses is their clouds, right?
0:40:14 If you look at Azure’s last quarter, I think it was like a $28 billion quarter or something
0:40:14 like that.
0:40:19 I think they said that 10 to 15% of the lift on that was from AI being sold on the cloud.
0:40:21 So, that’s what?
0:40:23 One and a half to three billion, a quarter, right?
0:40:28 So, the financial incentive for Microsoft to fund open AI is it feeds back into its cloud.
0:40:30 It feeds back in other ways, too, but it feeds back to its cloud.
0:40:36 And so, I don’t think it’s surprising that the biggest funders of AI today, besides sovereign
0:40:39 wealth, has been clouds because they have a financial incentive to do it.
0:40:40 And people really miss that.
0:40:45 So, I think that that is part of what really helped lock in this oligopoly structure early
0:40:49 is you had enormous capital scale going to a handful of the best players through these
0:40:49 cloud providers.
0:40:52 And so, the venture capitalists would put hundreds of millions of dollars into these companies.
0:40:54 The clouds put tens of billions in.
0:40:55 Yeah.
0:40:56 And that’s the difference.
0:41:04 And I guess the optimism there is that I can go use the full scale of AWS or Azure or
0:41:07 Google and just rent time.
0:41:09 So, I don’t need to make the capital investments.
0:41:10 I don’t need to run the data center.
0:41:11 I don’t need to.
0:41:13 Well, you could have done that either way, right?
0:41:16 You didn’t have to take money from them because they’re happy to be a customer.
0:41:17 That’s what I’m saying, right?
0:41:21 So, like the optimism is like you can compete with them now because you’re just competing
0:41:22 on ideas.
0:41:24 You have access to the structure.
0:41:25 Yeah.
0:41:25 Yeah.
0:41:28 And you would have done that no matter what, just given that everything moved to clouds,
0:41:30 like these third-party clouds that you can run on.
0:41:35 So, that’s enabling, but at least for these sort of language models, they’re increasingly
0:41:37 just a moat due to capital scale.
0:41:41 Do you think that we just end up with like three or four and they’re all pretty much equivalent?
0:41:43 Yeah, I’m not sure.
0:41:45 I think you can imagine two worlds.
0:41:47 World one is where you have an asymptote.
0:41:51 Eventually, things kind of all flatline against some curve because you can only scale a cluster
0:41:53 so much, you only have so much data or whatever.
0:41:56 In which case, eventually, things should converge really closely over time.
0:42:00 And in general, things have been converging faster than not across the major model platforms
0:42:01 already.
0:42:07 Where a second world is, if you think about the capability set built into each AI model,
0:42:12 if you have something that’s far enough ahead and it’s very good at code and it’s very good
0:42:16 at data labeling and it’s very good at doing a lot of the jobs that allow you to build the
0:42:20 next model really fast, then eventually you may end up with a very strong positive feedback
0:42:25 loop for whoever’s far enough ahead that their model always creates the next version of the
0:42:26 model faster than anybody else.
0:42:28 And then you maybe have liftoff, right?
0:42:32 Maybe that’s the thing that ends up dramatically far ahead because every six months becomes more
0:42:33 important than the last five years.
0:42:38 And so, there’s another world you could imagine where you’re in a liftoff scenario where there’s
0:42:41 a feedback loop of the model effectively creating its next version.
0:42:47 So, GPT-5 or 7 or whatever, GPT-7 would create GPT-8, which would help create GPT-9, which
0:42:48 would even faster create GPT-10.
0:42:54 And at that point, you have an advantage, but the advantage is expanding at the velocity at
0:42:55 which you’re creating the next model.
0:42:55 Correct.
0:43:00 Because GPT-10 perhaps is so much more capable than 9 that everybody else is at 9, it’s already
0:43:00 building 11.
0:43:04 And it can build it faster, smarter, et cetera, than everybody else.
0:43:09 And so, it really comes down to what proportion of the model building task or model training
0:43:12 and building task is eventually done by AI itself.
0:43:16 Spring is here and you can now get almost anything you need delivered with Uber Eats.
0:43:17 What do we mean by almost?
0:43:20 You can’t get a well-groomed lawn delivered, but you can get chicken Parmesan delivered.
0:43:21 Sunshine?
0:43:21 No.
0:43:22 Some wine?
0:43:23 Yes.
0:43:29 What do you think of Facebook?
0:43:34 They’ve spent, I don’t know, 50, 60 billion and they’ve basically given it away to society.
0:43:35 Yeah.
0:43:35 Yeah.
0:43:39 I’ve been super impressed by what they’ve done with Llama.
0:43:40 I think open source is incredibly important.
0:43:43 And why is open source important?
0:43:45 It does a couple of things.
0:43:51 One is it levels a playing field for different types of uses of this technology and it makes
0:43:53 it globally available in certain ways that’s important.
0:43:59 Second, it allows you to take out things that you may not want in there and because it’s
0:44:01 open weights and it’s open source.
0:44:08 So, if you’re worried about a specific political bias or a specific cultural outlook, because
0:44:13 it’s really interesting if you look at the way people talk about norms and what should be
0:44:17 built into models and safety and all the rest, it’s like, who are you to determine all
0:44:22 of global norms with your own values, right?
0:44:25 That’s a form of cultural imperialism if you think about it, right?
0:44:28 You’re basically imposing what you think on everybody else.
0:44:32 And so, open source models gives you a bit more leeway in terms of being able to retrain
0:44:39 a model or have it reflect whatever norms of your country or your region or whatever lens
0:44:40 on that you want to take.
0:44:42 So, I think it’s also important from that perspective.
0:44:49 As an investor, what’s the ROI on a $1,600 billion open source model?
0:44:53 How do you think through what Facebook is trying to do or accomplish?
0:44:57 Is it just like, I don’t want the competitors to get too far ahead?
0:45:00 I don’t know how meta specifically is thinking about it.
0:45:03 So, I think I’d be sort of talking out of turn if I just made some stuff up.
0:45:09 I think that in general, there’s been all sorts of times where open source has been very important
0:45:10 strategically for companies.
0:45:15 And if you actually look at it, almost every single major open source company has had a
0:45:17 giant institutional backer.
0:45:20 IBM was the biggest funder of Linux in the 90s as a counterbalance to Microsoft.
0:45:27 And the biggest funders of all the open source browsers are Apple and Google with WebKit.
0:45:31 And you just go through technology wave after technology wave, and there’s always a giant
0:45:32 backer.
0:45:35 And maybe the biggest counter to that is Bitcoin and all the crypto stuff.
0:45:40 And you could argue that they’re their own backer through the token, right?
0:45:43 So, Bitcoin financially effectively has fueled the development of Bitcoin.
0:45:49 It’s kind of paid for itself in some sense as an open source tool or open source sort of
0:45:50 form of money.
0:45:52 You know, I don’t know why AI would be different.
0:45:58 I, a couple years ago, was trying to extrapolate who is the most likely party to be the funder
0:46:00 of open source AI.
0:46:04 And back then, I thought it would be Amazon, because at the time, they didn’t have a horse
0:46:07 in the race like Microsoft and Google, or maybe be NVIDIA.
0:46:12 And Meta was kind of on the list because of all the money they have, but in prowess and
0:46:15 engineering and fair, and, you know, they have a lot of great things, but they weren’t the
0:46:17 one I would have guessed as the most likely.
0:46:19 They were on the list, but they weren’t the most likely.
0:46:23 And then there’s other players with tons of money, and tons of capabilities.
0:46:25 And the question is, are they going to do anything?
0:46:26 What does Apple do?
0:46:27 What does Samsung do?
0:46:31 You know, there’s like half a dozen companies that could still do really interesting things
0:46:32 if they wanted to.
0:46:33 And the question is, what are they going to do?
0:46:39 How would you think about sort of the big players and who is best positioned for the
0:46:41 next two to three years?
0:46:42 How would you rank them?
0:46:44 In terms of AI or in terms of other things?
0:46:46 In terms of AI.
0:46:47 Yeah.
0:46:53 Like who’s most likely to accrue some of the advantages of AI?
0:46:59 Yeah, it’s kind of hard because AI is the only market where the more I learn, the less I
0:47:00 know.
0:47:02 And in every other market, the more I learn, the more I know.
0:47:05 And the more predictive value, or the more I’m able to predict things.
0:47:06 And I can’t predict anything anymore.
0:47:09 You know, I feel like every six months, things change over so rapidly.
0:47:13 You know, fundamentally, there’s a handful of companies in the market that are doing very
0:47:14 well.
0:47:21 Obviously, there’s Google, there’s Meta, there’s OpenAI, there’s Microsoft, Anthropic, and AWS
0:47:24 or Anthropic, X.AI.
0:47:27 You know, Mistral has done some interesting things over time.
0:47:30 So I think there’s like a handful of companies that are the ones to watch.
0:47:32 And the question is, how does this market evolve?
0:47:33 Does it consolidate or not?
0:47:34 Like what happens?
0:47:37 How do you think about regulation around AI?
0:47:38 Yeah.
0:47:43 So there’s basically like three or four forms of AI safety that people talk about and they
0:47:44 kind of mix or complate them.
0:47:47 The first form of AI safety is almost what I call like digital safety.
0:47:48 It’s like, will the thing offend you?
0:47:51 Or will there be hate content or other things?
0:47:55 And there’s actually a lot of rules that already exist around hate speech on the internet or hate
0:47:58 speech in general or, you know, what’s free speech or not and how you should think about
0:47:58 all these things.
0:48:00 So I’m less concerned about that.
0:48:01 I think people will figure that out.
0:48:06 There’s a second area, which is almost like physical safety, which is will you use AI to
0:48:07 create a virus?
0:48:09 Will you use AI to derail a train?
0:48:10 You know, et cetera.
0:48:15 And similarly, like when I look at the arguments made about how it will create a biological
0:48:18 virus and et cetera, et cetera, like you can already do that, right?
0:48:24 The protocols for cloning and PCR and all this, it’s all on the internet.
0:48:25 It’s all posted by major labs.
0:48:26 It’s in all the textbooks.
0:48:30 Like that’s not new knowledge that people can’t just go and do right now if they really wanted
0:48:30 to.
0:48:33 So I don’t know why that matters in terms of AI.
0:48:41 And then the third area is sort of this existential safetyism, like AI will become self-aware and
0:48:42 destroy us, right?
0:48:45 And when people talk about safety, they mix those three things.
0:48:48 They conflate them and therefore they say, well, eventually maybe something terrible happens
0:48:50 here, so we better shut everything else down.
0:48:52 While other people are just saying, hey, I’m worried about hate speech.
0:48:56 And so I think when people talk about safety, they have to really define clearly what they
0:48:56 mean.
0:49:00 And then they have to create a clear view of why it’s a real concern.
0:49:04 It’s sort of like if I kept saying, I think an asteroid could at some point hit the earth
0:49:06 and therefore we better do X, Y, Z.
0:49:07 We should move the earth or whatever.
0:49:11 You know, it’s just at some point these things get a little bit ridiculous in terms of safetyism.
0:49:16 There’s actually a broader question societally of like why has society become so risk averse
0:49:17 in certain ways and so safety centric?
0:49:20 And it impacts things in all sorts of ways.
0:49:22 I’ll give you a dumb example.
0:49:28 After what age does the data suggest that a child doesn’t need a special seat?
0:49:29 They can just use a seatbelt.
0:49:33 I think it’s like 10 or 12, isn’t it?
0:49:37 Well, so in California, for example, the law is up until age eight.
0:49:37 Okay.
0:49:40 You have to be in a booster seat or a car seat or whatever.
0:49:46 If you actually look at crash data, real data, and people have now reproduced this across
0:49:48 multiple countries, multiple time periods.
0:49:49 It’s the age of two.
0:49:50 Oh, wow.
0:49:53 So for six extra years, we keep people in booster seats and car seats and all that,
0:49:55 at least against the data, right?
0:49:55 Okay.
0:49:58 The Freakonomics podcast actually had a pretty good bit on this.
0:50:02 And there’s like multiple papers now that reproducibly show this retrospectively.
0:50:03 You just look at all the crashes.
0:50:04 That’s crazy.
0:50:04 Yeah.
0:50:05 So why do we do it?
0:50:05 Safety.
0:50:07 But it’s not safe.
0:50:08 Exactly.
0:50:09 But it’s positioned as safe.
0:50:11 As a parent, of course, you want to protect your children.
0:50:11 No, seriously, right?
0:50:14 And so, but then it has other implications.
0:50:17 It’s like you can’t easily transport the kids in certain scenarios because you don’t have
0:50:22 the car seat or, you know, you can only fit so many car seats in a car and it’s a pain
0:50:22 in the butt.
0:50:24 And do you upgrade the car if you want more kids?
0:50:25 And can you afford it?
0:50:27 And, you know, so it has all these ramifications.
0:50:33 And it’s because I think, A, it’s lucrative for the car seat companies to sell more car seats
0:50:34 for longer, right?
0:50:35 You get an extra six years on the kid or whatever.
0:50:39 Parents will, of course, say, I want safety no matter what.
0:50:44 And certain legislatures are happy to just, you know, legislate it.
0:50:48 So I think there’s lots and lots and lots of examples of that in society if you start picking
0:50:51 at it and you realize it pervades everything.
0:50:52 It pervades aspects of medicine.
0:50:55 It pervades things like AI now.
0:50:56 It’s just, it’s everywhere.
0:51:03 There’s one in Ottawa that I see on mornings when there’s schools around the, I don’t know,
0:51:04 five or six blocks of a school.
0:51:07 They basically have crossing guards everywhere now.
0:51:13 So it’s basically, even for high schools, like, so kids can’t walk to school on their
0:51:19 own, which I think you think, oh, well, how do you argue with that, right?
0:51:23 Like, and then I was thinking about this the other day because I was driving and, you know,
0:51:27 I got stopped by one of these people and I was like, we’re just teaching kids that like
0:51:28 they don’t even have to pay attention.
0:51:30 They can look at their phone.
0:51:32 The crossing guard is going to save them.
0:51:36 And then if the crossing guard is not like they’re, we’re not developing ownership or
0:51:37 agency in people.
0:51:39 How do you think about that?
0:51:44 I think it’s, I think it’s really bad for society at scale.
0:51:48 I mean, it’s kind of like, there was a different wave of this, which was, you know, 10, 15 years
0:51:53 ago with fragility and microaggressions and everything can offend you and you need to be super fragile
0:51:53 and all this stuff, right?
0:51:55 Which I think is very bad for, for kids.
0:51:57 And I think that has a lot of mental health implications.
0:52:03 The wave we’re in now, which is basically taking away independence, agency, risk-taking.
0:52:09 I think that has some really bad downstream implications in terms of how people act, what they consider
0:52:14 to be risky or not, and what that means about how they’re going to act in life and also their
0:52:15 ability to actually function independently.
0:52:16 So I agree.
0:52:20 I think, I think all those things are things that we’ve accumulated over the last few decades
0:52:21 that are probably quite negative.
0:52:27 You’re one of the most successful investors that a lot of people have probably never heard
0:52:27 of.
0:52:32 One of the things that you’ve said is that most companies die from self-inflicted wounds and
0:52:33 not competition.
0:52:38 What are the most common self-inflicted wounds that kill companies?
0:52:40 Yeah, I think there’s two or three of them.
0:52:43 You know, it depends on the stage of the company.
0:52:47 For a very early company, the two ways that they die is the founders start fighting and
0:52:51 the team blows up, or they run out of money, which means they never got to product market
0:52:51 fit.
0:52:54 They never figured out something that they could build economically that people would care
0:52:54 about.
0:52:58 So for the earliest stages, that’s, that’s roughly everything.
0:53:04 Every once in a while, you have some competitive dynamic, but the reality is most incumbent companies
0:53:05 don’t care about startups.
0:53:10 And startups have five, six years before an incumbent wakes up and realizes it’s a big deal and then
0:53:11 tries to crush them.
0:53:13 And sometimes that works.
0:53:16 Sometimes you just end up with capped outcomes.
0:53:19 So for example, you could argue Zoom and Slack got capped by Microsoft launching stuff into
0:53:24 teams in terms of taking parts of the market or creating a more competitive market dynamic
0:53:25 for them.
0:53:29 You know, the other types of self-inflicted wounds, honestly, sometimes people get very competitor
0:53:31 centric versus customer centric.
0:53:33 Go deeper on that.
0:53:35 I mean, there’s a lot of examples of that.
0:53:41 Sort of like if you focus on your, your competitor too much, you stop doing your own thing.
0:53:46 You stop building that thing the customer actually wants and you lose differentiation relative to
0:53:47 your competitor.
0:53:51 Or you start doing things that can hurt your competitor, but they don’t necessarily help
0:53:51 you.
0:53:54 And sometimes your competitor will retaliate.
0:54:00 An example of that would be in the pharmaceutical distribution world.
0:54:05 You know, 20 years ago, there was roughly three players that really mattered of any scale.
0:54:08 And they used to go after each other’s market share really aggressively, which eroded all the
0:54:10 pricing, which meant they were bad businesses.
0:54:16 And at some point, I think one of them decided to stop competing for share, but just protect
0:54:17 itself.
0:54:20 And then the others copied it and suddenly margins went way up in the industry, right?
0:54:24 They stopped being as focused on banging on each other and more just like, let me just
0:54:26 build more services for my customers and let’s just focus on our own set.
0:54:30 We’re all going to win a lot more that way, right?
0:54:33 In some cases, yeah, if you have an oligopoly market, that’s usually where it ends up.
0:54:38 Eventually, this is why people are so worried about collusion, right?
0:54:42 Eventually, the companies decide, hey, we should be in a stable equilibrium instead of beating
0:54:44 up on each other and shrinking margins.
0:54:49 Scaling a company often means scaling the CEO.
0:54:56 What have you learned about the ways that successful CEOs scale themselves and things that get in the
0:54:57 way?
0:54:58 Yeah, I think it’s two or three things.
0:55:02 One is figuring out who else you need to fill out your team with and how much can you trust
0:55:03 them and all the rest.
0:55:08 And so one piece of it is very innovative founder CEOs always want to innovate and so
0:55:10 they reinvent things that they shouldn’t reinvent.
0:55:14 Like sales is like effectively process engineering that’s been worked through for decades.
0:55:16 You don’t need to go reinvent sales.
0:55:18 You know, you just hire a sales team and it’ll work just fine.
0:55:21 So one aspect is getting out of your own way on reinvention.
0:55:23 There’s certain things you want to rethink, but many of them you don’t.
0:55:27 Part of it is hiring people who are going to be effective in those roles and more effective
0:55:28 than you might be.
0:55:31 Often you end up finding people who are complementary to you.
0:55:39 Now that really breaks down during CEO succession because what happens is often the CEO will promote
0:55:43 the person who’s their complement as the next CEO instead of finding somebody like them who
0:55:47 can innovate and push on people and drive new products and new changes.
0:55:52 And so often you see companies have a golden age under a founder and then decay.
0:55:56 And the decay is because the founder promoted their lieutenant who was great at operations
0:56:00 or whatever, but wasn’t a great product thinker or technology vision like themselves.
0:56:04 And so that’s actually a failure mode for like longer term related areas.
0:56:08 You could argue such a Microsoft is a good example of somebody who has more of a founder mindset.
0:56:09 I’m going to reinvent things.
0:56:10 I’m going to rethink things.
0:56:11 I’m going to do these crazy deals.
0:56:14 They backed open AI at GPT-2, which is like a huge risk.
0:56:16 They’ve done all sorts of really smart acquisitions.
0:56:22 So like that’s an example of somebody who actually did a, they did a smart succession there in
0:56:25 terms of finding somebody who’s a bit more like product founder mentality.
0:56:35 You know, in terms of other ways that CEOs fail is they listen too much to conventional wisdom
0:56:37 on how to structure their team.
0:56:42 And really the way you want to have your team function at a large organization is based
0:56:43 on the CEO.
0:56:45 What does the CEO need?
0:56:46 What are the compliments they need?
0:56:47 What is the structure they need?
0:56:52 And if you were to plop out that person and plop in a different CEO, that structure probably
0:56:53 shouldn’t work like half the time.
0:56:57 There’s some types of people where there’s lots of commonalities, particularly if it’s people
0:57:01 who came up the corporate ladder and they’re all used to doing things the same way.
0:57:05 But if you’re more of a founder CEO and you’re going to have your quirks and you’re going
0:57:08 to have your obsessions and you’re going to have all these things that founders often
0:57:11 have, you need an org structure that reflects you.
0:57:13 And so like Jensen from NVIDIA talks about this, right?
0:57:15 The claim is he has like 40 direct reports.
0:57:19 He claims that, you know, he doesn’t do many one-on-ones or things like that.
0:57:22 And the focus is more on finding very effective people who’ve been with him for a while and
0:57:23 who can just drive things, right?
0:57:25 And then he sort of deep dives in different areas.
0:57:31 That’s a very different structure from how Satya’s running Microsoft or Larry Ellison
0:57:32 has run Oracle over time.
0:57:36 Or, you know, you look at these other sort of giants of industry and management and everything
0:57:36 else.
0:57:40 And so I think you really need an org structure that reflects you.
0:57:43 Now, there’s going to be commonalities and there’s only so many reports most people can
0:57:44 handle and all the rest of it.
0:57:48 But I do think you kind of want to have the team that reflects your needs versus the generic
0:57:49 team that could reflect anybody’s needs.
0:57:54 Is that the problem with sort of a lot of these business leadership books that are written
0:57:59 about a particular person in style that they have and then people read them and they try
0:58:02 to implement them, but it’s not genuine to who they are?
0:58:03 I think that’s very true.
0:58:07 And it really depends on whether you’re talking about the generic case of, hey, it’s a big
0:58:12 company and you’re at a related large company that’s 100 years old that’s been run a certain
0:58:12 way.
0:58:17 Like, I wouldn’t be surprised if you could roughly interchange the CEOs of a subset of the pharma
0:58:19 companies in terms of the org structure.
0:58:23 They may not have the chemistry with the people or the trust or whatever, but like the org
0:58:25 structures are probably reasonably similar.
0:58:28 That’s probably pretty different than if you looked at, you know, how Oracle has been run
0:58:32 over time versus Microsoft over time versus Google over time versus whoever.
0:58:38 When you say that, I think the wording you use like conventional wisdom, CEOs should pay less
0:58:39 attention to conventional wisdom.
0:58:46 Do you mean that in the sense of the, I guess the nomenclature that Brian Jeske came out
0:58:47 was founder mode?
0:58:55 Yeah, I think, um, I think we lived through a decade or so, maybe longer where a lot of
0:59:01 forces came into play in the workplace that were not productive to the company actually obtaining
0:59:02 its missions and objectives.
0:59:08 And a lot of that was all the different forms of politics and bring your whole self to work
0:59:11 and all these things that people are talking about, which I don’t want somebody’s whole
0:59:11 self at work.
0:59:18 You know, I remember at Google, um, for Halloween, uh, and maybe we should edit this part out,
0:59:20 but there’s somebody who would show up and ask those chaps every Halloween.
0:59:23 And you’re like, I don’t want to see that.
0:59:24 Like I’m in a work environment.
0:59:26 Why is this, why is this engineer walking around like this?
0:59:26 Yeah.
0:59:27 Yeah.
0:59:30 And then the second you start bringing kids to work, you’re like, I sure as hell don’t
0:59:31 want this guy walking around.
0:59:31 Right.
0:59:32 Yeah.
0:59:33 And that’s bring your whole self to work.
0:59:34 Like, why would you do that?
0:59:37 You actually should bring your professional self to work.
0:59:40 You should bring the person who’s going to be effective in a work environment and can work
0:59:44 with all sorts of diverse people and be effective and doesn’t bring all their mores
0:59:47 and values and everything else in the workplace that don’t have a place in the workplace.
0:59:49 There’s a subset of those that do, but many don’t.
0:59:54 We lived through a decade where not only were those things encouraged, but the traditional
0:59:58 conventionalist executives brought that stuff with them.
1:00:00 And I think it was probably bad for a lot of cultures.
1:00:02 It defocused them from their mission.
1:00:04 It defocused them from their customers.
1:00:06 It defocused them from doing the things that were actually important.
1:00:11 And the first person to speak out against that was Brian Armstrong that I remember like
1:00:13 in a very public and visible way.
1:00:17 And then Toby Luque followed him not long after.
1:00:20 And they said, no, the workplace is not about that.
1:00:22 It’s about X, Y, and Z.
1:00:24 And if you don’t like it, like basically leave.
1:00:24 Yeah.
1:00:29 And was that the moment where we started to go back to founder mode effectively?
1:00:31 I think it took some time.
1:00:33 I think Brian was incredibly brave for doing that.
1:00:33 Totally.
1:00:35 And he got a lot of flack for it.
1:00:35 And I think it-
1:00:36 They tried to cancel him.
1:00:39 They tried to cancel him aggressively, which was sort of the playbook, right?
1:00:42 Oh, and this was happening inside of companies too, right?
1:00:44 You’d say something and you’d get canceled for it.
1:00:47 And so you can have a real conversation around some of these things.
1:00:48 And again, that just reinforced it.
1:00:52 And I think Brian stepping forward made a huge difference.
1:00:54 To your point, Toby, I think did it really well.
1:01:00 I still sometimes send the essay that he wrote for that to other people where he had a few
1:01:04 central premises, which is we have a specific mission and we’re going to focus on that.
1:01:05 We’re not focusing on other things.
1:01:07 We’re not a family.
1:01:07 We’re a team.
1:01:08 Yeah.
1:01:09 Right?
1:01:12 The family is like, hey, your uncle shows up drunk all the time.
1:01:14 You kind of tolerate it because it’s your uncle.
1:01:18 If somebody showed up drunk at work all the time, you shouldn’t tolerate that, right?
1:01:19 You’re not a family.
1:01:20 You’re a sports team.
1:01:22 You’re trying to optimize for performance.
1:01:26 You’re trying to optimize for the positive interchange within that team.
1:01:30 And you want people pulling in the direction of the team, not people doing their own thing,
1:01:31 which is a family, right?
1:01:35 And so I think there was a lot of these kind of conversations or discussions that were more
1:01:40 like it’s a family and bring yourself to work and all the holisticness of yourself.
1:01:44 And it’s actually, well, no, you probably shouldn’t show up at work drunk and, you know,
1:01:45 look at bad things on the internet.
1:01:49 You know, you should focus on your job and you should focus on good collaboration with your
1:01:50 co-workers and things like that.
1:01:57 You’re around a lot of outlier CEOs, not only in the context of you know them, but
1:01:58 you hang out with them.
1:01:59 You spend a lot of time with them.
1:02:03 What are sort of the common patterns that you’ve seen amongst them?
1:02:06 Are there common patterns or is everybody completely unique?
1:02:10 But I imagine that at the core, there’s commonality.
1:02:11 Yeah.
1:02:13 You know, this is something I’ve been kind of riffing on lately, and I don’t know if it’s
1:02:15 quite correct, but I think there’s like two or three common patterns.
1:02:19 I think pattern one is there are a set of people who are, and by the way, all these people
1:02:24 are like incredibly smart, you know, incredibly insightful, et cetera, right?
1:02:28 So they all have a few common things.
1:02:30 But I do think there’s two or three archetypes.
1:02:32 I think one of them is just the people who are hyper-focused.
1:02:34 They don’t get involved with other businesses.
1:02:36 They don’t do a lot of angel investments.
1:02:38 They don’t, you know, do press junkets that don’t make sense.
1:02:40 They just stay on one track.
1:02:44 And a version of that was Travis from Uber.
1:02:47 I knew him a little bit before Uber, and I’ve, you know, run into him once or twice since
1:02:50 then, but like, he was always just incredibly focused.
1:02:52 He used to be an amazing angel investor.
1:02:55 I think he made great investments, but he stopped doing it with Uber, and he just focused on Uber.
1:02:59 And as far as I know, he never sold secondary until he left the company, right?
1:03:02 He was just hyper-focused on making it as successful as possible.
1:03:04 So that’s one class of ArchType.
1:03:12 There’s a second class, which I’d view as people who are equally smart and driven, but a bit more,
1:03:15 polymathic may be the wrong word, but they just have very broad interests, and they express
1:03:17 those interests in different ways while they’re also running their company.
1:03:22 And often they have a period where they’re just focused on their company, and then they
1:03:23 add these other things over time.
1:03:28 And so examples of that, I mean, obviously Elon Musk is now that, right?
1:03:29 In terms of all that.
1:03:35 Patrick Collison is that he’s running a biology institute, or his Sobana and the other Patrick
1:03:38 are running it alongside him called ARK.
1:03:44 Brian Armstrong is now running a longevity company in parallel to Coinbase, or he has somebody
1:03:45 running it.
1:03:51 So there’s a lot of these examples of people doing X2, X3, and doing it in other fields.
1:03:56 Honestly, that’s a little bit of a new development relative to what you were allowed to do before,
1:04:01 because there’s both activist investors who try to prevent that, and public markets in
1:04:02 particular.
1:04:07 But also, it was just a different mindset of how do I show impact over time?
1:04:12 Are these people going from the first one, hyper-focused, to this?
1:04:16 Or were they always sort of, I don’t want to use the word dabble because it really understates
1:04:19 how focused they are on their businesses.
1:04:24 But are they always like that, and as they get larger, it scales differently?
1:04:30 Or is it, no, we’ve gone from sort of the first, which is this hyper-focus, to the second?
1:04:36 I think it’s more like when you talk to them, the way that they think about the world and
1:04:40 the set of interests they have is a little bit different from the first group of folks.
1:04:43 And I’m not talking about Travis specifically, because I didn’t know him well enough to have
1:04:44 a perspective on that.
1:04:49 But I just mean more generally, I’ve noticed that they have this commonality of when you
1:04:55 talk to them very early, they’re like 20 years old or whatever, and you meet them, the set
1:04:57 of interests that they have is very, very broad.
1:05:02 And they tend to go very deep on each thing that they get interested in, whether it benefits
1:05:03 them or not.
1:05:04 They just go deep on it, right?
1:05:05 Because it’s interesting.
1:05:09 They’re driven by a certain form of interestingness, in addition to being driven by impact.
1:05:13 And then I think there’s a third set of people who end up with outside successes.
1:05:16 And sometimes that’s just product market fit.
1:05:18 And then they grow into the role, you know?
1:05:24 And so there’s some businesses that just have either such strong network effects or just such
1:05:26 strong liftoff early on.
1:05:29 And they’re obviously very smart people and all the rest of it, but you don’t feel that
1:05:33 same drive underlying it or that same need to do big things.
1:05:34 It’s almost accidental.
1:05:37 And you sometimes see that.
1:05:39 Would you say that’s more luck?
1:05:41 I don’t know.
1:05:45 I mean, say somebody is really good at product market fit, but they’re not that aggressive.
1:05:47 And once they hit a certain level, they’re not that ambitious.
1:05:49 Part of it too is like, what’s your utility curve?
1:05:50 Like, what do you care about in life?
1:05:52 Do you care about status?
1:05:53 Do you care about money?
1:05:54 Do you care about power?
1:05:55 Do you care about impact?
1:05:57 Do you do things because it’s interesting?
1:05:58 Like, why do you do stuff?
1:06:04 And imagine people where that is a big part of everything they do, right?
1:06:07 Because I think the average person may have mixes of that, but they’re also just happy
1:06:08 going to their kids and hanging out, you know?
1:06:10 And like, it’s a different life, right?
1:06:16 Like, the average Google engineer is not going to be this insanely driven, hyper, you know,
1:06:17 hyper drive person anymore.
1:06:20 What do you think keeps people going?
1:06:25 I mean, a lot of people become successful and maybe they hit whatever number they have in
1:06:29 their head that they can like retire comfortably or live the life they want to live and they
1:06:31 become complacent.
1:06:32 Maybe not intentionally.
1:06:36 I mean, they’re not thinking that way, but they take their foot off the gas and, you
1:06:39 know, all of a sudden I’m focused on 10 different things instead of one thing.
1:06:45 And then there’s another subset of people that are like, they just blow right by that and
1:06:46 they keep going.
1:06:50 And whether it’s a hundred million or a billion or a 10 billion or, you know, in Elon’s case,
1:06:53 a hundred billion or more, but they keep going.
1:06:54 Yeah.
1:06:55 It’s back to what’s your utility, like, what do you care about?
1:06:56 What’s your utility function?
1:06:57 What’s driving you?
1:07:02 And based on what’s driving you, like the people that I know who have been very successful
1:07:03 or driven solely by money end up miserable.
1:07:07 Because they have money and then, and then what?
1:07:08 It’s never enough.
1:07:09 What do you do then?
1:07:10 Well, it’s not just never enough.
1:07:12 It’s just, what do you do?
1:07:13 What fulfills you?
1:07:15 You can already buy everything you could ever buy.
1:07:17 Like what fulfills you?
1:07:22 And you also see versions of this where you see people who make it and then they don’t know
1:07:23 what to do with themselves.
1:07:24 I think I mentioned this earlier.
1:07:28 There’s one guy I know who’s incredibly successful and he spends all his time buying domain names.
1:07:34 You’re like, well, is that fulfilling or, you know, it’s almost like what’s your meaning or purpose?
1:07:41 I feel like the people who end up doing these other things have some broader meaning or purpose driver even very early on.
1:07:43 And obviously people want to win and all the rest.
1:07:47 There’s this really good framework from Naval Ravikant.
1:07:53 And so in the 90s, John Doerr, who’s one of the giants, the legends of investing, used to ask founders,
1:07:54 are you a missionary or mercenary?
1:07:59 And of course, the question that you were expected to say is, I’m a missionary, right?
1:08:02 I’m doing it because it’s the right work to do and all this.
1:08:09 And Naval’s framework is like, when you’re young, of course, you’re at least half, if not more, mercenary.
1:08:10 Yeah.
1:08:11 You want to make it.
1:08:11 You’re hungry.
1:08:12 You don’t have any money.
1:08:13 You need to survive.
1:08:16 You know, you’re driven because of that in part.
1:08:23 And then in the middle phase of your career or life, you’re more of a missionary if you’re not a zero-sum person, right?
1:08:24 You suddenly can have a broader purpose.
1:08:25 You can do other things.
1:08:26 You can engage.
1:08:28 And then he’s like, late in your life, you’re an artist.
1:08:30 You do it for the love of the craft, right?
1:08:42 I much prefer that framework of the people that I see who do the most interesting big things over time fall into that latter category where always there is some mercenary piece.
1:08:45 Of course, you want to have money to survive and all this stuff.
1:08:49 And then that morphs into you become more mission-centric.
1:08:52 And then over time, you just do it for the love of whatever the thing you’re doing is.
1:08:54 And those are the people that I see that become happy over time.
1:08:58 What’s the difference between success and relevance?
1:09:03 Yeah, it’s a great question because there’s lots of different ways to define success.
1:09:06 Success could mean I have a million Instagram followers.
1:09:09 It depends on your own version of success, right?
1:09:13 So, societally, one of the big versions of success is a big financial outcome.
1:09:16 One could argue a bigger version of that is like a happy family.
1:09:18 You know, like there’s lots of versions of success.
1:09:26 Relevance means that you’re somehow impacting things that are important to the world and people seek you out because of that.
1:09:28 Or alternatively, you’re just impacting things, right?
1:09:32 But usually, people end up seeking you out because of that for a specific thing.
1:09:37 And the amazing thing is that there’s lots and lots of people who’ve been successful who are no longer relevant.
1:09:44 You just look at the list of even the billionaires or whatever metric you want to use and like how many of those people are actually sought out.
1:09:45 Yeah.
1:09:47 Because they’re doing something interesting or important.
1:09:51 And so, there’s this interesting question that I’ve been toying with, which is,
1:09:55 are there characteristics to people who stay relevant over very long arcs of time?
1:09:59 People are constantly doing interesting things, right?
1:10:05 One could argue Sam Altman has sort of maintained that over a very long arc between YC and the early things he was involved with the investing side.
1:10:08 And then, of course, now OpenAI and other areas.
1:10:12 Patrick is obviously doing that between Stripe and Arc and other areas.
1:10:16 And there’s people with longer arcs than that, right?
1:10:21 Marc Andreessen invented the browser and then there was one of the key people behind that.
1:10:26 And then started multiple companies, including Netscape, which was a giant of the internet.
1:10:28 And then started, you know, one of the most important venture firms in the world.
1:10:33 And so, that’s a great example of a very, very strong arc over time.
1:10:36 Or Elon Musk is a very strong arc over time, right?
1:10:38 From Zip2 to PayPal to all the stuff he’s done now.
1:10:41 So, the question is, what do those people have in common?
1:10:42 Peter Thiel, right?
1:10:48 Think of all the stuff he’s done across politics and the Thiel Fellows and the funds and Palantir and Facebook and all this stuff.
1:10:57 The commonality that stands out to me across all those people is they tend to be pretty polymathic.
1:10:58 So, they have a wide range of interests.
1:11:04 They tend to be driven by a mix of stuff, not just money.
1:11:07 So, of course, money is important and all the rest.
1:11:10 But I think for a subset of people, it’s interestingness.
1:11:11 For a subset, it’s impact.
1:11:12 For a subset, it’s power.
1:11:14 For whatever it is, but there’s usually a blend.
1:11:17 And for each person, there’s a different spike across that.
1:11:21 And the other, I think, commonality is almost all of them had some form of success early.
1:11:31 Because the thing that people continue to underappreciate is kind of like the old Charlie Mungerism that the thing he continues to underappreciate is the power of incentives, right?
1:11:34 The thing I continue to underappreciate is the power of compounding.
1:11:40 And you see that in investing and financial markets, but you also see that in people’s careers and impact.
1:11:46 And the people who are successful early have a platform upon which they can build over time in a massive way.
1:11:50 They have the financial wherewithal to take risks or fund new things.
1:11:53 And importantly, they’re in the flow of information.
1:11:58 You start to meet all the most interesting people thinking the most interesting things.
1:12:03 And you can synthesize all that in this sort of pool of ideas and thoughts and people.
1:12:08 This is full circle back to almost where we started, right?
1:12:17 Like how important is that flow of information to finding the next opportunity, to capitalizing on other people’s mistakes, to staying relevant?
1:12:19 Yeah, there’s two types of information.
1:12:24 There’s information that’s hidden.
1:12:27 And there’s information that…
1:12:30 So I’ll give you an example, right?
1:12:36 When I started investing in generative AI, all these early foundation model things, et cetera, basically nobody was doing it.
1:12:39 And it was all out in the open, right?
1:12:41 GPT-3 had just dropped.
1:12:43 It was clearly a big step function from two.
1:12:46 If you just extrapolated that, you knew really, really interesting things were going to happen.
1:12:48 And people were using it internally in different ways at these companies.
1:12:54 And so it was in plain sight that GPT-3 existed out there, but very few people recognized that it was that important.
1:12:56 And so the question is why, right?
1:12:57 The information was out there.
1:13:05 There’s other types of information that early access to helps impact how you think about the world.
1:13:07 And sometimes that could just be a one-on-one conversation.
1:13:10 Or sometimes, again, they could be doing things out in the open.
1:13:16 And so, for example, all the different things that Peter Thiel talked about and it cites on like 10 years ago ended up being true.
1:13:19 Not all, but a lot of them, right?
1:13:21 So wait, let me go through some of these.
1:13:29 So there’s, I found, I found information that is publicly available that you haven’t found.
1:13:33 There’s, I weigh the information differently than you do.
1:13:33 Yeah.
1:13:35 So I weigh the importance of it differently.
1:13:40 And then there’s access where I have access to information that you don’t have.
1:13:42 Are there other types of information advantage?
1:13:47 No, because I think the one where you interpret it differently that you mentioned has all sorts of aspects to that.
1:13:48 Go deeper on that.
1:13:50 Well, do you have the tooling to do it?
1:13:52 Do you need a data scientist, right?
1:13:53 It’s all the algorithmic trading stuff.
1:13:57 All the information’s out there, but can you actually make use of it?
1:14:00 There’s, do you have the right filter on it?
1:14:04 Do you pick up or glean certain insights or make intuitive leaps that other people don’t?
1:14:08 You know, there’s all the different, it’s sort of like when people talk about Richard Feynman, the physicist.
1:14:14 And they said, with other physicists who won Nobel Prizes, they’re like, oh yeah, I could understand how that person got there.
1:14:16 It’s this chain of logical steps and maybe I could have done that.
1:14:19 They’re like with Feynman, he just did these leaps and nobody knew how he did it.
1:14:26 And so I do think there’s people who uniquely synthesize information in the world and come to specific conclusions.
1:14:34 And those conclusions are often right, but people don’t know how they got there.
1:14:40 You’re bringing it back to clusters and all the stuff about information and how to think about it and how to interpret it.
1:14:41 It’s all about being in a cluster.
1:14:44 How do you go about constructing a better cluster?
1:14:55 Like if you take the presumption that the material that goes into my head, whether I’m reading, you know, that’s one way I’m conversing, I’m searching.
1:15:03 How do I improve the information quality through a cluster or not that my raw material is built on later?
1:15:05 Yeah, I think it’s a few things.
1:15:09 And I think different people approach your processes in different ways.
1:15:15 And this is back to the best people somehow tend to aggregate or maybe best is the wrong word.
1:15:24 There’s a bunch of people with common characteristics, a subset of whom become very successful, that somehow repeatedly keep meeting each other quite young in the same geography.
1:15:26 And again, it’s happened throughout history.
1:15:33 And so, A, there’s clearly some attraction between these people to talking to each other and hanging out with each other and learning from each other.
1:15:37 And sometimes you meet somebody and you’re like, wow, I just learned a ton off of this person in like 30 minutes.
1:15:43 And this was a great conversation versus, okay, yeah, that was nice to meet that person.
1:15:44 They’re nice or whatever, you know.
1:15:56 And I feel like a lot of folks who end up doing really big interesting things just somehow meet or aggregate towards these other people and they all tell each other about each other and they hang out together and all the rest.
1:16:00 And so, I do think there’s sort of self-attraction of these groups of people.
1:16:04 Now, the internet has helped create online versions of that.
1:16:12 There’s been a lot of talk now about these IOI or gold medalist communities where people do like math or coding competitions or other things.
1:16:17 Scott, the CEO of Cognition, is a great example of that where he knows a lot of founders in Silicon Valley.
1:16:19 And one of the reasons they all know each other is through these competitions.
1:16:25 And there’s a way to aggregate people growing up all over the country or all over the world who never would have connected.
1:16:26 And then they connect through these competitions.
1:16:29 And so, that’s become a funnel for a subset of people.
1:16:37 So, the move towards the internet, I think, has actually created a very different environment where you can find more like-minded people than you ever could before, right?
1:16:39 Because before, how would you find people?
1:16:42 And how would you even know to go to Silicon Valley?
1:16:46 Do you think it’s true that if I change your information flow, I can change your trajectory?
1:16:53 And if so, what are the first steps that people listening can take to get better information?
1:17:00 If you want to work in a specific area and be top of your game in that area, you should move to the cluster for whatever that is.
1:17:00 Yeah.
1:17:02 So, if you want to go into movies, you should go to Hollywood.
1:17:05 If you want to go into tech, you should go to Silicon Valley, if you want to, you know, etc.
1:17:11 And the whole, hey, you can succeed at anything from anywhere is kind of true, but it’s very rare.
1:17:13 And why make it harder for yourself?
1:17:14 Yeah.
1:17:15 Why play on hard mode?
1:17:15 Yeah.
1:17:19 How do you think about that in terms of companies and remote work?
1:17:31 Like, we were talking about this a little bit before we hit record in the sense of, you know, one of the things that people lose is the culture of the company and feeling part of something larger than themselves.
1:17:36 How does that impact the quality of work we do or the information flow we have?
1:17:42 There’s no more water cooler conversation where, like, hey, you know, in that presentation, you should have done this, not that.
1:17:42 Yeah.
1:17:43 No, that’s a great point.
1:17:44 I think it’s interesting.
1:17:53 If a company is really young and still very innovative, I think a lot of remote work tends to be quite bad in terms of the success of the company.
1:17:54 Now, that doesn’t mean it won’t succeed.
1:17:55 It just makes it much harder.
1:18:00 And a company I backed, I don’t know how long ago now, 14 years or something like that, was GitLab.
1:18:02 Which has done quite well.
1:18:03 It’s a public company now, et cetera.
1:18:08 And they were one of the very first remote first companies.
1:18:10 And so when I backed them, it was like four people or something.
1:18:11 I can’t remember, four or five people.
1:18:13 They were fully remote.
1:18:14 They stayed remote forever.
1:18:17 And they built a ton of processes in to actually make that work.
1:18:18 And they were brilliant about it.
1:18:24 And they actually have all this published on their website where you can go and you can read hundreds of pages about everything they’ve done to enable remote work.
1:18:30 Everything from, like, how they thought about salary bans based on location on through to processes and all the rest.
1:18:37 And it was a very quirky, it may still be, culture where I’d be talking to the CEO and he’d say, oh, this conversation is really interesting.
1:18:43 And he dropped the link to our Zoom into a giant group chat and random people just start popping in while we’re talking.
1:18:44 Oh, wow.
1:18:45 You know, and you’re like, who are these people?
1:18:48 Like, we’re just talking about should you do a riff and like 30 people just joined.
1:18:49 Like, is this a good idea?
1:19:01 It was a very, and it probably still is, very innovative, very smart culture, very process driven, you know, very just excellent at saying, okay, if we’re going to be remote, let’s put in place every single control to make that work.
1:19:03 So they’re very smart about that.
1:19:06 I have not seen many other companies do anything close to that.
1:19:13 And so I think for very early companies, the best companies I know are almost 100% in person.
1:19:15 And there’s some counter examples of that.
1:19:16 And crypto has some nuances on that.
1:19:18 And, you know, which is a little bit different.
1:19:22 But for a standard AI, tech, SaaS, et cetera, that’s generally the rule.
1:19:29 As a company gets later, you’re definitely going to have remote parts of your workforce, right?
1:19:30 Parts of your sales team are remote.
1:19:32 Although really, they should be at the customer site, right?
1:19:35 Remote should mean customer site or home office or something, right?
1:19:37 It shouldn’t mean truly remote.
1:19:42 But, and you always, even 10 years ago or whatever, would make exceptions, right?
1:19:48 You’d say, well, this person is really exceptional and I know them well and they’re moving to Colorado and we’ll keep this person because we know that they’re, you know, as productive.
1:19:51 They’re more productive than anybody else on the team, even if they’re not going to be in the office every day.
1:19:57 Later stage companies, there’s this really big question of like, how much of your team do you want to be remote?
1:19:58 How many days a week?
1:20:07 And then is enforcing a lack of remote policy just also enforcing that you’re prioritizing people who care about the company more than they care about other things.
1:20:07 Right.
1:20:12 And each CEO needs to come and make a judgment call about how important that is.
1:20:15 How much does that impact how they can participate in global talent?
1:20:17 Because that’s often the question or concern.
1:20:19 So there’s like a set of trade-offs.
1:20:26 I mean, the argument for it, I guess, is like it’s more flexible for employees if that is part of what you’re optimizing for.
1:20:31 But we can also hire world-class talent that we might not be able to hire otherwise.
1:20:31 Yeah.
1:20:34 And I don’t know if I 100% buy that, but it’s possible.
1:20:40 I’ve been in the sauna at the gym with a number of people on like Microsoft Teams calls.
1:20:42 Yeah, you can see people who are clearly not working.
1:20:50 Now, the flip side of that is, you know, there are certain organizations that you knew people weren’t working very hard at before things went remote, right?
1:20:58 Like some of the big tech companies before COVID, you’d go in and it’d be pretty empty until like 11 and then people would roll in for lunch and then they’d leave at like 2.
1:21:08 And so one argument I make sometimes is that big tech is effectively a big experiment in UBI, universal basic income, for people who went to good schools, right?
1:21:12 You’re literally just giving money to people for not doing very much in some cases.
1:21:19 Do you think that that’s starting to change and the complacency maybe that caused that is starting to go away as we get into this?
1:21:23 Like it seems like we had this, everybody was super successful.
1:21:26 They all had their own area, but now we have a new race.
1:21:27 Like we have to get fit again.
1:21:32 You know, it’s kind of like the person who goes to the gym and never breaks a sweat.
1:21:35 If you’re talking about fitness, you know, they lift away and they’re like, I’m going to get on my phone now.
1:21:37 That’s what I feel like has basically happened.
1:21:47 And so I think the reality is if you look at what Musk did at Twitter, where they cut 80% or whatever it was, I wouldn’t be surprised if you could do things that are pretty close to that at a lot of the big tech companies.
1:21:49 That’s fascinating.
1:21:57 One of the things that we talked about was sort of how the best in any field, there’s sort of like 20 people who are just exceptional.
1:21:58 Go deeper on that for me.
1:21:59 Yeah.
1:22:00 So we were talking about clusters, right?
1:22:01 So there’s geographic clusters.
1:22:03 Like, hey, all of tech is happening in one area.
1:22:07 And honestly, all of AI is happening in like, you know, a few blocks, right?
1:22:09 If you were to aggregate it all up.
1:22:14 So there’s these very strong cluster effects at the regional level.
1:22:20 And then as we mentioned, there’s groups of people who keep running into each other who are kind of the motive force for everything.
1:22:29 And if you look at almost every field, there’s at most a few dozen, maybe for very big fields, a few hundred people who are roughly driving almost everything, right?
1:22:36 You look at cancer research, and there’s probably 20 or 30 labs that are the most important labs where all the breakthroughs come out of it.
1:22:40 Not just that, the lineage of those labs, the people they came from was in common.
1:22:47 And the people who end up being very successful afterwards are all come from one of those, or mainly all come from those same labs.
1:22:49 You actually see this for startups, right?
1:22:54 My team went back and we looked at where do all the startup founders come out of school-wise.
1:22:58 And three schools dominate by far in terms of big outside outcomes.
1:23:01 Stanford is number one by far, and then MIT and Harvard.
1:23:07 And then there’s a big step down, and there’s a bunch of schools that have some successes, Berkeley and Duke and a few others.
1:23:10 And then there’s kind of everything else, right?
1:23:15 And so there are these very strong rules of like lineage of people as well, right?
1:23:19 And oddly enough, you see this in religious movements, right?
1:23:20 The lineage really matters.
1:23:23 Schools of yoga, the lineage really matters.
1:23:25 Like all these things, the lineage really matters.
1:23:30 And so what you find is that in any field, there’s a handful of people who drive that field.
1:23:32 And a handful, again, could be in the tens or maybe hundreds.
1:23:33 And that’s true in tech.
1:23:40 Like, you know, there was probably early on 20, 30, whatever, maybe 100 at most AI researchers who were driving much of the progress.
1:23:42 There’s a bunch of ancillary people, but there’s a core group.
1:23:45 That’s true in areas of biology.
1:23:46 That’s true in finance.
1:23:49 That’s, you know, and eventually most of these people end up meeting each other, right?
1:23:53 In different forms, and some become friends, and some become rivals, and some become both.
1:23:56 But it’s surprising how small these groups are.
1:24:03 And a friend of mine and I were joking that we must be in a simulation because we keep running into the same people
1:24:05 over the 10 or 20-year arc who keep doing the big things.
1:24:06 Yeah.
1:24:10 Does that mean those people are almost perpetually undervalued?
1:24:14 Especially if it’s not a CEO and they’re running their own show, if it’s a researcher.
1:24:22 If you take the hypothesis that maybe there’s only 20 people, 20 great investors, or, you know,
1:24:28 20 great researchers, or 20 great whatever, but they’re employees of somebody else,
1:24:30 then they’re perpetually undervalued?
1:24:35 Because it’s like, no matter how much I’m paying you, it’s almost not enough.
1:24:38 Because you’re going to drive this forward.
1:24:39 Yeah, it depends on how you define greatness.
1:24:40 Yeah.
1:24:42 If somebody is the world’s best kite flyer.
1:24:43 Yeah.
1:24:45 No, seriously, though, right?
1:24:48 Like, there’s going to be a handful of people who are the best at every single thing.
1:24:50 But there’s not a ton of economic value created by that.
1:24:52 Yeah, and so that’s the question, right?
1:25:00 And so, you know, part of the question is, what is the importance of each person relevant
1:25:02 to an organization or field?
1:25:05 And then are they properly recognized or rewarded relative to those contributions?
1:25:06 And if not, why not?
1:25:07 And if so, then great.
1:25:10 And so I think there’s a separate question, right?
1:25:12 Of rewards, effectively.
1:25:16 And rewards could be status, it could be money, it could be influence, it could be whatever it is.
1:25:19 What else have you guys learned about investing in startups?
1:25:25 So you had these clusters like, oh, you know, most people come from Stanford, MIT, or Harvard.
1:25:26 Yeah.
1:25:30 What are the other things that you’ve picked up that you were like, oh, that’s surprising
1:25:34 or counterintuitive or challenges an existing belief that I had?
1:25:37 Oh, I mean, I’ll give you one that challenges and then I’ll give you one that I think is consistent.
1:25:40 Maybe I’ll start with a consistent one, which is back to clusters.
1:25:44 We take all of market cap of companies worth a billion dollars or more that are private.
1:25:48 And every quarter or two, we basically look at geographically where are they based, right?
1:25:52 And traditionally, the US has been about half of that globally.
1:25:54 The Bay Area has been about half of that.
1:25:59 So 25% of all private technology wealth creation happens in one place, right?
1:26:00 In one city.
1:26:03 If you add in New York and LA, then you’re at like 40% of the world.
1:26:04 Wow.
1:26:05 Right?
1:26:06 And LA is mainly SpaceX and Android.
1:26:07 Yeah.
1:26:10 So it’s very concentrated, right?
1:26:14 That’s why when I see venture capitalists build these global firms with branches everywhere,
1:26:15 you’re like, why?
1:26:19 You know, like from a research allocation perspective, unless you’re just trying to, you know, have
1:26:20 a specific footprint for reasons.
1:26:27 And if you look at AI, it’s like 80 to 90% of the market cap is all in the Bay Area.
1:26:29 Right?
1:26:30 And so it’s a super cluster.
1:26:33 And you see that going the other way.
1:26:37 Like for fintech, a lot of the value of fintech was split between New York and the Bay Area.
1:26:37 Yeah.
1:26:41 So one aspect of it is these things are actually more extreme than you’d think for certain areas.
1:26:50 And space and defense is roughly all, or was Southern California until SpaceX moved some of its operations.
1:26:53 The counterintuitive thing is more tactical things.
1:26:58 So, you know, there’s a few things that people say a lot in Silicon Valley that just aren’t correct.
1:27:05 So if you look, for example, there’s this thing that you should always have a co-founder or an equal co-founder.
1:27:11 And if you look at the biggest successes in the startup world over time, they were either solo founders or very unequal founders.
1:27:15 So that, and there’s kind of examples to that, of course, but that was Amazon, right?
1:27:16 Jeff Bezos was the only founder.
1:27:19 Microsoft, it was unequal.
1:27:21 And eventually the other founder left.
1:27:26 You know, you kind of go through the list and there aren’t that many where there was true equality, you know.
1:27:30 But it’s now kind of this myth that you should be equal with your co-founder.
1:27:32 And I think there’s negative aspects to doing that.
1:27:37 A second thing is, that’s a little bit counterintuitive, is reference checks on founders.
1:27:42 So if you do a, if you get a positive reference check on someone, then it’s positive.
1:27:46 If you get a negative reference check on a founder, it’s usually neutral.
1:27:50 Unless people are saying they’re ethically bad or there’s some issue with them or whatever.
1:27:52 But there’s two reasons for that.
1:27:55 One is I think product market fit trumps the founder fidelity.
1:27:58 And so like, you could be kind of crappy, but if you hit the right thing, you can do really well.
1:28:01 But the other piece of it is it’s contextual.
1:28:11 Like somebody who’s kind of lazy and not great in one environment may actually be much better when they have their, when they’re responsible and they need to drive everything.
1:28:18 And, you know, as an example of that, there was somebody I worked with at Twitter who was a very nice person, but never really seemed that effective to me.
1:28:20 He was always kind of hanging out, drinking coffee, chatting.
1:28:24 And then a few years later, I met up with him and he was running a very successful startup.
1:28:25 And I said, what happened?
1:28:27 I mean, I said it nicer than that, right?
1:28:28 Yeah, of course.
1:28:29 Like, hey, like, it’s so interesting.
1:28:30 You built this great company.
1:28:31 Like, you know.
1:28:32 He said, you know what?
1:28:34 I finally feel like my ass is on the line.
1:28:36 And that’s why I’m working so hard.
1:28:37 And that’s why I’m so, you know.
1:28:43 Now, in general, I think that the true giant outside success archetype is somebody who can’t tolerate that.
1:28:44 Right.
1:28:44 Right.
1:28:46 They’re always on and they can’t help it.
1:28:52 But there are examples where the context of the organization and the context of your situation really shapes what you do.
1:28:59 When you invested in Andrel, what was your, you mentioned you had criteria and they checked it all.
1:28:59 Sure.
1:29:05 What was your mental, oh, if I’m going to invest in a tech forward defense company, it needs to have X, Y, Z.
1:29:06 What was that criteria?
1:29:18 Yeah, so Andrel happened in a unique moment in time where Google had just shut down Maven and defense had suddenly become very unpopular in Silicon Valley and people were making arguments that ethically you shouldn’t support the defense industry.
1:29:26 And all the stuff that I thought was pretty ridiculous, because if you cared about Western values and you wanted to defend them, of course you needed defense tech.
1:29:34 So I started looking around to see who’s building interesting things in defense because if the big companies won’t do it, then what a great opportunity for a startup, right?
1:29:36 It seemed like a good moment in time.
1:29:45 And it felt like there was four or five things that you needed in order to build a next-gen defense tech company because there was a bunch of defense tech companies that just never worked or hit small scale.
1:29:49 Number one is you needed a why now moment for the technology.
1:29:53 What is shifting in technology that the incumbents can’t just tack it on, right?
1:30:00 Because the way the defense industry works is there’s a handful of players called primes who sell directly to the DoD and they subcontract out everything else, right?
1:30:10 And if you’re not a prime and you don’t have a direct relationship, then you end up in a bad spot in terms of being able to really win big programs and survive as a company or succeed.
1:30:14 So number one is what is the technology why now that creates an opening?
1:30:17 For Anduril, it was initially machine vision and drones, which were new things.
1:30:22 Two is, are you going to build a broad enough product portfolio that you can become a prime?
1:30:23 Right.
1:30:25 Which they did from day one.
1:30:32 Third is, do you have connectivity slash ability to, you know, really focus on faster sales cycle?
1:30:40 Fourth is, can you raise enough money that you’ll last long enough that you can put up with really long timelines to actually get to these big programs of record?
1:30:43 And I think Anduril did their first program of record in something like three and a half years.
1:30:44 It was remarkably fast.
1:30:48 I think it was the fastest program of record since the Korean War or something, which is super impressive.
1:30:55 And then lastly, the way that the business model for the defense industry works is this cost plus.
1:30:56 Oh, yeah.
1:31:03 So you basically make, say, 5% to 12% on top of whatever your cost to work the product out is.
1:31:04 And that includes your labor.
1:31:06 That includes every component.
1:31:10 And that’s why there’s a very big incentive in the defense industry to overrun on time.
1:31:11 Yeah.
1:31:14 Because you’ve charged 10% on that time, right?
1:31:16 So if something’s late, you make more money.
1:31:18 And not have a cost incentive at all.
1:31:19 You have no cost incentive.
1:31:24 That’s why you have a $100 screw, because you make $5 on the screw that costs $100 instead of using a $0.10 screw, right?
1:31:25 Yeah.
1:31:32 And so the cost plus model is extremely bad if you want efficient, fast-moving defense industry, right?
1:31:36 And they were really focused on trying to create a more traditional hardware margin business,
1:31:44 where an example would be if Lockheed Martin sold a drone to the government for a million dollars and made 5% cost plus, they’d make $50K.
1:31:52 If Anderil sold a $100,000 drone with the same capabilities of the government and had a 50% hardware margin, they’d make $50,000 too.
1:31:54 But the government could buy 10 of them for the same price.
1:31:55 Yeah.
1:31:59 So the government gets 10 times the hardware or the capability set.
1:32:08 Anderil gets 10 times as much margin if, again, that structure works, and everybody basically wins, right?
1:32:11 And so I just thought that business model shift was really important.
1:32:16 Why now, though, in the sense of why wouldn’t the defense industry encourage more competition?
1:32:19 They know they’re paying cost plus.
1:32:21 They know the screw shouldn’t be $100.
1:32:26 Like, why didn’t they encourage this way before Anderil?
1:32:32 Yeah, I think at the time, cost plus was viewed as the most fair version of it because you’re like, oh, just give me your bill of materials and I know exactly what it costs.
1:32:33 And then you’ll just get a fixed margin.
1:32:35 And so that’s more fair.
1:32:41 And I know from my budgeting perspective, really, like how much budget I need to ask for, how much.
1:32:41 Yeah.
1:32:46 And I think in hindsight, maybe it worked in that moment in time, but it no longer seems applicable.
1:32:50 And then the other thing that’s happened in the defense industry is there’s been massive consolidation over the last 30 years.
1:32:56 And so a lot of the growth of these companies came through M&A, and so you had fewer and fewer players competing for the same business.
1:33:00 And so that also means that it’s back to the oligopoly market structure that we talked about earlier.
1:33:03 How do you see defense changing in the future?
1:33:07 Like, is it less about ships and more about cyber and drones?
1:33:16 And how do we see the future of defense spending in a world where what used to dominate is these like billion dollar ships?
1:33:28 And now we’re in a world of asymmetry where, you know, for a couple million bucks, I might be able to hire the best cyber attack team in the world, or I might be able to buy a thousand drones.
1:33:30 Or how do you think about that?
1:33:33 Like, how do you think about defense in the next five, 10 years?
1:33:44 Yeah, I mean, in general, defense is inevitably going to move to these highly distributed drone-based systems as a major component of any branch of the military.
1:33:54 And it’s not just because it’s faster, cheaper, et cetera, et cetera, but also there’s certain things that you can’t do with a human operator inside the cockpit.
1:34:05 So, for example, you have a plane, the G-forces that a human piloted plane can tolerate is much lower than if you’re just a drone and you don’t have to worry about people inside the…
1:34:09 Plus, we must be at a point where AI can outperform a human fighter pilot, I would imagine.
1:34:11 I haven’t kept up on defense.
1:34:20 Yeah, there’s a few different contracts, both in Europe and the U.S., that are moving ahead around autonomous flight and autonomous drones and all the rest of it, autonomous capabilities in general in the air.
1:34:31 You know, I think the thing that people have stuck to so far is if there’s any sort of decision that is involved with, like, killing somebody or hurting something, then you need a human operator to actually trigger it.
1:34:37 And so that way you’re not turning over control to a fully autonomous system, which I think is smart, right?
1:34:42 You don’t want the thing to do the targeting and go after the target and make all these mistakes, right?
1:34:44 You want a human to make that decision.
1:34:48 But we exist in a world where not everybody is going to follow those roles.
1:34:50 That’s true.
1:34:56 And then the question is, what’s the relative firepower of that group of people and how do you deal with them and, you know, what do you do to retaliate and everything else?
1:35:01 I mean, in general, one could argue warfare has gotten dramatically less bloody.
1:35:04 Oh, wait, go deeper on that.
1:35:17 Well, if you think about the type of warfare that happened 150 years ago, or imagine if some equivalent to the Hooties was constantly shooting at your ships 100 years ago, what do you think the response would have been?
1:35:19 Do you think you would have said, ah, don’t worry about it?
1:35:29 Obviously, we’ve become much more civilized in our approach and very thoughtful about the implications of certain ways that people used to fight battles and all the rest of it.
1:35:32 But the way that we deal with problems today is very different from how we used to deal with them.
1:35:38 Is there an equivalent to Andrel, but in the software space, from a defense perspective?
1:35:42 And I mean that as like cyber weapons or cyber defense.
1:35:43 Who’s the best?
1:35:45 Yeah, I’ve been looking around for that for a while.
1:35:49 I don’t think I’ve seen anything directly yet, but it may exist and I may just have missed it.
1:35:52 But I do think things like that are coming.
1:35:58 And you do see some AI security companies emerging, which are basically using AI to deal with phishing threats or other things.
1:36:03 You could argue material security is doing that, but there’s people working across pen testing and other areas right now as well.
1:36:05 This has been a fascinating conversation.
1:36:09 We always end with the same question, which is what is success for you?
1:36:12 Yeah, you know, I’ve been noodling on that a lot recently.
1:36:25 And I think if I look at the frameworks that exist and certain Eastern philosophies or religions, it’s almost like there are these expanding circles that change with time as you go through your life, right?
1:36:35 Early on, you’re focused more on yourself and your schooling and then you kind of add work and then you add your family and community and then you add society.
1:36:40 And then eventually you become a sadhu and you go off and you meditate in a cave in the forest or whatever.
1:36:44 And different people weigh those different circles differentially.
1:36:49 And, you know, a big transition I’m making right now probably is I’ve been focused a lot on work and family.
1:36:55 And the thing I’m increasingly thinking about are like, what are positive things I can do that are more society level?
1:36:56 Thank you.
1:36:57 This was awesome conversation.
1:36:58 Oh, no, thanks so much for having me on.
1:36:59 It was really great.
1:37:04 Thanks for listening and learning with us.
1:37:09 Be sure to sign up for my free weekly newsletter at fs.blog slash newsletter.
1:37:19 The Farnham Street website is also where you can get more info on our membership program, which includes access to episode transcripts, my repository, ad-free episodes, and more.
1:37:24 Follow myself and Farnham Street on X Instagram and LinkedIn to stay in the loop.
1:37:27 Plus, you can watch full episodes on our YouTube channel.
1:37:31 If you like what we’re doing here, leaving a rating and review would mean the world.
1:37:35 And if you really like us, sharing with a friend is the best way to grow this community.
1:37:36 Until next time.
1:37:36 Thank you.
What if the world’s most connected tech investor handed you his mental playbook? Elad Gil, an investor behind Airbnb, Stripe, Coinbase and Anduril, flips conventional wisdom on its head and prioritizes market opportunities over founders. Elad decodes why innovation has clustered geographically throughout history, from Renaissance Florence to Silicon Valley, where today 25% of global tech wealth is created. We get into why he believes AI is dramatically under-hyped and still under-appreciated, why remote work hampers innovation, and the self-inflicted wounds that he’s seen kill most startups.
This is a masterclass in pattern recognition from one of tech’s most consistent and accurate forecasters, revealing the counterintuitive principles behind identifying world-changing ideas.
Disclaimer: This episode was recorded in January. The pace of AI development is staggering, and some of what we discussed has already evolved. But the mental models Elad shares about strategy, judgment, and high-agency thinking are timeless and will remain relevant for years to come.
Approximate timestamps: Subject to variation due to dynamically inserted ads.
(2:13) – Investing in Startups
(3:25) – Identifying Outlier Teams
(6:37) – Tech Clusters
(9:55) – Remote Work and Innovation
(11:19) – Role of Y Combinator
(15:19) – The Waves of AI Companies
(20:24) – AI’s Problem Solving Capabilities
(26:13) – AI’s Learning Process
(30:41) – Prompt Engineering and AI
(32:00) – AI’s Role in Future Development
(34:37) – AI’s Impact on Self-Driving Technology
(40:16) – The Role of Open Source in AI
(43:23) – The Future of AI in Big Players
(44:23) – Regulation and Safety Concerns in AI
(49:11) – Common Self-Inflicted Wounds
(51:34) – Scaling the CEO and Avoiding Conventional Wisdom
(55:21) – Workplace Culture
(58:39) – Patterns Among Outlier CEOs
(1:15:50) – Remote Work and its Implications
(1:18:47) – The Impact of Clusters and Exceptional Individuals
(1:25:41) – Investing in Defense Technology
(1:27:38) – Business Model Shift in the Defense Industry
(1:31:46) – Changes in WarfareSHOPIFY: Upgrade your business and get the same checkout I use. Sign up for your one-dollar-per-month trial period at shopify.com/knowledgeproject
NORDVPN: To get the best discount off your NordVPN plan go to nordvpn.com/KNOWLEDGEPROJECT. Our link will also give you 4 extra months on the 2-year plan. There’s no risk with Nord’s 30 day money-back guarantee!
Newsletter – The Brain Food newsletter delivers actionable insights and thoughtful ideas every Sunday. It takes 5 minutes to read, and it’s completely free. Learn more and sign up at fs.blog/newsletter
Upgrade — If you want to hear my thoughts and reflections at the end of the episode, join our membership: fs.blog/membership and get your own private feed.
Watch on YouTube: @tkppodcast
Learn more about your ad choices. Visit megaphone.fm/adchoices
-
Trump Blinks on China
AI transcript
0:00:01 Hi, I’m Frances Frey.
0:00:02 And I’m Anne Morris.
0:00:06 And we are the hosts of a new TED podcast called Fixable.
0:00:09 We’ve helped leaders at some of the world’s most competitive companies
0:00:11 solve all kinds of problems.
0:00:13 On our show, we’ll pull back the curtain
0:00:16 and give you the type of honest, unfiltered advice
0:00:18 we usually reserve for top executives.
0:00:21 Maybe you have a coworker with boundary issues
0:00:24 or you want to know how to inspire and motivate your team.
0:00:26 No problem is too big or too small.
0:00:29 Give us a call and we’ll help you solve the problems you’re stuck on.
0:00:32 Find Fixable wherever you listen to podcasts.
0:00:37 Support for the show comes from Yonder.
0:00:40 While technology can be incredibly helpful for teaching and learning,
0:00:43 it can also be a source of seemingly endless screens and distractions.
0:00:47 And those distractions can keep us from being present and focused in the moment,
0:00:49 especially in places like school.
0:00:52 Yonder says they are committed to fostering phone-free schools
0:00:55 so students can learn without distractions, social media pressure,
0:00:56 or worries about being filmed.
0:00:59 Yonder has put its years of experience forward
0:01:01 so they can support schools through the whole process,
0:01:04 from policy and planning to culture transition and launch.
0:01:07 Learn more at overyonder.com.
0:01:11 That’s O-V-E-R-Y-O-N-D-R.com.
0:01:13 Overyonder.com.
0:01:21 We used to have big ideals and dreams when we were still in university.
0:01:24 We wrote these beautiful application essays
0:01:28 about how we were going to fix tax avoidance and tax evasion,
0:01:31 how we’re going to tackle global hunger and work at the United Nations.
0:01:33 And look at us.
0:01:33 What has happened?
0:01:35 What has happened?
0:01:39 This week on The Gray Area, we’re talking about our moral ambition.
0:01:41 Where did it go?
0:01:43 And what we can do to get it back?
0:01:46 New episodes of The Gray Area drop on Mondays.
0:01:47 Available everywhere.
0:01:56 Welcome to Raging Moderates.
0:01:57 I’m Scott Galloway.
0:02:00 Jessica is jet-setting across Europe this week,
0:02:03 which I think is awfully nervy given she’s a new employee.
0:02:08 Our vacation policy is you don’t take vacation the first couple of years here at a
0:02:10 Galloway-sponsored corporation.
0:02:16 But anyways, she has decided to head to Europe where I think she’s in Italy or something like that.
0:02:18 But our loss is our gain.
0:02:21 On with us is literally our favorite side piece.
0:02:23 The Bulwarks’ own Tim Miller.
0:02:26 Tim is literally our favorite three and threesome.
0:02:29 We have become the same person or the same podcast, Tim.
0:02:31 If I’m on something, you’re on it before.
0:02:33 You’re on with Jess a lot.
0:02:35 Anyways, it’s great to have you, Tim.
0:02:35 How are you?
0:02:37 I love being a third, you know?
0:02:39 So I really appreciate it.
0:02:40 You know, it spices things up.
0:02:41 And we are.
0:02:42 We are becoming the same person.
0:02:44 I had your sidekick, Ed.
0:02:45 That’s right.
0:02:48 On my Gen Z podcast, like last week, I love Ed.
0:02:52 I’m thinking about kicking my co-host, Cameron Kasky, off and replacing him with Ed.
0:02:56 So if you have any problems with him, if he’s taking too much vacation, I might poach him.
0:02:57 I watched that.
0:02:58 How old is your young guy?
0:02:59 I love how we both-
0:02:59 He’s 24.
0:03:00 He’s 24?
0:03:01 Wow.
0:03:02 Yeah.
0:03:02 Ed is 26.
0:03:04 Yeah.
0:03:05 But you’re a kid, too.
0:03:06 I think it’s more adorable.
0:03:08 Because I have the grandfather thing.
0:03:09 You’re just like the big brother.
0:03:09 Yeah.
0:03:10 Little big brother vibe.
0:03:12 Got to keep making them behave.
0:03:12 There you go.
0:03:14 Are you in New Orleans today?
0:03:14 Where are you?
0:03:15 I’m in New Orleans, yeah.
0:03:18 I was in New York over the weekend, back in New Orleans.
0:03:20 I’m here for a couple weeks.
0:03:24 And then we got a live show in Chicago and Nashville, if any Raging Moderates listeners want to come.
0:03:26 May 27th and 28th.
0:03:26 Look at me.
0:03:28 I’m just plugging, baby.
0:03:29 Tell me a little bit about the live shows.
0:03:30 How many people do you get?
0:03:31 What’s the business model?
0:03:32 Do you enjoy it?
0:03:33 I just lied.
0:03:34 It’s May 28th and 29th.
0:03:35 28th and Chicago, 29th and Nashville.
0:03:37 I love them.
0:03:37 We love them.
0:03:41 We are getting, I think, almost 1,000 people in Chicago.
0:03:46 And they’re closer to like 400 in Nashville, kind of a small, you know, big market, small market thing.
0:03:54 And we haven’t quite figured out on the business, Scott Brain, we haven’t like really quite figured out how to monetize them in a way that is that useful to the bottom line.
0:03:57 But I think it’s still useful because it’s cool for the community.
0:03:59 People love it.
0:04:03 They like, especially in kind of our world, people love like, when I see people on the street, I’m sure you get this too.
0:04:09 It’s just like, I just like listening to you because I feel like I’m going insane and it makes me feel sane to listen.
0:04:15 And so then it makes you feel even more sane when you’re around other sane people that you can kind of vent to about the craziness of the world.
0:04:17 And so I think it’s good for the community side of things.
0:04:18 I like being out with the people.
0:04:23 I feel a little bit disconnected sometimes when I’m up here in my hole in New Orleans.
0:04:25 I can’t leave my little studio hole.
0:04:27 And so it’s nice to have human contact.
0:04:31 One of my colleagues, Jonathan Last, doesn’t like human contact.
0:04:33 So it’s not a plus for him, but it is a plus for me.
0:04:34 So it’s kind of invigorating.
0:04:35 So I dig it.
0:04:39 I mean, I think that, you know, we only do maybe six a year, seven a year.
0:04:43 So it would become a burden if you’re like, we’re doing a real tour, like a rock and roll tour.
0:04:49 Yeah, I’ve always said that it’s really a shame that these LLMs and AI is crawling the digital world and not crawling the real world.
0:04:55 Because I find online people not so nice, but people out in the wild couldn’t be more lovely.
0:04:55 Totally.
0:04:57 Well, it’s not because people are cowards.
0:05:04 And so there are some people that you see in public that are lovely, that are nuts online, you know.
0:05:17 Some of it is that and other others of it is just like online draws in like the people who want to be engaged for the most part, present company excluded are like, I think it draws people in with mental illness.
0:05:18 I don’t know.
0:05:25 I just like I like, for example, I just think back to, you know, something like after the Biden debate when I was super critical of him because it was just obvious.
0:05:34 I the commenters on my social media and on the blog were really mad at me, like the lefty commenters are like, no, I don’t you understand the assignment.
0:05:35 I got a lot of that, too.
0:05:42 Yeah, but then out in the real world or on my email or text message in private communications, everybody was like, thank God you’re saying this.
0:05:43 I mean, this is crazy.
0:05:44 Like, that was insane.
0:05:45 I couldn’t even watch it.
0:05:46 It was so painful to watch, right?
0:05:48 And that’s just one example.
0:05:49 There are a million examples of this.
0:05:54 I do think that social media kind of draws in the most mentally ill people to be the most active.
0:05:55 I don’t know.
0:05:57 I maybe need to reflect on that myself, possibly.
0:06:04 But also, I think that just being alone makes you more mentally ill and makes you more isolated and makes you less empathetic and more angry.
0:06:13 And I think those are the people who have a disproportionate share or voice online because, quite frankly, they’re home and they’ve got not a lot else to do.
0:06:18 I think a lot of what ails us is the social isolation and the fact that we don’t recognize we’re mammals.
0:06:22 And, you know, you put an orc on a tank alone, it literally goes crazy.
0:06:26 And, you know, a Cape buffalo gets excommunicated from the herd.
0:06:28 It usually goes crazy or gets eaten and dies.
0:06:29 Totally agree with that.
0:06:32 I could not be more human contact, just pro-human contact.
0:06:34 It’s just another thing that, you know, we’re aligned on.
0:06:35 Good.
0:06:40 So, in today’s episode, we’re going to be discussing the Qataris may gift Trump a luxury jet.
0:06:44 Tell me that thing probably doesn’t look like an Iraqi whorehouse inside.
0:06:45 What do you think the decor looks like inside?
0:06:50 Yeah, I mean, it probably looks like the Uday and Kuse suite in the palace, for sure.
0:06:58 And, look, there’s so much horrifying about this story, but the funny part of it is, I guess it was parked at the West Palm Beach FBO in February.
0:07:01 And Trump was like, I want to go check that out.
0:07:08 He’s, like, immediately drawn to the opulence of the Qatari, you know, whorehouse in the sky.
0:07:12 And that, I guess, is what started us down this path to this bribe coming through.
0:07:16 And, you know, that is just very Trumpy, you know, a very Trumpy origin story.
0:07:19 But it’s really, I mean, obviously, it’s just bad on the corruption front.
0:07:28 Like, the idea that our country should be taking a $400 million bribe for another country that we have a complicated geopolitical relationship with is insane.
0:07:38 Simultaneously to that, if the corruption of the government part isn’t bad enough, Eric Trump signed a deal for, like, a golf course in Qatar for, I think, $5 billion.
0:07:44 So, there’s private corruption on top of the public corruption that is happening with Qatar.
0:07:51 And it’s particularly jarring, and I think it’ll be interesting to see what the kind of pro-Israel right folks say about this.
0:07:55 Like, Qatar was funding Hamas and was funding the campus protests.
0:08:05 So, in addition to just the corruption part, there’s the hypocrisy of, we are currently taking away the green cards and jailing people who participated in the campus protests.
0:08:13 At the same time, as we’re taking an Air Force One bribe from the country that was funding the same protests.
0:08:15 And the whole thing is just preposterous.
0:08:22 I have a chat group or a text group with some of my friends from the fraternity at UCLA, and the majority of them are Jewish.
0:08:24 And a lot of them voted for Trump.
0:08:30 I think most of them voted for Trump because, quite frankly, he’s seen as viewed as more resolute on Israel.
0:08:32 And I said, be clear.
0:08:37 You know, this guy likes Jews the way that hardcore evangelicals like Jews.
0:08:41 If you kind of go one layer deeper, their plan for us is not all that great.
0:08:43 You know, it’s all about the rapture.
0:08:46 When Jesus comes back, then they decide to kill most of us.
0:09:00 And the fact that, essentially, we have the Qataris giving the president a $400 million plane and sort of turning this into kind of, you know, the ultimate frequent flyer program.
0:09:09 I mean, first off, it’s embarrassing that America needs to take a plane manufactured in the U.S. from the Qatari government, that that’s where we are.
0:09:29 But also, the notion that you have the primary sponsor of Hamas and the political mouthpiece, and you have a country that has given about $4.8 billion of the $14 billion we have received from foreign governments to sponsor, quote, unquote, Middle Eastern studies departments.
0:09:42 I mean, Jews have to get past the fact that this notion that the president is going after universities because of anti-Semitism is just fucking ridiculous.
0:09:44 It has nothing to do with anti-Semitism.
0:09:59 It’s him attacking progressive institutions and trying to implement thought leadership, that if he really cared about anti-Semitism, he wouldn’t be taking $400 million bribes from the primary sponsor of a group that murdered 1,200 Jews.
0:10:02 And I’m like, you guys don’t see this?
0:10:09 You don’t see the inconsistency here in that this isn’t about—I mean, we have totally become, at this point, pay for play.
0:10:11 The question I would put forward to you—
0:10:12 Do they see it?
0:10:14 Has the tech chain fired up since last night?
0:10:16 What they see is they see that it’s problematic.
0:10:28 What they see more, Tim is one of the guys in our group who’s this wonderful high-integrity guy, has this great small business that does specialty products.
0:10:33 You know those products—you go to a conference and you see those banners and the mugs and the water bottle with logos.
0:10:37 He has that kind of business, and he does—it’s a great business.
0:10:42 Fifteen million bucks, good living for 200 or 300 people, has put kids through college on it.
0:10:43 It’s a family business.
0:10:57 And his business is basically shut down overnight because of the ridiculous, sclerotic, reckless, like, approach to negotiating, where we’re negotiating against ourselves and both sides are losing around China.
0:11:01 And my friends are very—I don’t want to say economically focused.
0:11:02 Is that fair?
0:11:05 I’d say they’re more focused on America as a platform for prosperity.
0:11:06 I think they’re like most voters.
0:11:08 They think about who’s going to put more money in my pocket.
0:11:15 They think that essentially Washington is feckless and useless around social issues, and they’re focused on who they think is better for the economy.
0:11:28 And to have kind of one of us have our, you know, one of our close friends’ business basically just, like, turned off like it was a tap and threaten, you know, a multi-generational business, I think that hits hard.
0:11:30 I think that hits them at home.
0:11:42 The question I would have for you is I’m kind of—I want to move beyond the part of the program where we’re, like, screaming into TikTok about the corruption here and the obvious fraud or whatever you want to call it.
0:11:45 I have, like, seven more minutes of TikTok bits, though, but it’s fine.
0:11:47 We can move on quicker than you wanted.
0:11:47 It’s your show.
0:11:52 I guess I want to move to the part of the program where how do the Democrats become the party and not fucking around?
0:12:08 And this is my idea, and I’m curious what you think, that we should draft legislation, the, you know, Foreign Enemies Act, Part 2, 2.0, that says if you’re operating black sites in your country, El Salvador, if you’re trying to bribe our public officials, Qatar,
0:12:16 even if the president at that moment agrees with it, it doesn’t mean you’re not guilty of a crime or a violation of Emoluments Act or whatever.
0:12:25 And in three years and nine months, we are going to implement significant economic sanctions and rethink our geopolitical relationship with you.
0:12:31 And also be clear, in America, the White House and the branches of government or Congress tends to turn over.
0:12:35 I don’t think there’s any shaming the Trump administration and his acolytes.
0:12:54 So I’m about how do we start sending a chill down the backbone of some of these foreign governments and also some of the lower level people of these organizations that say, if you’re illegally incarcerating people, whether the president or whether the current head of ICE says that is okay, it doesn’t mean you’re not committing a crime.
0:13:00 I’m trying to figure out how we, quite frankly, move from the strongly worded letter to being a little bit more aggressive.
0:13:02 Any ideas or thoughts?
0:13:03 Yeah, I do.
0:13:03 I have a couple of thoughts on that.
0:13:07 And I was literally just talking to Bill Kristol about just on the Qatari plane thing.
0:13:10 Again, this is more of a strongly worded letter side of things.
0:13:12 And I have additional thoughts on top of it.
0:13:21 But I do think, just at minimum, somebody in the House, among the Democrats, should try to put forth through a privileged resolution creating a vote on the new Air Force One.
0:13:24 Like, make the Republicans actually vote to codify this.
0:13:26 Like, you have a majority, right?
0:13:33 Just say, look, if you guys want to take a $400 million bribe from the funder of Hamas, then put your money where your mouth is and vote for it.
0:13:46 Because, you know, we’re already seeing everybody from Ari Fleischer, who is Bush’s spokesperson, who’s been pro-Trump, to Laura Loomer, the insane MAGA conspiracy theorist, to the Free Press, which has been kind of like anti-anti-Trump, the Bari Weiss outfit.
0:13:49 Like, all of them are out this morning criticizing the Qatari plane thing today.
0:13:53 So I would at least force these Republicans to actually have to codify it.
0:13:53 That’s one.
0:14:01 The thing I liked about your El Salvador idea, legal is not my background, so I don’t have a lot of deep thoughts on how you can scare people into feeling like they might go to jail.
0:14:03 Although I like where your head’s at on that.
0:14:24 Economically, though, I mean, I think it would make sense for, you know, Democratic leaders either in or out of government right now to be talking with the EU and Carney and, like, I just got reelected in Australia about isolating El Salvador and saying that when we come back in charge, we’ll isolate El Salvador too.
0:14:28 Like, we will turn you into Nicaragua or Venezuela if you want to.
0:14:36 Like, if you want to be completely isolated from the world community, I know you’re very happy about this deal you’ve done with Donald Trump and his crime family, but they’re not going to be around forever.
0:14:50 And if you want the El Salvador economy to look like the North Korea, Nicaragua, or Venezuela economy, then keep going down this path of having, you know, of violating the Human Rights Council and what they’ve already signed and agreed to.
0:15:01 Like, you know, you cannot be part of the, you know, liberal, small L liberal world of nations if you are going to put somebody in a hole in a torture camp and not give them access to a lawyer.
0:15:07 Like, that’s just, that’s a no-go and we’ll stop doing trade with you and we’ll stop doing tourism with you.
0:15:19 And it would be hard to actually impact the El Salvador economy in a big way without the U.S. being involved, but you could start to lay the groundwork down for it in a way that might make Bukele start to think twice.
0:15:25 I think we’ve got to say to these folks, look, the president does not provide blanket immunity from economic sanctions or even criminality.
0:15:27 Just to say, look, you’re right.
0:15:32 Well, you’re good for three years, eight months and two weeks.
0:15:41 But after that, be clear, if the Democrats get control back, which there’s always a good chance at some point they will, it’s going to be really ugly for you.
0:15:55 Because I think we’ve got to go after the infrastructure and the enablers and the co-conspirators at this point, as opposed to, because it’s just pretty clear we’re not going to shame him or our current branches of government who have been weaponized and politicized.
0:15:57 That’s just not an effective strategy.
0:15:59 So let’s move on to the tariffs here.
0:16:00 All right.
0:16:07 So back in Washington, Fed Reserve Chair Jerome Powell, Chairman Powell, warned that Trump’s escalating trade war could drive the U.S. towards stagflation.
0:16:10 That’s probably a word you don’t know because you’re too young.
0:16:12 We haven’t had it since the 70s.
0:16:13 I’ve read about it in history.
0:16:13 You’ve read about it?
0:16:22 Well, I mean, as a Reagan fan, you know, as in high school Republicans, people talked about, you know, how he ran against stagflation.
0:16:23 So I’m familiar with it in that context.
0:16:30 So it’s this toxic mix of rising prices and rising unemployment where basically interest rates go up and the economy slows down.
0:16:36 And it’s sort of stagflation is sort of a step or a bridge to a depression.
0:16:40 But on Thursday, Trump announced a new trade framework with the U.K. that lowers tariffs,
0:16:43 but only on luxury cars, including a Rolls-Royce and Bentley.
0:16:43 Well, thank God.
0:16:46 And plane engines, I think, got thrown in there, too.
0:16:48 I think Rolls-Royce has given us some plane engines, too.
0:16:52 Toys, including Barbie and Hot Wheels, will face 100 percent tariff.
0:16:54 Then over the weekend, there was a surprise detour.
0:17:00 The U.S. and China agreed to a 90-day truce, temporarily rolling back some of the steep tariffs that had been hammering both economies.
0:17:11 By May 14th, the U.S. will slash its tariff on Chinese goods from 145 to 30 percent, while China will lower its own tariffs on American products from 125 percent to just 10 percent.
0:17:14 The move helped calm global markets.
0:17:17 But it’s anyone guess if the pause will hold.
0:17:19 They now have 90 days to make a deal.
0:17:21 What do you think will come out of this?
0:17:23 What’s your impression of what’s happened as of this morning, Tim?
0:17:28 Well, for starters, like, obviously, Trump linked and had very serious concerns about the economy.
0:17:37 I mean, if you just look at the broad contours of this, so a 30 percent tariff now in China is 20 percentage points higher than it was under Biden, right?
0:17:40 So it was at 10, and now it’s up to 30.
0:17:48 And so we’ve added the 20 percent tax on consumers who consume Chinese goods in exchange for nothing.
0:17:50 I mean, the Chinese didn’t even.
0:17:53 There were some, I guess, promises around fentanyl or something.
0:18:00 You know, in the past, you know, in the first Trump term when they did the tariffs to China, there was also a deal where, like, they were buying our soybeans.
0:18:03 And, you know, there are other, and maybe that’ll come over the next 90 days.
0:18:03 I don’t know.
0:18:10 But as of right now, you know, we still put a 20 percent, essentially, sales tax increase on Americans, like, for nothing.
0:18:13 Just so that Donnie could, like, feel tough for a little bit.
0:18:15 So how does it go from here?
0:18:21 I don’t, I mean, I think that I’d be interested in your take on, like, I noticed the markets are up quite a bit today.
0:18:33 I just generally think, and it’s maybe my pessimistic nature, that, like, the markets and business leaders have been, like, a little bit too sanguine about, like, kind of where we’re heading.
0:18:37 I think that this is going to be, like, relatively ugly.
0:18:46 Like, this move away from a total trade embargo on China has, like, walked us away from the brink of, like, a worst-case scenario economically, at least temporarily.
0:18:57 But even still, and if you would have went to any of these people in October and said, hey, I’m from May 2025, and here’s what the economic outlook’s going to look like then.
0:19:02 We’re going to have a 10 percent across-the-board tariff on everybody, 30 percent on China increased.
0:19:08 The tax bill that you guys were counting on is going to be floundering in Congress, and we’ll see what happens with that.
0:19:14 But we haven’t really made any meaningful progress of it yet on May, and, you know, GDP growth will be down to zero.
0:19:22 I feel like everybody would think that was – like, I feel like business people outside of politics would say that that’s, like, almost a worst-case imaginable scenario.
0:19:34 And that’s where we are now, but people are kind of spinning it as a positive because it ends up being better than what the worst-case scenario was that we were staring down the pike of had they kept the 145 percent in.
0:19:35 So, I don’t know.
0:19:36 What do you make of that?
0:19:40 Well, he’s definitely – so, he’s pulled the knife out of the back sort of halfway.
0:19:42 That’s the good news.
0:19:59 The bad news is the injury is going to take, I think, decades to heal because even worse than the tariffs themselves, which obviously increase consumer prices and slow the economy, I think the most lasting damage here is that we have now become the land or the economy of toxic uncertainty.
0:20:02 And that is people don’t even know how to plan their businesses.
0:20:19 And the U.S. S&P trades at a price earnings multiple of around 26, meaning for every dollar of profits that our great American companies generate, the world rewards us with $26 in value, which flows right into not only the pockets of shareholders but employees.
0:20:21 It lowers interest rates.
0:20:23 We can borrow money at a much lower rate.
0:20:29 The U.S. dollar is kind of the reserve currency because everybody wants to buy American stocks, so there’s greater demand for dollars.
0:20:40 And the U.S. being the reserve currency globally, literally lowers on average the interest rate that you pay across your student loans, your mortgages, and your carb loans, somewhere between half and 1%.
0:20:50 So, that’s just literally hundreds of billions of dollars in cost savings that the Americans enjoy because of the fact that our markets trade at a higher multiple on earnings.
0:20:52 Now, why do they trade at a higher multiple?
0:20:53 A lot of reasons.
0:20:55 We’re more risk-aggressive.
0:20:56 Our technology is better.
0:20:59 We have more of a zeitgeist or a culture of entrepreneurship.
0:21:01 We have great universities, great intellectual property.
0:21:04 But we also have rule of law and consistency.
0:21:06 We’re seen as good trading partners.
0:21:08 We’re seen as people we can count on.
0:21:16 We’re seen as a place where there isn’t going to be a ton of corruption where you come and, say, open a bunch of restaurants, and then the government shows up one day and says, sorry, we now own them.
0:21:18 And that happens in other countries around the world.
0:21:24 Rule of law and consistency have been thrown out the window in just 110 days.
0:21:28 And you’re starting to see a reduction in the price earnings multiple.
0:21:35 And I believe over the next several years, we’re going to see a re-rating down of our price earnings multiple, which effectively increases the costs on all American businesses and consumers.
0:21:44 Because, and the market has sort of said this to a certain extent, the market has said, we don’t really know what this guy is going to do and we don’t trust him.
0:21:46 145% tariff.
0:21:49 I mean, this is what a bad negotiator is.
0:21:52 The first thing we need to do is dispel the myth that this guy is a good business person.
0:21:58 He would be wealthier if he’d taken his massive inheritance and invested it in index funds.
0:22:04 His business career includes a trail of bankruptcies and unpaid subcontractors.
0:22:11 To be fair, he’s an outstanding reality talk show host, made several hundred million dollars hosting and envisioning a reality talk show.
0:22:14 As a business person, he’s not very good.
0:22:18 And in terms of negotiating, he’s negotiating himself at this point.
0:22:27 He put on 145% tariff and then a few days later, without any counter from the Chinese, other than this is unacceptable and we’re not even going to talk, he said they’re unsustainable.
0:22:30 It’s like, well, boss, you’re the one that did it.
0:22:37 So to go to 145% and then to go down to 30%, and effectively what you have is the Chinese are divesting away.
0:22:40 This will keep the factories sort of humming in China.
0:22:46 This will basically loosen up or cancel the trade embargo for the time being.
0:22:50 But also in negotiation, you have to understand your leverage and the amount of leverage you have.
0:22:57 And what is typical of America and Donald Trump is that we’re under the impression we’re much more powerful than we are.
0:23:02 People think of us as, you know, we’re the only customer at the taco stand here.
0:23:04 And that without us, they go out of business.
0:23:06 We’re the third largest trading partner.
0:23:12 The Association of Southeast Asian Nations and the EU are bigger trading partners with China.
0:23:15 China has been divesting away from us.
0:23:17 This is kind of, I think this is good for them.
0:23:25 They get to continue to sell not as many products, but still not the shock that this trade embargo was going to implement on them.
0:23:29 At the same time, they will slowly but surely continue to divest away from us.
0:23:31 And that is what the whole world is doing, Tim.
0:23:36 The whole world is rerouting their supply chain around the United States, not even because of tariffs,
0:23:40 but because they don’t know how to plan their business with us because of this toxic uncertainty.
0:23:48 And I think these, that resupplying or that rerouting of the supply chain will take years, if not decades, to reheal.
0:23:57 And I do think the Americans have taken for granted, the American public, of just how inexpensive our goods are because of the supply chain that runs through the U.S.
0:24:01 of every major economy because they trust us and think there’s rule of law.
0:24:04 And those things are no longer a given with us.
0:24:09 The scariest data I’ve seen is that I think it was Pew or the Hoover.
0:24:15 Some polling organization did a poll of global citizens, took a statistically significant sampling.
0:24:23 And for the first time in the history, more people around the world think that China is a greater force for good in the world than the U.S.
0:24:28 Which says to me, people are much more inclined to do business with China than they are with the U.S.
0:24:31 And as someone who has run businesses, I’ve run businesses my whole life.
0:24:34 They’ve always been global businesses because they’ve been strategy and brand firms.
0:24:43 When I walk in and meet with LVMH or Samsung or, you know, I don’t know, Tata Motors, we’re taken seriously.
0:24:46 And also, when they do business with us, they want to do business with us.
0:24:55 If I’d started a brand strategy firm in Pakistan or even in Thailand, they’re just less inclined to do business with you because they don’t know you.
0:24:56 They don’t trust you as much.
0:24:58 They don’t think you’re as innovative.
0:25:01 They probably don’t think your employees are as good.
0:25:05 They’re not as confident you’re going to uphold your side of a legal contract.
0:25:12 The legal contracts aren’t as easily agreed to because they’re not as consistent with the kind of American or Western law.
0:25:16 All of that, we have had massive benefit from.
0:25:20 And I don’t think American consumers realize how much they benefited from that.
0:25:25 And they’re going to realize it when everything just gets a little bit more difficult.
0:25:25 Your thoughts?
0:25:26 You know, I agree with most of that.
0:25:29 I’ll just defer to your expertise on the economy side of things.
0:25:31 I concur with it.
0:25:35 I just like putting on my former Republican hat just on like the China hawk side of it.
0:25:48 I remember we used to have kind of a Republican Party that was strong against communism and that felt like, you know, wanted to use more of a Reaganite policy, you know, tour and great power struggles.
0:25:52 We’ve seen these guys basically fold in the face of China.
0:25:59 And I just think broadly, think about the advantages China has gained over the past five months.
0:26:11 I mean, in addition to the stuff you just laid out, the fact that we’ve totally gutted USAID and we’ve eliminated any soft power we have throughout the world and created a huge opportunity for China to fill that void.
0:26:26 To your point on the economic trading partner side of things, I think that if you are one of those Asian countries like Thailand that you just mentioned, like, I mean, don’t you feel like you can trust China a little bit more as a trading partner than you could have five months ago?
0:26:27 For sure.
0:26:29 Then you look at the policy side of things.
0:26:31 Just look at talk about leverage and weakness.
0:26:36 I mean, we completely fold on the, you know, Liberation Day tariffs.
0:26:53 Meanwhile, we also completely fold when it comes to TikTok and like the U.S. government puts into law a TikTok ban, but Donald Trump and this administration won’t enforce it because they’re afraid of the backlash from the American population.
0:27:07 So, like, China has been so successful in infiltrating American culture through TikTok that, and the power of that is so great that, like, the U.S. government is scared.
0:27:11 Let’s just be honest, scared to, like, enforce its own laws when it comes to banning TikTok.
0:27:13 You know, China State TV this morning, I saw this.
0:27:14 They put this out.
0:27:20 The outcome of trade talks with Trump team shows China’s firm countermeasures and resolute stance have been highly effective.
0:27:24 China gets nearly all tariffs off for doing very little other than agreeing to talk.
0:27:26 That’s their spin this morning.
0:27:35 So, I just think across every metric, we have made China stronger over the past five months in ways that, as you say, are going to be hard to unravel.
0:27:43 And there isn’t really any evidence that we are trying to, like, win a great power struggle with them.
0:27:54 And I guess I would add just the last thing I’d say on it is, if you’re China, I don’t, I’ve, it’s hard for me to get inside the head of Xi Jinping, but, and I don’t know what their plans are with regards to Taiwan or their timing.
0:28:09 But I just do not think they could look at America right now and think that we would put up any real resistance to their efforts to overtake Taiwan if they wanted to do it, based on how we’ve acted with regards to Ukraine, how we’ve acted with regards to this trade war.
0:28:16 So, I just think that we’ve weakened ourselves pretty noticeably across, you know, a variety of different metrics vis-a-vis China.
0:28:20 Yeah, I think the short-term winner is Europe because China wants to keep these factories humming.
0:28:24 So, they’ll have a lot of excess supply that they’ll be willing to sell at a discounted rate.
0:28:30 And I think the EU is about to get a massive amount, kind of, for sale on a ton of products.
0:28:33 The medium and the long-term winner, I actually think I agree with you, is China.
0:28:39 I know firsthand that their commerce executives and business people are roaming around Europe and Latin America saying,
0:28:43 hey, you may not like us, but you can trust us.
0:28:44 We do what we say we’re going to do.
0:28:48 And I was actually at dinner with the CEO of one of the largest companies in China.
0:28:53 And you said, yeah, for the first time, we’re talking to European companies about providing cloud services.
0:28:58 And the general reaction was always, we don’t trust China to store our data in the Chinese cloud.
0:29:03 And now the question is, okay, we don’t trust you, but we don’t necessarily trust American companies now either.
0:29:04 Look at the Elon Musk Starlink.
0:29:08 Like, they’re like, you know, are we going to trust Elon Musk with the internet access?
0:29:13 Obviously, I think that there are going to be some countries, they’re going to look at that both ways now.
0:29:19 Some will want to do that deal because they feel like it might be a way to get good favor of the Trump administration.
0:29:25 But I think others are going to feel about that the way they might have felt about China a couple of years ago.
0:29:26 You also have to do a better job as Democrats.
0:29:30 I think of who has been good at pushing back on autocrats.
0:29:32 I learned this, I did an interview with Anna Applebaum.
0:29:40 And she said, if you take Alexander Navalny as an example of someone who was able to push back on an emerging kleptocracy or an autocrat,
0:29:46 it was because he was able to connect it to people’s lives, that he had this sort of motto when he was running against Putin,
0:29:49 that, okay, they’re getting rich and there’s still potholes in Moscow.
0:29:54 And Elizabeth Warren or Senator Warren kind of summarized it nicely by saying,
0:29:58 they’re getting rich and you’re getting your health care taken away.
0:30:05 And I don’t think we’ve done a really good job of explaining to the American people that a kleptocracy creates a small number of very rich people,
0:30:11 whether it’s the people who are tipped off to the launch of the Trump coin the Friday night before inauguration,
0:30:16 small number of wallets, like 30 wallets made $800 million, thing spikes, they dumped the bag.
0:30:22 And then over the course of the next several weeks, 800,000 smaller investors lost billions.
0:30:32 And we haven’t done a good enough job connecting that, okay, when Elon Musk, as part of our negotiation with UK around tariffs, gets a Starlink deal,
0:30:35 that means every other American tech company, every other small business,
0:30:41 and by the way, 98% of the companies that make their living from import and export in the United States are small and medium-sized businesses
0:30:45 who create two-thirds of the jobs in America, but don’t have lobbyists.
0:30:55 And they don’t have enough money to get on Trump’s lunch calendar or be part of Eric’s executive, like, you know, $500,000 a year kind of fraternity, if you will.
0:30:57 Those are the people that get hurt the most.
0:31:03 And we, I don’t think we as Democrats have done a good enough job connecting the dots there.
0:31:14 Or that say, look, kleptocracy is a small tax and then a medium tax on everyone such that we can funnel a massive amount of money to a small number of people.
0:31:17 All right, let’s move on here.
0:31:18 We’ll take one quick break.
0:31:19 Stay with us.
0:31:25 What’s better than a well-marbled ribeye sizzling on the barbecue?
0:31:32 A well-marbled ribeye sizzling on the barbecue that was carefully selected by an Instacart shopper and delivered to your door.
0:31:36 A well-marbled ribeye you ordered without even leaving the kiddie pool.
0:31:40 Whatever groceries your summer calls for, Instacart has you covered.
0:31:45 Download the Instacart app and enjoy $0 delivery fees on your first three orders.
0:31:48 Service fees, exclusions, and terms apply.
0:31:51 Instacart. Groceries that over-deliver.
0:31:54 Support for the show comes from Shopify.
0:31:56 It’s no small thing to start a business.
0:32:01 I remember when I was starting out and could have used Shopify, could have used some infrastructure, could have used some help.
0:32:08 Anyways, if you’re running a small business and you’re looking for a partner to help your business grow, Shopify is your answer.
0:32:15 Shopify is the commerce platform behind millions of businesses around the world and 10% of all e-commerce in the U.S., according to their data.
0:32:26 Shopify has built-in tools to help you with social media and email campaigns and boasts a 99.99% conversion rate from browsers to buyers, both online and in-store.
0:32:32 And the best part, you can tackle all the important tasks in one place, from inventory to payment to analytics and more.
0:32:36 Shopify even has a global selling tools to support sales in more than 150 countries.
0:32:41 Simply put, Shopify is a small choice that can have monumental impact on your business.
0:32:45 Get all the big stuff for your small business right with Shopify.
0:32:50 Sign up for your $1 per month trial and start selling today at Shopify.com slash Prof G.
0:32:53 Go to Shopify.com slash Prof G.
0:32:55 Shopify.com slash Prof G.
0:33:03 Welcome back.
0:33:13 On the immigration front, a federal judge has blocked what may be one of the Trump administration’s most extreme efforts yet, the planned deportation of detainees to Libya.
0:33:27 ISIS detained Asian nationals in Texas and allegedly pressured them to voluntarily agree to be transferred to prisons controlled by armed militia in eastern Libya, despite widespread reports of torture and human rights abuses in those facilities.
0:33:34 At the same time, Trump is touting a 95% drop in illegal crossings at the U.S.-Mexico border compared to last year.
0:33:41 Tim, is this a fear tactic designed to intimidate future migrants into staying away or self-deporting?
0:33:42 What do you think is going on here?
0:33:47 There definitely is a desire to try to intimidate people into self-deporting, and they’re actively doing this.
0:33:52 They’re running ads calling on people to self-deport right now throughout the country.
0:33:53 So there is that.
0:34:00 I think there is a little bit of a sadism to the Stephen Miller wing of the Trump administration.
0:34:11 I think some of them like the idea of doing these kind of outlandish types of deportation plans because, A, it’s intimidating.
0:34:17 B, I think they get some kind of pleasure out of it, maybe erotic pleasure.
0:34:17 I don’t know.
0:34:20 But, you know, it’s hard to keep track of all this stuff.
0:34:21 But the Libya thing is the latest.
0:34:29 There’s another story, I guess a week or two ago, of a guy, Omar Amin, who got deported to Rwanda.
0:34:41 He was from Iraq and had been pretty thinly and I think quite clearly falsely accused of being part of ISIS when he was in Iraq.
0:34:49 He had come to America, brought his whole family here, went through the refugee vetting process, was living in Sacramento, was working, did not have any crimes in America.
0:34:56 But, you know, there was some cable where he was on a list of people that were ISIS members.
0:34:59 You know, he, him and his lawyer says that’s false.
0:35:06 Anyway, he gets jailed during the first Trump administration, has been jailed since then, and we just sent him to Rwanda, you know, where he’s not from.
0:35:14 And, you know, we have the – there’s another situation I was just reading this morning in New York with these two guys who are 19 and 20, graduated high school, were from El Salvador.
0:35:24 You know, their parents brought him here when they were kids, they hadn’t broken any laws, were good students, spoke English, and they went to their immigration checkup.
0:35:32 They ended up getting shackled, sent to Louisiana, and now are about to be sent back to El Salvador where they, you know, they don’t remember or know anybody in El Salvador.
0:35:42 So, like, all of this, you know, is part of the broader effort to, yes, intimidate and to send a signal to the world that people aren’t welcome here anymore.
0:35:44 And that’s what they want, right?
0:35:49 I mean, look, the only people they’re going to welcome into this country are white Afrikaners from South Africa.
0:35:54 I mean, I don’t think you’ve got to, you know, read between the lines too much on that.
0:35:55 And it’s outrageous.
0:36:14 And I think that, you know, part of this stuff, to your point in the last thing about Democrats, I think that the more Democrats can just speak, whether it’s about the economy or whether it’s about this stuff, in normal language and focus on things that people understand, people really don’t want kids that were brought here where there are four to be sent back to El Salvador.
0:36:16 Like, that is not a popular policy.
0:36:24 They don’t want people to be grabbed off the street by masked agents and sent to a prison camp in El Salvador or a prison camp in Libya.
0:36:26 Like, those are not popular things.
0:36:38 And I think that you can talk about those things in regular language that speak to American values while also, you know, not going down crazy lefty open borders like territory.
0:36:40 And I think that it’s important to be able to do both.
0:36:42 Yeah, I feel I’m of two minds on this.
0:36:44 The first is this is still his most popular policy.
0:36:47 And there’s just no getting around it.
0:36:52 I feel like a lot of this was the Democrats sticking their chin out and just waiting to get tagged hard.
0:36:56 And a quarter of a million people crossed the border in December of 23.
0:36:57 We were just sort of asking for it.
0:37:07 And then they see, OK, them getting Trump, you know, or Biden-Harris hats and free phone cards and hotel rooms and Americans saying, OK, there’s something wrong here.
0:37:16 But I’ve never understood about this whole argument or where I feel Americans fail to see what’s going on is that immigration is obviously the secret sauce of America.
0:37:19 But I’ve always thought I’m kind of where Friedman was on this.
0:37:27 And that is the most profitable part of immigration is illegal immigration because they’re essentially the most flexible, inexpensive workforce in history.
0:37:35 When there’s crops to be picked or old people to be taken care of or dishes to be washed and we can’t afford or find domestic workers.
0:37:39 The reason why it’s fairly inexpensive to eat out is because of illegal immigration.
0:37:45 And in some cities, somewhere between 15, 25 percent of fast food workers are undocumented workers.
0:37:54 And in addition, they generate a surplus of 100 billion dollars in the Social Security program because most of them are younger and they don’t stick around for Social Security.
0:37:56 They pay their taxes and they pay their taxes and then they go back.
0:38:03 A mass deportation effort, some estimates, put at 4 to 7 percent loss or reduction in GDP.
0:38:09 And also there, 90 percent of undocumented, of the undocumented population is working age.
0:38:16 About a third of agricultural workers, a quarter of ground maintenance workers, and about a quarter of all food service workers are undocumented.
0:38:19 So, I mean, you’re just – your prices are going to go up, folks.
0:38:22 And there’s this trope and there are some very bad people.
0:38:24 I do not believe in open borders.
0:38:25 I believe you have to have a country.
0:38:35 But the question I would have for you because I don’t feel as if I am very knowledgeable or have a deep domain expertise around immigration is that we want to demonize immigrants.
0:38:46 But wouldn’t the fastest way to solve this problem to be to go to the demand side and that is say to Chipotle or lawn care companies, we’re doing random audits.
0:38:53 And based on the percentage of people who are clearly undocumented, we’re going to find you $10,000 a day because they don’t come here to rape.
0:38:56 These immigrants don’t come here to commit crimes or to start gang warfare.
0:38:58 They come here for jobs.
0:39:12 And if you went on the demand side and basically hit those nice American people with real fines such that they started implementing – and they could do this with biometrics or, you know, just simple documentation, verification.
0:39:19 If there’s no jobs, if it’s like, okay, I’m sorry, I can’t hire you, they melt back to where they came from.
0:39:20 But we don’t want to do that, do we?
0:39:22 Yeah, two thoughts on this.
0:39:28 One is – and I think that it reveals a lot about why they’re targeting, who they’re targeting, what the motivations are.
0:39:34 They don’t – this administration doesn’t want to go after business owners directly, right?
0:39:37 And if they get hurt by the tariffs because Trump’s obsessed with tariffs, that’s one thing.
0:39:42 But they’re not trying to make enemies of people that they think voted for them or possibly voted for them, small business owners.
0:39:51 Plus on top of that, Donald Trump is an employer of illegal immigrants who worked at, you know, his hotels and golf courses, which he knows.
0:39:53 So they don’t have any interest in doing that.
0:39:54 You’re exactly right.
0:39:55 They could – there’s e-verification.
0:40:00 I mean this has been something that like border hawks, immigration hawks have been proposing ever since I’ve been in politics.
0:40:07 In fact, many of the candidates I worked for like supported that as something that’s on your policy agenda in the campaign.
0:40:15 But then you don’t actually put into place in government e-verify because you don’t want to actually punish the small business owners that are likely Republican voters.
0:40:19 So what they’re doing now is they feel like low risk, right?
0:40:33 Like who is – sure, there are probably some working class Hispanic voters that moved over to Trump that are starting to have a – maybe a change of heart because they had a cousin or a friend or something who is not a criminal, who’s been deported, or they know somebody who has.
0:40:49 But broadly speaking, if you take an 18- and 19-year-old El Salvador kid that was brought here that can’t vote, brought here illegally by their parents, and you send them back to El Salvador, you’re not paying a political price for that in any meaningful way.
0:40:50 And it is immoral.
0:40:58 It’s an affront to what the country is supposed to be about, and it’s affront to the very American ethos of people wanting to come here and have an opportunity.
0:41:00 But you’re not paying a political price there.
0:41:03 So I just think that they’re doing it.
0:41:07 There’s obviously some racial elements to it as referenced earlier with the white Afrikaners.
0:41:19 But it’s also just they feel like it’s much more politically palatable to go after 18- and 19-year-old kids that would have been dreamers or whatever than it is to go after American business owners.
0:41:21 And just one really quick thing on the economy.
0:41:22 I agree.
0:41:23 This all takes time.
0:41:24 This is going to happen now.
0:41:32 But just adding on to what we talked about earlier with tariffs, you add on to that, we’re deporting a bunch of people that are working, doing cheap work.
0:41:35 We’re not bringing in nearly as many people as we were.
0:41:43 So the shutting down of the border is good in that it’s shutting down some of the fentanyl traffic and some of the gang labor, but it’s also shutting down people that were coming here to work.
0:41:48 And then on top of that, we’re firing a lot of people in the federal workforce or putting them on the sidelines.
0:41:53 They’re probably going to end up getting paid anyway, so we’re going to be paying them to do nothing while that goes to the courts.
0:42:00 And it’s harder for recent college grads to get jobs, you know, in a lot of these areas because people don’t know what’s going to be happening with the government.
0:42:09 I just think that there are a lot of economic factors there that are pointing to a pretty bad situation, you know, once it all, you know, starts to hum through the economy.
0:42:10 Curious.
0:42:11 Let me put forward a thesis.
0:42:12 I want to get your response to it.
0:42:14 So I have a 17-year-old son.
0:42:28 And there’s been reports in verification of actual people who aren’t criminals, people who were brought here, grew up here, being deported, some ending up in these hellscapes, prisons, and also reports of U.S. citizens.
0:42:40 My view is unfortunately that a lot of Democrats who are very wealthy clutch their pearls and say at dinner parties how outraged they are.
0:42:48 But they don’t really do a fucking thing about it because there’s this emerging what I’ll call transnational oligarchy, a togarks.
0:42:53 And that’s if you’re in the 1%, A, you have a disproportionate amount of power.
0:42:56 And without you, it’s very hard to get anything done without your support.
0:42:59 And so it’s easy to complain about it at dinner parties.
0:43:04 But the reality is in America that your rights have become a function of your wealth.
0:43:09 My kid is not going to be sequestered by ICE and sent to El Salvador.
0:43:11 There’s just zero chance that could happen.
0:43:16 I will not be silenced because I have the money to lawyer up.
0:43:25 Anyone in my life that becomes pregnant, I don’t care if I’m in deepest, reddest Mississippi, I can get access to family planning because I have money.
0:43:34 And if shit really gets real and on the unlikely chance we start rounding up Jews and I got on the wrong list, well before that, I have the money to peace out to Milan or Dubai.
0:43:45 And what that creates is this lack of incentive or this divesting of the most powerful interests in America.
0:43:52 And that is, whereas before, I think the really wealthy thought, I’m going to stay here.
0:43:54 If this could happen to them, it could happen to us.
0:44:05 But now there’s a feeling amongst me, well, I always turn this back to me, but among the really wealthy, they were insulated from some of this, that it really doesn’t impact us.
0:44:07 So we have our own schools.
0:44:08 We have our own security.
0:44:10 We have our own health care.
0:44:12 We have our own legal rights.
0:44:14 We’re protected by the law, but we’re not bound by it.
0:44:29 And it creates this really unhealthy ecosystem where the most powerful in our nation, even who claim to have progressive values, really don’t feel that same sense of vested interest in the maintenance and fidelity to American values.
0:44:34 Because at the end of the day, we’re kind of global citizens and our governance is the dollar.
0:44:37 We’re basically how much money we have.
0:44:40 And we can find those rights somewhere else, even if they’re violated here.
0:44:41 Your thoughts?
0:44:42 I think that there’s some of that.
0:44:46 You know, I mean, look, you’re always going to paint with a broad brush in these sort of situations.
0:44:52 Like there are certainly rich liberals that are out there doing what they can and others that feel like how you did.
0:44:59 And there’s – and I hear from Bulwark listeners, like upper middle class people that will come up to me and say that they’re thinking of leaving.
0:45:02 Like I said, a woman just over the weekend that was like, I’m thinking about moving to Australia.
0:45:04 My husband is a citizen or something.
0:45:06 And I was like, don’t leave.
0:45:07 I’m not going anywhere.
0:45:09 You know, you’re fine here.
0:45:17 You actually – because of what you just laid out, Scott, like if you’re a citizen of this country that has enough money for a lawyer, like you’re in pretty good shape right now.
0:45:21 We’ll see how things look when Donald Trump is deteriorating at age 81 in 2027.
0:45:23 Maybe my assessment will change on that.
0:45:26 But as of right now, you’re fine and you should be staying here and should be fighting.
0:45:28 So I do.
0:45:29 I think there’s a little bit of that.
0:45:35 I also think the Democratic Party – and this is going to go against what like my policy preferences is probably.
0:45:38 But I just – I think that from a political standpoint, this is important.
0:45:52 Like the Democratic Party has not done a particularly great job of recruiting people that are from the working class, that are from the non-globalist parts of America to be spokespersons for the party.
0:45:58 A lot of times those people are probably going to be more – they’re probably going to have different views from me on social and economic issues, right?
0:46:01 Like I’m kind of a fiscally conservative, socially liberal, whatever cliche.
0:46:17 Like the Democrats should probably be recruiting people that are more – that are more fiscally left than me and have some maybe contrarian social views because that is like the most popular, you know, combination of political views for people – for working class people.
0:46:31 And I think the Democrats have done a lot of recruiting of people that like are maybe from somewhere in America and then were the valedictorian and then went to a fancy school and then worked at McKinsey and then went back to where they came from or, you know.
0:46:34 And nothing against any of those people.
0:46:39 But they’re going to have a set of views that are closer to what you just lined out that have a more of a globalist kind of mindset.
0:46:50 And I do think it would help the Democrats to have people that like authentically sound like they are from the communities that are going to be hurt by this.
0:46:51 Yeah, they claim to represent.
0:46:54 Yeah, it’s just if they’re setting to the kind of the purity test.
0:46:56 And we’re going off script here, but I can’t help it.
0:47:05 It shocked me a couple of days ago there was a new poll showing that if the election were held again today that Harris would still lose or that Trump would win.
0:47:07 In my sense is –
0:47:07 I’m not shocked by that.
0:47:11 I was shocked by that, especially when you see the swing among young people.
0:47:13 But, I mean, it is what it is.
0:47:16 And as unpopular as Trump is, the Democratic Party is less popular.
0:47:30 And I think as we sit, again, crying into TikTok, the reality is where the – the analogy I used to use was that the panzer tanks come rolling into Poland and we’re fighting Democrats, whereas Democrats were fighting them on horseback.
0:47:33 And then someone reminded me, actually, that was a successful military operation.
0:47:36 I should stop disparaging the heroes of the Polish army in World War II.
0:47:46 So – but my point is the only thing that feels more corrupt than the Trump administration or kind of coarse and cruel right now is just how weak the Democratic Party is.
0:47:48 Well, it’s not more corrupt, but it’s sadder.
0:47:49 And it has more impact.
0:47:50 No, no, no.
0:47:52 More weak.
0:47:53 Yeah, more weak for sure.
0:47:57 Well, isn’t America basically saying they’d rather have a corrupt autocrat than a weak Democratic Party?
0:48:02 A lot of Americans are – and here’s the – and this goes to your screaming the TikTok thing, which I do a lot, so I’ll defend it.
0:48:04 I’ll defend the honor of it.
0:48:06 But I do think it has its limitations, which is this.
0:48:27 Like – and this is why I bet you get pushback sometimes from people when you say this because there is a not nothing – you know, there’s 40 percent of the country, maybe 33 percent of the country who are super engaged in politics, are decently well off, middle class to upper middle class, went to college, read the news, you know, listen to podcasts or watch, you know, cable or do – or read the newspaper.
0:48:37 Read magazine, whatever, engaged, know who their representative is, and are mad about what’s happening, are legitimately mad about it, and are trying to figure out what to do about it.
0:48:41 And the good news for Democrats is those people show up in these special elections and local elections.
0:48:44 That’s why Democrats are trying to do better in those than in the national elections.
0:48:51 The problem is there’s just another huge part of the country that are less informed, and I don’t really even say that as a pejorative.
0:48:54 It’s just like they don’t engage in political news.
0:48:57 Maybe for some of them it’s because they’re working too damn hard and don’t have time.
0:49:01 Maybe for others it’s because they’d rather play video games for eight hours a day.
0:49:03 But either way, like that is happening.
0:49:12 And the Democrats have been – like to that demographic, the Democrats feel very weak and they feel very disconnected and out of touch and not fighting for them.
0:49:22 And it is just an absolute necessity that Democrats figure out how to find a voice that can connect with people that don’t read the New York Times.
0:49:37 And part of that I think is, as I just mentioned in the last answer, is like recruiting people, you know, not who play video games eight hours a day, but who like look and sound and feel more like folks that are not part of that class of the one-third of the country that’s super engaged.
0:49:38 Well, let me ask you then.
0:49:46 Right now, if you had to say who are the leaders of the Democratic Party, you would point to minority leader Jeffries and minority – or Senate minority leader Schumer.
0:49:49 And I’m a fan of leader Jeffries.
0:49:52 I don’t think he is the leader we need right now.
0:49:54 And I think Senator Schumer is a fucking disaster.
0:50:00 Who do you think, in your view, who are some of those emerging – everyone keeps saying we have such a strong bench.
0:50:04 And then they say – they point to Westmore and the list runs shallow.
0:50:09 And then everyone was getting excited about John Fetterman and there’s all these stories coming out saying that he’s struggling.
0:50:13 Who do you see as kind of the up-and-coming draft choices in the Democratic Party?
0:50:15 I am on the weak bench side of this.
0:50:16 Me and Carville argue about this.
0:50:19 Carville feels very – like the bench is really good.
0:50:20 I don’t really think so.
0:50:22 But I do like Westmore.
0:50:27 I think that you’ve seen like the AOC and Bernie are actually channeling something.
0:50:29 I don’t – obviously Bernie’s really old.
0:50:30 Let me just push a pause there.
0:50:32 My thesis is they’re great.
0:50:33 They’re inspiring.
0:50:36 There’s no fucking way America is going to elect either of them.
0:50:36 Yeah.
0:50:37 It was funny.
0:50:42 I was at a panel – and this is part of like getting again outside of these pockets of –
0:50:46 Republicans are praying that Bernie or AOC are the nominee.
0:50:50 I was on a panel here in Louisiana and I got the same question.
0:50:51 I was giving the same answer I’m giving right now.
0:50:53 And then I mentioned AOC.
0:50:58 And a guy who I know who’s an older guy who’s a Democrat, Louisiana Democrat, came up to me and he’s just like,
0:51:04 my party is more insane than I even thought it was if they think that AOC can win this country.
0:51:07 It’s just like people that are outside of certain worlds just don’t see things differently.
0:51:10 That said, they’ve showed leadership and I just wanted to mention it.
0:51:11 They’re inspiring.
0:51:11 Yeah.
0:51:13 But look, here’s what I think, man.
0:51:18 Look, if this is May – where it’s May 12, 2025.
0:51:26 If you took us to May 12, 2013 and said Donald Trump is going to be the Republican Party leader, everybody would say you’re insane.
0:51:33 If you took us to May 12, 2005 and said that Barack Hussein Obama is going to be the Democratic Party leader, everybody would say you’d be insane.
0:51:36 And I think a lot of times people have limits on their imaginations.
0:51:37 Same with Clinton.
0:51:38 Nobody knew who Clinton was.
0:51:38 Yeah.
0:51:43 And I think that you look at – like the two names that – two examples I just come up with that are just totally different.
0:51:45 Neither of these guys are going to be the leader of the Democratic Party.
0:51:53 But Dan Osborne ran for Senate in Nebraska, way overperformed as kind of a working class, socially conservative, fiscally liberal guy.
0:51:56 And, like, Democrats should recruit guys like that to run the midterms.
0:51:58 Mark Cuban is, like, the inverse of him.
0:51:59 There’s, like, a business guy.
0:52:06 And you could tell me that, like, either of those types of people could be the Democratic nominee in four years – three years, and I would believe you.
0:52:07 And so, I don’t know.
0:52:08 I like Wes Moore fine.
0:52:11 I like Josh Shapiro fine.
0:52:12 You know?
0:52:22 I mean, there are other – Pete, I hate – you know, I think that I don’t compete, like, as a seven-language-speaking grad student grad who worked at McKinsey.
0:52:25 Maybe he really, like, reached the working class people I’ve been talking about.
0:52:25 I don’t know.
0:52:29 But he did pretty damn well on that bro podcast, that Andrew Schultz podcast the other day.
0:52:31 So, maybe he can do better than I think.
0:52:33 So, there are people out there.
0:52:34 But it’s going to take, you know, real work.
0:52:37 Scott Galloway, if he didn’t live in London, might be an example.
0:52:39 Yeah, that shows just how desperate we are.
0:52:45 Well, let me ask you this, just so we can fill the comment section up with people calling me names.
0:52:48 I think America is ready for a gay president.
0:52:49 I don’t know if the Democratic Party is.
0:53:00 I think the way the Democratic primaries are held, that there are certain elements of the Democratic Party that would have an issue with Secretary Buttigieg as evidenced by his poor performance.
0:53:01 You’re talking about black voters?
0:53:01 Thank you.
0:53:02 This is just a truth.
0:53:04 I’m just going to say this.
0:53:05 Like, I have plenty of friends at the Pete campaign.
0:53:11 And, like, Pete had some of his own issues with black voters in South Bend that might be totally unrelated to gay issues.
0:53:13 But, like, they did focus groups with black voters.
0:53:17 And in all those focus groups, there were some black voters that weren’t cool with it.
0:53:18 And that’s just, like, that’s just a fact.
0:53:19 That’s just reality.
0:53:22 There are more white people that hate gays than black, you know what I mean?
0:53:24 So, I’m not, like, trying to make it a racial thing.
0:53:25 That’s just kind of a fact.
0:53:26 And that would be a challenge for him.
0:53:28 Is that going to be still the true in three years?
0:53:29 I don’t know.
0:53:31 Is it something about Pete himself?
0:53:37 Again, like, Carville’s line is always, like, the person that wins the Democratic primary is the one that can win the black church.
0:53:43 And it’s, like, I could maybe imagine a gay candidate that would be able to do better in a black church than Pete.
0:53:45 It’s just, like, is that, like, his natural space?
0:53:46 Like, probably not.
0:53:48 But maybe so, by the way.
0:53:49 I don’t know.
0:53:52 Like, maybe he could really surprise and prove himself.
0:53:55 I didn’t think he’d do that well on that podcast.
0:53:56 Pete has surprised at every turn.
0:53:58 So, you know, I don’t know.
0:54:05 I think that, broadly speaking, even outside of black voters, Democrats are, like, what’s the fucking old saying?
0:54:07 Twice bit, thrice bits, once, whatever.
0:54:08 Just about.
0:54:20 I think that they’ll probably want to go for a straight white guy or a straight black guy just because, like, after Hillary and Kamala experienced, I think that a lot of Democrats are just going to be freaked out about nominating somebody.
0:54:22 And I don’t know if that’s true or right.
0:54:23 I think there are other issues there.
0:54:25 But I do think that there will be some of that.
0:54:30 Yeah, I think the Democratic Party at this point is like, okay, we absolutely need a female president.
0:54:32 And we will have a female president.
0:54:38 She’ll be a Republican who has a reputation for likely drone striking your entire family if you run a stop sign.
0:54:41 That’s who’s going to be the first female president.
0:54:52 Christy, no, she’ll have a whole new face, you know, and she’ll have murdered a dog and have a pinup photo shoot in front of, you know, in front of El Salvador torture prison.
0:54:58 Well, I hope and trust that she’ll be out of government soon, but she’s going to slipstream right into some sort of Cinemax prison film.
0:55:04 I mean, that picture of her where it looked like a Sephora had exploded on her face and she was in front of a bunch of half-naked dudes.
0:55:10 It literally felt like those prison films I used to watch in the 90s after my parents had gone to sleep on Cinemax.
0:55:12 Okay, let’s take a quick break.
0:55:13 Stay with us.
0:55:18 Welcome back.
0:55:23 History was made last week as Cardinal Robert Previce became Pope Leo XIV, the first American to lead the Catholic Church.
0:55:32 Born in Chicago and shaped by decades of missionary work in Peru, Leo stepped onto the balcony of St. Peter’s Basilica Sunday to deliver his first blessing.
0:55:39 He called for peace in Ukraine and Gaza, urged leaders to reject war, and emphasized caring for young people and the vulnerable.
0:55:44 His message and background signal a potentially progressive path forward.
0:55:51 Tim, what does the new pope’s background and his first public message tell you about the direction that he may take the church?
0:55:53 I’ve got to tell you, my mother couldn’t be more thrilled.
0:55:55 I’m a little bit of a lapsed Catholic myself.
0:56:00 As we were just talking about the gay stuff, but there were some jokes on the internet that he was the bulwark pope.
0:56:07 And it was like right in my mom’s lane because he was, you know, he voted in Republican primaries, I guess, in 2012, up until 2016.
0:56:12 And then he stopped, and then he had multiple tweets criticizing Trump and Vance.
0:56:15 It’s pretty wild that we can go through the pope’s tweet history now.
0:56:17 We are in a different world.
0:56:19 He also graduated Villanova, where my little brother went.
0:56:23 So, like, touching a lot of bases for my daily church-going mother.
0:56:25 Very thrilled.
0:56:25 That’s huge.
0:56:26 Yeah, very thrilled.
0:56:27 Good for the Miller family.
0:56:29 Big weekend for the Miller family.
0:56:30 Well, the broader thing, I don’t know.
0:56:38 You know, I think that the College of Cardinals probably had a lot of things in their minds, not just the fact that this person was American or our domestic political concerns.
0:56:40 And, frankly, he hasn’t even been in America that much.
0:56:45 He was in Peru and Italy for most of his service to the church.
0:56:46 And so, I don’t know.
0:56:47 I will say this.
0:57:02 Whether they intended this or not, I do think it is nice to have an American on the world stage that is offering a counter view about what it means to be a person and a human than our president.
0:57:10 And, you know, I don’t know that he’s going to be, like, the woke pope of every lefty’s dreams on a variety of issues.
0:57:18 You know, like, the Catholic Church still has the Catholic Church’s views on gender and sexuality and abortion and women priests and all that.
0:57:26 I think that he is someone that is just—it’s very clear that he actually cares about his fellow humans.
0:57:27 He cares about humanity.
0:57:33 And that is in direct contrast to the president who only cares about himself.
0:57:37 And so, I don’t—you know, we’ll see what exactly it means for the Catholic Church.
0:57:45 I think probably a continuation of Francis more than any big, massive changes based on my—the Catholics in good standing in my life.
0:57:47 TBD, a little bit on all that.
0:57:55 But just from a—as a former PS2 PR, marketing and PR people, like, it’s nice PR for America at a time where our PR is pretty shitty.
0:57:55 Yeah.
0:57:59 My thesis is that this is the third world leader that got elected by Trump.
0:58:05 Anthony Albanese, Mark Carney were both supposed to lose.
0:58:07 Especially Carney, overcame a 25-point deficit.
0:58:16 And I think there is such a gag reflex globally around Donald Trump that he is electing world leaders, just not the world leaders that he’s hoping would be elected.
0:58:17 And I think this is another example.
0:58:23 When—I think the papacy is really strategic and says, where can we have the most impact?
0:58:33 And part of that is which region is struggling and would benefit most and get the most attention around these very humanistic values and code of decency.
0:58:36 And when the Eastern Bloc was struggling, they’d pick someone from Poland.
0:58:51 And I don’t think it’s any accident that they picked an American pope, that they said, if we—we are really troubled by what this lack of humanity—the call sign or I think the statement that literally identifies America right now is the following.
0:58:55 And it was made by Bill Gates, and I’m paraphrasing, but it’s thematically the same.
0:59:00 The world’s richest man is killing the world’s poorest people.
0:59:07 And that to me—like, when Bill Gates said that, it was one of those moments where I thought—it, like, just hit me so hard in the gut.
0:59:09 I thought, wow, that’s what we’ve become?
0:59:11 Like, literally, that’s us now.
0:59:22 And so I think they see an opportunity, not only for attention, but a chance to restore and have influence on Americans who obviously disproportionately carry weight and gravity and influence around the world.
0:59:41 For, I think, more Americans and more elected officials will pay closer attention to what this pope says, and that a restoration, a rejuvenation, an EpiPen, a Narcan to the American value system is really needed right now.
0:59:47 And this guy, in addition to understanding technology and referencing AI, he is unafraid.
0:59:50 He’s called Putin’s actions wicked.
0:59:52 Which is an upgrade from Francis, worth mentioning.
0:59:53 There you go.
0:59:57 But I think, again, Trump has gotten another world leader elected.
1:00:06 And I think they see a big opportunity here to have an American pope who will, again, get probably greater sort of bandwidth or airtime because of his origins.
1:00:11 And that America is really in need of sort of a values upgrade, if you will.
1:00:13 Thoughts?
1:00:13 Bad, but true.
1:00:14 It’s hard to argue.
1:00:20 Again, I don’t claim to, you know—I could give you a lot better analysis of the Electoral College than the College of Cardinals.
1:00:22 And so it’s hard for me to get inside there.
1:00:24 They do get branding, though, don’t they?
1:00:25 White smoke?
1:00:27 Yeah, maybe it’s more interpersonal.
1:00:27 You know what I mean?
1:00:29 I just—I don’t really know.
1:00:35 But I think that the impact of what you’re saying, whether or not the intention was there, is definitely correct.
1:00:35 I agree with the analysis.
1:00:40 And I’m sure that for certain members of the college, it was part of it.
1:00:46 And Francis, to my understanding, did put in a lot more people in his mold, you know, into that college.
1:00:57 And so I wouldn’t be surprised if at least among some of them they thought that this was a nice contrast to the American president, particularly at a time of, you know, where America is struggling.
1:01:01 And I’m glad you mentioned that Bill Gates quote because that also hit me like a ton of bricks.
1:01:02 It’s just—it’s terrible.
1:01:05 The USAID thing is so unimaginably terrible.
1:01:11 And it’s like, you know, you run out of reasons to talk about it on shows like this, right?
1:01:13 Because there’s no, like, new news about it.
1:01:25 But it is truly abhorrent that we took something that was a tenth of a penny in our federal budget that was, you know, giving HIV medicines to people in Africa and feeding the poor.
1:01:32 And we’ve cut it because Elon Musk, like, broke his brain by, like, reading too many tweets.
1:01:34 It’s a truly deplorable state of affairs.
1:01:46 Well, we’ve spent 80 years developing an expensive and worthwhile brand association that, in the short term, we make a lot of mistakes, but it’s mostly out of stupidity or naivete.
1:01:48 This brand association is real, though.
1:01:54 I just—so I worked for McCain, and I talked to Mark Salters, his ghostwriter and longtime speechwriter, and traveled the whole world with him.
1:02:06 And Salters said, like, the American brand, you would go with McCain to these random corners where people were fighting against autocrats or where there had been a big natural disaster.
1:02:16 And he would travel there, and he’s like, you know, people in random villages and, you know, in small towns and remote corners of the world would be like, America, John McCain, John McCain.
1:02:27 Like, that brand was that strong, and I do think that we’ve essentially just ruined it forever, certainly tarnished it in a matter of five months.
1:02:34 We’ve said it back decades, but that association, one of the core associations on a very basic level is I’ve always felt like we’re the good guys.
1:02:36 That, yeah, do we make dumb mistakes?
1:02:38 Are we gluttonous?
1:02:39 Are we obnoxious?
1:02:40 Yeah, but we’re the good guys.
1:02:42 Our heart’s in the right place.
1:02:46 And I think in just probably in three and a half months, we’re no longer the good guys.
1:02:52 And there’s this notion that in regions where there’s no investment, you get just such an enormous return on investment.
1:02:54 It’s basic economic theory.
1:03:06 We were getting such enormous ROI on these small investments in terms of preventing kids from getting infected, having AIDS transmitted from their mothers to them, which is some very inexpensive, wiping out malaria,
1:03:14 toilets to such that thousands, even millions of young boys and girls didn’t die of dysentery.
1:03:20 And we’ve taken what is probably the greatest ROI investment because there’s so little investment in these regions.
1:03:22 No other nation would make those investments.
1:03:25 And we decided those are the investments that we’re going to pull back.
1:03:26 It really is depraved.
1:03:35 But circling back, I do think that the papacy recognized this and decided that they could have the most impact with a pope that more Americans would listen to.
1:03:38 So, Tim, I want to go off script for a minute.
1:03:40 I’m fascinated by Tim Miller.
1:03:48 I have found myself just so drawn to your content and how you bring the strength and fearlessness and real emotion and real empathy.
1:03:50 What’s your origin story?
1:03:52 I don’t know that much about you.
1:03:55 How did Tim Miller get to here right now?
1:03:56 I appreciate that.
1:04:02 I don’t know about you, but this is not false modesty because I can be a narcissist like any other content creator.
1:04:07 But I do find it weird to process people consuming my stuff all the time, right?
1:04:15 Because when I try to just emote and be authentic and just say what I really think and not actually think about the audience as much as possible,
1:04:22 And so I do sometimes it makes me uncomfortable when I start hearing about, you know, thinking about Scott Galloway consuming my rants.
1:04:24 But so I was Republican operative.
1:04:29 I was just a PR flack, essentially, for Republicans, usually moderate Republicans, but I was also a hired gun.
1:04:32 So I have some shameful Republicans on my resume as well.
1:04:36 And, you know, I came out of the closet during that process.
1:04:47 And so I was probably like the most, there have been a lot of prominent Republicans who like were either outed or became gay after, like when they retired, like Ken Melman or Larry Craig or like, you know, whatever.
1:04:55 But as an active person in the party, like right around all the time of the gay marriage stuff, I was like the most like visible.
1:04:59 And so I do think that gave me just kind of a relationship with all of it.
1:05:01 Those may be a little bit different than other hired guns.
1:05:09 Like I had dealt with like being separate from the party on something that was very core to me, you know, throughout this process.
1:05:11 And so when Trump came along, I don’t know.
1:05:15 I just part of that, I think it was made it easier for me just to say, no, fuck this.
1:05:27 And as part of my hired gun process, a bunch of rich guys hired me in 2016 to be the point, like the face of a basically Republicans against Trump effort, like anybody, like it’s like whoever it is.
1:05:31 So I just went on cable and argued with Trumpers and pitched negative stories about Trump to people.
1:05:37 And then when Trump won, I went through a massive midlife crisis about what the hell to do with my life.
1:05:38 Pretty early midlife.
1:05:40 Yeah, early midlife crisis.
1:05:43 I had a very early midlife crisis and an extended one.
1:05:47 And I started doing some of these podcast stuff on the side, literally.
1:05:48 And I was like, you know, I was lost.
1:05:51 I was like, should I do corporate PR?
1:05:53 We adopted a kid at that time.
1:06:03 I was like, should I just be a nine to five, you know, guy and like do PR for Clorox bleach or something and like have a regular job and coach the kids sports teams and forget all this?
1:06:08 Should I, whatever, do like figure out, like try to fight within the party against Trump?
1:06:10 And I was like totally lost.
1:06:16 And my colleague, Sarah Longwell, who was an old friend of mine, started the Bulwark and I started kind of doing Bulwark stuff for fun.
1:06:17 And I don’t know, man.
1:06:33 I just, I think that people were, there’s something about the fact that I think that I was lost and did not have like a little birdie in the back of my head saying, hey, you know, think about your career and like what other job, you know, you might be White House press secretary in the future.
1:06:36 You might want, you know, who knows what will happen after Trump ends.
1:06:37 Like I just didn’t have that.
1:06:40 I was a little bit unfettered, I think.
1:06:46 And, and so we created at the Bulwark with Sarah and JVL and others, like a community of people who really liked that.
1:06:51 And I think that was important to them, like the ROGs, because they were also kind of lost.
1:06:55 And so, I don’t know, man, that’s how I ended up doing this.
1:07:10 And I think there is something freeing about being a former Republican versus being somebody who comes up as a Democrat in their background, because, you know, I just don’t, A, I have some of the, like the Republican traits of aggressiveness.
1:07:15 I’ve not been beaten down by the Democratic traits of community building.
1:07:16 So, I think that has helped.
1:07:26 And also, I just don’t, like, you know, I’m not plotting who might hire me for the 2028 primary in the way that maybe some Democratic commentators are.
1:07:27 So, I don’t know.
1:07:27 Is that good?
1:07:29 Was that a good backstory for me?
1:07:31 Curious if you, I have trouble.
1:07:34 I would say that from zero to 30, I didn’t have enough stress.
1:07:36 Almost failed out of UCLA a couple times.
1:07:36 Didn’t bother me.
1:07:39 Was on the verge of being kicked out of UCLA, which would have been really bad for me.
1:07:40 I didn’t really care.
1:07:42 Almost lost a couple businesses.
1:07:46 Was very reckless with my relationships.
1:07:49 Just didn’t have as much anxiety, quite frankly, as I should.
1:07:51 I think from 30 to 40, I had the perfect amount of anxiety.
1:07:55 Worried enough to be successful, but not worried enough where I couldn’t sleep.
1:07:57 And now I have too much anxiety.
1:07:59 I worry about everything.
1:08:00 And you have a kid.
1:08:03 Like, if I’m not anxious about one of my kids during the day, something’s wrong.
1:08:05 That makes me anxious.
1:08:06 I feel like I’m missing something.
1:08:12 And I’ve had trouble disassociating from what is going on with America and our government right now.
1:08:16 For the first time, politics is really sort of rattling me and taking a toll on me emotionally.
1:08:19 Do you struggle with the same thing?
1:08:21 Sometimes when I watch your content, I get the sense.
1:08:22 I can hear it in your voice.
1:08:25 Like, this shit really upsets you.
1:08:27 Like, it really rattles you.
1:08:30 One is, am I sensing that correctly?
1:08:40 And two, how do you attempt to disassociate and or keep things in perspective and get about your day and focus on your family and progress at the pull work?
1:08:46 I’m pretty good at compartmentalizing, which got me into trouble in that past life that I talked about earlier.
1:08:51 Like, I probably shouldn’t have compartmentalized with the fact that I was gay with the fact that I was a spokesman for Republicans.
1:08:52 But I was able to do it then.
1:08:57 It’s serving me a little bit now because I do get – I get rattled emotional and very mad.
1:09:01 And, like, probably three times during the day I get very mad.
1:09:07 And when I get actually mad, I try to channel that into the content because I said this after the election.
1:09:10 I was like, I’m not going to do the fake mad thing.
1:09:11 Like, I’m not.
1:09:13 I’m not going to pretend to care about things I don’t care about.
1:09:31 And, like, sometimes there’s Trump stuff that makes other people really mad that I just either don’t talk about or will talk about a little bit just because I’m like, I just – I can’t – I don’t have any room in my body for the anger over this thing because – in part because I’m so mad about the immigration stuff and some of – and in particular the immigration stuff.
1:09:33 But also other things they’re doing.
1:09:35 The trans military ban is one that got me recently.
1:09:39 I try to just be – to have my honest emotions with people.
1:09:42 Outside of that, A, I’m drinking too much.
1:09:44 But I’m trying to go.
1:09:45 I live in New Orleans.
1:09:47 I knew we were brothers from another mother.
1:09:47 Yeah.
1:09:50 I’m trying to go to – and I live in New Orleans.
1:09:51 So I’m going to show.
1:09:52 I’m going to see music.
1:09:55 And when I’m there, I’m drinking too much bourbon.
1:09:58 But it is allowing me – and I’m enjoying my time there.
1:10:03 And I’m being with – I have a lot of buddies here who don’t stress me out about politics.
1:10:04 And I appreciate all of them.
1:10:06 And that is good.
1:10:10 I have a couple hours a day where I take on the parenting and I just try to parent.
1:10:11 And I’m like, I’m here.
1:10:12 We’re going to play.
1:10:13 We’re going to go to the basketball court.
1:10:15 We’re going to – whatever.
1:10:15 Do your homework.
1:10:16 We’re going to be silly.
1:10:20 And I try to do that and not think about it.
1:10:23 Every once in a while, bad thoughts come through when I’m parenting or drinking.
1:10:24 But usually not.
1:10:26 Like I’m pretty good at compartmentalizing it.
1:10:31 A therapist might tell me that this strategy is eventually going to fail.
1:10:37 And like those three parts of my brain are going to collide in a way that will create crippling anxiety.
1:10:38 But that hasn’t happened so far.
1:10:41 Most importantly, what did you do for Mother’s Day yesterday?
1:10:41 Nothing.
1:10:46 One of the great joys of being gay is that we don’t have to do Mother’s Day.
1:10:49 I mean I sent my mother a gift and we did a FaceTime with her.
1:10:51 She lives in Colorado and I have a great mother.
1:10:52 Yeah.
1:10:54 It’s nice.
1:10:56 I feel like we get a little bit freed from the conventions.
1:11:02 So, you know, some people trying to be nice and woke like will wish us a happy Mother’s Day.
1:11:03 And I’m like, no, it’s cool.
1:11:03 No worries.
1:11:05 And by the way, I don’t even have to do a happy Father’s Day.
1:11:06 It’s fine.
1:11:08 Like we have a little different family style.
1:11:14 We went and had crawfish at Klessis, which if you find yourself in New Orleans during crawfish season, I got to shout out Klessis.
1:11:15 It’s the best spot.
1:11:16 I watched the Nuggets game.
1:11:18 It was a loss, unfortunately.
1:11:23 And, you know, I yelled at the YouTube camera, took the kid to the park.
1:11:23 It was great.
1:11:24 It was a wonderful day.
1:11:25 What about you?
1:11:26 I had a wonderful Mother’s Day.
1:11:27 I did nothing.
1:11:31 I’m here in New York on my own and I walked around Soho.
1:11:32 I went to Jack’s Wife Frida.
1:11:35 I went to San Vicente Bungalows for brunch.
1:11:37 It was, you know, just.
1:11:40 You weren’t guilt-trapped by the mothers in your life over that?
1:11:41 My mom is gone.
1:11:49 The mother of my children is, I don’t want to say it’s Mother’s Day every day, but we’re pretty much in awe of her and we plan a lot of stuff.
1:11:49 And do a lot of stuff.
1:11:52 But we had some stuff planned for her to make sure that she felt loved.
1:11:56 And quite frankly, she said that she just wanted to be alone, that that was her Mother’s Day gift.
1:11:58 She just wanted all.
1:12:00 She has three kids.
1:12:09 But, look, one of the really wonderful things about getting older as a man is you develop these really nice kind of paternal instincts or fraternal instincts.
1:12:12 where you’re happy for people, you’re happy for younger men.
1:12:14 I have gotten real reward.
1:12:18 I don’t know you that well, but I’ve gotten real reward from watching you in this moment.
1:12:23 I think you are so authentic and so courageous and have such great command of the medium.
1:12:27 I get reward from watching your success.
1:12:28 I’m really happy for you.
1:12:31 I think you’re doing a great job and your voice is really important.
1:12:41 And I just – I hope that you take time with your husband and your kid to pause and recognize how successful you are and what a difference you’re making.
1:12:44 And it’s just fun to just observe it and watch it.
1:12:47 Really appreciate all that you do and very much appreciate you coming on the show today.
1:12:48 Thank you, Scott.
1:12:49 I genuinely appreciate that.
1:12:50 It means a lot.
1:12:51 I’m getting tingly.
1:12:52 I also – it sucks.
1:12:53 I don’t know about you.
1:12:57 I do get uncomfortable with the compliments, especially when so much shit is happening.
1:12:58 And I’m like, I don’t know.
1:12:59 I’m doing the best I can.
1:13:01 But I do appreciate it very much.
1:13:02 It means a lot.
1:13:02 All right.
1:13:03 That’s all for this episode.
1:13:05 Thank you for listening to Raging Moderates.
1:13:07 Our producers are David Toledo and Shinianye Onike.
1:13:09 Our technical director is Drew Burrows.
1:13:13 You can now find Raging Moderates on its own feed every Tuesday and Friday.
1:13:14 That’s right.
1:13:15 Its own feed.
1:13:21 That means exclusive interviews with sharp political minds, including this one who joined us today, that you won’t hear anywhere else.
1:13:23 This week, we have another anti-Trump Republican.
1:13:27 Jess is talking with former Congressman Charlie Dent.
1:13:31 Make sure to follow us wherever you get your podcasts so you don’t miss an episode.
1:13:33 And, Tim, where can they find more, Tim?
1:13:34 I’m everywhere.
1:13:35 The Bulwark YouTube.
1:13:36 You’re everywhere.
1:13:37 To resist is futile.
1:13:37 Yeah.
1:13:39 The Bulwark YouTube.
1:13:43 Nicole Wallace’s show sometimes at MSNBC and some others.
1:13:45 And, you know, Twitter, TimODC.
1:13:46 I’m still suffering through X.
1:13:47 I think you left.
1:13:48 Instagram.
1:13:49 Everywhere.
1:13:50 I’m everywhere, baby.
1:13:51 Get off of X.
1:13:51 Get off of X.
1:13:52 Trust me on this.
1:13:55 The most accretive thing you can do for your mental health is to get off of X.
1:13:56 That’s good advice.
1:13:57 All right.
1:13:58 Thanks again, Tim.
1:13:59 Thanks, man.
1:13:59 Thanks, man.
Scott is joined by The Bulwark’s Tim Miller to break down reports that Trump may accept a $400M jet from Qatar, a shaky tariff truce between the U.S. and China, and Trump’s plan to deport migrants to Libya. Plus, history is made with the election of the first American Pope—and they discuss what his leadership could mean for the future of the Church.
Follow Jessica Tarlov, @JessicaTarlov.
Follow Prof G, @profgalloway.
Follow Raging Moderates, @RagingModeratesPod.
Learn more about your ad choices. Visit podcastchoices.com/adchoices
-
Will AI Replace Amazon? The Future of Shopping Revealed
AI transcript
0:00:07 Welcome to the next wave. I’m your host, Nathan Lanz. And today we’re going to talk all about
0:00:10 the future of shopping with AI. You know, in the last few months, we’ve seen perplexity
0:00:15 ad shopping. And now Chatsubiti has added shopping directly into Chatsubiti. You can
0:00:20 buy a product just by talking to your AI. Today, we’ve got on AJ Bam, the CEO of Viral,
0:00:24 a hot startup in Silicon Valley. It’s absolutely amazing because it makes video searchable.
0:00:28 And brands that are using this are already seeing their sales double. So I think you’re
0:00:32 going to learn a lot about where shopping is headed with AI, as well as ways you can take
0:00:36 advantage of this today in your business. So let’s just jump right into it.
0:00:43 Cutting your sales cycle in half sounds pretty impossible, but that’s exactly what
0:00:48 Sandler Training did with HubSpot. They used Breeze, HubSpot’s AI tools to tailor every customer
0:00:53 interaction without losing their personal touch. And the results were pretty incredible. Click-through
0:01:02 rates jumped 25%. And get this, qualified leads quadrupled. Who doesn’t want that? People
0:01:08 spent three times longer on their landing pages. It’s incredible. Go to HubSpot.com to see how
0:01:10 Breeze can help your business grow.
0:01:15 Hey, AJ. Great to have you here today.
0:01:17 Yeah, thank you. How are you?
0:01:23 I’m doing good. I’m doing good. It’s morning here in Kyoto, but waking up and excited to talk
0:01:25 with you today. You know, the reason I wanted to bring you on is, you know, I’ve been thinking
0:01:30 a lot lately about like what the future of shopping and e-commerce is going to look like with AI,
0:01:35 right? Like a few months back, you had Perplexity roll out. They’re like kind of AI-powered
0:01:38 shopping, which I thought was an okay experience. But I was like, okay, but I get where it could
0:01:44 go. And then OpenAI recently launched their thing, which I think they partnered with Shopify. And
0:01:50 they’re kind of like baking, shopping directly in to the LLM, which is like nuts. Like you can,
0:01:55 you know, I just imagine in the future, just being able to chat with my AI and get exactly what I want.
0:01:58 And I know you kind of play like a different role with your company, Viral, where you’re more on the
0:02:03 AI video side with the social shopping. So I just like love to hear like your thoughts on
0:02:05 the landscape and where things are at.
0:02:09 Yeah, absolutely. So first of all, thanks for having me on the show.
0:02:09 Yeah.
0:02:14 And I think you’re spot on that. I think what’s happening now is, you know, if you look at the
0:02:19 Gen Zs, they’re shopping now with all things video. And I think the proof in the pudding is TikTok,
0:02:20 right?
0:02:20 Right.
0:02:26 I mean, there’s a reason why TikTok became a phenomena. And TikTok proved that when you have
0:02:34 short form video at scale, and it’s authentic, interesting, funny, silly, perhaps as well,
0:02:40 it works. You know, I mean, just reflecting back on my company, right? Like what’s really
0:02:47 changed for us is life before TikTok and life after TikTok. So before TikTok, you know, I’d be knocking
0:02:52 doors, both with brands and retailers. And I’ve always believed that at the end of the day,
0:02:58 you know, people shop not because Kim Kardashian said you should buy a car, but people shop because
0:03:05 your neighbor who looks like you said, hey, Ajay, you know, I just bought my new EV and I love it.
0:03:11 And the reason I love it is it’s doing 200 miles on one charge. And so what really matters at the end
0:03:16 of the day is authentic opinions, right? Right. And where video is very transformative, where video
0:03:22 makes a huge difference is you can see the product, the person and the emotions inside the video.
0:03:22 Right.
0:03:28 So what that leads to really is high brand trust, high product trust, right? You can see the product in
0:03:34 action. So even like simple thing as my headset, right? Just being able to see in a video, whether
0:03:39 this fits, you know, loose or tight, I have a round face, right? Yeah. Like just being able to
0:03:45 see that versus someone telling you in a review that it’s in text makes a huge difference.
0:03:49 You know, it actually reminds me of is there’s this book I’ve been reading. It’s actually my second or
0:03:54 third time reading it, like Ogilvy on advertising. Oh, yeah, yeah, yeah. Of course. Yeah. Yeah. Yeah.
0:03:57 Maybe you’re even like kind of preaching that Bible. I don’t know. But they talk on that book about how
0:04:02 they found in all their years of doing advertising that people thought that hiring like a huge celebrity
0:04:07 celebrity work to promote something. But what they found was typically what happened is you pay those
0:04:12 people so much money. And if people remember the celebrity, not the product, you have to use a
0:04:16 person that people don’t know. It actually works better. Like they had like a, I think like a 80
0:04:21 year old grandma in France or something. And they had her doing, I think it was a butter commercial or
0:04:25 something like this. And that was like a huge hit for like 30 years or something. It was like a long
0:04:30 running ad they ran because people like authenticity. It’s just like a real person using a product.
0:04:34 Yeah. Yeah. Yeah, absolutely. I mean, I think you hit something very important here,
0:04:38 you know, and we’re seeing this trend as well in the market, which is brands are moving away from
0:04:45 very high paid influencers to more micro influencers and shoppers and authentic creators. Right. I mean,
0:04:52 today, in one of the reasons TikTok has exploded is also TikTok brings that average creator who
0:04:56 previously had to make a lot of effort on their phone to make a video. Now with TikTok tools,
0:05:02 anybody in the world can be a creator. Anybody can be an influencer. Right. So in general,
0:05:07 we’re seeing a trend where even in the market is, you know, I think between, I would say between
0:05:15 2018 and 2023, influencer marketing was all the craze. Right. And now what’s happening is, you know,
0:05:19 when you go online and you see another influencer, you’re like, oh, not again. You know, he was paid
0:05:24 to say this. Right. Right. And I think that’s where now there’s a massive tectonic shift. And I think
0:05:30 this is where again, you know, TikTok proved that when you have authentic content, it works. And I
0:05:35 think also what changes that now with, you know, we all have smartphones, I would say give and take
0:05:42 90% of all phones in the world can record pretty good these days, even in poor lighting. Right. So the
0:05:47 previous concerns that brands had about, you know, light being poor or the camera not being right or the
0:05:52 quality of the video not being good. And it’s not just the quality of the video as in capturing the
0:05:57 video. But even like with our 5G networks now, I mean, you live in Japan, Japan has had
0:06:04 fast, high speed phones. And, you know, Docomo was in fact, a leader when it came to mobile shopping.
0:06:07 You know, I lived in San Francisco for 13 years, and it’s crazy to me. I was like,
0:06:12 why is the internet not better in Silicon Valley? Like, what the hell is going on? You’re supposed to
0:06:16 be like in the Mecca of technology, then you go to other places. And it’s like, oh, they have better
0:06:20 technology, like infrastructure, they have at least better infrastructure, right? So it’s always kind
0:06:21 of shocking to me.
0:06:26 Right. I mean, Japan has been ahead in this game, to be honest with you, since 20 years ago. I even
0:06:32 hope you could pay in subway in Japan with your Docomo phone, which it’s only arrived now. I think
0:06:36 COVID was what changed. It took a virus to change our behaviors in the US, right?
0:06:37 Yeah.
0:06:40 But here we are now, I think Apple Pay has been accepted. But like I said,
0:06:45 tectonic shifts in your hardware, in your software, in the way you create content,
0:06:49 and in the way you consume content as well. So, you know, like watching a short form video today,
0:06:54 you can watch without disruption, right? I mean, previously, I remember like watching a video on
0:06:59 the phone, you would have to wait for that lag, you know, the latency on the video to happen,
0:07:04 right? And now I think, so all these factors have honestly have contributed to video really
0:07:07 taking off with when it comes to commerce. So what you’re saying makes me think that like
0:07:12 the future of, like, let’s say e-commerce 2.0 or whatever this is going to be, you know,
0:07:17 beyond just Amazon ruling everything, it feels like there’s probably two things. I assume that
0:07:22 eventually we’re going to have AIs that understand us as a person very well, and they’ll probably be
0:07:27 able to recommend things to us really well, maybe better than Amazon in the future. But then on the
0:07:32 discovery side, you know, if you’re somebody who wants to discover new things, you know, maybe that
0:07:36 starts leaning into the videos, right? Like, oh, I watched a video and, you know, maybe my wife saw
0:07:40 a video and she saw a purse and it’s like, who’s got that purse? There’s been a few times actually
0:07:44 where we’ve seen people that had a purse that she liked and literally once or twice, I asked them
0:07:48 where they got it. And she was shocked that I did that. She’s very introverted. I’m kind of,
0:07:52 I just started talking to this woman. I’m like, my wife loves your purse. Where did you get that?
0:07:56 And so I could see in the future that being like a main way that people discover it just through
0:08:01 social. And especially if you just make it simple and just click a button or something and you buy it,
0:08:05 that makes a ton of sense to me. Yeah. What we’re seeing is that I think this is what Instagram and
0:08:09 TikTok have really nailed, right? I mean, the algorithm is able to figure out, you know, who
0:08:14 you are, what your interests are. And based on your interests and your history, they’re able to
0:08:20 recommend certain products on your TikTok feed or your Instagram feed as well, right? People forget
0:08:24 that TikTok was like considered like the big AI startup as of like five years ago, right? It was like,
0:08:28 oh, they’ve got the best algorithm. They got the best AI. And then it just came out,
0:08:32 right? Yeah. Right, right. And I think the shift that’s happening now is this is moving now towards
0:08:38 retailers and like Amazon, right? I mean, if you go on shop on Amazon now, Amazon is leading all their
0:08:43 product pages now with video content. So there is the branded video at the top of the page. There is a
0:08:49 user-generated content in the middle of the page. And then Amazon runs its own retail media network as
0:08:54 well. They’re showing competitor ads, video ads on the bottom of the page, right? Okay. I mean,
0:09:01 my guesstimate is Amazon has amassed 250 million video reviews and video on their platform, right?
0:09:08 So that shift is now video is now jumping, I would say from social to retailers. And eventually we will
0:09:14 see video across, you know, all different retailer and DTC and other sites. And so essentially it’s
0:09:20 bringing social commerce to your retail and your DTC shopping experience, right? Now, there are some
0:09:26 challenges with that. And I think one of the big challenges with video in general is no one has time
0:09:32 to watch videos, right? So what I mean by that is when you’re in your shopping mindset, you do want to
0:09:37 watch the video, but you just want to watch the right video, right? So you’re buying a car now and you’re on
0:09:43 the BMW website. You have selected your car model and that 35 video on the page, right? There could be some
0:09:48 branded videos, BMW showcasing the product, the car features. There might be some testimonials from
0:09:54 customers, right? So let’s say you’re looking, Nathan, for a car with leather seats that is blue in color and
0:10:00 I’m looking for a car with child seats, right? With a female driver, right? So how do you know which video is
0:10:06 talking about leather seats versus child seats? So as we move towards video content consumption, I think the next
0:10:12 big problem is how do I find something in a video that’s relevant for me, right? Perhaps even personalized for me.
0:10:16 I mean, how many times have we been on a fashion website where you’re a guy and you’re being shown
0:10:22 videos featuring women, right? I think video personalization is not there yet, but it’s where
0:10:27 things are headed, right? So coming back to the BMW example, today most shoppers select a random video,
0:10:32 so they want to find that answer about leather seats. They select a random video based on a thumbnail.
0:10:37 A lot of videos today don’t have description and in fact, TikTok, Instagram videos don’t have a title
0:10:42 in description anymore, right? So how do you find something in a video? So I think finding ways to
0:10:48 help shoppers find what they need inside the video with video search and personalization recommendations
0:10:54 is going to be extremely important and key to really driving more conversion. And I’ll tell you this,
0:11:00 that video just, you know, from our own experience here at VARL, video increases brand trust by 20x.
0:11:00 Okay.
0:11:07 It increases conversion anywhere from, you know, 5 to 28% and it drives deeper engagement. So when
0:11:09 done right, it works, it’s magic.
0:11:12 You’re talking about like if somebody’s seeing a product, like if they see it, it’s someone using
0:11:15 a product versus just reading about it or whatever. It’s like a 20x jump.
0:11:22 Yeah, yeah, absolutely. Like, and again, brand trust, right? For someone to see how something works,
0:11:25 right? Whether it’s you’re putting your furniture together in the house and it’s a how-to video,
0:11:30 or perhaps it’s a makeup video and you’re seeing actually how you can apply makeup for your scan,
0:11:36 right? All of this, or you’re looking for a recipe and you’re just able to find in a 30-minute
0:11:40 recipe, you know, you want to know how to braise the chicken, right? If you’re able to find those
0:11:44 answers, that really creates a delight. And ultimately that’s going to drive a purchase, or
0:11:49 maybe people will spend more time on your site and do more, right? I mean, that’s the ultimate goal
0:11:54 that the brand. So search is a big problem. And the other big problem is no one has time to write
0:11:58 videos. They just want to, both on the brand side, you know, when they get a lot of video data,
0:12:03 they want to be able to find the video that’s going to help their shoppers, right? So how do
0:12:08 they find those nuggets? And on the shopper side, how do you find the video that’s going to answer
0:12:13 your burning question? So you can make a go, no, go, or even a comparison video, right?
0:12:17 Right. How do brands do it today? I know you guys have been working on a solution for this,
0:12:20 like before viral, like how would brands do that? How would they find that?
0:12:24 Yeah. Yeah. So to be honest with you, like, I think before viral, what we were seeing was
0:12:29 brands are creating content on their TikTok, YouTube, and Instagram, but we were seeing that
0:12:34 most customers, I would say both brands and retailers are not utilizing that content on the
0:12:40 e-commerce site, on their social site. So I would say the first thing is make sure you bring all your
0:12:44 video content that you’re producing even on your social media to your e-commerce site and website.
0:12:48 So we’re seeing that gap right now where they’re producing a lot of content for social, but hey,
0:12:53 they’re forgetting that at the end of the day, people do land on your website or your retailer on
0:12:57 your product page. And it’s the last mile where they make the product decision. So if you can also
0:13:02 influence them in that last mile in the shopper journey with video content, that’s going to be
0:13:07 extremely helpful. So I would say that’s number one. The number two is we see a lot of companies just
0:13:13 putting a video carousel and the video carousel is just placed there. And again, the problem is no one
0:13:17 has the 10 videos, right? I don’t have time to watch 10 videos. I just want to find that relevant
0:13:21 video, right? So how do you bring that relevancy is still a big problem on most sites. And then
0:13:27 brands are also, they have a lot of videos, how-to videos, support videos. And in most cases, we just
0:13:32 see a page with a list of videos. So again, they’re missing out by providing search, personalization,
0:13:39 some sort of recommendation engine. They’re missing out on really driving that self-service journey,
0:13:44 if you will, for the shopper to either buy your product or address a comment or concern they might
0:13:48 have with the video. So you’re talking about like a person on social media, they’re using a product and
0:13:52 now the brand is leveraging that video to promote the product. I wonder if there’s any way you could
0:13:58 create this kind of like, you know, loop where the person who created the video also benefits from that
0:14:02 being used somehow. Like there’s like a, you know, a link to them or something, because that could even
0:14:06 incentivize like, oh, now more influencers are going to want to use my product because, you know,
0:14:08 they might get mentioned now, right? They might get a boost from that.
0:14:11 Absolutely. Absolutely. I think that loop is coming circular, if you will.
0:14:17 So I’ll give you just a couple of examples. Now we’re like, we’re seeing now brands ask customers
0:14:21 for videos. So today it’s a bit of a manual process, but then there are tools like Varl,
0:14:27 like today after you make a purchase, now with Varl, you can set up a QR code or a message to invite
0:14:33 customers to make a video review, right? So what I’m going to do is I’m going to show first a demo example
0:14:38 on our site, and then I’m going to show you some live customers on how a number of our customers are
0:14:43 using the video carousel and more intelligent. I would say at Varl, we have built the next
0:14:49 generation video shopping experience, if you will. And really what we do is, so this is an example
0:14:54 where you’re buying an electronic toothbrush on an OLB website, you land on the product page,
0:14:57 and there are five video reviews of videos on the page.
0:15:03 So now, how do you know which of these videos are talking about what, right? So for example,
0:15:08 let’s say I want to know what people are saying about, it’s an electronic toothbrush, I want to know
0:15:13 what people are saying about the brush. So I can instantly search, voila, Varl found the videos now
0:15:19 talking about search. And you can see we generate all the key highlights, and this is all AI and ML driven.
0:15:20 Oh, that’s awesome.
0:15:25 So essentially, we make all your video, the audio, text, images, and transcription searchable,
0:15:29 and we generate the clips. We were searching for people are saying about the brush, you can instantly
0:15:34 find all the clips about the brush. So now before Varl, you would have to watch the entire video. Now
0:15:39 with Varl, you can just find the clips. And so what this does is this helps you find the answers you’re
0:15:43 looking for across one video or multiple videos. So if you look at the experience, right, we generate the
0:15:49 video summary, which is very extremely helpful for SEO. So Google can read the summary and the tags and
0:15:54 the transcript. We also generate highlights on top of the videos. So these are all the key highlights
0:15:59 that we have generated on top of this about what’s being said in the video as well. So if I go to the
0:16:04 next video, it will show you the next search result about talking about brush. And it also shows you the
0:16:08 highlights as well, right? And what’s cool is this is integrated with the shopping experience.
0:16:13 So as a shopper, you can just, or the shopper website, this is integrated with buy now. So
0:16:18 essentially a customer can hit buy now and add the product to the shopping cart and make a purchase.
0:16:24 Hey, we’ll be right back to the show. But first I want to tell you about another podcast I know
0:16:29 you’re going to love. It’s called Marketing Against the Grain. It’s hosted by Kip Bodner and
0:16:34 Kieran Flanagan. And it’s brought to you by the HubSpot Podcast Network, the audio destination for
0:16:38 business professionals. If you want to know what’s happening now in marketing, especially how to use AI
0:16:43 marketing, this is the podcast for you. Kip and Kieran share their marketing expertise,
0:16:48 unfiltered in the details, the truth, and like nobody else will tell it to you. They recently
0:16:56 had a great episode called Using ChatTBT03 to Plan Our 2025 Marketing Campaign. It was full of like
0:17:02 actual insights as well as just things I had not thought of about how to apply AI to marketing.
0:17:08 I highly suggest you check it out. Listen to Marketing Against the Grain wherever you get your podcasts.
0:17:15 So I’m going to just show you really quick a couple of other examples of live customers. So we have a
0:17:23 beauty brand, RX. They do beauty products for women, especially for curly hair. And they have now added
0:17:29 video content on all their sites. It’s a very rich, engaging experience. As you can see, you know,
0:17:32 the previous video didn’t have any speech, so we didn’t generate the highlights. But whenever there’s
0:17:37 speech, we generate the summary, we generate all the key highlights of what they’re saying. So even as a shopper,
0:17:42 you don’t have to like watch the entire video. Yeah. You can watch the clip that’s relevant for you
0:17:47 and make a purchase. This seems great for brands. You know, I think for average, like product videos
0:17:51 are so boring, right? Yeah. And usually you watch them as like, okay, whatever. It’s like highly
0:17:55 edited, you know, whatever. Yeah. But like something like this, like you actually see real people using
0:17:58 the product. Yeah. That would convince me more. Yeah. Also for a brand, that’s great, right? Because
0:18:01 they don’t even have to spend all this money on this advertisement. It’s not even going to work.
0:18:07 Exactly. Show the real people using it. That’s the ad. Right. And the beauty is, you know,
0:18:12 we have integrated with Instagram, TikTok, YouTube, Dropbox, Google. So we can bring all your TikTok
0:18:17 content that you’re producing. We can bring all that content to your e-commerce site. So essentially,
0:18:22 we have built a mechanism to pull in all your videos that you have already produced, or you can also
0:18:27 leverage our platform to capture video reviews as well. So here’s an example of a head. So this brand
0:18:32 has increased their site engagement by about 3.8x. They’ve increased their revenue overall
0:18:38 conversion on average by about 6% last quarter. So video has a direct impact because again, you can
0:18:42 see these are real testimonials from customers. They’re talking about how they can use the product
0:18:47 and more, right? So I’ll give you another example. So here’s a company that does hot sauce, right?
0:18:52 So they have a number of recipes. You know, there’s only so much heat I can bear. I don’t eat like very,
0:18:57 very hot foods. So I want to know, is there a comment about heat and about the sauce, right?
0:19:01 So instantly, it’ll find the video talking about heat, right? So this is a game changer. What is
0:19:06 this doing, right? This is going to help you instantly decide, like, for example, this comment
0:19:11 will help me decide whether, you know, where I want to use the sauce. Is it talking about heat? It even,
0:19:16 we generate all the key highlights about this hot sauce as well, right? So here we bring search,
0:19:22 personalization, SEO, recommendation engine, and buy now to all this content.
0:19:26 Yeah. I love that idea. I guess my one question there, it’s like, it’s almost like introducing a
0:19:30 new user behavior though. Like, are people interacting with that feature a lot, the search? Because
0:19:33 it’s an amazing feature. I’m just curious if people know how to use it, right?
0:19:37 Yeah. It’s a great question. So what’s happening right now is there’s a couple of things happening,
0:19:41 right? Like everything new, it’s very clear on top of the videos, right? That you can search,
0:19:46 right? So we’re also in the process of integrating this video search with your site search,
0:19:49 right? Okay. So where you have the site search, I mean, most people are familiar with tech search,
0:19:53 right? Right. So when you start typing the product name or you look for a keyword,
0:19:58 you will essentially find the videos. It will also surface video content as well. So we’re actually
0:20:01 building this as we speak. You should do probably like good suggestions as well, right? Like almost
0:20:05 like auto-complete. Absolutely. You’re searching for that and it’s like, how hot is it or whatever,
0:20:09 something like this. And then that’s, and then it shows the videos. Right. And I think to your point
0:20:12 though, we’re doing sort of a crawl, walk, run approach, right? The crawl approach is,
0:20:17 hey, let’s put an amazing video experience on your site. The walk approach is, let’s make sure we
0:20:22 integrate search across your entire site on all your pages as well, right? It could even, you might
0:20:27 even have a support page for your brand, right? So we have different implementations you can put
0:20:32 essentially on, even on your homepage or your product page. You can also put videos as well,
0:20:36 right? So essentially, Warwell has a set of e-commerce tools where you can bring search capability to
0:20:41 your, for your site search, for your homepage. We also bring a very rich video experience as I just
0:20:46 showed you on your site as well. It’s a whole new way of shopping that’s never happened before.
0:20:49 Yeah. It seems awesome. I guess one thing that keeps popping my head is like, I’m sitting here
0:20:54 thinking about like the future of the web and my slight concern is like, does ChatGPT eat the web?
0:20:59 Do people like, you know, basically live in ChatGPT? They’re not even, you know, maybe the future
0:21:03 during the browser, like you literally just like open up the ChatGPT app and that’s your like
0:21:07 surface into the web versus using a browser. Yeah. Kind of worry about that.
0:21:11 I think to be honest, like Nathan, it really is going to come down to trust. Yeah. Can you trust
0:21:16 ChatGPT to complete your transaction? I think there’s already enough scams and frauds with payments and
0:21:21 whatnot, right? Yeah. The challenge is with any LLM says, even if they are able to, I mean, you know,
0:21:26 like to your point, if you go on perplexity, it shows you, you know, I was looking for a role B
0:21:31 toothbrush. It essentially brings up a quick pop-up with the image. It summarizes the reviews
0:21:36 from multiple platforms. So I think there’s a benefit to getting that review, perhaps a summary
0:21:43 of the reviews on ChatGPT, but ultimately my reward points are tied to Amazon. My reward points are tied
0:21:48 to Target, right? You see where I’m going with this? So ultimately I believe that when a person is
0:21:52 shopping, they might even want to do comparison shopping. And, you know, as you can imagine,
0:21:56 retailers do a fairly decent job now showing you similar recommendations for similar products
0:22:01 across multiple brands, right? And it also will depend on the size of the purchase as well.
0:22:06 So for example, like, you know, people definitely buy, if the product is under $50, you’re very likely
0:22:11 to purchase it on TikTok, Instagram. I think the amount is much lower for trust and fraud to worry
0:22:17 about versus when you’re buying a car or when you’re buying a $300 cappuccino machine and really want to
0:22:21 see all the videos on how the machine works, the espresso machine works, right? So I would say that
0:22:25 the last miles still happen on the retailer or the brand website.
0:22:29 Yeah. I feel like what’s probably going to happen is ChatGPT will send people to websites still.
0:22:32 I hope that’s what’s going to happen, right? Like I’ve been seeing, that’s actually been a big thing
0:22:38 I’ve been thinking about and seeing people post on X about is that everyone’s saying that traffic from
0:22:43 ChatGPT is converting better than Google, which is just like, oh crap. So what does that mean?
0:22:47 That’s a huge disruptor to Google long-term, which Google’s now getting good at AI,
0:22:50 but it feels like people will still go to the website. And then when they’re looking for the products,
0:22:54 yeah, a lot of people would love to see a video of a real person using it.
0:22:58 And again, you know, also don’t forget in-store shopping experience. I would say 90% of still,
0:23:04 you know, big purchases happen in-store as well, right? So, but can bring video to in-store shopping,
0:23:10 you know, we’re seeing more and more packaging with QR codes. We’re seeing even on televisions in
0:23:15 stores as well, as they’re playing different product reviews and product videos, you are now actually
0:23:21 able to just scan the QR code and open a video experience to read the reviews or watch the
0:23:26 videos in-store as well, right? So my point is, I think it’s going to be an omni-channel experience
0:23:27 at the end of the day.
0:23:31 You know, there’s a few places in Japan, I’m not sure if America has this yet, but when you shop,
0:23:37 you just put the product down and then they know how many of the products you have. I’m not sure
0:23:38 exactly how they’re doing that.
0:23:42 You literally just put the products there and there’s no scanning. It just knows how many of
0:23:46 the products you have. That’d be really awesome in the future if there’s some area in stores where it’s
0:23:51 like, you just put the product down and then it’s just like, oh, here’s on social media, you know,
0:23:55 all these people. Maybe there’s like one famous person, but there’s people who are not famous and
0:23:58 it’s kind of a mixture and you could just click it and watch the videos. That’d be so cool.
0:24:02 Again, the future of commerce with video is going to be very different.
0:24:06 Right now, I would say it’s more two-dimensional video. You know, I wouldn’t be surprised if we
0:24:12 see 3D AR, VR experiences as well with video content and reviews, which you might be able to just
0:24:17 experience with your phone. And maybe, you know, as you’re seeing more smart eyewear come through,
0:24:23 I can see an application where you just stare at a product with your eyewear and it’s literally
0:24:27 pulling the video review on your screen. I’m not kidding, right? It’s coming. It’s a matter of time.
0:24:29 Yeah, I can definitely see that in the future. You’re looking at a product,
0:24:34 you see the reviews. I want to see videos as well. That’s so cool.
0:24:38 So I think this might be a good segue into talking about, you know, what Vyral does behind the scene.
0:24:39 Yeah, for sure.
0:24:42 I’ll do a quick demo of the dashboard, just a very teaser demo.
0:24:45 Yeah. So if you could show me how Vyral works, show me the tech behind the scenes,
0:24:48 you show me the widget. So that’s cool. But like, how would a brand actually use this?
0:24:54 Yeah, absolutely. So essentially, you know, the good news is most brands that do 5 million plus in
0:24:59 revenue already are producing some video content. So, you know, as I mentioned before,
0:25:04 the biggest challenge today is making all your video content searchable and making it useful.
0:25:11 So really what I’m showing you is a demo for Oral-B. It’s really a sample snapshot of some of the data.
0:25:17 So in this case, we captured 24 video reviews. The brand was interested in managing videos for a few
0:25:22 of their electronic toothbrush products. So what we did was we captured reviews from customers via QR code
0:25:28 campaigns. And we also captured reviews from social media. So in this case, these 24 video reviews came
0:25:33 in from either social media or via campaign. So essentially on the Oral platform, a brand can
0:25:38 essentially set up a QR code. So here’s an example where you can set up a QR code campaign and you can
0:25:44 invite your shoppers. So the brand actually sets this up and they set up the personalized experience.
0:25:48 And what you’re seeing is actually a mobile experience. They might issue, give you a reward
0:25:53 for making a video. It’s optional campaign description, brand instructions. And the big problem we have
0:25:59 solved, Nathan, is licensing. So the biggest fear that brands have is someone putting a video on their
0:26:01 site or something that has been unlicensed and they get into a lawsuit.
0:26:06 How do you capture video reviews at scale, right? Is the problem that the viral has solved. So now
0:26:12 with us, you can essentially create a campaign and it generates a QR code and you can tie this QR code
0:26:17 to your shopping experience. So you can, after a customer makes a purchase, you can invite the
0:26:22 customer to upload a video review. So when they scan the QR code, it will prompt them with the campaign
0:26:27 details and it will invite them to upload a video. And what happens is the video directly comes on the
0:26:32 viral dashboard and it gets analyzed. So in this case, we have 24 reviews that came in. The video is
0:26:36 social, we track your social media engagement. We track demographic who’s in the video by age,
0:26:40 ethnicity, gender. So you’ll notice at a glance that there are 40 to 50 year olds missing in the
0:26:45 video. So we have actually analyzed who is in the video. And by the way, we don’t store any personal
0:26:49 information. This is simply at a very high level, helping brands understand, you know, are people
0:26:53 making your videos, aren’t they on target with your demographic, right? So you notice there’s no
0:26:57 40 to 50 year olds in the video. So maybe if they’re targeting that demographic, they should have
0:27:02 videos featuring 40 to 50 years old. And then below, we launch a rating system.
0:27:06 So at a glance, brands can decide whether they want to publish the videos or not. So for these
0:27:11 products, for this SKU, we captured 11 reviews and they have a score of 89. So the higher the score,
0:27:16 better the video. So the score is about 60. The brands can promote the video. If the score is below
0:27:20 40, it’s very negative. And we even generate the highlights. So if you select the video, it will
0:27:24 take you inside the video to the key highlight. So as a brand manager, e-commerce team, you don’t have
0:27:29 to really spend time watching all your videos. Now, what makes Varo really special and unique,
0:27:35 and, you know, recently we just landed a contract with TikTok to power video reviews for TikTok and
0:27:40 for the 500,000 TikTok shops. And the reason we got that contract is we have 150 different filters
0:27:45 on the platform. So on the right, you see all these filters. So essentially, it’s a mechanism for brands
0:27:50 to moderate their video content. And all of this is offered via an API. So you can now search by positive
0:27:56 sentiment. You can say, show me everyone in the video that’s 18 to 24, 25 to 30, to 30 to 40 years
0:28:01 old. And I’m looking for a video that talks about battery life that I want to promote. So you can
0:28:05 instantly search the word battery life or a product feature. It’ll show you all the comments about
0:28:10 battery life. And not only can you see the comment, you can even open the video. It will take you inside
0:28:16 the video. So essentially, either TikTok will help capture a video after the purchase is made on TikTok
0:28:21 shop. Yeah. And then viral will analyze the video and we’ll have a merchant dashboard where TikTok
0:28:25 merchants can log into the viral dashboard. They can see all the insights for all the video reviews,
0:28:31 and they can then publish that content to the TikTok shop. So TikTok has given us exclusive right
0:28:35 to be able to publish video reviews that are vetted, right? Oh, that’s awesome. That’s huge.
0:28:40 Because TikTok has 30 rules. You know, you cannot mention the word Amazon in the video. The video has
0:28:44 to be less than three minutes long. You cannot have minors in the video. It has to be brand safe. It has to be
0:28:48 properly licensed, right? There should be no profanity in the video. So Vorl is building a
0:28:54 filter, a TikTok filter, an Amazon filter, or a Walmart filter to be able to vet this content with
0:28:58 these filters we have on the right, right? That’s the use case for TikTok. For other brands and retailers,
0:29:03 they can use our platform to manage all their video content across multiple SKUs, across multiple
0:29:08 platforms. So we have one dashboard. We are now going at Gentic as well. So we just launched an AI
0:29:14 e-commerce agent, where you can ask questions about your video data, and you can get answers. So here’s a couple of
0:29:19 examples. You know, I wanted to know what are some of the top topics being discussed in the videos, right?
0:29:19 Right.
0:29:22 So I can ask questions now. So it gives you an answer.
0:29:22 Oh, that’s awesome.
0:29:27 So now brands can, instead of having to watch videos, they can spend more time on action and
0:29:31 content. They can instantly get insights. So they want to create content. Well, they just launched a
0:29:35 new toothbrush. Hey, what are customers saying about cleaning? Right. They get a summary along with the
0:29:36 clips, right?
0:29:40 Right. You probably could get unique insights from that too. Like, oh, people in Japan are talking about
0:29:43 this or whatever. What are they saying? And like, you could actually learn about new opportunities for
0:29:44 your brand through that too.
0:29:48 And speaking of insights, these are all the insights we deliver. And I’ll just give you a quick
0:29:53 example of an insight. So if I open the speech report, by the way, brands love the speech report.
0:29:58 They use it for SEO, for identifying competitor mentions and trend analysis. So we can take you
0:30:01 inside the keyword and you can even play the clip. It’ll take you inside the video, right?
0:30:02 Right.
0:30:06 So we have different reports. And I want to show you one last thing, the level of insights we offer for
0:30:11 our customers and the way we make the video searchable. We generate the video summary. We detect the
0:30:14 sentiment score, whether people are saying positive, negative about your product or brand.
0:30:19 We detect languages, sentiment analysis, topics, demographic. We have marketing workflows.
0:30:23 You can do sentiment analysis of the audio. We even break it down by product feature,
0:30:28 which has never happened before. So we have built our own models. So it’s not LLMs. We actually have
0:30:33 our own AI, about 18 plus models that understand everything inside your video review and your video
0:30:34 content.
0:30:35 You say 18 models?
0:30:40 Yes. So that’s why we’ve been at this for a while. And we’re going deep with our models in e-commerce,
0:30:44 right? So here’s where, you know, we give you all the insights on your content. So we then have
0:30:48 a mechanism to publish content. So you can build your own carousel. So programmatically,
0:30:52 you can publish content to your site and you can customize the whole widget, the colors,
0:30:56 you know, whether you want the highlights, whether you want the search or not. And instantly,
0:31:00 and we can enable a search experience. So that we have solved the Holy King of Commerce,
0:31:05 which is in-video search, personalization, SEO, recommendations, and buy now inside the video.
0:31:06 So cool.
0:31:08 Yeah. So that’s a quick, very short demo.
0:31:12 Yeah. One thing that like really stuck with me, I was thinking one of the big values was like
0:31:17 having the more organic videos talking about a product. And so it seems like most brands are
0:31:21 only doing the opt-in. Is that like, they have like a campaign. Is that the main thing they’re
0:31:21 doing?
0:31:25 Well, today what happens is outside viral, if they’re not using viral, they usually do branded
0:31:27 and influencer videos. Right.
0:31:31 The problem is how do you capture a licensing rights from your shoppers at scale? Right.
0:31:32 Right.
0:31:37 So with viral, you can now program the QR code either with our API or with our campaigns on your
0:31:38 store. Yeah.
0:31:43 And after a customer makes a purchase, they get the QR code. So essentially the customer scans the QR code,
0:31:45 makes the video, accepts the licensing terms. Yeah.
0:31:48 They’re sharing their licensing terms, by the way. Right.
0:31:50 And essentially they’re giving the right to use the video at scale.
0:31:55 Yeah. Random idea. Like I would imagine like now you could take something like, I don’t know,
0:31:58 the new Gemini 2.5 or something like that. I’m not sure how well this would work at scale,
0:32:03 but it seems like a really easy experiment. But you could have that email or contact all these
0:32:08 people and like hype it up where your name’s going to be linked on there and stuff. So you may gain
0:32:12 some followers. Like if you’re someone who has like 200 followers or a thousand followers and like a
0:32:16 brand’s contacting you and it’s like, they’re going to show my face and they’re going to like link to me
0:32:21 somewhere. Like I think a lot of people, if there was like a one click opt-in, you know, and then you
0:32:24 got the contract, it’s like, okay, it’s fine now. It’s good. I’m not sure if you guys would do that.
0:32:26 That is something worth considering in the future.
0:32:30 Yeah. I mean, I’ll just, I’ll just give you an example. Like I think where things are headed,
0:32:34 I’ll give you a fun example. So imagine you just watched a Mission Impossible movie
0:32:39 and you’re coming out of the theater and you bought a ticket on Fandango or AMC theaters.
0:32:44 And imagine you get a quick nudge by the time you’re home from Tom Cruise saying, Hey, RJ,
0:32:51 do you have a minute to chat with me? So this is where future of commerce and conversational AI is
0:32:55 going with video reviews, right? So imagine you open the video and you’re just intrigued. Hey,
0:32:58 Tom wants to talk to me, right? So, you know, it’s a licensed avatar of Tom Cruise,
0:33:04 just to be clear. Right. And imagine like Tom Cruise says, Hey, I have three questions for you.
0:33:08 Do you think I should make Mission Impossible 5? Right. And how is the theater experience? Right.
0:33:14 And what else can we do better? Right. Or any other ideas? Right. So this is where brands can program
0:33:20 25 questions, right? Maybe Tom Cruise, his production company could have asked 25 questions to his fans and
0:33:23 followers. Yeah. That’s definitely going to happen. So I’m not sure if you know this,
0:33:27 but the original way I was using lore is I was actually lore.com is I was partnered with
0:33:31 Barry Osborne, the producer of Lord of the Rings and the Matrix. Yeah. And he helped create Weta with
0:33:36 Peter Jackson, right? The big special ethics company. They also do a lot of the gear. And so I got like
0:33:43 a VIP tour of Weta. And what I was shocked by was they had the facial scans of so many famous actors.
0:33:47 And apparently all of these studios and production companies, they already were thinking about AI.
0:33:51 Oh, yeah. Like for a long time now. They’re like, yeah, in the future, you’re going to want to have
0:33:55 Tom Cruise’s rights to be able to use it in other products. And you have some kind of revenue share
0:33:58 with him or whatever his estate in the future. That’s crazy. That’s coming.
0:34:03 Even with your podcast, I wouldn’t be surprised if another six months, a pop-up comes up and you’re
0:34:07 chatting with me on your site. You’re an avatar of you and saying, hey, RJ, what podcast should I be
0:34:11 making? What topics are you interested in? And you’re actually chatting with me to capture some
0:34:16 information from your visitor, site visitor, right? Right, right. So I’m telling you like where things
0:34:19 are headed, I think the future is going to be amazing. I can’t wait. It’s going to be fun.
0:34:24 It’s going to be fun. And also just last thing I would say is I think all these platforms, including
0:34:29 viral, you know, we have to be transparent and honest about what’s AI generated versus not, right? I mean,
0:34:35 just a quick point I’ll make is that, you know, when it comes to video reviews, people do not want AI
0:34:39 generated video reviews. I’ll tell you that. So we’re building tools to identify, to make sure that our
0:34:44 review is human generated and we’re validating it’s a human on the camera and not something else.
0:34:49 If you were building a business today, how would you be preparing for like these changes are coming
0:34:53 with shopping? Like how could you take advantage of the changes that are happening? Well, I think from
0:34:58 a company perspective, I would say that it’s really important now that all your employees are up to speed
0:35:03 and trained on AI and how what AI does, AI works. In fact, I would say that the first question in the
0:35:09 interview should be AI related. Right. So I would say that training is very important. The other thing is also
0:35:16 make sure you are on top of things like whatever domain you’re in. There’s plenty of newsletters. There’s plenty of
0:35:22 tools now. I would say be hands on. Like I think we live in an age right now where whether you’re a CEO or whether
0:35:28 you’re, you know, everyone in the company, right, at all levels, right, need to be playing with tools because and now,
0:35:33 you know, tools are free. Right. They’re easily accessible on your browser. There’s no reason to complain that you don’t have
0:35:39 access. Right. I think that excuse has gone away. Right. Right. And also like getting your employees to play with tools as
0:35:44 well. Let’s have people try different things. Right. Right. And in general, there’s a transformation happening
0:35:50 as well. Right. And then eventually figuring out, you know, what is going to make your job easy, cost-effective.
0:35:55 How can we bring more efficiencies for employees with AI? I think that’s starting to happen.
0:35:59 And we’re already seeing that in our company as well. Right. Yeah. I’ve been seeing tons of
0:36:05 different CEOs starting to share like almost like an AI first approach to hiring. Like not trying to have
0:36:09 less people, but make sure that everybody you’re hiring knows how to use AI because that’s just going
0:36:14 to amplify their outputs by, you know, so much more. Yeah. Well, it’s been awesome talking with you. I
0:36:19 think I learned a lot, especially about TikTok and about how people are finding products through social
0:36:23 videos. It makes a ton of sense. Like I think in the future, it’ll be like how I kind of described in the
0:36:27 beginning where the LLMs will know you very well. They’ll be really great for like buying basic
0:36:32 things. Yeah. But there’ll also be things where you want to discover new products and, you know,
0:36:37 sync social proof is huge there. It’s huge. I would say like there are five areas directly that will be
0:36:44 impacted in future with AI, like, or with chat TPD. And that is content generational, conversational AI
0:36:51 with video, personalization, SEO. You know, I think the reason now LLMs are doing a better job is
0:36:56 they’re able to better understand content than ever before. Right. I think that’s the problem
0:37:02 they’ve solved, right. Whether it’s video or audio or text. So SEO will get better. SEO will improve
0:37:08 significantly with LLMs as well. I’m now even like coining the word LEO, which is LLM based engine
0:37:13 optimization, if you will. Oh, okay. You know what I mean? Like, yeah, I’ve been saying AIO. That’s true.
0:37:17 Doesn’t exactly make sense, but it’s kind of catchy. Where should people check you out? Like,
0:37:20 should they check out your website or are you active on social media anywhere?
0:37:25 Absolutely. So, you know, we’re very active on LinkedIn. You can check out my website as well.
0:37:30 So if you just go to viral.com slash commerce, if you want to play with the experience, you can do
0:37:34 that. By the way, the word viral, it’s a play on the word viral. I do videos go viral, right? But it’s
0:37:41 spelled a little bit differently. It’s V-V-I-R-I-L-L. So it’s viral. Right. Again, V-V-I-R-I-L-L.com.
0:37:46 Cool. Yeah. We’ll put a link in the description. So yeah, it’s been great. And yeah. Hope to see
0:37:48 you again sometime. Thank you. Awesome.
Episode 58: What does the future of shopping look like as artificial intelligence weaves itself deeper into how we buy and sell online? Nathan Lands (https://x.com/NathanLands) sits down with Ajay Bam (https://www.linkedin.com/in/ajaybam), CEO of Vyrill—a Silicon Valley startup revolutionizing shoppable video—to reveal the seismic changes AI is bringing to e-commerce, social discovery, influencer marketing, and everything in between.
In this episode, Nathan and Ajay uncover how Gen Z and beyond are shopping via authentic short-form videos, the power shift from mega-influencers to everyday creators, and how AI-driven platforms like Vyrill are making it possible to instantly search inside videos for the exact info you need before buying. Ajay shares real-world examples of brands doubling sales by making their video content searchable—and why the brands who nail video trust will dominate the next wave of e-commerce. They also discuss what ChatGPT-powered shopping means for traditional giants like Amazon, and break down actionable strategies for businesses and creators to thrive in this new era.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
-
(00:00) AI Revolutionizing Shopping Experience
-
(03:15) Authenticity Outshines Celebrity Endorsements
-
(09:21) Video Search and Personalization Challenges
-
(11:42) Integrate Social Videos on E-commerce
-
(14:43) Comprehensive Video Content Searchability
-
(16:35) Boost E-commerce with Integrated Videos
-
(21:57) Japan: Automated Shopping Concept
-
(25:03) Video Demographic Analysis Tool
-
(29:05) E-commerce Video Content Solutions
-
(30:20) Gemini 2.5 Promotion Idea
-
(33:22) Emphasizing AI Training and Tools
-
(35:50) Explore Vyrill.com on LinkedIn
—
Mentions:
-
Ajay Bam: https://www.vyrill.com/about
-
Vyrill: https://www.vyrill.com/
-
Shopify: https://www.shopify.com/
-
Perplexity: https://www.perplexity.ai/
-
Gemini: https://gemini.google.com/
Get the guide to build your own Custom GPT: https://clickhubspot.com/tnw
—
Check Out Matt’s Stuff:
• Future Tools – https://futuretools.beehiiv.com/
• Blog – https://www.mattwolfe.com/
• YouTube- https://www.youtube.com/@mreflow
—
Check Out Nathan’s Stuff:
-
Newsletter: https://news.lore.com/
-
Blog – https://lore.com/
The Next Wave is a HubSpot Original Podcast // Brought to you by Hubspot Media // Production by Darren Clarke // Editing by Ezra Bakker Trupiano
-
-
$1M+/yr Local Businesses Hidden in Plain Sight
AI transcript
0:00:03 I have a business that no one on a podcast has ever discussed.
0:00:08 It’s literally the first time this has probably ever been talked about on YouTube or in the audio format.
0:00:10 I’m breaking grounds here.
0:00:11 Okay, Jackie Robinson.
0:00:17 I feel like I can rule the world.
0:00:19 I know I could be what I want to.
0:00:22 I put my all in it like no day’s off.
0:00:24 On the road, let’s travel, never looking back.
0:00:30 So this weekend, I went to my daughter’s spring recital.
0:00:33 And Sam, when you see this, what does this look like?
0:00:35 This just looks like, I don’t know.
0:00:37 Just like a great ballerina, typical.
0:00:38 A program.
0:00:38 Yeah.
0:00:39 Right.
0:00:40 Yeah, show program.
0:00:42 And that’s what everybody in that crowd thought.
0:00:44 But not me.
0:00:45 I saw a business plan, Sam.
0:00:46 I saw a business plan.
0:00:47 I saw information.
0:00:49 I saw a giant information leak.
0:00:50 Okay, so check this out.
0:00:57 This woman has built a million-dollar-plus kids’ dance studio just down the street from us.
0:00:59 And I think this is remarkable.
0:01:04 And I think it’s a good reminder that, like, there’s these million-dollar businesses, like, all around you.
0:01:08 You don’t have to do something really grand or innovative to do it.
0:01:11 You just got to provide a service that people love, and you got to scale it in the right way.
0:01:12 So check this out.
0:01:16 On the back is a list of all the dancers in the show.
0:01:21 Now, all the dancers at the show are all the dancers across her three locations, all the kids, basically.
0:01:22 Everyone performs.
0:01:27 Okay, so I look at this, and everyone else is looking for their kid’s name.
0:01:29 I’m looking for top-line revenue numbers, okay?
0:01:30 I’m trying to figure it out.
0:01:34 And so I see, all right, each of these columns is about 50 names.
0:01:35 There’s six columns.
0:01:38 Okay, we got 300 kids at this dance show.
0:01:39 How much does this cost?
0:01:44 Now, I know that we pay something like $250 a month to be a part of the dance studio.
0:01:46 And this is the spring recital.
0:01:49 So immediately my head says, all right, we’re doing at least spring and fall.
0:01:51 Might even be doing four recitals a year.
0:01:52 I’m not sure.
0:01:57 I had just bought the tickets to this recital, so I know that in addition to the $250 a month membership,
0:02:00 you’re going to be paying for the uniforms.
0:02:03 You’re going to be paying for tickets to watch the show.
0:02:07 Of course, every single parent is going to watch their kid at the thing.
0:02:10 In fact, we brought grandparents with us and a few extras.
0:02:15 But you look around that theater, it’s totally sold out, standing room only.
0:02:23 I know a mom who was in our class who she did not log on to buy her tickets right away and therefore only got two tickets and got them in the back.
0:02:28 And so she kicked her husband out of the two tickets and was like, hey, tell your mother.
0:02:32 She got her mother-in-law to come with a walker so that they could go sit in the ADA seat.
0:02:36 Like, that’s how vicious the competition is for these.
0:02:37 The demand is insatiable.
0:02:37 Okay.
0:02:43 So basically, if you do the math on this and you say, okay, we spent $100 on the tickets for this recital,
0:02:46 then you spent $250 a month, and then you’re in this thing year-round,
0:02:55 you end up seeing that this is a business that’s generating a little over a million dollars a year in revenue.
0:02:56 So about $1.25 million.
0:02:58 So you said 300 names, $250.
0:03:02 That’s $75,000 a month in sales.
0:03:02 A month.
0:03:03 Just off that.
0:03:04 Just off that.
0:03:07 And then you add on the plus-plus, the shows, the tickets, the photos.
0:03:09 Oh, for the photo package, you spend $100.
0:03:13 It’s one thing after another, basically, that they sell to you.
0:03:13 And it’s great.
0:03:14 We’re happy customers.
0:03:16 And so you get there.
0:03:17 She basically does no marketing.
0:03:18 The show is for marketing.
0:03:25 And what ends up happening is that you – and then at the end of the show, she brings out the teachers to take a bow.
0:03:26 These are the teachers.
0:03:27 So I’m like, oh, thank you.
0:03:29 Now I see the OPEX line.
0:03:30 How do we got seven teachers here?
0:03:31 Okay, cool.
0:03:32 Seven teachers.
0:03:33 Got it.
0:03:34 And so I’m trying to figure this out.
0:03:42 Okay, so I’m pretty sure that this dance studio is netting somewhere between $500,000 and $700,000 in EBIT every year.
0:03:43 Okay?
0:03:44 Amazing.
0:03:48 Did she also – did she, like, arrive to the class in, like, an S-class?
0:03:50 Like, what type of car was she driving?
0:03:54 Yeah, I installed a tracking device underneath just to see where she lives now.
0:03:59 Did her Birka bag give out any hints as to how well the business was doing?
0:04:00 Called a Birkin.
0:04:01 Birkin bag, whatever.
0:04:02 I don’t know what she just called it.
0:04:05 Like a Middle Eastern bag that she just referenced.
0:04:08 The other day, Sarah wanted to go to, like, some concert.
0:04:10 And I was like, yeah, like, Charlie XC90.
0:04:13 And, you know, it’s actually, like, Charlie – I forget what it is.
0:04:15 You know, like, this new, like, hot girl or whatever.
0:04:15 I also don’t know.
0:04:17 That’s one of those where I just don’t say it.
0:04:18 XC90 is a Volvo.
0:04:20 I was like, yeah, Charlie XC90.
0:04:22 So he kept saying, like, XC90.
0:04:24 It’s like XC something.
0:04:26 All right, but go ahead.
0:04:40 So I just thought this was inspiring, that, like, this local – just this local service, dance shows for little kids, or dance classes for little kids, scaled to three locations, can be, you know, such a great business for somebody.
0:04:43 And they’re basically kind of, like, found that sweet spot of doing what they love.
0:04:45 She’s been doing it for, like, 25 years now.
0:04:53 She’s an institution locally, has a great community of people around her, and, you know, is making people happy, making families happy.
0:04:54 And I’ve just been seeing this everywhere.
0:04:56 All right, let me – can I – let me show you something interesting.
0:05:02 I was just talking to this guy, but have you ever heard of Goldfish Swimming Classes?
0:05:03 No.
0:05:10 It’s a franchise that I’m pretty sure does about $600 million a year in revenue, and it’s a children’s swim class.
0:05:12 And it was one of those things where he was talking to me.
0:05:21 He was basically – I’m not going to – I can’t reveal too much, but he was like, I quit my prestigious job because I want to get into the swim class business.
0:05:26 You know, everyone’s, like, probably giving this guy the same, like, look of, like, but you’re throwing it all the way.
0:05:32 And I kind of, like, was like, all right, tell me more because I’m sure there’s, like, there’s a story here because you worked in finance.
0:05:34 Like, you’re not doing this to feel good.
0:05:34 Tell me more.
0:05:38 And he started breaking down the economics of this Goldfish chain.
0:05:45 And he was saying something like each location does, like, $2 million in revenue, and they have something like 300 locations.
0:05:50 All right, here’s the deal.
0:05:52 If HubSpot tripled their price, I’d be screwed.
0:05:56 The reason I would be screwed is because my entire company is run on HubSpot.com.
0:06:02 My website, my email marketing, my dashboards, how I track my customers, literally everything.
0:06:08 And if they tripled the price, I would pay them more money, and that’s because the product is so freaking powerful.
0:06:10 My entire company is built on it.
0:06:15 And so if you’re running a business and you want to grow faster, you want to grow better, you want to be more organized, check it out, HubSpot.com.
0:06:16 All right, back to the pod.
0:06:19 Yeah, I’m on their site.
0:06:20 They got a lot of locations.
0:06:21 I don’t know about 300, but they got a lot.
0:06:23 It was something insane like that.
0:06:27 And it was another one of those things that was just, like, hidden in plain sight.
0:06:30 But I have a hidden in plain sight business.
0:06:37 Not hidden in plain sight, but a business that, like, no one on earth has, no one on a podcast has ever discussed.
0:06:41 And this is going to be the thing I’m about to talk about.
0:06:46 It’s literally the first time this has probably ever been talked about on YouTube or in the audio format.
0:06:48 I’m breaking grounds here.
0:06:50 Okay, Jackie Robinson.
0:06:56 Is there a Hall of Fame for things like these?
0:06:58 Because I would be in it.
0:07:00 All right.
0:07:03 So I just slacked you a URL.
0:07:08 Go to dysbuilders.com.
0:07:09 So you’re on this website.
0:07:11 Let me tell you a story really quick.
0:07:16 You know how the Amish are famous for creating amazing furniture?
0:07:17 Mm-hmm.
0:07:22 I wanted to buy, like, a bed for my kid.
0:07:27 And I wanted, like, an heirloom quality bed where I was like, man, I wish I had my bed from when I was a kid.
0:07:28 My crib from when I was a kid.
0:07:33 How cool would it be to give my daughter a bed that I can, you know, reuse for all of our kids.
0:07:35 And eventually one of them can let their kids do it.
0:07:36 So my grandkids have my bed.
0:07:38 So I was looking up Amish furniture.
0:07:44 And I came across this website randomly because I got interested in the Amish, like, craftsmanship.
0:07:47 Now, this website, it’s at dysbuilders.com.
0:07:48 I think they make homes.
0:07:51 Scroll all the way to the bottom where it says contact.
0:07:54 And read to me the email address that you see.
0:07:59 dysbuilders at ibyfax.com.
0:08:00 Okay.
0:08:04 So I noticed, this is just one example because it was easy to see.
0:08:15 But I noticed on many of these Amish websites, when I was looking at how to, like, place an order, I had to email, like, you know, amisfurniture at ibyfax.com.
0:08:21 There was all these really weird URLs that I had to email to fax.
0:08:22 And I got really curious.
0:08:27 So I want you to go to ibyfax.com.
0:08:29 It says, send and receive emails with your fax machine.
0:08:32 So I was seeing this.
0:08:35 And I got a tip from one of our listeners, Andy Allen.
0:08:36 And he emailed me this.
0:08:39 And it was just all coincidence that, like, six months prior, I was, like, wondering what this was.
0:08:43 So let me tell you the background of this.
0:08:47 So if you’re Amish or you’re Mennonite, a lot of them are very entrepreneurial.
0:08:49 And they work with the outside world.
0:08:53 So they make furniture that they sell to, you know, people like me.
0:08:54 They have websites.
0:09:02 However, according to their religion, they are not allowed to use certain technology that’s considered individualistic.
0:09:14 So looking down on your iPhone or sitting down in your home and staring at a computer screen, they think that it either brings, like, things either bring them closer to God and other people or it takes them away.
0:09:18 And they feel, according to their rules, that, like, looking at your phone and using the Internet brings them away from other people.
0:09:22 However, they have all these websites and they sell furniture.
0:09:23 Well, how do they do it?
0:09:25 Well, there’s this small website.
0:09:29 I think it’s called iBuyFax, as in InternetBuyFax.com.
0:09:37 And it’s a service where you pay something like $20 a month plus, like, $0.10 or $0.50 per fax.
0:09:46 But basically, on the campus, I don’t know what they call it, the campus of a lot of Amish towns, there is literally, like, a small house, like a shanty.
0:09:49 And in that small house is a fax machine.
0:10:02 And if you’re an Amish guy running a website and you want to see how your orders are doing or somebody emails you and they’re asking a question about a bed frame and, like, can you do this or can you do that, they go to this fax machine and this phone.
0:10:07 They have in their small box, which I have a photo of, by the way, in our document.
0:10:11 But it’s, like, literally a tiny, tiny little outhouse.
0:10:18 It’s, like, a little outhouse where you make the call and you talk to your customer, but you have to use iBuyFax.
0:10:25 And so the iBuyFax service, what they’re going to do is they’re going to collect all of your emails and they’re going to fat, they’re going to.
0:10:27 This photo is outrageous.
0:10:28 It’s outrageous.
0:10:29 It’s like an outhouse.
0:10:33 It’s literally, it looks like a phone booth slash port-a-potty.
0:10:38 It’s in the middle of the road and on the wall is just a tiny phone, like a corded, a phone with a cord.
0:10:39 Yes.
0:10:40 And so that’s because.
0:10:41 There’s not even a fax machine.
0:10:42 Where’s the fax machine?
0:10:43 So some of them have fax machines.
0:10:44 Some of them have phones.
0:10:54 And so iBuyFax, they’ll either call you and be, like, the middleman and answer the questions, like, hey, Linda at gmail.com, she’s in this place.
0:10:56 She wants to know, can you make a bed like this?
0:10:59 Or Dave wants a child’s bed, but he wants it to be in this color.
0:11:00 Can you do it?
0:11:01 And they’ll reply.
0:11:11 And they’ll either handwrite the reply and fax it back, or iBuyFax has people who they say that they will actually, you talk to them, and they’ll be your middleman.
0:11:19 And another thing that they’ll do is, let’s say that you need, let’s say you want to buy something off eBay, or you’re trying to figure out what the price is of a certain farm equipment.
0:11:24 You can ask, iBuyFax, please tell me, you know, how much it would cost to buy blank on eBay.
0:11:28 And they’ll reply back by fax or telephone answering your question.
0:11:34 And so this way, the Amish can do business with the rest of the world, but aren’t breaking their rules.
0:11:40 And this website that I found, it’s used on all of the websites.
0:11:47 So if you look at Amish furniture, if they’re like really OG Amish, a huge percentage of them are using this website.
0:11:50 And the Amish community, it’s not tiny.
0:11:52 It’s about 400,000 people.
0:12:01 And they are like very entrepreneurial, like that’s like part of like the religion is to be like self-reliant or part of the community rules is to be like self-reliant.
0:12:04 And like Amish furniture is definitely like a well-known thing.
0:12:06 Amish crafts of all types are a well-known thing.
0:12:13 And this guy who emailed me, he goes, I own a business that buys and sells wooden pallets.
0:12:15 And in particular, we are based in Pennsylvania.
0:12:18 And so we work with mostly Amish people.
0:12:24 And whenever they communicate with us, which we work with a lot of them, they only communicate with us by iBuyFax.com.
0:12:26 And they’re all using this website.
0:12:29 And this website, here’s where it gets kind of funny.
0:12:31 I looked on LinkedIn.
0:12:34 I can’t find too much about any of the background on it.
0:12:36 The only, and I’m not going to blow this guy’s spot up too much.
0:12:38 So I’ll only say his first name.
0:12:41 The owner, his first name is Jamal.
0:12:47 And so in my head, I’m like, and he lives in New York.
0:12:54 I’m like, is there a brother who just came up with a brilliant idea to create an Amish faxing website?
0:12:57 And is this like how we cheer the world?
0:13:04 I swear to God, that’s his first name.
0:13:06 So is he Amish or no, when you found him?
0:13:08 No, he lives in New York.
0:13:09 No, he lives in New York.
0:13:10 And his first name is Jamal.
0:13:11 I can’t find a photo of him.
0:13:15 But like, I don’t, I don’t, I don’t think he is.
0:13:18 Because the Amish can’t work websites.
0:13:21 So like you have to be like an ally, you know what I’m saying?
0:13:22 But you can’t be part of the community.
0:13:23 This is great.
0:13:25 That is so funny.
0:13:27 Be like Jamal is like the new slogan.
0:13:29 Find the opportunity.
0:13:30 Be like Jamal.
0:13:41 By the way, why don’t people like, could you just basically go buy a bunch of, could you work with the Amish by fax and just be like, cool, I’m going to buy furniture from you.
0:13:55 But you have, you run a website, so you run a website, when somebody comes, places an order with you, and your website says, made by the Amish, Amish made furniture, handcrafted Amish furniture, finest Amish fine goods, whatever.
0:13:58 And then when somebody places an order with you, you just fax these guys and sell?
0:14:00 Like, could you just be a layer on top selling to them?
0:14:00 Yeah, so.
0:14:02 Or do they say no to that?
0:14:09 Well, first of all, I think that like there is some fraud there of like, yeah, we’re Amish, like, you know, and they’re not, you know, they’re not.
0:14:11 No, but I’m saying you really do buy it from them.
0:14:17 Yeah, I think that the way, yes, I think, and they call, so they call those people, I think the slang that they use is English.
0:14:24 So like, they have an English guy, they like an English guy, meaning that’s like their straight man, their front man who like can work with the world.
0:14:29 And he’s, he’s, we trust him, you know, he’s an outsider, but like, he’s, you know, he’s had our back for decades.
0:14:30 So we trust him.
0:14:33 I guess that’s for Jamal in this case, but you can have other.
0:14:35 Yeah, you’re like a dealer.
0:14:43 Yeah, you can have like, and they have like a name for that, which is like, they call it English, but it’s like, yeah, it’s a well-known thing that this is our person who’s got our back and we give them a cut.
0:14:47 On this doc, it says AWS for the Amish.
0:14:47 What is that?
0:14:48 Do you like that?
0:14:51 That’s just a little, I was workshopping.
0:14:51 Yeah.
0:14:58 I was workshopping.
0:14:59 What do you think about that?
0:15:00 That’s good.
0:15:01 I like that.
0:15:05 This is a pretty nifty website.
0:15:12 And if you go to a similar web and look at their, and guess their traffic, it’s not nothing.
0:15:12 Yeah.
0:15:13 400,000 Amish people.
0:15:20 Let’s just assume for a second that of the entrepreneurial Amish people, a very high percentage of them are going to need to use a service like this.
0:15:29 So if you say that even 5% of Amish people are entrepreneurial of the Amish population, but maybe it’s a little high.
0:15:31 Let’s say there’s 20,000 people.
0:15:37 I think you could pretty easily get to some version where you have 5,000 customers paying this thing, 20 bucks a month.
0:15:39 That’s a $1 million business.
0:15:40 It’s a $1.2 million.
0:15:42 Sure, that’s the exact math that I had.
0:15:45 I said $1 to $1.2 million a year.
0:15:47 And I’m pretty sure it’s just this guy.
0:15:49 Jamal is the only guy running it.
0:15:53 I think his name could have been Javal, J-A-V-A-L.
0:15:55 But like it was some name.
0:16:02 Do you think Jamal just, does he have trouble sleeping because he just laughs himself to sleep every night thinking about what his career is?
0:16:04 He’s like, I just can’t believe it.
0:16:06 I’m just tickled.
0:16:08 I just can’t believe that this is what I did.
0:16:08 And I’m a millionaire.
0:16:10 This is called providing value.
0:16:10 This is it.
0:16:11 Oh, it’s for sure.
0:16:18 And I just know that these Amish guys, they’re not exactly known for, you know, being open to change.
0:16:27 So once you get a customer, like you’re with them, like they measure or they measure churn, not like in terms of like, you know, like percentages per year, but it’s like per generation.
0:16:31 Because this is absolutely something that is going to be passed on from generation to generation.
0:16:35 And the website looks like it was launched like web 1.0.
0:16:38 Yeah, there’s like seven sentences on the site.
0:16:39 All right.
0:16:40 This is amazing.
0:16:43 So this episode is basically local million dollar businesses.
0:16:45 That’s what this episode is.
0:16:46 You want me to do one more?
0:16:48 I have one more that could fit in this category.
0:16:50 This one, it’s depressing.
0:16:51 But well, okay.
0:16:53 So when you have…
0:16:54 Don’t do it if it’s depressing.
0:16:55 Well, it’s important.
0:17:00 So when you have to euthanize your pet, it’s like a horrible experience.
0:17:01 Obviously, it’s like the worst thing next to your children.
0:17:05 And so I used a service that came to my house.
0:17:10 And it like was the best of the horrible situation.
0:17:12 It couldn’t have been better for the worst thing ever.
0:17:18 And I, after, you know, I’m a nerd, like after a few months, I was like, this service, like, it was like phenomenal.
0:17:19 What was this?
0:17:20 How did I learn about this?
0:17:23 Google at home pet euthanasia.
0:17:25 It’s the first one that comes up because they crush.
0:17:27 I’m not, I’m not Googling that.
0:17:27 Okay.
0:17:28 Bad karma.
0:17:29 Not doing it.
0:17:29 Okay.
0:17:30 It’s called…
0:17:31 I don’t even want that in my search history.
0:17:33 Lap of love.
0:17:35 So lapoflove.com is the website.
0:17:38 And I was reading this press release by them.
0:17:40 And here’s how this business works.
0:17:42 So they have vets.
0:17:43 So they contract it all out.
0:17:46 So they have like best practices that they use.
0:17:47 And then they are like the call center.
0:17:51 And they dish it out to a local vet who, you know, does what they need to do.
0:17:53 And they teach them their ways, whatever.
0:18:00 This website, Lap of Love, they put out a press release that said that they are getting 10,000 customers a week.
0:18:04 And they charge, I think, $600.
0:18:12 And so if you do the math, this business is making hundreds of millions of dollars in revenue, of which they split with the vet.
0:18:17 But I was like, very happy with my service with these guys.
0:18:24 And it was amazing that I had never heard of it and how this company just owns the entire space.
0:18:24 Wow.
0:18:26 10,000 a week.
0:18:27 Can you believe that?
0:18:29 How did you even know that that was a thing?
0:18:30 How did you find out about it?
0:18:32 Well, it’s word of mouth.
0:18:35 You know, like your wife or me was like…
0:18:35 Telling people.
0:18:35 Yeah.
0:18:36 We were like, look, this is horrible.
0:18:38 I don’t want to do this.
0:18:39 And they’re like, well, I use this service.
0:18:45 And you hear about that one-liner and you’re like, oh, my God, that’s so much better than the alternative.
0:18:47 And that’s how we found it.
0:18:49 And I was like researching it.
0:18:51 And what they do is they…
0:18:51 Because I was like Googling.
0:18:54 They just own…
0:18:59 Like if you Google that word, you know, like related phrases, they own the Yelp pages in every city.
0:19:05 So it will be like at-home pet euthanasia in New York, in Nashville, in this.
0:19:08 And they grow entirely through local search.
0:19:09 Wow.
0:19:11 That’s a crazy business.
0:19:13 It’s a crazy business, isn’t it?
0:19:13 That’s a crazy business.
0:19:15 And so like…
0:19:17 And they said how much…
0:19:18 10,000 a week, you said?
0:19:19 Sorry, I got confused.
0:19:20 10,000 a month.
0:19:22 So they handle 10,000 a month.
0:19:26 And it costs anywhere from $500 to $1,000, depending on a variety of things.
0:19:27 But isn’t this wild?
0:19:28 And it is like…
0:19:32 This is another one of services where it sucks, but like it’s incredibly necessary.
0:19:36 And I was amazed at how large this was.
0:19:37 All right.
0:19:37 Here’s another one.
0:19:41 Under the radar business that just crushes with local businesses.
0:19:45 I saw this guy, Tane, on Twitter talk about this.
0:19:46 He said…
0:19:46 Who?
0:19:48 I don’t know how you say his name exactly.
0:19:49 Don’t make me say it again.
0:19:49 Okay.
0:19:49 Sorry.
0:19:50 All right.
0:19:51 So…
0:19:58 I just found out I’ve been calling my piano teacher, Steven, and his name is Vinny for like
0:19:59 the last two months.
0:20:02 All right.
0:20:06 I’m still reconciling that fact of what I’ve been doing.
0:20:07 Did he just take it?
0:20:11 Well, I didn’t say it often, and I say it fast, because I was like…
0:20:12 I was a little unsure.
0:20:14 But dude, I was slow…
0:20:16 Saying the word Vinny slow does not sound like you’re saying…
0:20:19 Or saying the word Vinny fast does not sound like you’re saying Steven at all.
0:20:21 I’d be like, all right, say bye to Steven.
0:20:26 So I’d tell my daughter to say bye, and then she would say bye, Steven.
0:20:27 And I’d be like, oh, my God.
0:20:29 His name’s Vinny.
0:20:33 Blame her.
0:20:38 Listen, honey, you’re going to learn an important lesson today.
0:20:39 It’s called taking one for the team.
0:20:45 I need you to tell him, to say out loud the following words today.
0:20:46 All right.
0:20:50 So there’s this business called Taro, and Taro…
0:20:56 His tweet was, today I learned about Taro, a $100 million company that routes phone orders
0:20:59 from Chinese sushi and pizza restaurants in the U.S. to call centers in the Philippines.
0:21:06 All right, folks, this is a quick plug for a podcast called I Digress.
0:21:10 If you’re trying to grow your business, but feel like you’re drowning in buzzwords and BS,
0:21:12 then check out the I Digress podcast.
0:21:15 It’s hosted by this guy named Troy Sandage.
0:21:19 He’s helped launch over 35 brands that drive $175 million in revenue.
0:21:23 So if you want to get smarter about scaling your business, listen to I Digress wherever
0:21:24 you get your podcasts.
0:21:25 All right, back to the pod.
0:21:33 And so what it does is, what these guys did was, these two brothers, back in 2015, they
0:21:36 start this business and they’re basically like, hey, we’ll help local businesses take orders
0:21:37 over the phone.
0:21:39 We will be your phone staff.
0:21:40 So your staff is busy.
0:21:43 You don’t want to have somebody on the phones or just constantly interrupting their workflow.
0:21:46 We’ll just take the call and then we’ll put the order into your system.
0:21:48 And so they started doing that.
0:21:55 They’ve, uh, they basically serve service 3000 local restaurants in the United States
0:21:56 with phone ordering.
0:22:02 And they basically are like, cool, like we’ll do this for 10% cheaper than your labor costs
0:22:03 if you do this yourself.
0:22:08 And by having phone ordering, you’re going to get an extra 10 to 20% of revenue that you
0:22:09 wouldn’t otherwise get.
0:22:10 Simple, simple proposition, right?
0:22:14 Get more revenue and I can do it for you at a lower cost than you could do this for yourself.
0:22:18 And by the way, nobody cares who picks up the phone to take this order.
0:22:22 They, um, as of this year say that they reached a hundred million dollar run rate.
0:22:24 And, um…
0:22:25 And how do you spell it?
0:22:27 T-A-R-R-O.
0:22:30 And it stands for technology all restaurants run on.
0:22:34 It’s the Adidas of online phone ordering.
0:22:40 Uh, and is this bootstrapped?
0:22:45 Uh, I don’t think it’s, I don’t know if it’s bootstrapped or not, but, uh, it could be because
0:22:47 this is the type of business you could definitely bootstrapped.
0:22:48 It’s like a heavy cash flow business.
0:22:52 Well, now it says that it’s AI powered phone ordering.
0:22:54 Does that mean that they don’t use Filipinos anymore?
0:22:55 I think both.
0:22:56 Right.
0:22:57 So I think it’s basically…
0:22:59 Is AI a name of one of their workers?
0:23:05 There’s Alfred Ignacio over there.
0:23:07 He’s powering all your orders.
0:23:09 Um, it’s both.
0:23:12 So I think there’s, there’s funny things happening with AI and call centers.
0:23:16 So there’s like AI tools that will change the accent of the person.
0:23:20 So they, you know, you call somebody there in India, but their Indian accent gets remixed
0:23:22 on the fly using AI.
0:23:24 So it sounds like he’s Steve in Wichita.
0:23:28 And so that’s like one tool that all these guys are using now is like the AI doesn’t take
0:23:32 the order, but they just changed the accent so that you have an American accent.
0:23:33 What’s that called?
0:23:35 Uh, I don’t know.
0:23:36 There’s a few, a few companies trying to do that.
0:23:38 Um, so there’s accent changers.
0:23:44 There’s AI handles just, let’s say 50 to 70% of the routine things.
0:23:49 And then it routes to the human in the sort of 50 to 30 to 50% of, of calls that couldn’t
0:23:50 be solved with AI.
0:23:56 So like basically AI is the AI makes their call center need like, you know, half as many people
0:23:56 as it did before.
0:23:59 And that’s the rest is just profit that just falls to the bottom line for them.
0:24:02 Um, so there’s, there’s some cool stuff happening with that.
0:24:07 It’s actually kind of interesting to track call center stocks to see what the market thinks
0:24:07 is going on.
0:24:08 Like, are they going to be extinct?
0:24:10 Like his call center is just going to go away.
0:24:14 Are they actually going to be, are they going to survive, but become much more profitable
0:24:17 because now they’re AI powered and they cut a lot of their human costs.
0:24:19 It’ll be interesting to see how that plays out.
0:24:19 Yeah.
0:24:20 What happened to that?
0:24:23 Interesting, you know, to the most boring person on earth, I guess.
0:24:27 What happened to like the NBA playoffs are on.
0:24:28 I guess that’s probably more interesting.
0:24:30 Yeah.
0:24:33 But if you’re like listening to this podcast, you definitely might be in that category of
0:24:35 people who are, but it’s boring.
0:24:39 What’s a, like, I think Coachella was last weekend and I only was watching the live stream
0:24:40 of the Berkshire Hathaway conference.
0:24:45 What’s the name of that really big company?
0:24:47 Was it called Task Us or something like that?
0:24:47 Yeah.
0:24:49 Are they publicly traded?
0:24:53 I think they are.
0:24:56 Is their market cap just getting obliterated right now because of all this?
0:24:59 It’s a $1.2 billion market cap.
0:25:01 And yeah, it’s down in five years.
0:25:04 It’s down like 5X.
0:25:09 But it’s been, yeah, it’s been, it got nuked basically from, well, it kind of was at the
0:25:10 peak of the 2021 range.
0:25:11 So like, let’s see.
0:25:17 Yeah, it basically went public right at like the peak of the market, like September 2021
0:25:19 and then has just been down since then.
0:25:22 I have gotten, oh yeah, you’re right.
0:25:23 So it could be, it’s a bunch of stuff.
0:25:23 Yeah.
0:25:25 Dude, by the way, I just invested this company.
0:25:28 I think you might actually be an investor.
0:25:29 Are you an investor in owner.com?
0:25:30 Yeah.
0:25:30 Yeah.
0:25:31 He’s cool.
0:25:33 That thing is crushing.
0:25:34 It’s crushing.
0:25:37 This business owner.com is kind of amazing.
0:25:42 So what they’re doing is they go to restaurants across America and they’re basically like,
0:25:46 Hey, you, you need software.
0:25:47 You hate your current software.
0:25:48 You’re using 15 different tools.
0:25:50 Use the owner system instead.
0:25:50 Okay.
0:25:53 It’s actually like not that new of a pitch.
0:25:57 You know, there’s other companies that have claimed to be like, oh, we’re an all-in-one
0:26:00 or we have the best point of sale checkout system.
0:26:05 And these guys have just got it like really, really right because they’re growing incredibly
0:26:05 fast.
0:26:11 And so what they do is they go to a company and they’re like, Hey, look today, if I Google,
0:26:12 there’s a great case study on their website.
0:26:15 Like if your case study is good, this is when I decided to invest.
0:26:17 I was like doing like the diligence on it.
0:26:21 And I was, I watched their case study and most case studies on business websites are awful.
0:26:21 God awful.
0:26:25 I watched it and I was so thoroughly convinced.
0:26:28 And I just thought to myself, if they do their case studies this well, imagine how they’re
0:26:30 doing like the other important parts of their business.
0:26:34 Cause this is like a, you know, the, the sort of like the thing that’s a kind of a throwaway
0:26:36 for most businesses and they’re pretty poor at execution.
0:26:42 So it was this dude who was a pizza shop owner in somewhere, maybe Pennsylvania or somewhere
0:26:42 like that.
0:26:46 And he’s basically showing, he’s like, look, if you Googled my pizza shop’s name, if you
0:26:53 Googled like whatever, like town slice pizza, Pennsylvania, the first result is slice.
0:26:55 The second result is DoorDash.
0:26:59 The third result is like all these other companies are stealing my, like people are searching for
0:26:59 me.
0:27:03 They’re not searching for DoorDash and they make these websites that rank in SEO.
0:27:05 It’s about, I was on page two.
0:27:08 And he’s like, basically I started working with owner and owner.
0:27:12 First of all, now I’m the first result because I’m the first result.
0:27:15 When people are searching for my business, those orders come through me directly.
0:27:20 I don’t have to pay DoorDash, the 15% fee, uh, my, on my website and my online ordering
0:27:21 works really well.
0:27:22 Cause that’s what owner does.
0:27:23 They provide that like out of the box.
0:27:25 I don’t have to know anything about tech to be able to do that.
0:27:29 And then, uh, you know, I get these, I get customers emails and phone numbers and I’m
0:27:33 able to text them and we have promotions and sales and deals and things like that.
0:27:36 And basically I’m making an extra like 10 grand a month.
0:27:37 And that’s huge for me.
0:27:40 Like, that’s like a, that’s like the difference between being on the brink of failure or having
0:27:41 like a margin of safety.
0:27:44 It’s the difference of like hiring an extra person or not.
0:27:50 And, um, and I just saw like that, that same like business proposition, which is like,
0:27:52 look, you’re it’s 2025.
0:27:54 You need to have a website.
0:27:58 You need to be able to take online orders yourself and you need to rank for your own name at the
0:27:58 top of Google.
0:28:03 And, um, look, you don’t want to have to deal with, you know, 15 tools to be able to do that.
0:28:06 And we should do it for you out of the box and do it really well.
0:28:08 And this business is scaling very, very fast right now.
0:28:10 Very impressive growth.
0:28:16 And, uh, this guy seems like one of those founders that’s kind of like high octane.
0:28:21 Um, I don’t know him super well yet, uh, but just seems very, very high octane.
0:28:22 I’m pretty sure.
0:28:27 The intro I got to him was someone said, this is the best company I’ve ever invested in.
0:28:28 And this is the best founder I’ve ever invested in.
0:28:33 And I was like, are you just saying words or like, do you mean these words?
0:28:34 And he’s like, I mean these words.
0:28:36 So I could be wrong, but I mean them.
0:28:38 I was like, wow, that’s a hell of an endorsement.
0:28:42 The way that I invested in him was way less fancy as yours.
0:28:46 I just, Jason Lemkin was like, he’s the best.
0:28:47 And I just said, okay.
0:28:53 And, and I remember talking to him and when I talked to him, he, this was, I think, no,
0:28:59 maybe this was four years ago, he was 21 years old and he was telling me a story.
0:29:04 And once I heard like 21 years old and Jason Lemkin saying he’s the best, I was like, well,
0:29:05 okay, cool.
0:29:06 I think, I think I’m in.
0:29:09 And the valuation I believe was really, really expensive.
0:29:13 It was like a, a nine figure something valuation.
0:29:14 Yeah.
0:29:17 And, uh, I was like, this has got to be huge to like really be worth it.
0:29:20 And I think he’s going to actually make it a massive, massive business.
0:29:21 Have you ever even talked to him?
0:29:23 Only through email.
0:29:24 We traded like five emails in one night.
0:29:27 Cause I was like, tell me the answer to these five questions.
0:29:28 And then he did.
0:29:31 He is, uh, the terminator.
0:29:34 When I had a conversation with him, I was like, oh, you’re going to destroy everything in your
0:29:34 path.
0:29:41 Like I, I could sense that, but he gave me that vibe where I was like, I don’t want to be
0:29:41 your enemy.
0:29:43 You’re on a high protein diet, huh?
0:29:50 He’s, uh, on his videos for work and stuff like his like YouTube videos or whenever they
0:29:53 got to do like interviews, he’s like a, comes off like a really sweet, nice guy.
0:29:58 When I talked to him one-on-one, uh, I was, I, he’s incredibly intense.
0:30:03 He, he will annihilate people in an ethical, good way.
0:30:07 But like, he’s the type of guy where, uh, I do not want to compete against this guy.
0:30:10 It’s so funny how you get a vibe off people and, um, very quickly.
0:30:15 So like, I remember Joe Rogan once described somebody, he was describing somebody who he
0:30:19 just thought was just like a, in his mind, I don’t know, just like a total loser, but like
0:30:20 just like a nose, like a spineless person.
0:30:24 And he, and, but the way he was describing them, he was like, he’s like, they just had
0:30:25 no energy.
0:30:28 You don’t meet somebody and like, are your veins empty?
0:30:31 Like, where’s the, is there any blood in your veins?
0:30:35 And I just remember when he said that immediately in my mind, I could think of three people who are
0:30:37 like that, just like very low energy people in my life.
0:30:42 And then you can meet these other people that literally like, they walk at a different pace.
0:30:44 They have like a different amount of energy.
0:30:48 I remember like when we were, um, hanging out in North Carolina and Mr. Beast took us to Walmart
0:30:49 to like show us the things.
0:30:53 And I remember like, I was like, why is this dude walking so fast?
0:30:56 Like this guy’s literally like, has like a little, like an extra heartbeat or something
0:30:57 in his cadence.
0:30:59 He’s just like walking faster than everybody else.
0:31:01 And literally had more energy than anybody else.
0:31:03 And he was busier than everybody else.
0:31:09 And I couldn’t tell like, is he so busy because he’s got so much energy or did he have to
0:31:11 raise his level of energy?
0:31:15 Does this guy have just more ATP in his body because his schedule demanded it?
0:31:19 And I’m not, I still don’t know like cause and effect of that, but it was very obvious
0:31:23 to me as I’ve met more and more people that literally having more energy is a common trait
0:31:25 of like the most successful people.
0:31:26 And I don’t know if it’s cause or effect.
0:31:32 I was, uh, so I, I’ve got a, uh, you know, I think we have a small team, maybe 15 people.
0:31:37 And a lot of them are these like young 25 year olds and they’re animals, like they’re rabid
0:31:40 animals and they like do crazy animal stuff.
0:31:44 And every once in a while I got to correct them and I have to remind them, stay crazy.
0:31:46 Just like, I need to direct you crazy a little bit.
0:31:50 And they were asking what I meant and I was trying to think of like, well, how can I give
0:31:51 you a good analogy?
0:31:54 And I was like, have you guys ever seen a curling?
0:31:58 I was like, you know, like when they take like that big rock that like is capable of just
0:32:01 smashing through everything if it wanted it to, and they push it.
0:32:05 And then there’s all those people in the front with like these brooms that are just like sweeping
0:32:10 the area to make sure like everything, like the path that you need to that, that, that big
0:32:13 rock that’s a, like a brute force, like blunt object.
0:32:17 So you are just guiding it to in the right lanes and it’s like clearing the path.
0:32:23 I was like, I’m the broom and you’re the rock and someone’s, we’re going to, we’re going
0:32:28 to push you down this lane and I just need to be in front of you constantly like clearing
0:32:28 the space.
0:32:33 And, uh, if, if I’m ever not clearing the space for you, uh, or I need to reprimand you, it’s
0:32:38 just me kind of saying like, Hey, I need you to go like into this direction, but you need
0:32:40 to continue being like this brute force rock.
0:32:41 Yeah.
0:32:42 That’s just going to like smash through stuff.
0:32:46 And it’s just our job to like change, uh, directions every once in a while, but I need
0:32:48 you to like stay what you are.
0:32:53 And that is when I know I’ve hired the right people is when a, I feel like that.
0:32:56 And B, sometimes I feel like I’m intimidated by them.
0:33:02 Like, have you ever hired someone and you’re like, uh, uh, I, I want to keep you happy because
0:33:06 if you go like, like if you, if you go work for someone else, it’s going to be bad news.
0:33:08 Or, uh, you like almost get intimidated.
0:33:11 Have you ever had someone who you hired, who you’re intimidated by?
0:33:14 I mean, I don’t know if intimidate is the right word, but I think I know what you mean.
0:33:18 Like Furkan was like this, like, uh, immediately I was like, Oh, Whoa.
0:33:19 Okay.
0:33:24 So it’s there’s, there’s no, yeah, but it’s like, yeah, he’s super smart, but he doesn’t
0:33:25 know anything about business.
0:33:25 Nope.
0:33:26 Actually he does.
0:33:26 Yeah.
0:33:29 But he doesn’t, you know, he doesn’t, he doesn’t work that hard.
0:33:29 Nope.
0:33:31 Actually he works way harder than everybody else.
0:33:33 So it’s like, wait, wait, wait, you’re, there’s no butts.
0:33:37 It’s just super smart and works super hard and is like well-rounded and knows, knows enough
0:33:39 about the other stuff to get it right.
0:33:40 It’s like, Holy shit.
0:33:40 Okay.
0:33:44 You know, and, and, and I would say the biggest thing is like their self-assumption.
0:33:48 So how do they carry themselves and how do they think about themselves?
0:33:52 You hire a lot of people that want to fit into your company or they want to defer to you
0:33:55 or they defer to your judgment and every once in a while you hire somebody that doesn’t want
0:33:56 to do any of those things.
0:33:58 They come in, they see broken stuff.
0:33:58 They want to fix it.
0:34:00 They don’t think what you were doing was right.
0:34:02 They just, if it’s, if it’s good, they think it’s cool.
0:34:04 If it’s broken, they think it’s broken.
0:34:04 Like that’s it.
0:34:08 They don’t think that anybody else is more qualified in the companies to do it.
0:34:10 Like they think they could do it themselves.
0:34:13 They don’t think that there will be an employee forever.
0:34:15 Like they’re like, cool, I’m here right now.
0:34:15 There’s a partnership.
0:34:18 And like, you know, eventually I’m going to be doing my own thing or I’m going to be, I’ll be,
0:34:21 you know, in the leadership of this company, I’ll have more equity in this
0:34:22 company that I have today.
0:34:26 Like there’s some people who have a confidence about that, about themselves because they have
0:34:29 a certain self-assumption and it’s a self-fulfilling prophecy.
0:34:30 You know what I mean?
0:34:35 Furkan was a guy who worked with you at Monkey Inferno, the incubator.
0:34:39 And he previously helped start AppLovin, which is a hundred billion dollar company.
0:34:47 But between him, between AppLovin becoming a hundred billion dollar company and him starting
0:34:48 it, he worked for you.
0:34:52 And it wasn’t like a clear runaway hit for a minute, but then it like, it did.
0:34:53 And his…
0:34:53 What, AppLovin?
0:34:54 Right.
0:34:55 It wasn’t like a clear hit.
0:34:57 No, AppLovin was already a runaway hit when he left.
0:34:58 Oh, got it.
0:34:58 Got it.
0:34:59 So he knew it was a winner.
0:35:01 So then what the hell was he doing working with you?
0:35:03 Or maybe it had not paid out yet?
0:35:08 Believe it or not, he doesn’t care.
0:35:11 I don’t know if I believe…
0:35:13 I think I’m a not in that category.
0:35:16 Because I was there when the money hit.
0:35:17 Oh, got it.
0:35:18 Okay.
0:35:23 He was still working on our like beta release of our app that had 400 users and he was up
0:35:26 till 3 a.m. that night and he couldn’t have cared less.
0:35:31 And nothing changed between going from, you know, whatever, you know, normal person.
0:35:35 To being worth, you know, nine figures in an instant.
0:35:36 Nothing changed.
0:35:37 Nothing.
0:35:38 Well, I remember…
0:35:41 And I remember even telling him, I said, look, man, I was like prepping him like a psychologist.
0:35:44 I was like, look, man, it’d be crazy if nothing changed.
0:35:45 I understand.
0:35:46 We got to figure out like how we’re going to…
0:35:48 Are you just going to want to retire?
0:35:49 Are you going to like…
0:35:50 Are you going to lose that edge?
0:35:52 Is it going to be temporary?
0:35:54 You want to take some time, just go on vacation and enjoy it?
0:35:58 Like, I was like, it’s hard to walk into an opium den and not get high.
0:35:58 I was like, it’s hard.
0:36:01 It would be like, I think, kind of crazy to assume you’re going to get…
0:36:08 Massively, massively rich, generationally wealthy, like in the next few months and that nothing changes.
0:36:11 And then he was like, cool analogy.
0:36:12 Can you go out?
0:36:14 Can you get out of my way now so I can just do what I was doing?
0:36:15 And I was like, all right.
0:36:16 And then nothing changed.
0:36:16 It was amazing.
0:36:20 I think I like walked into the office one time, like after…
0:36:21 What happened?
0:36:24 I think I, at this point, I knew that he was like wildly successful.
0:36:31 And I saw him, like, I don’t remember exactly how it was, but I had this feeling like he had his hat turned around his…
0:36:32 On the backwards of his head.
0:36:34 And he had like a screwdriver in his teeth.
0:36:38 And he was like behind the TV, like installing a Raspberry.
0:36:39 What was that thing called?
0:36:40 A Raspberry Pi.
0:36:40 Yeah.
0:36:43 He was like installing this like computer chip to like the TV.
0:36:45 And I was like, Furkan, what are you doing?
0:36:50 And he was like explaining to me, like how it’d be cool because this Raspberry Pi thing is like a computer.
0:36:52 So he’s turning the TV into the computer.
0:36:57 So, and it like, and then the things that he wanted to do, like there was a handful of like amazing things.
0:37:01 And then it really kind of boiled down to like, isn’t this cool?
0:37:04 Like, and I was like, yes, it is cool.
0:37:06 And I remember I was trying to justify him.
0:37:07 Like, but why are you?
0:37:09 And then he just kept going, but it’s cool.
0:37:10 It’s cool.
0:37:11 Don’t you think it’s cool?
0:37:11 And I was like, right.
0:37:12 Yeah, you’re right.
0:37:16 That’s actually the best reason why you should be looking like a mechanic and like doing this.
0:37:18 And it was like, we were doing one of our sessions.
0:37:20 It was at like 8 p.m. at night.
0:37:21 That sounded different than how I meant for it to sound.
0:37:25 But we were like talking about business at like 8 p.m. at night at your office.
0:37:30 And he was like there, like installing this like pie into your TV.
0:37:31 Yeah, yeah.
0:37:33 And the funny thing is when Furkan joined the company, this is a good lesson, I would say,
0:37:35 because it could have gone either way.
0:37:37 So Furkan builds AppLovin.
0:37:42 I think AppLovin at the time when he left was maybe like $100 million a year business,
0:37:44 but it was clear it was scaling fast.
0:37:48 And he leaves and he leaves because he’s like, cool, the rest of the job is managing people
0:37:50 if I want to stay CTO.
0:37:53 And like, I liked the beginning when I was building stuff.
0:37:54 So I’m going to just build stuff.
0:37:59 So he leaves and he decides to go build, he decides to learn mobile development.
0:38:01 So he’s like, oh, I think mobile is going to be big now.
0:38:04 And so he’s like, I want to actually learn Android development.
0:38:07 And he starts building games for fun in his bedroom alone.
0:38:09 And he does that for a little bit.
0:38:13 I find him there and I’m like, yo, I think you’re super interesting.
0:38:13 You should come join us.
0:38:15 Don’t be in your room alone.
0:38:16 No fun building that way.
0:38:16 Come build with us.
0:38:19 So he ends up joining us and he joins us like the head of Android.
0:38:24 And within the first few weeks, it was like extremely obvious that this guy
0:38:24 does not fit in.
0:38:26 First of all, he’s smarter than everybody.
0:38:28 Second of all, he works harder than everybody.
0:38:29 He was there like everybody would leave.
0:38:32 Our company culture was everybody would leave around 5 p.m.
0:38:33 Most people had kids.
0:38:33 They would go home.
0:38:37 He would come in at 11 and he would leave something like 11 p.m.
0:38:40 And then he would still be on Slack at 2 a.m.
0:38:42 And then he would come in again the next day at 11, 1130.
0:38:45 And he would just do that every single day.
0:38:48 And when everybody else would quote me a timeline like, all right, cool.
0:38:52 I’ll show you the prototype at the same meeting next week.
0:38:55 He would show me the prototype the next morning.
0:38:59 And so I was like, okay, this guy, he’s going to break our culture one way or the other.
0:39:09 And you could tell the other engineers, they both liked him, but were also a little bit like, this guy doesn’t come in until lunch.
0:39:13 He, you know, he’s pushing updates at 2 a.m.
0:39:18 And like we weren’t working then, so we weren’t involved with it.
0:39:20 And, you know, what’s his deal?
0:39:22 This guy dropped out of college.
0:39:23 He’s not like classically trained.
0:39:24 So, like, what are we going to do with this guy?
0:39:27 And in my head, I was like, all right, one of two things is going to happen.
0:39:29 Either this guy is going to, it’s going to be like organ rejection.
0:39:32 He’s going to have to leave because he just doesn’t fit.
0:39:38 Or it needs to be that, like, the thing where the organ, like, the host takes, the guest takes over the body.
0:39:39 I was like, okay.
0:39:41 So I went out with him one night.
0:39:48 And I remember we were at a bar, and I was just like, look, I can’t, this is not official, but you’re going to be running this company.
0:39:52 And I need you to start building out the team the way you want it to be built.
0:39:54 And you should work the way you want to work.
0:39:58 Don’t try to fit in because we’re going to change this whole company with you kind of driving that engineering change.
0:40:01 I was like, this is how a startup is supposed to feel.
0:40:02 You’re doing it right.
0:40:08 And so pretty quickly, I just told him, you hire your own people, and they don’t have to interview with everybody else.
0:40:11 Or you don’t need everybody’s blessing to sign off on a hire.
0:40:11 So they would interview them.
0:40:14 But if he liked the person, he could hire them under his team.
0:40:16 And eventually, you know, he became CTO.
0:40:17 He became my co-founder.
0:40:19 And, you know, he became, like, you know, leader of the company.
0:40:24 But I had to basically, like, Amazon has this phrase, bar raisers.
0:40:29 It’s like you hire somebody, and then they raise your bar of what good looks like.
0:40:30 He was a clear bar raiser.
0:40:34 And there was a part of us that, like, didn’t know how to deal with that.
0:40:38 And the right way to deal with it was to totally lean into it and be like, oh, that’s the new normal for us.
0:40:44 You’re the new bar setter of, like, what our engineering team should look like.
0:40:45 Did the other people quit?
0:40:47 Like, who won?
0:40:49 I mean, obviously, he is still there, or he was there.
0:40:51 Well, a couple people adapted.
0:40:52 So they were like, cool.
0:40:56 My lifestyle is not that I’m going to be up until 2 a.m., but I’m going to crush in my 9 to 5.
0:40:58 And I will work at that pace.
0:40:59 And, like, cool.
0:41:00 You’re going to work a crazy schedule.
0:41:02 I’m not going to work a crazy schedule.
0:41:03 But, like, I’m here for it.
0:41:04 I want to work at that pace.
0:41:10 I want to be, like, that effective, and I want to change the expectations of what speed looks like inside the company.
0:41:11 And so a couple people became that.
0:41:15 And then a couple people, we actually had a legacy business that was making a few million dollars a year profit.
0:41:18 And so we spun out the rest of the team onto that business.
0:41:23 I was like, you guys work on that company in that schedule and that pace.
0:41:25 And this team is going to work in this pace.
0:41:28 And we basically split the company in half, like a divorce.
0:41:30 Dude, it was like Lord of the Flies.
0:41:32 Like a happy, married divorce.
0:41:33 It was like, you guys get custody of those assets.
0:41:35 We’re going to have custody of these assets.
0:41:36 That’s pretty fascinating.
0:41:40 It’s like throwing people out on the different island and just saying, like, you better figure it out.
0:41:41 Survive.
0:41:44 And you losers are going to go to this dying thing.
0:41:45 It wasn’t dying.
0:41:47 I mean, it was, like, fine.
0:41:49 It was, honestly, it was what some people wanted.
0:41:53 Some people, like, not everybody wants to, like, grind like crazy.
0:41:56 And what this was a very good way to do was be, like, there are two paths.
0:42:00 You keep your same job, your salary, all that stuff.
0:42:02 One is a certain lifestyle.
0:42:02 One is another.
0:42:04 This lifestyle is easy.
0:42:04 This lifestyle is hard.
0:42:06 Self-select.
0:42:07 And the self-selection was very helpful.
0:42:16 New York City founders, if you’ve listened to My First Million before, you know I’ve got this company called Hampton.
0:42:19 And Hampton is a community for founders and CEOs.
0:42:24 A lot of the stories and ideas that I get for this podcast, I actually got it from people who I met in Hampton.
0:42:27 We have this big community of 1,000-plus people, and it’s amazing.
0:42:34 But the main part is this eight-person core group that becomes your board of advisors for your life and for your business, and it’s life-changing.
0:42:41 Now, to the folks in New York City, I’m building an in-real-life core group in New York City.
0:42:52 And so if you meet one of the following criteria, your business either does $3 million in revenue, or you’ve raised $3 million in funding, or you’ve started and sold a company for at least $10 million, then you are eligible to apply.
0:42:55 So, go to joinhampton.com and apply.
0:42:58 I’m going to be reviewing all of the applications myself.
0:43:02 So, put that you heard about this on MFM so I know to give you a little extra love.
0:43:03 Now, back to the show.
0:43:07 Dude, he’s pretty badass.
0:43:08 We should have him on again.
0:43:13 Now he’s got, like, his space in Fort Mason, I think it’s called.
0:43:14 He’s the man.
0:43:15 We should have him on again.
0:43:16 Yeah, of course.
0:43:17 Love talking to Furkan.
0:43:18 What do you think?
0:43:19 Is that it?
0:43:21 I don’t know exactly where I want to go with this, but I just want to share this with you.
0:43:30 So, I’ve been trying to help certain people in my life, like, either start businesses or upgrade their business.
0:43:32 Like millions of people who listen to you?
0:43:33 No, no, no.
0:43:37 Like, my, like, micro, like, people who I care about, you know?
0:43:38 Like, my trainer, for example.
0:43:43 My trainer, today he’s got a training business where his calendar is full.
0:43:48 He’s got more clients on his roster than he can handle, but, you know, he’s trading time for money still.
0:43:49 So, he’s not scalable.
0:43:51 He can only train so many people per day, right?
0:43:52 So, he’s doing five, six sessions a day.
0:43:55 He’s driving to people’s locations and he’s training them.
0:43:58 But, like, you can’t do 12 people per day, for example.
0:44:00 He couldn’t double his money if he wanted to.
0:44:04 He definitely couldn’t do his goal, which is, you know, make twice as much money with half the time invested.
0:44:07 Like, half the, you know, with double the time flexibility.
0:44:12 And so, he’s been starting, he started, like, a drink, an energy drink company.
0:44:16 He started, like, apparel companies, like, trying to do these side hustles.
0:44:20 But all of them, I’m like, dude, the beverage industry is, like, brutal, right?
0:44:22 It’s, like, a brutally competitive business to be in.
0:44:25 The apparel business is just a brutal business to be in.
0:44:27 Might be better just to, like, get another trainer.
0:44:30 Yeah, so, I’m like, hey, have you thought about getting another trainer?
0:44:34 Or, like, and in this case, I was like, what would be, like, an appealing version of this?
0:44:35 Like, here’s some ways you could scale.
0:44:37 So, I was like, you could start a studio.
0:44:40 And he’s like, oh, I would love to have my own space, my own studio.
0:44:44 And I’m like, okay, that’s a way that you could get to your goal if you started a studio.
0:44:50 And it’s been very interesting to see, kind of, like, how he would approach it versus how I would approach it.
0:45:01 And so, I basically told him, I was like, look, the way I wanted to get in shape, and instead of just being like, I guess I’ll just wing it, I guess me, who’s never done this, will just figure it out.
0:45:05 I was like, no, let me get a coach, somebody who’s already done this before, and I hired you as my trainer.
0:45:08 I was like, I think you should basically have me as your trainer, your business trainer.
0:45:10 And I was like, don’t pay me anything.
0:45:13 All you got to do is book your, what do you call your first session?
0:45:13 An assessment?
0:45:14 Book an assessment.
0:45:17 And he’s like, all right, tomorrow?
0:45:18 I’m like, great, yeah, let’s meet tomorrow.
0:45:26 So, we started talking, and we started doing this thing where basically I would, we would talk, and I would just give him, I was like, how do I keep this so simple?
0:45:31 Because at the, in prior times when I talked to him, I remember, like, I was such a terrible coach.
0:45:41 I was like a trainer who would come in and try to train all your body parts in one session, and, like, be showing you, like, the beginner thing, but then couldn’t resist showing you the advanced thing, and then you’re going to do that, you’re going to get hurt, you’re going to pull back muscle.
0:45:44 So, I’ve been trying to be a better coach, and so I was like, all right.
0:45:45 How do I keep this super simple?
0:45:49 And so, I leave him every time with one blue sticky note, with one thing.
0:45:50 It’s, all right, this is the one action.
0:45:53 Do this between now and the next session, and we’re good.
0:45:58 And it’s been very interesting to see how much progress we can make just doing this very simple method.
0:45:59 I’ll just share with you kind of, like, how this works.
0:46:05 So, the same way that at the dance studio, I was, like, picking up information on, I was learning about a business.
0:46:09 While I was there watching my daughter, I also learned a little bit about a business.
0:46:14 I basically realized, like, that’s something I’ve been doing for about 15 years now.
0:46:23 And I think most people, if you just started doing that one thing, just start, like, paying attention to the business around you and start doing a little bit of napkin math, right?
0:46:27 Try to figure out how many customers a place has times the price every customer pays.
0:46:28 Gives you a good approximation of top line.
0:46:34 You could just Google or ask AI, what’s a good profit margin for a fitness studio?
0:46:37 Typically, are they a 10%, 40%, 50%, 20%?
0:46:40 Like, what is the net profit margins for these things?
0:46:49 And so, what I realized is that most people don’t, as I’ve been helping two or three people in my life do this, most people don’t approach business this way.
0:46:53 And I think if they did, they would have a lot higher chance of success.
0:46:59 I think that hiring a fitness, like, I was reluctant to hire a fitness coach.
0:47:07 But then it made total sense because I remember I was like, well, I was the best at whatever sport I wanted to do in high school and then college where I played for a little while.
0:47:11 When you are basically, like, a professional athlete, you have someone just telling you what to do every single day.
0:47:12 You just do what they say.
0:47:15 And I remember being so reluctant to hire a fitness coach.
0:47:16 And then I did.
0:47:19 And I started seeing, like, my body change in, like, two months.
0:47:21 And then I was like, yeah, that works.
0:47:23 And then I was like, well, maybe should I get a nutritionist?
0:47:25 And I remember being, again, so reluctant to do it.
0:47:28 And then I got, I use my body tutor.
0:47:32 And I was just, like, doing exactly what they told me to do.
0:47:34 So I had accountability, but I also had education.
0:47:34 They would teach me.
0:47:37 And it just kind of hit me.
0:47:42 I’m like, why have I always been so reluctant to pay someone money to just tell me what to do?
0:47:50 And once I kind of let go of that, I think I realized, and I’ve learned this in business as well, there’s a lot of creativity that you need to have.
0:47:58 But in general, there is, like, a process that you can follow where, in a lot of cases, you will get to be fairly successful.
0:48:02 Like, you know, you still have to invent stuff and you still have to, like, stick with it for years.
0:48:10 But in general, just like with changing your body, it’s just like if you do these five things, you will get 80% to where you want to go.
0:48:14 And just only, and you don’t need to think, you just need to execute these five things.
0:48:20 And I think what people don’t understand, I think you and I understand it a bit, even though emotionally sometimes we forget it.
0:48:30 But a lot of the listeners understand this, which is business is the exact same, where there’s a series of steps where you can sort of iterate your way there, just like you can with your body, with your nutrition and things like that.
0:48:32 Yeah, totally.
0:48:37 The way I think about it is you’re going to have some rate of learning, some learning curve, right?
0:48:41 So it might take you six months, might take you a year, might take you two years.
0:48:42 You could definitely get there on your own.
0:48:46 A coach is pretty much just a guaranteed way to speed up that learning curve.
0:48:49 And that’s like the first benefit you get.
0:48:56 And then the second benefit that you get is you’re much less likely to quit during plateaus because the coach has some accountability.
0:48:58 A coach has also seen those plateaus many times before.
0:49:01 And a coach can get you out of through the plateau faster than you’re going to get through it yourself.
0:49:07 And so those two reasons, I think I have probably five active coaches right now.
0:49:09 It’s kind of insane.
0:49:11 Roughly, what category are they in?
0:49:12 So you have a fitness.
0:49:14 I think you also use my body tutor.
0:49:14 So you have nutrition.
0:49:18 So I have exercise and then I have food.
0:49:26 Food coach, which is probably the one that felt the weirdest to do and now is, in retrospect, the most obvious no-brainer of all of them.
0:49:28 It’s almost like a therapist, too.
0:49:28 Food’s a weird thing.
0:49:31 It’s more of a therapist than it is anything else.
0:49:31 Yeah.
0:49:31 Yeah.
0:49:36 It’s like, I think when people think food coach, oh, so they’re giving you a meal plan and macros.
0:49:36 It’s like, no, no, no.
0:49:42 She’s helping me figure out why I don’t stick to any food plan or macros that I’ve ever set for myself in the last 10 years.
0:49:47 And slowly uprooting those and, like, being in my corner along the way.
0:49:48 I started learning the piano this year.
0:49:54 And so I got a piano teacher that I got, I ended up getting two, like, two different ones to try to do that.
0:50:01 Because one of the other realizations I had is that there’s a massive difference between an average coach and a great coach.
0:50:08 So, like, in the same way that in tech there’s this phrase about, like, 10X engineers, there’s for sure a 10X coach or a 100X coach.
0:50:14 So you’re pitting your piano coaches, like, you know, Miss Linda and, like, these two old ladies next to each other.
0:50:17 Like, you know, Miss Linda said we should do it this way.
0:50:18 What do you think about that?
0:50:20 I don’t even say anything.
0:50:21 I just show up and I’m better.
0:50:23 And they’re like, wow, you’ve been putting in a lot of work.
0:50:26 And I’m like, well, I had a couple great sessions, you know?
0:50:30 Okay, so you got two piano coaches.
0:50:31 That’s pretty wild.
0:50:32 Yes.
0:50:39 Business coach, executive coach, like a, yeah, like, I don’t know, what do you call it?
0:50:39 Executive coach, I think.
0:50:40 Yeah, executive coach.
0:50:43 I think those are all I have right now.
0:50:46 I had a PT briefly for my knee rehab.
0:50:54 But, yeah, basically anything I do now, my first step is to, my first step is to start the same day I have the idea.
0:50:55 That’s, like, my rule.
0:50:57 Oh, you want to do X?
0:50:57 Great.
0:51:00 Like, same day you need to do something in that area.
0:51:02 You need to go have your first session in some way.
0:51:03 Drop everything and do it.
0:51:05 So I have this sort of, like, drop everything and do it rule.
0:51:10 And then the next thing that I’ll do is I’ll try to find a coach because I know a coach is going to speed me up in the process.
0:51:12 And it’s like, obviously, these things cost money.
0:51:14 So you can’t, like, always get coaches for everything.
0:51:16 But you kind of can.
0:51:20 Like, there’s a guy in our basketball league, this guy, Alex, and he’s just nasty on the court.
0:51:21 He’s so good.
0:51:23 And I’m like, wow, Alex, what did you do?
0:51:25 And he’s smaller than me.
0:51:30 He’s quicker, but he’s not, like, he’s not, it’s not his athleticism is why he’s so good.
0:51:32 This guy’s just better at basketball.
0:51:35 And growing up, I thought I was training to be good at basketball.
0:51:36 That was, like, a goal of mine.
0:51:37 It’s just I never had any coaches.
0:51:39 I never did take it seriously.
0:51:41 I didn’t know how to, I didn’t know how to train properly.
0:51:43 And he told me this story.
0:51:44 I was like, Alex, what were you doing differently?
0:51:48 And he basically was like, when I was young, he’s like, I didn’t have any money for a coach.
0:51:50 But I saw, I was at the gym training myself.
0:51:52 And I saw this trainer training this other kid.
0:51:55 So I went up to the trainer and I was like, hey, how much for a session?
0:51:56 He’s like, oh, it’s like $75.
0:52:00 He’s like, oh, my God, no way my parents are going to pay $75 for a session.
0:52:11 And so he goes, he asked a great question, which is, he was like, is there anything I could help you with that you would be willing to give me a session for if I, if I helped you with that thing?
0:52:13 Like, you know, for example, do you have another session coming in?
0:52:16 Like, could I be, could I run around and just be a rebounder?
0:52:17 Shag balls for you.
0:52:17 Could I clean up?
0:52:19 Could I show up early?
0:52:24 Could I do, could I do, you know, help you with your, your text messages to your, all the people you’re scheduling?
0:52:25 Like, what can I do?
0:52:28 And the guy was like, all right, like, that’s endearing.
0:52:29 Look, fine.
0:52:32 And so he, he lets him basically like help him during sessions.
0:52:36 And then that way he was actually like learning while teaching somebody else.
0:52:38 And then he’d have his own session at the end.
0:52:40 And he’s like, cool, give me 30 minutes, give me 40 minutes at the end.
0:52:42 And he just did that.
0:52:45 And he got so good as a young kid, just doing that.
0:52:50 And then eventually built his own business training while he was getting trained.
0:52:51 You know what I mean?
0:52:56 And ended up actually turning it into a revenue generator versus just a cost for himself.
0:52:56 Dude, okay.
0:52:59 So I am so bought into everything you’re saying.
0:52:59 I do this as well.
0:53:00 I’ve got all types of coaches.
0:53:03 To add to it.
0:53:09 The second thing that I do after getting a coach is I put a date where I’m like, I must perform on this date.
0:53:12 And one of the ways that I got that idea.
0:53:16 So like, for example, if it’s like a fitness thing, it’s like, I want to achieve this body fat by this time.
0:53:20 Or I want to run this race on this date.
0:53:22 Or, you know, I want to be able to do X, Y, and Z.
0:53:23 Lift this amount of weight.
0:53:24 It’s a goal, but it’s not just a goal.
0:53:25 It’s a performance.
0:53:26 Is that what I’m hearing?
0:53:27 I tried to make it a performance.
0:53:29 So for example, if it’s just like a 5K.
0:53:32 Just I want to run a 5K in 21 minutes.
0:53:34 That’s not particularly fast, but it was hard for me.
0:53:36 And it was just a really nice thing to work back from.
0:53:37 So there’s an end date.
0:53:39 So I find it quite motivating.
0:53:43 Or I want to bench this amount of weight on this date.
0:53:47 And do you remember that TV show on MTV called Made?
0:53:51 Where they would teach people how to learn something in approximately 30 days.
0:53:52 And so it was like…
0:53:53 I love that show.
0:53:54 I love that show.
0:53:58 And so what they would do is they would take this young woman that she was like,
0:54:01 I want to do a backflip on a BMX bike.
0:54:05 Or I want to be able to win a skateboard competition.
0:54:07 These kind of crazy ideas are like, I want to be…
0:54:08 I don’t remember.
0:54:10 The backflip one always stuck in my brain.
0:54:12 And they hired a BMX coach.
0:54:14 And this little girl, her whole shtick was like,
0:54:16 she’s like a prissy, cool, popular girl.
0:54:19 There’s no way she wants to do this nitty-gritty BMX thing
0:54:21 with the kids from the other side of the railroad tracks.
0:54:23 And she ends up doing it.
0:54:25 And in a competition, she did a backflip on a bike.
0:54:27 And I remember that show.
0:54:31 We should do an MFM version of Made.
0:54:32 Where everyone just…
0:54:33 It doesn’t matter what the challenge…
0:54:34 It doesn’t matter what the thing is.
0:54:35 You just got to pick a thing.
0:54:40 It could be like, I want to go try to meet a girl,
0:54:42 but use their language in a foreign country.
0:54:44 Or I want to go ask directions in Spanish.
0:54:46 Or whatever.
0:54:48 I want to go enter a chess competition.
0:54:50 Just something where it’s like,
0:54:52 you have a very short amount of time.
0:54:53 And you have to hire help.
0:54:57 And you have to jump off the cliff a little bit
0:54:58 to master your skill.
0:54:59 Or just even learn your skill a bit.
0:55:02 So for you, it would be a piano recital.
0:55:03 Or I want to have friends over it.
0:55:05 And I want to play a song for them.
0:55:05 Right.
0:55:06 Yeah, I think that’s great.
0:55:07 I love that idea.
0:55:09 It’s kind of like we did the My First Muscle Challenge last year.
0:55:12 I think it’s like a cousin of that.
0:55:13 I’m totally on board for this.
0:55:19 I did a thing once that was similar in Australia.
0:55:21 We were three of us.
0:55:26 And we basically each wrote down a thing that we would love to have done,
0:55:28 but are scared as shit to do.
0:55:34 So for example, one’s person’s was to perform a stand-up comedy set.
0:55:37 Like just go on in an open mic and do five minutes.
0:55:43 One of the guys, he had been in a long-term relationship with a girl from his high school girlfriend.
0:55:45 They had just broken up like five years later or something.
0:55:49 And he really, he was like, I’ve never asked anyone out.
0:55:53 And he’s like, I just want to, he’s like, I want to be out somewhere,
0:55:54 see someone who I think is cute.
0:55:55 He’s like, I want to approach her.
0:55:56 I want to ask her out.
0:56:00 And he’s like, I just want to like, like not, I just want to overcome that one thing.
0:56:02 And like, he’s like, I know that sounds stupid.
0:56:03 We’re like, no, it doesn’t sound stupid.
0:56:04 Everybody’s got these things, right?
0:56:12 Another, um, another person’s was, uh, to, um, I forgot how they phrased it,
0:56:17 but it was, I remember it was something like, you know, at parties when like the dance circle forms,
0:56:21 yeah, they want to go in one, go in one and do a thing and then get out.
0:56:22 And I was like, what?
0:56:29 And so we took a hip hop dance class together with our friend who’s a girl who she’s a great dancer.
0:56:33 She’s like a professional dancer and me, my buddy Trevor and her went to a hip hop dance
0:56:35 class, just prepping for the circle.
0:56:40 Like the whole time, by the way, when you go to that class, it’s two different classes.
0:56:44 If you just show up to a class just for whatever, or you show up thinking, I’m, this is me going
0:56:45 in the circle at some point.
0:56:46 Yeah.
0:56:47 It’s the best, right?
0:56:49 You’re like, I’m going to Julia Stiles as some bitch.
0:56:52 And like, we got kind of addicted to it.
0:56:54 We would start to make up new ones to do every few days.
0:56:58 So it’d be like, I’m going to go for a walk right now, but I’m going to have like three,
0:57:01 like, I’m going to have three like conversations, like, you know, I don’t know if this conversation
0:57:04 was like on my walk, I’m not just going to smile and nod.
0:57:07 Like I’m going to, I’m going to basically give a smooth compliment to like, you know, three
0:57:08 people along the way.
0:57:10 If I notice something I like, I’m going to say it and it’s going to go well.
0:57:12 I’m going to have that interaction.
0:57:14 Or like, I’m not going to answer the question.
0:57:15 How are you today?
0:57:15 With the word good.
0:57:16 Exactly.
0:57:17 Exactly.
0:57:19 Yeah.
0:57:23 My friend Noah Kagan used to have this thing where he was like every single day, he’s like,
0:57:26 I was, when I was trying to get my business going, I would ask for a discount on
0:57:27 every single thing that I bought.
0:57:31 He’s like, I just need to get over like the nerves and just be not afraid of confrontation
0:57:32 and asking for things.
0:57:33 Yeah.
0:57:34 That’s amazing.
0:57:34 Yeah.
0:57:37 We have a whiteboard in our living room called, it was corny name.
0:57:38 It was fear nation.
0:57:40 We just wrote everything we’d be afraid of.
0:57:41 And then you, you try to cross them out.
0:57:45 You pick one each day and you try to cross it out or pick one every couple of days and try
0:57:45 to cross it out.
0:57:48 We’re going to do MFM made instead of MTV made.
0:57:49 We’re going to have MFM made.
0:57:51 What’s yours going to be?
0:57:54 That’s an interesting question.
0:57:55 I would need to think about it.
0:57:59 I, um, I don’t think it would be a fitness related thing.
0:58:02 Cause that’s too easy, but I would have to pick like an emotional thing.
0:58:04 Like the equivalent of asking a girl out.
0:58:05 I think it’d have to be dancing.
0:58:07 Dude, that would be the worst.
0:58:12 Maybe that would be, yeah, I would rather, uh, like punch myself in the stomach 20 times
0:58:13 than go and dance in a circle.
0:58:17 You’re like, nevermind.
0:58:18 Edit this out already.
0:58:19 We’re not, we’re not doing this episode.
0:58:20 Uh, what do you think?
0:58:20 Is that it?
0:58:22 That’s it.
0:58:28 Amish dying pets for con very eclectic episode.
0:58:31 Kids playing, uh, the buffet you never knew you wanted.
0:58:34 All right.
0:58:34 That’s it.
0:58:35 That’s the pod.
0:58:38 I feel like I can rule the world.
0:58:40 I know I could be what I want to.
0:58:43 I put my all in it like no days off.
0:58:44 On the road, let’s travel.
0:58:45 Never looking back.
0:58:51 Hey everyone.
0:58:51 A quick break.
0:58:55 My favorite podcast guest on my first million is Dharmesh.
0:58:56 Dharmesh founded HubSpot.
0:58:57 He’s a billionaire.
0:58:59 He’s one of my favorite entrepreneurs on earth.
0:59:03 And on one of our podcasts recently, he said the most valuable skills.
0:59:08 That anyone could have when it comes to making money in business is copywriting.
0:59:12 And when I say copywriting, what I mean is writing words that get people to take action.
0:59:15 And I agree, by the way, I learned how to be a copywriter in my 20s.
0:59:17 It completely changed my life.
0:59:19 I ended up starting and selling a company for tens of millions of dollars.
0:59:23 And copywriting was the skill that made all of that happen.
0:59:27 And the way that I learned how to copyright is by using a technique called copywork,
0:59:30 which is basically taking the best sales letters.
0:59:32 And I would write it word for word.
0:59:35 And I would make notes as to why each phrase was impactful and effective.
0:59:38 And a lot of people have been asking me about copywork.
0:59:39 So I decided to make a whole program for it.
0:59:40 It’s called Copy That.
0:59:42 CopyThat.com.
0:59:43 It’s only like 120 bucks.
0:59:47 And it’s a simple, fast, easy way to improve your copywriting.
0:59:49 And so if you’re interested, you need to check it out.
0:59:50 It’s called Copy That.
0:59:53 You can check it out at copythat.com.
Episode 706: Sam Parr ( https://x.com/theSamParr ) and Shaan Puri ( https://x.com/ShaanVP ) talk about offline businesses that are crushing it.
—
Show Notes:
(0:00) Be a $1M dollar business spotter
(4:56) Swim lessons franchise
(6:34) AWS for the Amish
(17:25) At home pet euthanasia service
(25:37) Take out order call center
(42:35) Cheat code: Coaches
(52:13) Add a performance
—
Links:
• Steal Sam’s guide to turn ChatGPT into your Executive Coach: https://clickhubspot.com/ogh
• Goldfish Swim School – https://goldfishswimschool.com/
• IbyFax – http://ibyfax.com/
• Lap of Love – https://www.lapoflove.com/
• Tarro – https://www.tarro.com/
• Owner – https://www.owner.com/
—
Check Out Shaan’s Stuff:
• Shaan’s weekly email – https://www.shaanpuri.com
• Visit https://www.somewhere.com/mfm to hire worldwide talent like Shaan and get $500 off for being an MFM listener. Hire developers, assistants, marketing pros, sales teams and more for 80% less than US equivalents.
—
Check Out Sam’s Stuff:
• Hampton – https://www.joinhampton.com/
• Ideation Bootcamp – https://www.ideationbootcamp.co/
• Copy That – https://copythat.com
• Hampton Wealth Survey – https://joinhampton.com/wealth
• Sam’s List – http://samslist.co/
My First Million is a HubSpot Original Podcast // Brought to you by HubSpot Media // Production by Arie Desormeaux // Editing by Ezra Bakker Trupiano