No Small Boy Stuff, Investing Wisdom from Nassim Taleb, plus ChatGPT Prompts We’re Using

AI transcript
0:00:01 – What’s up, Sam? – Hey.
0:00:03 – You like the, uh, you like the fit?
0:00:04 – Where did you get that?
0:00:06 – Our boys at Jambie sent it over,
0:00:07 the No Small Boy stuff.
0:00:08 Christmas edition.
0:00:10 (laughing)
0:00:12 ♪ I feel like I can rule the world ♪
0:00:15 ♪ I know I could be what I want to ♪
0:00:17 ♪ I put my all in it like no days off ♪
0:00:18 ♪ On the road let’s try ♪
0:00:19 – You know what’s pretty funny?
0:00:21 I actually used the phrase No Small Boy stuff,
0:00:22 like kind of a lot.
0:00:24 – Yeah, I remember the guy who tweeted it.
0:00:27 I think his name was Bengali87.
0:00:29 And this was back in 2022, he said,
0:00:32 “Best business slash entrepreneurship podcast out there.
0:00:35 “Big money that is, No Small Boy stuff.”
0:00:36 (laughing)
0:00:37 – I loved that.
0:00:38 – And that’s basically the phrase
0:00:40 that we use for this podcast a lot, No Small Boy stuff.
0:00:43 But frankly, I kind of use it a lot in my life.
0:00:46 Like, I don’t know, man, that’s Small Boy kind of stuff.
0:00:48 Like it’s sort of like in succession where they say,
0:00:50 “You’re not a very serious person.”
0:00:52 It’s kind of like that.
0:00:56 – I also use the phrase, but I’d never say it.
0:00:58 ‘Cause saying it to me feels so cringe,
0:01:01 but I think it like a thousand times
0:01:02 for every one time that I say it.
0:01:05 And every one time I say it, it feels so awkward to me.
0:01:09 It’s like saying, just do it in the Nike slogan way
0:01:10 or something.
0:01:12 You don’t really want to say that.
0:01:14 Hey guys, just in the fourth quarter,
0:01:16 we just, Nike baby, just do it.
0:01:17 And then they’ll be like, what?
0:01:19 Why are you saying slogans at us?
0:01:20 But I do think it a lot.
0:01:21 It’s actually like–
0:01:22 – I think it a lot.
0:01:23 – Meaningfully affected the trajectory of my life
0:01:25 is to use this phrase.
0:01:26 ‘Cause there’s so many situations
0:01:28 where there’s like a little Small Boy response,
0:01:31 ooh, I’ll behave like a little Small Boy in this situation.
0:01:32 Or–
0:01:35 – Yeah, that phrase and what Amjad said recently
0:01:37 about what will make the better story,
0:01:39 that has had a fairly meaningful change.
0:01:41 Just in, you know, it’s only been a few weeks,
0:01:43 but like I think about that actually a lot.
0:01:47 – He also said something else where he was talking about,
0:01:49 he was basically so comfortable with this like
0:01:52 10 year plus Odyssey that he’s been on building this.
0:01:54 And we’re like, wow, you’ve been doing this for so long
0:01:56 and wow, you did this for years
0:01:58 before you had really any recognition
0:02:00 or any funding and you just kept going.
0:02:04 And he was just like, yeah, I persist.
0:02:05 And he was just like, yeah, he’s like,
0:02:06 I think that’s what I do.
0:02:08 He’s like, I didn’t really think about it that consciously,
0:02:11 but like, I’m pretty comfortable pushing the boulder
0:02:13 for a long time up the mountain.
0:02:14 And he goes, I realized like,
0:02:16 I guess that’s my like competitive advantage.
0:02:18 Like I’m in it for the long haul
0:02:20 and I’ll just persist.
0:02:23 It’s like, oh, and we were both like, small like.
0:02:24 (laughing)
0:02:25 – Yeah.
0:02:26 – A quick intake of breath.
0:02:27 (laughing)
0:02:29 What do you want to start with today?
0:02:31 – All right, I got a good story for you.
0:02:35 So there’s this great Naseem Taleb quote or tweet
0:02:38 where he, Taleb who wrote Black Swan and Anti-Fragile,
0:02:40 he’s kind of this like contrarian thinker.
0:02:44 – Was he like a successful hedge fund investor,
0:02:44 but he was successful
0:02:46 because he had an interesting life philosophy.
0:02:49 And then he became like a thinker.
0:02:50 Is that his story?
0:02:51 – I believe so.
0:02:51 I believe so.
0:02:53 I believe he’s like a successful trader
0:02:56 and part of his success, unless I’m mixing him up
0:02:57 with somebody else, part of his success
0:03:00 was that he noticed that humans are,
0:03:05 we would rather win frequently in small amounts
0:03:07 and then lose a bunch when we’re wrong.
0:03:09 It’s like gambling, it’s like playing craps, right?
0:03:11 You know, one roll of dice, you made a little money,
0:03:12 two rolls of dice, you made a little bit of money,
0:03:14 but eventually you roll a seven
0:03:15 and it wipes out the entire board.
0:03:17 All of the chips go away.
0:03:19 But he’s like, humans are more comfortable with that
0:03:23 versus he was willing to bleed a little every day
0:03:26 and look stupid every day for years.
0:03:29 But then when his big, sort of like the big short,
0:03:32 when his big bet pays off in his country
0:03:34 and bet pays off, he makes back all the money in one day.
0:03:35 – Got it, okay.
0:03:36 – And I think that’s his story.
0:03:37 If it’s not his story,
0:03:40 his book is talking about the guy who does that.
0:03:42 So I can’t recall if it’s him or if he’s the author
0:03:45 or he’s the author and the hero of the story.
0:03:51 Okay, so he, Tel Aviv tweeted this thing.
0:03:54 He goes, “I conjecture that if you gave an investor
0:03:57 “the next day’s news, 24 hours in advance,
0:04:00 “he would go bust in less than a year.”
0:04:03 And this is basically the back to the future premise, right?
0:04:04 So I don’t remember the movie back to the future,
0:04:07 but what’s his name, Biff or whatever.
0:04:10 – Biff, he finds like the sports betting book
0:04:12 that tells all the winners for the next decade of the game.
0:04:14 – Exactly, he goes back in time
0:04:15 and then he just becomes a gazillionaire
0:04:17 ’cause he knows the scores.
0:04:19 Okay, so like let’s take,
0:04:22 yes, if you knew the exact score,
0:04:24 you’d have to be pretty dumb to not win.
0:04:26 What these guys did and what Naseem Taleb was saying
0:04:29 is I could give you the news, so not the price change,
0:04:31 but I could give you the news
0:04:34 and I bet you would trade in correctly.
0:04:36 – Dude, I think about this all the time, by the way,
0:04:39 all the time I think if I know what I know today,
0:04:43 but I was 10 or 50 years ago, how would I capitalize on that?
0:04:44 I think about that all the time.
0:04:47 – Right, and usually the easy answers for that are,
0:04:49 oh, I just buy Bitcoin, I just buy Google, right?
0:04:51 Like, yeah, it would actually wouldn’t be that hard
0:04:55 if you could convince yourself, hey, do this one thing
0:04:56 and just shut up and trust me.
0:04:58 Like, don’t touch it for 15 years.
0:04:59 Now, what these guys did was a little bit
0:05:01 of a different experiment.
0:05:04 So what they did was they took 118,
0:05:06 as they call them, adults trained at finance
0:05:08 and they did the crystal ball test
0:05:10 and the crystal ball test was as follows.
0:05:12 They said, we’re gonna give you money,
0:05:13 so they gave them $50 each.
0:05:17 So I said, you have 50 bucks and you get to place trades
0:05:19 and you’re gonna trade, but before you make a trade,
0:05:22 we’re gonna show you the front page of the Wall Street Journal,
0:05:24 the actual front page of the Wall Street Journal,
0:05:28 from 15 random days in the last,
0:05:30 like, I think 20 years or something like that, okay?
0:05:32 So 15 random days, we’re gonna show you
0:05:33 the front page of the Wall Street Journal
0:05:35 and that’s a Wednesday edition
0:05:37 and you’re gonna place the trade
0:05:39 that would execute on the Tuesday,
0:05:40 so the day before that news.
0:05:42 So you had the news at 24 hours in advance.
0:05:44 They blacked out the stock prices,
0:05:45 so they wouldn’t just show you,
0:05:47 oh, Johnson Johnson’s up 20%, right?
0:05:49 But they would say, like there would be the headline
0:05:50 about Johnson and Johnson,
0:05:53 they would just redact the actual stock price change.
0:05:55 – So Johnson and Johnson beats earnings.
0:05:57 – Beats earnings, exactly.
0:06:01 Record unemployment, record jobs posting,
0:06:03 Fed indicates, blah, blah, blah, right?
0:06:04 Things like that.
0:06:07 And by the way, anybody can go play this game online.
0:06:08 There’s like a link to it, we’ll put it in the show notes,
0:06:11 but you can go actually do it yourself, I did it too.
0:06:13 There’s a couple other caveats to this,
0:06:16 when you go do it, which is, it’s not just a buy and hold,
0:06:16 ’cause what I was gonna do,
0:06:19 I went and did the thing, I was like, oh, cool.
0:06:21 This is news from 15 years ago.
0:06:23 I’ll just put all my money in, buy,
0:06:25 and I won’t trade, I won’t do anything else
0:06:26 for the next 15 years.
0:06:27 I know there was a bull market,
0:06:28 so I don’t need to be smart.
0:06:31 But the way that this test was designed was,
0:06:35 the trade executes, and you either go up or down that day.
0:06:36 So it’s kind of like the trade closes that day.
0:06:37 You know what I mean?
0:06:40 You get one day gain based on the one day news, okay?
0:06:45 So they do it, and the results are not good,
0:06:46 as you might expect,
0:06:48 otherwise I wouldn’t really be talking about this.
0:06:50 So the results are not good.
0:06:51 Half the players lost money,
0:06:54 even having been given the news.
0:06:56 One out of every six players lost everything,
0:06:57 and the way they lost everything
0:06:59 was they let you trade on leverage.
0:07:01 So you can trade up to like 20X leverage,
0:07:02 if you want to in this thing,
0:07:04 like an options trader could,
0:07:05 or you could just trade, or you could skip.
0:07:07 You don’t even have to trade any given day.
0:07:08 You could just say pass.
0:07:09 I don’t feel confident out of it.
0:07:11 – An example headline would be,
0:07:12 like an example trade would be,
0:07:14 they would see something that says like,
0:07:16 Fed is going to cut rate today.
0:07:17 I guess you would assume
0:07:19 that the index is going to go up like a couple of percent.
0:07:22 So you would bet on the index or bet on what?
0:07:24 – It’s basically the S&P 500,
0:07:27 or it’s the 30 year treasury.
0:07:29 So you bet on one of those two things.
0:07:31 You bet, you buy the S&P index for that day,
0:07:32 or you could-
0:07:34 – Short a bond.
0:07:35 – You go long or short,
0:07:37 or you can go long or short the bond, right?
0:07:39 So that’s, let me just show you an example.
0:07:41 So like, this is the Wall Street Journal
0:07:43 that they, the article that they show you.
0:07:45 So it’s Obama does something.
0:07:48 And then you see this business and finance section
0:07:49 and talks about Rupert Murdoch.
0:07:53 Chesapeake Energy says that their CEO is going to step down.
0:07:54 Auto sales are up.
0:07:57 Homeland security says blah, blah, blah, right?
0:07:59 So there’s all this news.
0:08:02 And so you then go here and you place a trade.
0:08:03 So you say, all right, I’m going to go
0:08:04 in this little game here.
0:08:07 You can see my screen where it gave me a million dollars.
0:08:08 So I’m going to trade.
0:08:12 And it says today’s movement, I bet a million dollars.
0:08:15 So I used my full stack with no leverage
0:08:17 and the day was up 0.62%.
0:08:20 So I got an extra $600, right?
0:08:22 Then it gives me the next one.
0:08:26 And it says, oh, there’s a deadly plane crash.
0:08:28 Iran is doing some shit.
0:08:29 Okay, cool, blah, blah, blah.
0:08:32 Kraft is in talks to acquire this Brazilian company.
0:08:35 And they’re just blacking out any of the stock price news,
0:08:36 right?
0:08:37 So you read this and you can decide what you want to do
0:08:38 and you do that over and over and over again.
0:08:41 So 15 days in history.
0:08:44 And they tried to do it as 15, like they did a third
0:08:48 of the days were like, fed quarterly meeting days.
0:08:52 A third was jobs reports and a third was complete randomness.
0:08:54 And they didn’t, they’re like, we’re not trying to trick you.
0:08:56 There’s no, we’re not like cherry picking,
0:08:57 like misleading days.
0:08:59 These are just actually like random front from pages.
0:09:00 Okay.
0:09:02 This is an awesome experiment.
0:09:05 And so, okay, so back to the results now.
0:09:08 So like I said, half the people lose the money.
0:09:10 One out of six lose everything cause they got over leveraged.
0:09:14 The average person was only able to gain 3.2%.
0:09:16 So even being given the news.
0:09:19 During that era, the market was up on average,
0:09:21 I think 15% a year over the last 15 years.
0:09:22 I don’t know when this was done.
0:09:23 Correct, correct.
0:09:24 But again, these are like one day trades, right?
0:09:26 So, you know, you’re not just like buying and holding.
0:09:28 Oh, it was for 15 days.
0:09:30 The experiment was a 15 day experiment.
0:09:31 Exactly.
0:09:34 And so, okay, now why, right?
0:09:36 There’s two ways you can lose in investing.
0:09:39 One is you bet wrong, meaning you pick the wrong direction.
0:09:41 You think it’s going up, it actually goes down.
0:09:43 So basically, even given the news,
0:09:45 they were basically only able to bet the direction correctly,
0:09:46 51% of the time.
0:09:50 So it’s the same as if, you know, you just flipped a coin,
0:09:52 you would have been right the same amount of times
0:09:54 as you were being given the actual front page
0:09:56 of the Wall Street Journal, okay?
0:10:00 So information doesn’t lead to actual insight,
0:10:01 especially news.
0:10:05 The second thing is that why do they do poorly?
0:10:07 They bet sized very poorly.
0:10:10 So when you had an event, even when you were correct,
0:10:12 people didn’t size up their bets enough.
0:10:14 And when they’re incorrect, they sized up their bets too much
0:10:16 for the level of conviction that they had.
0:10:20 And, you know, this doesn’t go in line with what people think.
0:10:22 So they surveyed people separately,
0:10:26 and basically 70% of people thought that even if they got
0:10:30 the news, you know, sort of like, like basically they thought
0:10:33 that even four week old stale news would be predictive.
0:10:36 And, you know, 70% of people thought that,
0:10:39 but in this case, it just showed that even, you know,
0:10:42 one day fresh news doesn’t even really help you.
0:10:44 Okay, so then they went and they did an extra experiment.
0:10:47 They go, okay, maybe those 118, you know,
0:10:49 financial trained adults, maybe they’re just not
0:10:50 the best of the best.
0:10:51 So they went and tried to find the best of the best.
0:10:53 The best of the best actually did better.
0:10:55 So they went and found five people that were, you know,
0:11:00 hedge fund guy, the head of trading at a top five bank,
0:11:01 seasoned macro traders.
0:11:04 So they’re used to trading on this type of news.
0:11:07 They’re considered the best in the world at this.
0:11:08 And they actually did better.
0:11:11 So what they did was all of them finished with gains.
0:11:13 So all five finished with gains.
0:11:16 On average, they were up 130%.
0:11:20 And they also didn’t bet on one out of every three news things.
0:11:22 One of the big ways that they were better was
0:11:24 they just didn’t bet all the time.
0:11:26 Whereas the casual was too active.
0:11:28 Okay, what else did the pro do differently?
0:11:32 They were only right 6% more.
0:11:37 So I think if the test group was right 51% of the time,
0:11:40 if it’s going up or down,
0:11:42 these guys were only right 57% of the time.
0:11:45 It wasn’t like they correctly interpreted it.
0:11:48 But when they were right, they bet size properly
0:11:50 and they never risked too much of their bankroll
0:11:51 to where they couldn’t recover.
0:11:54 And so isn’t that amazing that, you know,
0:11:57 only being, you know, six to 10% better
0:11:59 at your predictive ability,
0:12:01 but would yield a much bigger result, right?
0:12:03 3% average gain for the test group,
0:12:07 130% average gain for the pros.
0:12:10 So it’s these small edges that can make a huge difference
0:12:12 when you apply leverage properly,
0:12:14 which was the bet sizing.
0:12:19 And what’s the takeaway that Nasim said, which is what?
0:12:21 Well, he was saying it in a polarizing way.
0:12:23 He goes, “I can gesture if you gave an investor
0:12:26 the next day’s news, he would go bust in less than a year.”
0:12:27 Got it.
0:12:29 And this kind of, this basically showed
0:12:31 that one out of six would go bust
0:12:32 ’cause they would get overzealous
0:12:35 around this perceived edge that doesn’t exist.
0:12:39 And on the whole, most people would just do worse
0:12:41 than if they didn’t have the news.
0:12:43 It’s no better than random, right?
0:12:46 That’s actually one of his books, Fooled by Randomness.
0:12:48 And at the end, they use this quote by Ray Dalio in there.
0:12:49 This is a great quote.
0:12:52 It goes, “He who lives by the crystal ball
0:12:55 will die eating shattered glass.”
0:12:56 Dude, that’s insane.
0:12:57 So good.
0:12:59 That’s like, it’s weird that multiple smart people
0:13:02 come to that conclusion that I never would have come to.
0:13:03 Like I would have thought,
0:13:05 like I guess everyone would have thought that
0:13:07 if you know the future or you know the news,
0:13:10 you absolutely are going to outperform.
0:13:11 Exactly, exactly.
0:13:14 The counterintuitive wise conclusion.
0:13:16 Let me tell you one other related one.
0:13:18 So there was one other story here
0:13:20 that was kind of interesting.
0:13:22 There was a real world version of this
0:13:25 where a hacking group got access.
0:13:28 They hacked the press release system.
0:13:32 So they had access to the next day’s press releases
0:13:33 that companies put out
0:13:34 when they have like major announcements,
0:13:36 earnings, results, et cetera.
0:13:38 They got access to all of the press releases
0:13:39 that were coming out the next day
0:13:42 and they were using it as like their own form of like,
0:13:45 you know, homebrewed insider information, right?
0:13:46 They were able to get insider information
0:13:48 and so they could place a bet in the market
0:13:51 overnight or the next morning, instantaneously.
0:13:54 That’s like a 12 hour leading indicator maybe
0:13:56 because like if you’re gonna fire your CEO,
0:13:59 you submit the press release maybe at five or 6 p.m.
0:14:03 on a Thursday and then 9 a.m. on Friday,
0:14:07 you announce it or the wire goes live.
0:14:08 Something I don’t know the exact timing.
0:14:10 I would imagine it’s a tighter wind of that
0:14:11 ’cause there’s too much leakage
0:14:16 but even a 12 second advantage would be a huge advantage.
0:14:18 If you knew 12 seconds ahead of time,
0:14:19 what the news was about to be,
0:14:20 you could just push the button, right?
0:14:21 That’s all you gotta do.
0:14:23 That’s like an interesting, like, you know,
0:14:24 like HubSpot for example,
0:14:26 whenever they have a, that I’m a shareholder,
0:14:28 whenever they have like an earnings,
0:14:30 I like, I know when it’s gonna go live that day.
0:14:32 So someone is like writing that
0:14:33 and they’ve submitted it to,
0:14:36 I forget what the PR, what the thing’s called,
0:14:37 it’s the popular– – Yeah, PR Newswire or whatever.
0:14:39 – Yeah, like whatever the popular thing is,
0:14:41 that is kind of an, I did,
0:14:43 I never even realized that actually.
0:14:44 – Well, that’s why there’s rules, right?
0:14:47 When I was at Amazon, you couldn’t trade the stock
0:14:49 in a, there’s a window, there’s like a frozen window.
0:14:51 So X days before the announcement,
0:14:53 you can’t make any trades.
0:14:55 – Oh, I know that but I’m saying the employees
0:14:57 of the PR– – Oh, right, right, right.
0:14:59 I think they probably have the same, right?
0:15:01 – Yeah, it’s just like, I didn’t even think about that
0:15:02 as a leakage. – I tell you,
0:15:04 when I accidentally did that trade
0:15:06 and then I had to go to the–
0:15:09 – What were they like, you’re an idiot?
0:15:12 – So I’m like, I’ve learned about this afterwards, right?
0:15:14 I’m a startup kid, I don’t know anything about this.
0:15:17 We get acquired, I, you know, I make a trade
0:15:18 and then I’m like, oh shit.
0:15:19 – What was the trade? – And I’m like,
0:15:21 so we’re at a subsidiary of Amazon, right?
0:15:23 – Like you bought Amazon before you sold to Amazon?
0:15:26 – I bought more Amazon stock or something.
0:15:27 Or I sold some Amazon, I don’t remember what it was.
0:15:32 And I was like, oh shit, did I just inside our trade?
0:15:35 Did I just get my hands dirty
0:15:37 with the little big boy business?
0:15:39 And so I’m like, oh shit, what do I do?
0:15:41 They’re like, you need to go speak to the general counsel.
0:15:42 And I was like, what?
0:15:44 So I get a meeting with the general counsel, urgent,
0:15:48 urgent, possible, possible big money move made.
0:15:51 And I send the email, they get me a meeting, stat,
0:15:54 I go in and he’s like, so what happened?
0:15:57 And I was like, I went and made a trade, you know,
0:15:59 I’m in the window and you know, I’m an executive.
0:16:01 So handcuff me, take me away boys.
0:16:03 – Yeah, light me up, I’m a bad boy.
0:16:06 – Yeah, I’m a bad, I’m a bad boy, take me away.
0:16:08 And he’s like, so how much did you trade?
0:16:09 – Yeah, I’m a bad boy.
0:16:12 – And I was like, I was like, yeah, it was like 150 grand.
0:16:13 And he was like, it’s okay.
0:16:15 You just, my lunch is outside,
0:16:17 can you bring it in before you leave?
0:16:19 (laughing)
0:16:21 He was like, this is for actual execs
0:16:24 at the actual company who make actual trades.
0:16:26 I was like, okay, gotcha, gotcha, gotcha.
0:16:27 Let me go sit down.
0:16:28 (laughing)
0:16:29 – That’s actually hilarious.
0:16:31 He just like totally dismissed you.
0:16:33 But what about-
0:16:34 – Let me finish the story about the hackers.
0:16:38 So let’s put you, let’s test your criminal mastermind,
0:16:40 which I love, I love doing this by the way.
0:16:41 How would I, how would I cheat if I was gonna cheat?
0:16:43 – By the way, you know that it’s always,
0:16:44 the women always think to themselves,
0:16:47 how would I get away from this bad person trying to hurt me?
0:16:49 And the men always think in this,
0:16:51 and they align with the criminal problem.
0:16:52 – How would I be the bad person?
0:16:53 – Yeah, like how would I get away with this crime?
0:16:55 That’s like, I realized that
0:16:56 after watching a lot of true crime.
0:16:58 And so go ahead, I like this experiment.
0:17:01 – So you’re the hackers, you get this,
0:17:02 but here’s the problem.
0:17:04 You, they’re like, Sam, we got it.
0:17:05 We hacked him.
0:17:08 You know, Dave over here in the corner did it.
0:17:10 He got into this, he got root access.
0:17:13 And they print out all of the press releases coming out,
0:17:14 but they put it on your desk.
0:17:15 They’re like, hey, we got like an hour.
0:17:16 We gotta make a trade.
0:17:19 – And now there’s 60,000 press releases on your desk.
0:17:20 What do you do?
0:17:23 – I guess pick like a random five and hold,
0:17:26 and act on those as soon as, yeah,
0:17:28 I mean, that’s a very challenging situation.
0:17:29 I guess-
0:17:29 – It’s a challenging situation.
0:17:31 – Just like pick the first five.
0:17:34 And if it’s good news, buy the stock.
0:17:36 If it’s bad news, somehow short,
0:17:37 but I don’t even know how to do that.
0:17:39 So I guess I would only find like the five good news ones.
0:17:40 – Right.
0:17:42 You’re like, I think I would still just end up
0:17:43 holding the index fund, Vanguard.
0:17:47 Just doing exactly what I always do,
0:17:48 80, 20 stocks and bonds, baby.
0:17:50 So what they ended up doing was they were like,
0:17:53 all right, you sort of need to do a search function
0:17:56 to figure out what news affects the price the most
0:17:59 in a positive or negative direction.
0:18:02 And I think what they figured out was that
0:18:04 it was merger announcements
0:18:07 that would be the highest kind of like volatility
0:18:09 for the company that was getting acquired,
0:18:11 ’cause it almost always gets acquired at like a 50% premium
0:18:12 to the way the stock was trading.
0:18:14 And so I think what they realized
0:18:17 was we need to be able to quickly discard 98%
0:18:20 of the news and information, ’cause it’s noise, right?
0:18:22 Which goes back to the same experiment, right?
0:18:25 The most of the news information is noise.
0:18:28 The secret is figuring out what is actually signal.
0:18:29 And most of us can’t do that.
0:18:31 And we overestimate our ability
0:18:33 to figure out signal versus noise.
0:18:35 And so they figured out the signal.
0:18:36 It was these merger things.
0:18:39 And even they were only right in their predictive ability
0:18:41 about 70 something percent of time.
0:18:43 It was enough to make hundreds of millions of dollars
0:18:45 very quickly before they got caught for doing this.
0:18:46 And then they all went to jail.
0:18:49 But isn’t that cool also?
0:18:50 (laughing)
0:18:52 I think that’s the ending.
0:18:54 When we sold the HubSpot, I think the share price,
0:18:57 I think it was $350.
0:18:59 And then like the week they announced it,
0:19:02 it went to like $460, just something like this.
0:19:03 Anyone can go back and look at it.
0:19:05 It was February of ’21.
0:19:06 And…
0:19:08 Damn, Sam, the needle mover over here.
0:19:11 Well, so that stock price went up.
0:19:14 I guess it’s a market cap of like one or two billion dollars.
0:19:17 And I remember going to KIP the CMO, I go,
0:19:18 you’re welcome.
0:19:21 He’s like, oh yeah, it was this acquisition
0:19:24 that got mentioned one time in our earnings call.
0:19:26 It just barely, it wasn’t the fact
0:19:30 that we had just announced that we grew by 45%
0:19:34 and have been compounding growth of like this, this, this.
0:19:36 And I was like…
0:19:38 You know, yeah, causation is difficult to prove, I agree.
0:19:41 Yeah, I’m like, you don’t understand, man.
0:19:42 To each his own.
0:19:44 (laughing)
0:19:46 Can I tell you?
0:19:50 All right, so we, two or three years ago,
0:19:52 we talked about AI Girlfriends.
0:19:56 I sort of understood it because I like have actually
0:19:58 developed like pretty good friendships,
0:20:00 mostly via text messages.
0:20:02 I think a lot of people who have group messages here
0:20:03 feel the same way.
0:20:05 I didn’t entirely understand it.
0:20:08 But in the last two or three months,
0:20:12 I’ve been using chat GPT in a way that now I’m like,
0:20:13 yeah, this would go away.
0:20:15 I would be very upset.
0:20:17 And I understand why people were very upset
0:20:19 when they’re AI Girlfriends.
0:20:21 Replica got, when they did like a software update.
0:20:25 Yeah, and so basically I’ve been using chat GPT
0:20:30 as like my thought partner slash assistant slash therapist.
0:20:35 And you actually said something recently
0:20:36 that made it a lot better.
0:20:40 So I sat down and I’ll explain how I’ve used it.
0:20:43 But I sat down and I said, hey,
0:20:45 can you ask me all the questions that a therapist
0:20:48 or a life coach or an executive coach would ask?
0:20:50 And we could spend a few hours,
0:20:51 but just me downloading,
0:20:53 giving you a download of my life.
0:20:54 And I did that.
0:20:56 And since then it’s been magical.
0:20:58 And I’ve been using it for all types of purposes.
0:20:59 I use it all day.
0:21:01 And I wanna maybe explain to you how I’m using it.
0:21:03 Maybe you could explain to me if you are doing the same,
0:21:05 which I think you are and how you’re using it.
0:21:06 Right.
0:21:08 By the way, I’ll just give you a quick one.
0:21:10 My prompt that I used yesterday for this,
0:21:12 I said, I was explaining the situation.
0:21:15 I go, ask me a few questions one at a time.
0:21:17 Then when you feel you have enough info,
0:21:19 then try to give me a suggestion.
0:21:22 Because otherwise it just tries to like, you know,
0:21:24 you know, like mansplaining or what is it called
0:21:27 when like guys hear, like your girlfriend is explaining
0:21:29 something you’re trying to fix the problem right away.
0:21:31 She’s like, no, I’m not trying to get the fix right now.
0:21:33 I just want you to hear me and understand me.
0:21:34 You’re like, what?
0:21:36 I thought you just want the answer as fast as possible,
0:21:37 shove it into your throat.
0:21:39 And like that’s what ChatGPT does by default.
0:21:40 It’s, yeah.
0:21:42 And there’s a bunch of other downsides
0:21:45 that I wanna explain to all of this
0:21:46 and how I’m working around it.
0:21:48 But first, I’m using it for a variety of things.
0:21:51 So I’m using it for personal finance stuff.
0:21:52 And I’ll give an example for each in a second.
0:21:54 I’m using it for business questions.
0:21:57 I’m using it as like a sparring thought partner of like,
0:21:59 I’m thinking about doing this.
0:22:00 What’s your opinion?
0:22:01 I’m using it as a therapist of like, you know,
0:22:03 I’m struggling with this person at work
0:22:04 or my personal life.
0:22:05 How should I handle this?
0:22:06 Or what should my life goals be?
0:22:08 And then I’m also using it for
0:22:10 help me decide which tasks.
0:22:10 So I’ll give an example.
0:22:12 So for net worth, I use Kubera.
0:22:14 Kubera is like a net worth tracker.
0:22:16 You just log in with your bank accounts
0:22:17 and all your other accounts in it.
0:22:19 Tells you your net worth, whatever.
0:22:20 Well, they actually have a feature
0:22:22 where you can download the information
0:22:26 specifically for ChatGPT and you upload it.
0:22:28 And it doesn’t have any identifying information.
0:22:29 It’s not like it has passwords.
0:22:30 It just has a bunch of numbers.
0:22:32 And so you can, I will upload this to ChatGPT
0:22:33 and I’ll say things like, you know,
0:22:35 I like to be conservative.
0:22:39 Like, what would you rate this portfolio out of 10 of risk?
0:22:42 Or, you know, like what’s your opinion on it?
0:22:43 Like what would Warren Buffett say?
0:22:45 You can ask it all types of questions like that.
0:22:47 Or you can also say like, you know,
0:22:49 how much should I spend on a house?
0:22:52 Or what will my net worth be in 20 years?
0:22:54 Like things like that.
0:22:55 And it’s been actually really amazing.
0:22:58 Another thing that I did was I took the main KPIs
0:23:00 from my company and I uploaded it to it.
0:23:02 And I’ll be like, what are the needle-moving things
0:23:03 that I can do for this company?
0:23:05 And you could do your KPIs,
0:23:08 which is typically like an Excel spreadsheet,
0:23:10 like your company’s churn, new users, things like that.
0:23:12 You can also do your company financials.
0:23:14 And then another thing that I’ve been doing
0:23:18 is I will actually take screenshots of my calendar
0:23:20 and I’ll upload it and be like,
0:23:23 what task should I be doing for the next week,
0:23:25 the next months, the next quarter,
0:23:26 to get to the goals that I’ve told you about,
0:23:28 you know, my life goals, which by the way,
0:23:33 you helped me create quarterly and annual goals.
0:23:35 How should I be spending my time today, tomorrow,
0:23:39 next week, and it gives me an agenda
0:23:43 that I literally print out and I work according to that.
0:23:45 It’s like pretty wild and that’s how I’ve been using it.
0:23:47 And then all day, I’ll be like,
0:23:49 how should I reply to this email?
0:23:50 What’s your opinion?
0:23:52 It’s kind of crazy.
0:23:54 So that’s how I’ve been using it.
0:23:56 – It’s like you have Neuralink.
0:23:57 They just never did the surgery.
0:23:59 All right, like you’re basically putting AI
0:24:03 like as the operator in your brain in many ways,
0:24:05 but you’re just like, you know,
0:24:07 we just haven’t reached that tech point
0:24:08 where the chip is already implanted.
0:24:10 – Well, the next step of that is,
0:24:11 here’s what’s gonna happen.
0:24:13 There’s gonna be software, it probably exists.
0:24:14 I’m tinkering with a few of them.
0:24:18 That records your computer screen, your phone screen,
0:24:20 the words that you say out loud, the things you type,
0:24:23 and it’s gonna give you feedback on how you spent your day.
0:24:24 It’s gonna give you feedback on what to do,
0:24:25 things like that.
0:24:28 It’s gonna, like, you know how there’s a book,
0:24:28 I forget what the book is,
0:24:31 but the premise is Google knows more than you
0:24:33 because you are more honest in your Google searches
0:24:34 than you are when you talk to your spouse
0:24:36 or your friends or whatever.
0:24:38 The same thing happens when it’s like, yeah, you know,
0:24:41 I spend this much time working on this, this, and this.
0:24:42 I’m gonna just be like, no,
0:24:43 you did not spend that much time doing it.
0:24:45 And also you told me that you were trying to be nicer.
0:24:47 You wrote like eight really mean emails.
0:24:48 Do you know what I mean?
0:24:50 Like that’s how it’s going to be in the next six months,
0:24:52 I think there’s gonna be products
0:24:53 that are actually nailing that.
0:24:56 Yeah, I think the CEO of Microsoft,
0:24:59 I don’t know if you heard this story,
0:25:02 but I guess when Balmer stepped down
0:25:04 and they needed a new CEO,
0:25:09 and at the time Microsoft was kind of in a downward,
0:25:10 downward to flat.
0:25:13 It was an uninspired stock and company at the time.
0:25:14 So they needed something.
0:25:15 And I don’t know if you heard the story.
0:25:18 So the guy who became the CEO, Satya Nadella,
0:25:20 actually wrote a memo,
0:25:23 like it wrote a kind of like a manifesto,
0:25:27 an internal manifesto about like what Microsoft needs to do.
0:25:29 And he ends up getting the job.
0:25:30 And at the time it was like,
0:25:32 he’s like, I didn’t,
0:25:34 he’s like, I never thought I’d be the CEO of Microsoft.
0:25:36 Like, you know, you joined Bill Gates as a CEO
0:25:37 or whatever, and then Balmer,
0:25:40 and you just assumed they’re always gonna bring in somebody,
0:25:42 but they actually promoted him from within.
0:25:46 And he wrote this thing
0:25:48 and one of the key principles that he wrote in this,
0:25:49 this is a while back when he wrote it.
0:25:52 – You know what year, like ’05 or ’10 or something?
0:25:54 This was in 2014.
0:25:58 So he wrote, he bet on two things.
0:25:59 I don’t remember the second one,
0:26:03 but I remember the first one he called ambient intelligence.
0:26:05 And ambient intelligence is kind of what you’re describing,
0:26:06 which is basically like,
0:26:10 how do you have, you know,
0:26:12 computer intelligence, artificial intelligence,
0:26:15 but just like kind of on ambient,
0:26:17 like kind of in your environment
0:26:20 so that it can be helpful to you.
0:26:21 So it just knows what you need
0:26:23 without you having to go fetch it,
0:26:25 without you having to go ask specifically,
0:26:27 it can either anticipate it,
0:26:29 it can be aware of all of your context,
0:26:31 so that you don’t have to like,
0:26:33 first explain the whole situation
0:26:34 and then be able to just ask your question.
0:26:36 It already knows your situation,
0:26:38 so you could just ask the question, that sort of thing.
0:26:42 And so, isn’t that cool that he, you know, like so long
0:26:45 before, you know, opening AI wasn’t even incorporated
0:26:47 at that point or something like that.
0:26:48 This is a very long time ago.
0:26:53 So to bet on that as like one of the two like ways
0:26:57 that the tech puck is going, pretty baller.
0:26:58 – Which is shockingly hard, by the way.
0:26:59 It’s hard to make these predictions
0:27:01 and remove like the limiter part of your brain
0:27:03 and just imagine like, yeah,
0:27:05 but what would be amazing, you know,
0:27:07 like what would be cool if,
0:27:08 and that’s actually, that sounds easy.
0:27:11 It’s really hard because you constantly think like,
0:27:13 well, I can’t do that, you know,
0:27:14 like because that’s impossible
0:27:15 or that would cost too much money.
0:27:17 Like there’s all these limiters,
0:27:20 but the way that I’ve been using this, like if, like,
0:27:22 it doesn’t work perfect yet though, by the way.
0:27:25 This is like, there’s a few issues with this.
0:27:27 And I am like super not technical.
0:27:30 The first thing is contextual or context windows.
0:27:33 Like the more you talk to it, it doesn’t always learn more.
0:27:36 You actually, it runs out of memory in a weird way.
0:27:38 And so I’ve been testing like a variety
0:27:41 of different platforms, Gemini versus ChatGPT,
0:27:42 but I want to use ChatGPT
0:27:43 because I think it’s going to be around the longest
0:27:45 and they’re going to innovate the fastest,
0:27:47 but it’s not perfect at all.
0:27:50 But it’s like shocking how useful this is.
0:27:53 I finally, for a long time, I’m like, yeah, AI is great.
0:27:56 Like I can just like Google a stat and it’s going to tell me.
0:27:59 But now it’s more like, this is my life.
0:28:02 Like I am using this more than anything.
0:28:04 And so like they had their new $200 a month thing come out.
0:28:06 And I don’t even think I need the features,
0:28:07 but I’m like, whatever, I’ll take it.
0:28:10 And so I’ve like contemplated, contemplating,
0:28:12 like, should I like invest a little bit of money
0:28:13 into like building up these systems
0:28:15 just for my personal operating system
0:28:16 and like making my life great.
0:28:17 And keep in mind,
0:28:19 I don’t know anything about any of this shit.
0:28:21 I just know that it’s just effective.
0:28:24 Like it just literally is helping me get my day done better.
0:28:27 And it’s like a great bit of advice.
0:28:30 Like here’s a really another like practical way.
0:28:32 I mean, I’ll upload my measurements for my body
0:28:35 and I’ll be like, find me clothes that fit.
0:28:38 Or like, does this pair of pants fit?
0:28:39 And you just like post a link.
0:28:41 Like I just, I’ve been using it constantly.
0:28:47 – How are you, if you are using it
0:28:50 to be like this like sparring thought partner?
0:28:51 – Yeah, yeah.
0:28:51 Well, I think this is the key.
0:28:55 So what we’re saying is basically the way that
0:28:57 I think by default people will use this
0:29:00 is you ask a question, it gives an answer.
0:29:03 And actually a equally, if not more powerful way
0:29:05 is to do the exact opposite.
0:29:09 You basically say, I’m trying to think about this.
0:29:11 Ask me questions.
0:29:13 And then you get it to ask you the questions.
0:29:15 And then in that way, it’s your sparring partner.
0:29:18 It is your thought partner in like kind of fleshing out
0:29:21 or getting your own clarity around a situation.
0:29:23 And it’s available 24/7, it doesn’t judge.
0:29:27 It’s super, super intelligent, but also has like empathy.
0:29:29 You can go back and forth instantly.
0:29:33 It’s always available and there’s no lag time, right?
0:29:34 It’s better than a friend, right?
0:29:36 – You know, you have a friend who you bitch to
0:29:38 and you’re like, I just need a vent and like just give me,
0:29:39 like what should I do here?
0:29:41 But you kind of feel guilty like laying everything on them
0:29:42 or making it all about you.
0:29:44 And like, they don’t quite understand
0:29:45 exactly what you’re talking about.
0:29:47 This is just that person, but better.
0:29:49 – It’s one of the main reasons why coaches
0:29:51 and therapists are great because you’re like, cool,
0:29:53 we’re gonna have a completely one way conversation here.
0:29:56 Like I don’t gotta give you nothing.
0:29:59 I can come here and be a taker and that’s the arrangement.
0:30:00 And like, you know, I gave you the money.
0:30:02 That’s what that was for.
0:30:03 And now from there on out,
0:30:06 I don’t need to consider your feelings in this interaction.
0:30:07 It sounds like ruthless, but it’s true.
0:30:09 It’s why it’s different than just talking to a friend
0:30:11 where it’s friend, you gotta be like, sorry,
0:30:12 am I taking up too much of your time?
0:30:13 I don’t mean to put all of this on you.
0:30:15 But you know, you’re like, you’re always trying to like,
0:30:18 kind of half apologize and then reciprocate.
0:30:20 And one of the cool things about a therapist or a coach
0:30:22 is like, that’s not the social contract.
0:30:24 That’s not what’s expected in that situation.
0:30:25 AI is even better.
0:30:27 It’s like, hey, start a buggy at 1 a.m.
0:30:29 I just, I’d like to talk right now
0:30:32 and have like instant responses with complete intelligence.
0:30:34 And I’ll just keep saying, no, tell me, you know,
0:30:37 no try again until I get something that’s satisfactory to me.
0:30:38 It’s like, you couldn’t even treat a human like that, right?
0:30:40 So it’s pretty great to be able to do that.
0:30:41 – It’s become strange.
0:30:42 I call it dude sometimes.
0:30:43 I’m like, dude, what’s your problem?
0:30:44 That’s wrong.
0:30:48 Stop getting these, like, like it’s, it’s, it’s strange.
0:30:52 Because if you think about it, when you’re texting your friends,
0:30:54 like it’s, because it’s like in the same window
0:30:56 or next to the same window on your computer,
0:30:58 like you kind of forget that this is a machine
0:30:59 and you can train it how to talk.
0:31:03 It’s very strange, but it’s actually quite effective.
0:31:05 – Do you know how an LLM works?
0:31:07 Do you know what like deep learning is?
0:31:08 – No.
0:31:12 – I went and watched some videos other day just to get like,
0:31:15 cause I was like, how is this magic, magic-ing?
0:31:17 What is going on here?
0:31:20 There’s one by this guy, I think it’s called like three brown,
0:31:23 one blue is like his, his username or something like that.
0:31:24 It’s got millions of views.
0:31:27 And he explains, you know, a machine, like what is deep learning?
0:31:29 Which is like the technique that worked with AI.
0:31:31 And the second thing was, you know,
0:31:32 how large language models work.
0:31:33 What does it even mean?
0:31:34 What is large?
0:31:34 What is the language model?
0:31:36 What does that even do?
0:31:37 But check this out.
0:31:40 So okay, like here’s the example that, that, that it gave.
0:31:42 Okay, so this is me not even trying to explain to you
0:31:45 what it is cause my explanation is going to be pretty bad.
0:31:48 This is me just saying, I can’t believe
0:31:50 that this is what actually is happening.
0:31:53 I cannot fathom that this is the actual scenario.
0:31:55 Okay, so let’s take this example.
0:31:57 I wrote this, I put this on a card cause like,
0:31:58 I can’t forget this.
0:31:59 I’ll never forget what I learned.
0:32:02 All right, so imagine this number seven, right?
0:32:05 So let’s say you’re trying to train AI
0:32:07 to be able to see that this is seven.
0:32:08 How do you do that?
0:32:09 You can hard code it, but well,
0:32:11 every time you see the number seven,
0:32:12 it’s like a capture, right?
0:32:14 It’s like written a little bit differently.
0:32:18 So it’s like, you can’t just say this is exactly a seven
0:32:20 cause you write your seven slightly different than me.
0:32:21 Maybe you put the little line through it.
0:32:23 Maybe you have a little angle to it, whatever, right?
0:32:25 So you just want it to be able to recognize
0:32:28 anybody’s handwriting and figure out seven or not seven.
0:32:30 Right, what number is it?
0:32:32 So how does it work?
0:32:36 So imagine basically a classroom.
0:32:39 Okay, so here’s a row of kids.
0:32:42 So there’s 10 kids standing there.
0:32:46 And each of the 10 kids is like holding one of these cards
0:32:48 with a different number on it, right?
0:32:49 But actually it doesn’t have the whole number.
0:32:52 So, or actually they have the whole number,
0:32:55 but for at first it just says, all right,
0:32:56 there’s a whole index card.
0:32:58 We gotta figure out, we don’t even know if this is a seven
0:33:00 or a dog or a car.
0:33:01 It could be anything, right?
0:33:03 So it just zooms in and it says,
0:33:05 let’s look at this little section right here.
0:33:07 Like these 20 pixels, okay?
0:33:11 These 20 pixels, you know, on this area, it’s white.
0:33:14 So if you got color there, sit down kids.
0:33:15 Anybody who’s got color over here, sit down
0:33:18 cause this picture is white over here.
0:33:20 Can’t be, can’t be you, you’re eliminated.
0:33:23 And then over here, it’s like, hey, there’s some blue ink.
0:33:24 Something is here.
0:33:26 So if you got blue ink in this little section,
0:33:29 stay standing if you don’t sit down, right?
0:33:31 So that like eliminates a bunch of, you know,
0:33:33 like kind of thought processes.
0:33:35 So then it passes it to the next layer,
0:33:36 the next layer of 10 kids.
0:33:40 And it says, all right, who here’s got this flat line?
0:33:44 Okay, so the seven stay standing, the five stay standing.
0:33:45 You know, the threes are kind of like,
0:33:48 hey, we got some stuff up here up top, the eights,
0:33:49 but you know, the four,
0:33:51 the number four doesn’t have a little roof on top.
0:33:53 So it’s like, I’m out, I’m out.
0:33:54 And you’re like, okay, go sit down.
0:33:55 It’s like paintball, right?
0:33:57 You’re out, go go to sit on the side.
0:33:58 And then, so now you’re left with like, you know,
0:34:00 some of the numbers.
0:34:01 And then it says, all right,
0:34:03 we got a little, little stick over here.
0:34:04 Who’s got a stick over there?
0:34:06 And it’s like the threes are like, oh, I’m out now.
0:34:07 That’s not me.
0:34:08 But the sevens and the fives are like,
0:34:09 hey, we’re still in, it might be us, right?
0:34:10 Bingo.
0:34:13 And so you just keep passing it from layer to layer,
0:34:16 showing it like kind of more pixels on the screen.
0:34:19 And it’s trying to get with some level of confidence
0:34:20 at the end, right?
0:34:22 It’s going to be seven and maybe five at the end.
0:34:25 And the seven’s like, yo, I’m 90% sure it’s me.
0:34:27 And the five is like, ah, it’s maybe 10% that it’s me.
0:34:29 It’s just an ugly five.
0:34:32 And then that’s how the AI knows that this is a seven
0:34:35 ’cause it passes it from layer to layer to layer to layer,
0:34:36 looking at the pixels on the screen
0:34:39 and basically trying to figure out, trying to guess,
0:34:40 is it, is it one of you?
0:34:44 I think with some probability, it’s this.
0:34:48 Okay, so that’s just recognizing a number, okay?
0:34:49 Now imagine what you’re doing.
0:34:52 You’re giving it KPIs of your company.
0:34:54 It has to understand what a KPI is,
0:34:56 what a company is, that you were looking for strategy,
0:34:58 what strategy sounds like, it’s got to say something
0:35:00 that you, as a successful business person
0:35:03 who sold your companies for tens of millions of dollars,
0:35:06 that you will respect the output of this.
0:35:10 Isn’t that mind blowing that that’s even a thing?
0:35:12 And so now you take, how does that work?
0:35:14 So now you take, instead of the seven,
0:35:18 take an example where it’s like the dog-
0:35:19 – Blanked.
0:35:21 – Right, so it’s like, what’s it going to come after?
0:35:24 It basically sees a sentence, the dog, or the dog-
0:35:28 – It’s like, what’s a dog and what do they commonly do?
0:35:28 – It doesn’t even even know that.
0:35:31 It has no idea what a dog is, it has no meaning.
0:35:33 It just has, it read the whole internet.
0:35:34 So what they did was they were like,
0:35:35 hey, go read the whole internet.
0:35:38 Which like, if you or I, we were like, yo, Sam,
0:35:40 I gotta like, let’s do this, man, we could do this.
0:35:43 We’re gonna take so much AdRoll, we’ll stay up all night,
0:35:45 and we’re gonna read 24/7, all the text on the internet.
0:35:47 It would be like thousands of years
0:35:50 before we could ever ingest what, you know,
0:35:52 what they gave it in one training run, right?
0:35:56 So they said, go read all the internet, cool, done, all right.
0:36:01 Now, user puts in a sentence, the dog blank.
0:36:03 Guess what, guess what the next token is.
0:36:05 Guess what the next little word is
0:36:08 that comes after the dog, the dog.
0:36:10 It’s like, the dog barked, the dog jumped,
0:36:13 the dog, you know, is hungry, right, whatever.
0:36:15 It could be like one of many things.
0:36:16 So then it takes the next word,
0:36:19 which might be like, the dog barked.
0:36:21 And then it passes that phrase back through.
0:36:24 It’s like, now you’ve got the phrase, the dog barked.
0:36:25 What comes after that?
0:36:27 And it just loops that over and over again
0:36:28 to generate the next word.
0:36:31 So that’s when you see chatGPT writing,
0:36:34 it’s literally taking like the next token
0:36:36 it thinks it should say, then it feeds it back through
0:36:39 and then says, okay, well, if I said the dog barked,
0:36:41 then I gotta say loudly, right?
0:36:43 Okay, loudly, period.
0:36:45 If I said the dog barked loudly, what would I say next?
0:36:48 And it keeps recursively doing that.
0:36:50 And that’s what’s actually,
0:36:52 that’s how it generates a training thing, right?
0:36:53 And that’s like, you know,
0:36:57 this is only part of it half explained correctly.
0:36:59 But let’s assume for a second
0:37:02 that I’m not like completely misinterpreting this.
0:37:05 Let’s assume for a second that this is only, you know,
0:37:08 a percentage of what is actually going on, right?
0:37:09 There’s still parameters and weights
0:37:11 and all this other stuff that I haven’t even talked about yet.
0:37:13 This is like God, right?
0:37:17 This is like, what, like, how is this even a thing?
0:37:18 It’s so mind blowing to me.
0:37:19 It’s mind blowing.
0:37:21 It’s absolutely mind blowing.
0:37:25 And I think that, you know, I think young, you know,
0:37:26 I don’t hang around like 18 year olds.
0:37:28 I think they’re using it for school.
0:37:29 So I think they get it.
0:37:32 I think I know a little bit about it
0:37:33 ’cause I hang out with smart people
0:37:36 and I’m on the outskirts of like what these guys are doing.
0:37:38 So I kind of see it online and I play with it.
0:37:40 For the average Joe, for my mom and dad,
0:37:43 for a 35 year old who isn’t like tech savvy
0:37:45 who just works as a mechanic,
0:37:47 I don’t think that they’re using it this way.
0:37:48 I don’t think they’re using it at all.
0:37:51 And it’s gonna change everything.
0:37:52 It’s just like so like crazy.
0:37:55 Like when the average Joe starts getting into this,
0:37:57 I think young people, like a 21 year old or something,
0:38:00 I think it’s like changing schools, by the way.
0:38:03 It’s like the grading system is like totally F’d up, right?
0:38:06 Yeah, like when I like think about this, I’m like,
0:38:08 like there is no homework.
0:38:10 You can’t do homework anymore.
0:38:11 You know what I mean?
0:38:13 Someone DM me yesterday, it’s not just homework.
0:38:16 Someone DM me last night, they were showing me
0:38:19 this guy Oliver, Oliver Hahn.
0:38:21 He texted me this thing or DM me this thing.
0:38:23 He said, “Coding interviews.”
0:38:26 Like so, okay, you, school, yeah, kids in school
0:38:28 are using chat to write essays and the teachers are like,
0:38:30 fuck, how do we, how do we get,
0:38:31 it’s a cat and mouse game to try to be like,
0:38:33 hey, how do I stop you from using AI
0:38:35 to just like do your assignments?
0:38:37 Well, the same thing is true for coding interviews.
0:38:40 So coding interviews, which are used to hire programmers,
0:38:43 there’s a website, leapcodewizard.io.
0:38:45 And basically it just helps you cheat
0:38:46 on your coding interview.
0:38:48 It’s like, oh, you got a coding test to get a job?
0:38:50 Just use this, watch, it’ll write all,
0:38:51 it’s the same thing as a student,
0:38:53 it’ll write the essay for you basically.
0:38:56 And it’s like, you know, doing 15 grand a month
0:38:57 and recurring revenue.
0:38:58 I’m just helping people cheat on coding interviews.
0:38:59 – This is insane.
0:39:01 – It’s so difficult, right?
0:39:02 But it’s kind of amazing.
0:39:05 – How are you using this every day?
0:39:06 – Like, let me just go to ChatGBT
0:39:07 and just tell you like my last few.
0:39:10 – Is ChatGBT your tool of choice
0:39:11 or do you like any of the other ones?
0:39:13 – Yeah, it is my like default.
0:39:15 And then, you know, I play with everything else.
0:39:18 So usually if I’m like, how factually correct
0:39:19 does this need to be?
0:39:21 I’ll perplexity.
0:39:22 So I go to perplexity.
0:39:24 If it’s analysis, I’ll use ChatGBT.
0:39:26 Like have you used like the 01 stuff,
0:39:27 like the deeper thinking stuff?
0:39:29 – Only for 24 or 48 hours.
0:39:31 Yeah, it’s brand new, but yeah, it’s wild.
0:39:33 It takes a long time, but it’s wild.
0:39:34 – Well, yeah, that’s the point of it.
0:39:36 It’s basically, if you told the computer,
0:39:38 hey, you don’t have to just quickly like,
0:39:41 again, shove an answer down my throat instantaneously
0:39:42 where you’re just predicting the next token
0:39:44 and good enough to go, right?
0:39:46 There’s 70% chance it’s this word.
0:39:47 Let’s just put it in.
0:39:48 They found they could get,
0:39:50 you could do more interesting tasks
0:39:54 if you just said, hey, take your time before your answer.
0:39:56 Let’s just give it more time to think
0:39:57 and then it’ll come up with a better answer.
0:39:58 – It’s temperamental.
0:39:59 – Which is amazing.
0:40:02 So I use that, but like check this out.
0:40:06 So there was this press release recently for,
0:40:08 we were talking about IVF, remember?
0:40:09 – Yeah.
0:40:10 – Well, it’s kind of this amazing thing.
0:40:10 I don’t know if you saw it.
0:40:11 It’s called Fertilo.
0:40:13 Did you see what happened with this thing called Fertilo?
0:40:16 – So basically it was like the first live birth
0:40:19 using eggs that matured outside the body.
0:40:22 So like, if you’ve done IVF, it’s like a pretty expensive
0:40:23 and pretty like harsh thing on the body.
0:40:26 Like the woman has to get like injections,
0:40:28 which are hormone injections to try to get your,
0:40:32 they’re trying to get your eggs to essentially mature,
0:40:35 be produced and mature inside your body.
0:40:37 And so what Fertilo did was they were like,
0:40:39 cool, instead of doing that like long, expensive,
0:40:41 sort of hard on your body process,
0:40:45 we can take an immature egg, take it out of the body
0:40:47 and let’s do the hormone stuff out of the body
0:40:50 and get it to mature and then we’ll put it back in the body.
0:40:55 And so it just like removes the pain from the process.
0:41:00 And the first like actual live birth happened of a baby
0:41:02 that was born using that procedure.
0:41:03 It’s kind of amazing.
0:41:06 If true, it’s gonna make, you know,
0:41:08 it’s gonna change IVF, you know,
0:41:11 it’s gonna make it where, I don’t know if it’ll just be called
0:41:13 a new procedure or what, but basically for a fraction
0:41:15 of the cost, a fraction of the time and a fraction
0:41:18 of the pain, we can do the thing
0:41:19 that we’ve been doing with IVF.
0:41:20 Okay, so.
0:41:24 – Dude, it makes me realize that I think that Sahil,
0:41:26 I forget his last name from Gumroad tweeted this like thing
0:41:28 out where everyone made fun of him,
0:41:30 where he talked about how he’s like giving birth
0:41:31 is not gonna happen in the future.
0:41:34 You’re just gonna be in this sack
0:41:36 and that’s how you’re gonna grow.
0:41:39 This is, I’m like, oh shit, you’re right.
0:41:40 You know what I mean?
0:41:42 I remember we were at a dinner and Jess Ma just said it
0:41:44 casually in passing.
0:41:46 She was like, yeah, like, you know, I’m really excited
0:41:50 for and fascinated by basically like artificial wombs
0:41:52 and basically, you know, pregnant, you know,
0:41:54 you won’t give, women won’t give birth
0:41:55 at a certain point, right?
0:41:57 It’ll be like riding horses for transport.
0:41:59 It’s like, you could do it if you want to go
0:42:02 have a unique experience, it won’t be necessary.
0:42:04 – And she’s like, pass some mashed potatoes
0:42:05 and you’re like, wait, wait, wait, wait, wait.
0:42:07 – Yeah, so no, like literally that’s exactly what happened.
0:42:09 And I was like, and at the table,
0:42:10 I looked around to be like,
0:42:11 was anybody else mind blown by that?
0:42:12 Well, what’s going on?
0:42:13 Like, don’t we all want more information about that?
0:42:15 But I’m at, I was at this like far diagonal,
0:42:18 seven people away, but I heard her say it
0:42:20 and I’m stuck over here talking about Facebook ads
0:42:21 with some dork and I’m like,
0:42:22 just suddenly get out of this side of the table,
0:42:24 get to that side of the table.
0:42:25 So after the dinner.
0:42:27 – Jess, what did you say about wombs?
0:42:28 – I literally, I flagged her down and I was like,
0:42:29 oh, you’re getting an Uber?
0:42:30 Hey, cancel that real quick.
0:42:32 And she canceled it and I was like,
0:42:34 what was that thing you were talking about?
0:42:36 And then she explained and she explained the companies
0:42:38 that she’s tracking and like where we are
0:42:40 in the scientific life cycle of like,
0:42:41 how real is that possibility?
0:42:44 And how, what are the laws of physics?
0:42:46 Is that inevitable or is it impossible?
0:42:49 Right? Cause basically if something is not impossible,
0:42:53 it’s inevitable, which in itself was kind of a dope idea.
0:42:56 But like that already kind of blows my mind.
0:42:58 And so she was explaining it.
0:42:59 So, you know, I’ve sort of been paying attention
0:43:02 to any signs of movement in that area.
0:43:03 Cause I think that’s really cool.
0:43:05 The world’s going to change pretty dramatically
0:43:07 when that happens.
0:43:08 But what I did back to the AI thing,
0:43:11 I just threw the press release into chat GPD and I said,
0:43:13 explain this article to me, tell me what they’re saying.
0:43:14 Tell me what this means in simple terms.
0:43:16 It’s a press release.
0:43:17 And so it might be misleading
0:43:19 or overstating the success of this.
0:43:21 So tell me about that too.
0:43:22 And then it just goes,
0:43:24 here’s what it means in simpler terms.
0:43:26 This company has achieved what they call the world’s first
0:43:28 a healthy baby born with a woman’s egg
0:43:29 that was matured out of everybody.
0:43:33 Normally in IVF, the doctors are doing ABC.
0:43:35 In this scenario, what they’re doing is ABC.
0:43:38 And then it explains it and he goes in simpler terms,
0:43:40 the conventional path is X.
0:43:42 The new approach is Y.
0:43:45 Why it matters if this is true, blah, blah, blah, blah.
0:43:47 And then it says, here’s why it might be misleading.
0:43:49 It’s a press release, so it’s definitely spin.
0:43:51 Number two, one success doesn’t prove a trend.
0:43:52 It talks about the world’s first,
0:43:54 but it doesn’t mention how many others they’ve tried
0:43:56 that have failed in the hit rate of this procedure.
0:43:57 It’s not peer reviewed.
0:43:59 It might be exaggerating the future impact.
0:44:02 We would need to know clinical trials, blah, blah, blah.
0:44:03 And then, you know, then I asked it more.
0:44:04 I was like, cool, what does the,
0:44:06 what does the scientific literature say about this?
0:44:09 So all of a sudden I’m getting like a quick biology lesson.
0:44:13 Another one, brainstorming name ideas for a project.
0:44:13 I’m like, hey, here’s a project.
0:44:15 – Yeah, it’s great for that.
0:44:16 – Ask me questions about the project
0:44:17 and then come up with names.
0:44:18 Then it comes up with dorky names.
0:44:20 I’m like, no, make the names not dorky and long
0:44:22 and don’t make it feel like it’s written by chat GPT.
0:44:24 Make it feel like it’s written by David Ogilvy.
0:44:26 And then it like comes up with different answers.
0:44:27 – A lot of financial analysis.
0:44:30 So analyzing stocks or just like, you know,
0:44:32 I see Kathy Wood on my screen a lot.
0:44:36 Like, is she actually like great at investing?
0:44:39 And then AI’s like, not feel like a monkey.
0:44:42 I see Kathy Wood on my screen.
0:44:43 (laughing)
0:44:45 – Is she just hot or good at trading, right?
0:44:46 It’s like, you know, asking these questions.
0:44:48 And again, no judgment.
0:44:50 Just gives me the answers, which was spoiler.
0:44:53 No, she underperforms the indexes
0:44:54 and has over like a 15 year period
0:44:57 and makes $100 million a year to underperform the index.
0:45:00 It’s like, wow, good on you, Kathy Wood.
0:45:03 – Thank you for, you know, for doing that.
0:45:05 Let’s see, just other ones.
0:45:07 Hey, I’m trying to do this in Excel,
0:45:07 but I don’t know how to do it.
0:45:10 Can you just tell me the function I need to write in?
0:45:12 ‘Cause like, you know, if you go Google this stuff,
0:45:14 you get like YouTube videos you have to watch?
0:45:15 – Yeah.
0:45:16 – So now I’m like, all right, forget the YouTube video.
0:45:18 Just give me the like the exact type thing
0:45:20 I need to go type in.
0:45:22 Or I’ll screenshot the Excel window
0:45:25 and I’ll just say, I’m trying to figure out in column C,
0:45:26 what are the ones, blah, blah, blah, blah.
0:45:28 And it gives me this like complicated, you know,
0:45:30 whatever count ifs formula
0:45:33 that’s as multiple like selectors, whatever.
0:45:40 Hey, can I tell you a Steve Jobs story real quick?
0:45:41 So Jobs once said that design
0:45:44 is not just how something looks, it’s how it works.
0:45:47 And a great example of that is my new partner, Mercury.
0:45:48 Mercury has made a banking product
0:45:49 that just works beautifully.
0:45:51 I use it for not just one,
0:45:53 but all six of my companies right now.
0:45:53 It is my default.
0:45:55 If I start a company, it’s a no brainer.
0:45:57 I go and I open up a Mercury account.
0:45:58 The design is great.
0:45:59 It’s got all the features that you need.
0:46:02 And you could just tell it was made by a founder like me,
0:46:03 not of, you know, bank or somewhere
0:46:05 who hired a consultant in an agency
0:46:07 to try to make some tool.
0:46:08 So if you want to be like me
0:46:10 and 200,000 other ambitious founders,
0:46:14 head over to mercury.com and open up account in minutes.
0:46:14 And here’s the fine print.
0:46:16 Mercury is a financial technology company,
0:46:18 not a bank, banking services provided
0:46:19 by Choice Financial Group
0:46:21 and Evolve Bank and Trust members, FDIC.
0:46:23 All right, back to the episode.
0:46:29 – Oh, I play games with my kids.
0:46:33 So we take pictures of like my son got all these sharks.
0:46:34 And so we just took a picture
0:46:35 ’cause he’s asked me questions, right?
0:46:37 Like, “Daddy, what is this shark?”
0:46:38 And I’m like, “Dude, shit if I know,” right?
0:46:40 Like, you know, it’s kind of like something
0:46:41 I always dreaded as a parent.
0:46:42 It’s like, “Oh, cool.
0:46:44 “My kid’s gonna ask me questions that I…”
0:46:45 You know, “Where does rain come from?”
0:46:46 And I’m like, “It’s in the clouds.”
0:46:47 It’s like, “How’d it get in the clouds?”
0:46:49 I’m like, “I think it was in the ocean.”
0:46:52 And then it just like zipped up there
0:46:54 ’cause it was hot or something.
0:46:55 And then I go, “Oh my God, this is gonna be terrible.”
0:46:57 I’m gonna expose myself.
0:47:01 And so I just do chat GPC voice mode.
0:47:03 And I’ll be like, I’ll send it a picture
0:47:04 and I’ll go voice mode.
0:47:05 I’ll be like, “Hey, tell me what these sharks are
0:47:06 “from left to right.”
0:47:08 And it reads it out to my kids.
0:47:09 And then my kid can ask a question.
0:47:11 He’ll be like, “Which one is the strongest shark?”
0:47:13 And it’ll be like, “Actually, the great white shark
0:47:15 “is the strongest shark with the most powerful bite.”
0:47:18 And he’ll be like, “No, but what if it was with a cheetah?”
0:47:19 And he’ll be like, “Well, the cheetah wouldn’t be
0:47:20 “in the ocean, but if it was in the ocean.”
0:47:22 And it’ll like interact with my kids
0:47:23 if we have like a fun time.
0:47:26 They’ll tell me all the time, can we play with AI?
0:47:27 – Dude, that’s so good.
0:47:29 I’ve got a bunch of friends who’s children
0:47:31 who’s like three, four, five talking age
0:47:33 and they are doing the exact same thing.
0:47:34 Let me- – We’ll do Trivia,
0:47:35 another hack for parents.
0:47:38 You can go, “Hey, I’m sitting here with my two kids.
0:47:41 “Their names are,” whatever, “Timmy and Tommy.”
0:47:45 And we’re gonna, we wanna do Paw Patrol Trivia.
0:47:46 Ask us easy questions.
0:47:47 And when we’re right, say ding, ding, ding.
0:47:49 And when we’re wrong, say, “That’s not right.
0:47:49 “Try again.”
0:47:51 And keep track of the scores.
0:47:52 All right, go.
0:47:54 Literally, you could just say that to it in voice mode
0:47:56 and it’d be like, “All right, first question.
0:47:59 “Marshall is a pup known for what?”
0:48:00 And you’re like, “Fire.”
0:48:01 And it’s like, “Ding, ding, ding, correct.
0:48:02 “One point for you.” – Dude, your kids
0:48:05 are gonna like fall in love with her.
0:48:07 Like, it’s pretty crazy how they’ll,
0:48:10 like imagine being, you know, raised with this.
0:48:12 This is insane.
0:48:14 Let me give you three practical ways I’m using it.
0:48:15 So they have this new thing called,
0:48:17 I think it’s new-ish called projects.
0:48:19 And so I have three folders right now.
0:48:21 And the way it works is you have like a folder
0:48:24 as a project, and then you can upload files to the project.
0:48:26 And then you can have multiple conversations
0:48:29 within the project, and it refers back to the files
0:48:31 or whatever information you store there.
0:48:32 – Let me give you an example, this is a good question.
0:48:34 What’s like, what do you throw in there?
0:48:35 – I have a health folder.
0:48:38 And so you know how everyone has like their own health guru?
0:48:41 And it’s like usually based off of like one book they read.
0:48:42 Well, I go and download the book.
0:48:43 – Yeah, you’re mine.
0:48:44 – Yeah.
0:48:47 Well, I go and I download the book that I ascribe to
0:48:51 and I will upload, and if it’s a book that’s EPUB,
0:48:52 which is how I buy it on Kindle,
0:48:55 I convert it to .txt file because that’s easier to read.
0:48:58 And I upload the .txt file to the-
0:49:01 – Even though it’s like huge, ’cause it’s a book, that works?
0:49:03 – I give it a full book, the full book.
0:49:04 I download it and I convert it.
0:49:06 And then like, so for example,
0:49:09 we were going to the grocery store today
0:49:10 and I just said like, you know,
0:49:12 there’s like this interesting book I just read
0:49:15 and I’ve uploaded the book
0:49:17 and I’ll just say make the grocery list for me.
0:49:20 And then I’ll tell me actually,
0:49:24 and I’ll say which grocery store should I go to in my area?
0:49:25 And it knows where I live.
0:49:27 And it says, yeah, like these three grocery stores
0:49:28 will have exactly what you need.
0:49:30 I think they will have what you need
0:49:33 because like, you know, I’m on this like clean meat kick
0:49:34 or whatever.
0:49:35 And he was like, yeah, the author says
0:49:37 like to buy this cut of meat
0:49:39 and you should ask the butcher this, this and this.
0:49:41 And like, here’s three butchers
0:49:43 that appear to have what you need.
0:49:45 And it’s all based off of like the files
0:49:47 that I’ve uploaded for health.
0:49:49 But then within health, I can ask it,
0:49:51 it know, I’ll like, hey, this quarter
0:49:55 I want to run a 5K at this particular time.
0:49:57 Give me like a good app to use
0:49:58 that can help track my running
0:50:00 and also tell me like what my goal should be.
0:50:02 So that’s like a couple health versions.
0:50:04 The second one is I’ve got a clothing one
0:50:06 where I literally took a photo of myself
0:50:08 and I used a tape measure
0:50:11 to measure various parts of my body
0:50:12 and I upload it to it.
0:50:12 And I’ll say, all right,
0:50:14 like make a chart with all of my measurements.
0:50:17 Thank you, remember that always.
0:50:19 Here’s some like clothing that I want to buy.
0:50:19 Here’s the links.
0:50:21 Can you like go and figure out what size it is
0:50:22 and let me like, well that fit.
0:50:24 And they’re like, well, this pants,
0:50:27 it says that they’re the same width as your thigh
0:50:29 but you actually want like two inches,
0:50:32 usually extra width that will probably feel more comfortable.
0:50:35 Or what I’ll do is I’ll upload like a blog
0:50:36 that I’ll like, Diworkwear blog.
0:50:38 And I’ll say, hey, here’s a picture.
0:50:40 I’ll literally lay a tie next to a jacket
0:50:42 and I’ll take a picture of it and I’ll upload it.
0:50:44 And I’m like, does this tie match this jacket?
0:50:45 And I’ll be like, no,
0:50:47 but that other tie that you showed me a picture of a while ago
0:50:49 that actually would look great here.
0:50:51 It’s like, that’s how I use it.
0:50:52 And then the final way that I use it,
0:50:54 and this is like my life coach folder,
0:50:56 which is like, it’s like partially like,
0:50:58 I’ll complain to it and I’ll be like, you know,
0:51:01 I noticed you’ve been complaining about this a lot
0:51:03 or I’ll upload business financials to it.
0:51:05 And that’s like more of like my sparring partner
0:51:06 throughout the day.
0:51:07 And so I have three folders right now,
0:51:10 health, clothing and like a life coach.
0:51:11 And so those are like the practical ways
0:51:12 and I’m using projects.
0:51:16 That’s the term on chat GBT.
0:51:18 And that’s how I’m using it as of now.
0:51:20 – You people are just gonna replace their co-founder
0:51:21 with this, right?
0:51:23 Like you’re gonna see a lot more solo founders
0:51:26 because you could just have an AI co-founder.
0:51:28 – You’re gonna say, well, you know,
0:51:30 you’ll reduce churn if you use this messaging
0:51:32 when you email your users.
0:51:35 And then you’re just gonna say, yeah,
0:51:37 well, you have my login to MailChimp,
0:51:40 like Shopify level ahead, get it done.
0:51:43 Or you’ll be like, you know, my Shopify store
0:51:45 is like a 2.1 conversion rate.
0:51:48 And it’s like, hey, I, you know, we ran this A/B test.
0:51:50 It like increased your conversion rate to 3%.
0:51:52 And you’re like, get after it, you know, go do it.
0:51:53 And that’s what’s gonna happen.
0:51:55 And so anyway, we’ve had these intelligent people
0:51:59 at Darmesh, whatever, explain to us all these things,
0:52:01 but it wasn’t until the last two months.
0:52:02 And in fact, recently actually,
0:52:04 since you told me to ask them the ask chat
0:52:07 to be do that question that like, I’m like,
0:52:09 oh my God, this is my life now.
0:52:11 And in fact, you actually sent out a wonderful email
0:52:12 the other day where you said,
0:52:14 here’s how to ask powerful questions.
0:52:16 I uploaded that email to chat GBT.
0:52:18 And I’m like, remember these questions
0:52:19 and like ask me them often
0:52:21 or ask yourself these questions often.
0:52:25 – Yeah, I mean, it’s just so, it’s incredible.
0:52:30 And it’s also so obvious that I think that chat GBT is,
0:52:33 I mean, it is the Google of our generation.
0:52:34 And I guess the only question is like,
0:52:38 why am I not a shareholder of open AI?
0:52:41 Like, how do I go to sleep at night?
0:52:44 – Well, I mean, Darmesh had to buy a $10 million domain
0:52:45 and then convince them to buy it
0:52:46 in order to become a shareholder.
0:52:50 So like it’s like, like asking like that.
0:52:51 – There’s always a way though.
0:52:52 – There is always a way.
0:52:54 – Why haven’t I tried everything?
0:52:55 – But that’s like saying like,
0:52:56 why am I not a billionaire?
0:52:58 It’s like, well, like you could be,
0:53:00 but like here’s some of the barriers to entry
0:53:01 that you’ve got to overcome.
0:53:02 So there’s certainly,
0:53:03 you should ask chat GBT that by the way.
0:53:04 – It’s a good question, by the way.
0:53:06 Why am I not a billionaire?
0:53:07 – It is a great question.
0:53:08 But like there-
0:53:09 – Have you ever asked yourself that question?
0:53:13 I asked a friend that question and they weren’t even
0:53:14 really that close of a friend.
0:53:16 So it was kind of a, you know,
0:53:18 it was a blunt question to ask at a dinner.
0:53:21 I was like, why are you not already a billionaire?
0:53:25 And he gave a great answer.
0:53:28 And he goes, actually what he was saying was,
0:53:31 you know, I want to start a billion dollar company,
0:53:32 something something selling.
0:53:36 And I was like, why have you not already done that?
0:53:41 And he goes, I think when I was starting
0:53:43 these other companies that I started,
0:53:44 because I didn’t actually understand
0:53:46 what a billion dollar company looked like.
0:53:48 And if I had known that,
0:53:50 I would have built the different company.
0:53:53 And he was, he was correct.
0:53:55 And, and you know, as we dug in,
0:53:58 it’s like what makes a company a billion dollar company?
0:54:01 Like, you know, there’s really only a couple of paths to that.
0:54:03 And you know, one of them, for example,
0:54:06 is like building something that has network effects.
0:54:07 So he had been building companies
0:54:10 that could do like great revenues,
0:54:12 that could be even be profitable, they could grow fast.
0:54:14 Like, you know, like those are some of the things you need,
0:54:16 but there was no network effect.
0:54:19 There was no durability, there was no defensibility.
0:54:22 There was no like, win the category.
0:54:25 It was like, just go to a category
0:54:26 where you can win inside that category,
0:54:27 but there’ll be other winners and you all compete.
0:54:29 Where is he now?
0:54:31 It was like, just as an example,
0:54:32 that was like a gaming company.
0:54:34 It’s like, there’s a lot of mobile gaming companies.
0:54:38 And at the time, like to build a billion dollar gaming company
0:54:43 you really had to be like one of the like, you know,
0:54:46 three that were gonna get built in a five year window, right?
0:54:48 Like you had to build, you know, Clash of Clans
0:54:50 or you had to build Candy Crush
0:54:52 or you had to build like one of those.
0:54:53 And even in one of those, it was like,
0:54:56 oh, actually, you know, I’m sitting here tinkering
0:54:58 on cool game designs.
0:54:59 And actually the thing I need to do
0:55:04 is build a enormous paid marketing team
0:55:07 that is like the top paid marketers in the world
0:55:10 to acquire hundreds of millions of customers
0:55:11 is what I need to do.
0:55:15 And like the cool artsy game design
0:55:17 that’s gonna win me awards is not gonna,
0:55:19 that’s not what a billion dollar gaming company looks like.
0:55:21 So he just didn’t understand the shape of something.
0:55:25 And I find that to be true about most of the goals.
0:55:28 So instead of how can I do this goal?
0:55:29 Another way of saying it is,
0:55:31 why have I not already done this goal?
0:55:33 Why is it not already true for me?
0:55:35 And then it points out some like, you know,
0:55:38 either knowledge gaps or execution gaps
0:55:43 that are today that are like more close to your timeline
0:55:47 versus when you set like an ambitious goal
0:55:48 that’s like far in the future
0:55:51 and you sort of bake in that it’s gonna take a long time,
0:55:54 you sort of avoid the, maybe the harsh realities
0:55:56 that might be actually existing today
0:55:57 in your world about those.
0:55:58 – Yeah, you had a great email
0:56:00 with a bunch of those questions.
0:56:03 Here’s a bunch of decision making questions,
0:56:05 which is I’m not sure, I’m not sure what should I do?
0:56:09 Instead you should say, what would I do if I weren’t afraid?
0:56:11 One bad question is how can I make this succeed?
0:56:15 The better question is what would make this certainly fail?
0:56:17 One final example is I can’t decide
0:56:19 which path is the right to pick.
0:56:20 A better question or a better version of that
0:56:23 is what path makes for the best story?
0:56:26 There’s actually a pretty good email.
0:56:28 I think I replied, I said this was a 10,
0:56:29 but you have like a list of better questions
0:56:32 and I use those questions in chat GPT
0:56:35 because what I’m learning with chat GPT is
0:56:39 you have to get it to ask you better questions
0:56:44 in order to, it’s input is important for its output.
0:56:47 And so yeah, I pretty much stole that email.
0:56:49 – Yeah, I think the realization was,
0:56:51 Tim first had said something way back,
0:56:53 I think I put it in the email,
0:56:56 but he used this phrase, he goes,
0:56:58 he was talking about it in the podcasting realm,
0:57:01 but first he had this quote, you read it out.
0:57:03 He goes, if you want confusion and heartache,
0:57:05 ask vague questions.
0:57:07 If you want uncommon clarity and results,
0:57:09 ask uncommonly clear questions.
0:57:11 Often all that stands between you
0:57:14 and what you want is a better set of questions.
0:57:16 – Exactly, he said this about his podcast.
0:57:21 He goes, I view questions as like a pickaxe for the brain,
0:57:23 like a pickaxe when you’re summiting a mountain
0:57:28 and you use it to sort of like pierce the side of the mountain
0:57:30 and use it to pull yourself up.
0:57:33 And so in many ways you are excavating the brain
0:57:36 with this pickaxe and your pickaxe’s questions.
0:57:39 Another phrase I use all the time in businesses is,
0:57:41 ask a better question, get a better answer.
0:57:44 So often if somebody asks a bad question,
0:57:46 and I’ll call a bad question either a vague question,
0:57:48 open-end question or a question in the wrong direction,
0:57:53 I think the rookie move is just to answer,
0:57:55 have a question at face value.
0:57:59 Like you should not answer 100% of the questions asked.
0:58:02 Like a lot of the questions need to bounce back to sender.
0:58:03 This has the wrong address on it.
0:58:05 You got to write a better address on that.
0:58:07 This won’t get delivered.
0:58:08 The way you’ve written this address
0:58:09 is not going to get delivered.
0:58:12 And so you bounce back some questions and say,
0:58:15 maybe the better question to ask is blank.
0:58:19 For example, like, instead of how could we succeed,
0:58:21 which is like a million paths all unknown,
0:58:23 it’s what would make this certainly a failure?
0:58:25 That’s much more knowable.
0:58:27 And we can establish a few ground rules
0:58:30 from that question and get some momentum towards this.
0:58:31 And you could see this with your brain,
0:58:33 just like if you ask chat, you know,
0:58:34 they call it prompt engineering
0:58:36 when it comes for AI, right?
0:58:38 Being able to ask the AI in a certain way
0:58:40 that’s going to get you a better result.
0:58:42 Absolutely the same thing is true for yourself
0:58:45 and for people around you to ask better questions, right?
0:58:49 I do, I ask annoyingly stupid questions
0:58:50 to my team all the time.
0:58:54 Like it’ll be, one question I love to ask is,
0:58:56 what are we stupid for not doing right now?
0:59:01 And that question that comes loaded with a presumption
0:59:03 that there’s something stupid we’re doing.
0:59:04 Of course there is, we’re always doing stupid things.
0:59:06 And specifically, what are we stupid
0:59:08 for not doing right now?
0:59:11 Meaning, what is an obvious low hanging fruit
0:59:12 that’s in our face?
0:59:14 And we’re out here searching for the complex
0:59:18 when the simple, stupidly obvious thing is here.
0:59:21 And, you know, I would say more than 50% of the time,
0:59:23 there’s a useful answer to that question.
0:59:24 But if you didn’t ask that question,
0:59:26 it would just go unspoken in your company, right?
0:59:27 So like, how many are those?
0:59:29 Another one that I learned from Amazon
0:59:31 is Amazon asks this thing in the,
0:59:33 if you’re like, if you’re an exec that leads a team,
0:59:34 you have to like write this document
0:59:36 at the end of the year called the OP1.
0:59:38 I think it’s the operating plan one.
0:59:40 And you do it two a year, right?
0:59:41 The operating plan one,
0:59:42 and then you have the operating plan two
0:59:43 halfway through the year.
0:59:43 Was that effective?
0:59:44 Yeah, it’s great.
0:59:48 I’m a fan of the Amazon writing culture.
0:59:51 It’s easy to make fun of also and easy to do wrong,
0:59:53 but when done right, it’s super effective.
0:59:54 So one of the things that they,
0:59:57 one of the like common questions that they ask in that is,
1:00:00 what are the dogs not barking?
1:00:02 And it’s back to that Sherlock Holmes story
1:00:05 where he solves the case because he’s like,
1:00:06 and they’re like, how did you know Sherlock?
1:00:08 And he’s like, ’cause there’s like a house break in,
1:00:09 they’re trying to figure out who did it.
1:00:11 And he’s like, well, it was the dog, of course.
1:00:13 They could put the dog, the dog didn’t do anything.
1:00:15 He goes, exactly.
1:00:16 The dog didn’t bark,
1:00:18 which means he must’ve recognized the person that broke in,
1:00:19 which means it must’ve been the, you know,
1:00:21 the housekeeper or whatever, right?
1:00:23 And so in your business,
1:00:25 there’s what are the dogs not barking
1:00:27 is a good way of asking.
1:00:28 What are the things that,
1:00:30 there’s really like, I interpret it in two ways.
1:00:33 One is, what are the things we should be hearing
1:00:34 that we’re not?
1:00:37 So for example, one week, I didn’t send out my Friday email,
1:00:39 and I just sat there and I was like,
1:00:41 should probably be getting some emails being like,
1:00:43 hey, where’s the Friday thing, man?
1:00:44 I love that.
1:00:45 Oh, I didn’t get that.
1:00:47 Okay, dog not barking, right?
1:00:49 And then I had changed how I did the Friday emails
1:00:50 because of that.
1:00:51 It’s like, well, why’d you make that pivot?
1:00:54 It’s like, because I did Jenga, dude,
1:00:56 I took a block out and the tower was fine.
1:00:58 Nothing, nothing fell down.
1:00:59 I’m trying to only have like,
1:01:01 I’m trying to be an email in your inbox
1:01:04 that if I remove that email, your life got worse, you know?
1:01:07 And you want to speak to the manager,
1:01:09 where’s my goddamn email, right?
1:01:10 Like if DoorDash doesn’t deliver your food,
1:01:12 you’re knocking on the door.
1:01:14 I want to be at least more powerful
1:01:15 than the DoorDash delivery, right?
1:01:17 Like that’s what I’m striving for.
1:01:19 And so that’s one way of interpreting it.
1:01:21 The other way is, what are the problems
1:01:24 that you don’t hear about yet, but are certainly there?
1:01:26 That’s another way to think about the dogs not barking
1:01:30 is like, you know, anticipate a problem around the corner
1:01:33 because we know it’s going to be there,
1:01:34 but we just haven’t heard it yet.
1:01:36 But, you know, we can anticipate it
1:01:37 and maybe get ahead of it.
1:01:39 Dude, I’m telling you, there’s going to be a world,
1:01:41 probably in three years where you’re going to like,
1:01:45 so the issue that a lot of smart people like you and me
1:01:47 and people listening is like, you’re like, well,
1:01:49 I’m really smart and I feel like I’m wise
1:01:50 and I feel like I know what to do,
1:01:53 but like, it’s a lot of work.
1:01:56 And then like, literally the idea guys
1:01:58 are going to thrive in five years
1:02:02 or the wise people because there’s going to be AI agents
1:02:03 doing all of this for you.
1:02:04 You know what I mean?
1:02:06 Like you’re not going to have to actually do that work.
1:02:08 You’re just your opinions or your tasteful matter.
1:02:09 Yeah, but it is dangerous, right?
1:02:12 ‘Cause then who’s, why can’t the AI do the idea part two?
1:02:13 Right?
1:02:14 Who’s the same person?
1:02:15 That’s going to happen too.
1:02:16 It’s not, you’re not.
1:02:16 I don’t think you are.
1:02:17 Right?
1:02:18 And then that’s when the brain breaks
1:02:21 and you’re like, I guess it’s over then.
1:02:23 And I’m not sure.
1:02:24 Wait, so are you actually afraid?
1:02:26 Yeah, kind of.
1:02:27 Like I don’t want to say afraid
1:02:28 ’cause I’m not like, you know,
1:02:29 quivering in my boots about it.
1:02:32 But I guess like, I don’t have a satisfying answer.
1:02:34 And for most things in my life,
1:02:36 I got a pretty satisfying answer.
1:02:38 Sometimes the answer is just,
1:02:40 I’ll deal with it when it happens, right?
1:02:41 I’ll just adjust, right?
1:02:44 And I could feel safe, I could feel comfortable with that.
1:02:46 That’s usually my fail safe.
1:02:48 With this one, it’s kind of like,
1:02:52 so when the AI can do everything, right?
1:02:56 Which is like, it seems like it’s a matter of when,
1:02:58 not if at this point.
1:03:01 Okay, and it’s like, seems like it’s in my lifetime.
1:03:06 Probably in the next 10 years, it could do the work,
1:03:09 but it can also figure out what the work to be done is.
1:03:10 All right, well, I guess like,
1:03:13 I’m less afraid of the like,
1:03:16 oh, and then it’s gonna crush humans and try to,
1:03:18 you know, that’ll go rogue and it’ll attack us.
1:03:21 Like, I’m not as afraid of that as I am.
1:03:23 Just like, what’s the point of all this?
1:03:24 What’s the point of doing any of this stuff?
1:03:25 If that’s gonna be true.
1:03:28 And that’s kind of just like a weird place to land.
1:03:33 So you want to end there?
1:03:34 (laughing)
1:03:36 What the fuck, right?
1:03:39 Podcasts are all that safe, dude.
1:03:41 No, they’re not.
1:03:42 No, they’re not.
1:03:43 Perplexity has a daily podcast.
1:03:44 That’s really good.
1:03:47 They just take the news that as great.
1:03:48 And then they have, no, it’s not perplexity.
1:03:50 It’s a, what’s the 11 Labs?
1:03:52 11 Labs has, they use like a Stephen Fry voice
1:03:53 and they read the news.
1:03:54 I listened to it.
1:03:55 It’s awesome.
1:03:56 It’s not safe.
1:03:57 We’re not safe.
1:03:57 No one’s safe.
1:03:59 Maybe like a plumber, a plumber’s safe.
1:04:01 Well, I actually think our strategy is pretty genius
1:04:04 because we are getting stupider.
1:04:06 All right, just like we dumb ourselves down
1:04:08 and AI is trying to get smarter.
1:04:11 And so there’s actually a white space in the market
1:04:13 for some just imperfect knowledge.
1:04:18 Some half-baked ideas and some incorrectness.
1:04:19 I think we’ve really,
1:04:20 I think we’ve stumbled onto something.
1:04:21 I think we might be the last one standing
1:04:23 in this whole podcast game.
1:04:24 (laughing)
1:04:25 It’s us and Theo Vaughn.
1:04:27 It’s just like the dumbest conversations on earth.
1:04:28 There’s gonna be all that’s left
1:04:30 ’cause the AI is gonna do all the smart ones.
1:04:33 Maybe, I mean, I don’t know, maybe.
1:04:35 Mark Andreessen should be scared right now.
1:04:35 (laughing)
1:04:37 Dude, yeah, the smart guys are fucked.
1:04:38 Like the smart guys built,
1:04:40 the smart guys are digging their own graves.
1:04:42 They’re like, their shovels are clanking together
1:04:44 on accident as they’re digging the same grave.
1:04:46 They’re like, “Oh, sorry, my bad.”
1:04:47 It’s like, they don’t realize
1:04:50 that you guys are going into this grave in about a year.
1:04:51 – Is that my name on the tombstone?
1:04:52 – Yeah.
1:04:53 – That’s weird, there must be a problem.
1:04:55 – Is there another Mark here?
1:04:56 (laughing)
1:04:58 Two Mark Andreessen?
1:04:59 (laughing)
1:05:01 Like, they think that they’re like,
1:05:03 they’re like, “We’re putting the blue collar guy
1:05:06 “in this grave and we’re gonna outsource this fucking job.”
1:05:07 They’re like, “Huh.”
1:05:09 (laughing)
1:05:14 – I’ve never, Mr. Andreessen, are you here?
1:05:15 Like, you know what I mean?
1:05:20 – Dude, I found my new sick burn in the TikTok comments.
1:05:22 You know, there’s all these TikTok clips of podcasts.
1:05:23 Like, we should probably be doing this,
1:05:24 but we don’t really do it very much.
1:05:28 But like, people just clip, you know, podcast snippets
1:05:31 and that’s like a lot of TikToks.
1:05:33 And the more viral, the more, basically,
1:05:36 the more outrageous the comment in the podcast,
1:05:37 the more viral the TikTok clip,
1:05:38 ’cause you’re gonna get a bunch of comments being like,
1:05:42 “No, that’s wrong, that’s stupid, that’s whatever.”
1:05:46 And I saw the best one, it was just the top liked comment
1:05:47 on a podcast clip, which is,
1:05:50 it just said, “Podcasting equipment
1:05:52 “is way too readily available.”
1:05:54 (laughing)
1:05:58 This is like, “Damn, anybody can just get a microphone now?”
1:05:59 That’s how I feel when I see a lot of these clips.
1:06:01 I’m like, “Wow, this shit is,
1:06:04 “these microphones are way too easy to access.”
1:06:05 – Have you heard that song,
1:06:08 “Another White Boy” with a podcast?
1:06:10 – No, it’s not a song.
1:06:12 – Yes, it’s a song called “Another White Boy”
1:06:13 with a podcast.
1:06:15 – God damn, how did I not think of that?
1:06:19 – It’s sort of like that, like, “Finance 64, Blue Eyes.”
1:06:20 (laughing)
1:06:22 It just says like, “Joe Rogan.”
1:06:24 Like, it just says like a bunch of like random phrases,
1:06:25 but it’s called “Another White Boy”
1:06:26 with a podcast. – Maybe we should
1:06:27 just play that song on the way out of this.
1:06:29 That’ll be our outro.
1:06:30 All right, cue the music.
1:06:34 ♪ Ooh, another white boy with a podcast ♪
1:06:37 ♪ Crypto, Jim Bro ♪
1:06:41 ♪ Real prep, the sport’s been advanced ♪
1:06:45 ♪ So smart and funny, we should make a party ♪
1:06:47 ♪ We buy mics, we get chairs ♪
1:06:49 ♪ We sit down, we’re bling-stairs ♪
1:06:50 ♪ We’re gonna be billionaires ♪
1:06:54 ♪ Just don’t forget that I can share ♪
1:06:58 ♪ Ooh, another white boy with a podcast ♪
1:07:00 – Hey, Sean here.
1:07:01 A quick break to tell you an Ev Williams story.
1:07:03 So he started Twitter and before that,
1:07:05 he sold a company to Google for $100 million.
1:07:06 And somebody asked him, they said,
1:07:07 “Ev, what’s the secret, man?
1:07:10 “How do you create these huge businesses,
1:07:11 “billion-dollar businesses?”
1:07:12 And he says, “Well, I think the answer is
1:07:14 “that you take a human desire,
1:07:17 “preferably one that’s been around for thousands of years,
1:07:20 “and then you just use modern technology to take out steps.
1:07:22 “Just remove the friction that exists
1:07:24 “between people getting what they want.
1:07:26 “And that is what my partner Mercury does.
1:07:27 “They took one of the most basic needs
1:07:29 “any entrepreneur has, managing your money
1:07:31 “and being able to do your financial operations.
1:07:32 “So they’ve removed all the friction
1:07:34 “that has existed for decades.
1:07:35 “No more clunky interfaces,
1:07:38 “no more 10 tabs to get something done,
1:07:39 “no more having to drive to a bank,
1:07:41 “get out of your car just to send a wire transfer.
1:07:43 “They made it fast, they made it easy.
1:07:45 “You can actually just get back to running your business.
1:07:46 “You don’t have to worry about the rest of it.
1:07:48 “I use it for not one, not two,
1:07:50 “but six of my companies right now.
1:07:52 “And it’s used by also 200,000 other ambitious founders.
1:07:54 “So if you want to be like me,
1:07:57 “head to mercury.com, open them an account in minutes.
1:07:59 “And remember, Mercury is a financial technology company,
1:08:00 “not a bank.
1:08:02 “Banking services provided by Choice Financial Group
1:08:04 and involve bank and trust members FDIC.
1:08:06 All right, back to the episode.

Get our Business Monetization Playbook: https://clickhubspot.com/monetization

Episode 664: Sam Parr ( https://x.com/theSamParr ) and Shaan Puri ( https://x.com/ShaanVP ) talk about investing wisdom from Nassim Taleb and how to use ChatGPT as a life coach. 

Show Notes: 

(0:00) No small boy stuff

(2:30) Squid Game for investors

(13:00) Noise v. signal

(20:36) Sam uses ChatGPT to plan his life

(31:50) Shaan explains how LLMs work

(39:50) How to write better prompts

(52:56) How to build a billion dollar company

(55:51) 13 Questions that will change your life

Links:

• Nassim Taleb books – https://tinyurl.com/2vnvz36f 

• Crystal Ball Trading Challenge – https://elmwealth.com/crystal-ball-challenge/ 

• Kubera – https://www.kubera.com/ 

• 3Blue1Brown – https://www.youtube.com/c/3blue1brown 

• Leetcode Wizard – https://leetcodewizard.io/ 

• Fertilo – https://www.gametogen.com/fertilo 

• “13 Questions That Will Change Your Life” – https://shaan.beehiiv.com/p/one-minute-blog-13-questions-that-will-change-your-life 

Check Out Shaan’s Stuff:

Need to hire? You should use the same service Shaan uses to hire developers, designers, & Virtual Assistants → it’s called Shepherd (tell ‘em Shaan sent you): https://bit.ly/SupportShepherd

Check Out Sam’s Stuff:

• Hampton – https://www.joinhampton.com/

• Ideation Bootcamp – https://www.ideationbootcamp.co/

• Copy That – https://copythat.com

• Hampton Wealth Survey – https://joinhampton.com/wealth

• Sam’s List – http://samslist.co/

My First Million is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Arie Desormeaux // Editing by Ezra Bakker Trupiano

Leave a Comment