AI transcript
0:00:11 Mark is a visionary tech leader and investor who fundamentally shaped the development of
0:00:15 the internet and the tech industry in general over the past 30 years.
0:00:22 He is the co-creator of Mosaic, the first widely used web browser, co-founder of Netscape,
0:00:28 co-founder of the legendary Silicon Valley Venture Capital firm Andreessen Horowitz, and
0:00:33 is one of the most influential voices in the tech world, including at the intersection
0:00:37 of technology and politics.
0:00:40 And now, a quick few second mention of his sponsor.
0:00:43 Check them out in the description, it’s the best way to support this podcast.
0:00:49 We’ve got On-Cord for unifying your ML stack, GitHub for programming, Notion for team projects
0:00:56 and collaboration, Shopify for merch, and Element for hydration, choose wisely my friends.
0:01:02 Also, if you want to get in touch with me, for whatever reason, go to lexfreement.com/contact.
0:01:06 And now, onto the full ad reads, no ads in the middle, I try to make this interesting,
0:01:12 but if you skip them, please still check out the sponsors, I enjoy their stuff, maybe you
0:01:13 will too.
0:01:19 This episode is brought to you by On-Cord, a platform that provides data focused AI tooling
0:01:25 for data annotation, curation, and management, and for model evaluation, once you train up
0:01:29 the model on the data that you curate.
0:01:34 In this conversation with Mark Andreessen, we actually discuss what he calls kind of
0:01:37 like the trillion dollar questions.
0:01:42 And one of them for AI is, how effective will synthetic data be?
0:01:45 It really isn’t an open question.
0:01:51 What piece, what fraction of the intelligence of future models will be based on training
0:01:53 on synthetic data?
0:01:57 At the top AI labs, I’m hearing a lot of optimism.
0:02:02 As far as I can tell that optimism is not currently, at least in the general case, based
0:02:04 on any real evidence.
0:02:10 So I do think synthetic data will play a part, but how big a part?
0:02:14 There’s still going to be some curation from humans, there’s still going to need to be
0:02:15 a human in the loop.
0:02:23 I think the real question is, how do you effectively integrate the human in the loop, so that the
0:02:34 synthetic data, sort of 99 synthetic, 1% human, that combination can be most effective?
0:02:35 That’s a real question.
0:02:38 And companies like Concord are trying to solve that very problem.
0:02:44 First of all, they want to provide the tooling for the annotation, for the actual human-iac
0:02:50 collaboration, but also asking and answering the research question of how do you pull it
0:02:55 all off and make the resulting model more intelligent for very specific applications and for the
0:02:57 general applications?
0:03:00 Yeah, so Concord does a really good job on the tooling side.
0:03:08 Go try them out to curate, annotate, and manage your AI data at oncord.com/lex.
0:03:12 That’s oncord.com/lex.
0:03:17 This episode is brought to you by GitHub and GitHub Copilot.
0:03:24 If you don’t know what that is, my friends you’re in for a joyous, beautiful surprise.
0:03:32 I think a lot of people that program regularly know and love GitHub and know and love Copilot.
0:03:40 It’s the OG AI programming assistant, and it’s the one that’s really trying to win this
0:03:42 very competitive space.
0:03:44 It is not easy.
0:03:49 If you’re somebody that uses VS Code, obviously, well, maybe not obviously, but you can use
0:03:54 GitHub Copilot in VS Code, but you can use it also in other IDEs.
0:03:58 I’m going to be honest with you, it’s a very competitive space.
0:04:06 I’m trying all the different tools in the space, and I really love how much GitHub and
0:04:10 GitHub Copilot want to win in this competitive space.
0:04:17 I’m excitedly sitting back and just eating popcorn like that, Michael Jackson meme, and
0:04:20 just enjoying the hell out of it.
0:04:27 Like I said, I’m going to be doing a bunch of programming episodes, including with Primogen.
0:04:34 He I think has a love/hate relationship with AI and with AI agents, and with the role of
0:04:36 AI in the programming experience.
0:04:42 He’s really at the forefront of people that are playing with all these languages, with
0:04:48 all these different applications, with all the different use cases of code, and he is
0:04:53 a new VM user, so he’s going to be skeptical in general of new technology.
0:04:58 He’s a curmudgeon sitting on a porch, on a rocking chair, screaming at the kids, throwing
0:05:04 stuff at them, but at the same time, he’s able to play with the kids as well, so I am more
0:05:09 on the kids side, with a childlike joy, enjoy the new technology.
0:05:18 For me, basically everything I do, programming-wise, has the possibility of AI either reviewing
0:05:21 it or assisting it.
0:05:23 It’s constantly in the loop.
0:05:29 Even if I’m writing stuff from scratch, I’m always just kind of one second away from asking
0:05:33 a question about the code, or asking it to generate, or rewrite a certain line, or to
0:05:39 add a few more lines, all that kind of stuff, so I’m constantly, constantly using it.
0:05:45 If you’re learning to code, or if you’re an advanced programmer, it is really important
0:05:49 that you get better and better at using AI as an assistant programmer.
0:05:55 Get started with GitHub Copilot for free today at gh.io/copilot.
0:06:00 This episode is also brought to you by Notion, a note-taking and team collaboration tool
0:06:05 that Mark Andreessen, on this very episode, sings a lot of praises to.
0:06:07 I believe he sings, “Was it on mic or off mic?”
0:06:10 I don’t remember, but anyway, he loves it.
0:06:15 It’s one of the tools, one of the companies, one of the ecosystems that integrate AI really
0:06:19 effectively for team applications.
0:06:25 You have, let’s see, docs, and wikis, and projects, and all that kind of stuff.
0:06:30 You can have the AI load all of that in, and answer questions based on that.
0:06:36 You can connect a bunch of apps, like you can connect Slack, you can connect Google Drive.
0:06:43 I think in the context, we were talking about something like Notion for email, for Gmail.
0:06:47 I don’t know if Notion integrates email yet.
0:06:53 They’re just like this machine that’s constantly increasing the productivity of every aspect
0:06:57 of your life, so I’m sure they’re going to start integrating more and more apps.
0:07:02 I use it for Slack and Google Drive, but I use it primarily at the individual level for
0:07:07 note-taking, and even at the individual level, just incredible what Notion AI can do.
0:07:12 Try it out for free when you go to Notion.com/lex.
0:07:19 It’s all lowercase Notion.com/lex to try the power of Notion AI today.
0:07:23 This episode is also brought to you by Shopify, a platform designed for anyone to sell anywhere
0:07:26 with a great looking online store.
0:07:35 There are a few people embody the joy and the power of capitalism than Mark Andreessen.
0:07:43 I was at a thing where Mark and Toby were both there, and then we were chatting, and
0:07:47 they were very friendly, so I think they’re friends, and I got to hang out with Toby.
0:07:50 It was, again, an incredible person.
0:07:56 I said it again and again, and it’s almost becoming funny that eventually we’ll do a
0:07:57 podcast.
0:07:59 I don’t know why we haven’t done a podcast.
0:08:05 There’s a few people in my life where it’s like, like, Jeffrey Hinton is one of those
0:08:06 people.
0:08:12 It’s like, we’ve agreed to do a podcast for so long, and we’ve just been kind of lazy
0:08:15 about it, and Toby’s the same.
0:08:17 Anyway, he’s a CEO of Shopify.
0:08:21 I don’t even know if he knows that Shopify sponsors this podcast.
0:08:23 It doesn’t matter.
0:08:27 It goes without saying, it should be obvious to everybody, that one doesn’t affect the
0:08:28 other.
0:08:33 I’m very fortunate to have way more sponsors than we could possibly fit, so I could pick
0:08:39 whoever the hell I want, and whatever guests I choose will never have anything to do with
0:08:42 the companies that sponsor the podcast.
0:08:45 There’s not even like a tinge of influence.
0:08:50 In fact, if there’s anything, it’ll be the opposite direction, but I also try to avoid
0:08:51 that.
0:08:57 It’s possible I talk to the CEO of GitHub, for example, on this podcast, and GitHub sponsors
0:08:58 this podcast.
0:09:03 It’s possible I talk to the CEO of Shopify, Toby, and Shopify sponsors this podcast.
0:09:08 One doesn’t affect the other, and obviously, again, goes without saying, but let me say
0:09:16 it, make it explicit that nobody can buy their way onto the podcast, whether through sponsorships
0:09:19 or buying me dinner or whatever.
0:09:20 I don’t know.
0:09:27 It’s just, it’s impossible, and most likely, if that’s attempted, it’s going to backfire
0:09:33 so I think people intuitively know not to attempt because it would really piss me off.
0:09:37 Anyway, this is a detour.
0:09:38 We’re supposed to talk about Shopify.
0:09:44 I have a Shopify store, lexforumni.com/store, that sells t-shirts, but you can sell more
0:09:49 sophisticated stuff, make a lot of money, and participate in this beautiful machinery
0:09:50 of capitalism.
0:09:55 Sign up for a $1 per month trial period at Shopify.com/lex.
0:09:56 That’s all over the case.
0:10:01 Go to Shopify.com/lex to take your business to the next level today.
0:10:07 This episode is also brought to you by Element, my daily zero sugar and delicious electrolyte
0:10:13 mix of which I consume very ridiculously large amounts.
0:10:17 You know, salt used to be currency in the ancient world.
0:10:18 How silly are humans?
0:10:26 They’re not silly, how sort of surprising the things we converge on as being the store
0:10:31 of value, just value in general, the kind of things we assign value to together.
0:10:41 We just kind of all agree that this item, this material, this idea, this building is
0:10:49 extremely valuable, and then we compete over that resource, or that idea, or that building.
0:10:54 We fight, and sometimes there is wars, and sometimes there is complete destruction, and
0:10:59 the rise and fall of empires, all over some resource.
0:11:04 What a funny, strange little world.
0:11:12 Completely harmless as H. Hiker’s guide to the galaxy summarizes humans.
0:11:15 For some reason, instead of that book, I was going to say Catcher in the Rye.
0:11:21 In my exhausted brain, the books kind of all morph together, but Catcher in the Rye is
0:11:23 a really damn good book.
0:11:28 All of the classics I return to often, the simple books, even like the first book I read
0:11:34 in English, Traveal Book, Traveal Book, called The Giver.
0:11:42 It’s like I return to it in its simplicity, maybe it has sentimental value, maybe that’s
0:11:46 what it is, but just the simplicity of words, Animal Farmer, I’ve read, I don’t know how
0:11:51 many times, probably over 50 times, I return to it over and over and over, the simplicity,
0:11:54 the poetry of that simplicity.
0:11:59 That’s something that just resonates with my brain, maybe it’s a peculiar kind of brain.
0:12:08 It is a peculiar kind of brain, and I have to thank you for being patient with this peculiar
0:12:09 kind of brain.
0:12:14 Get a simple pack for free with any purchase of whatever the thing I was talking about,
0:12:16 which I think is Element.
0:12:21 Try it at drinkelement.com/lex.
0:12:22 This is a Lex Friedman podcast.
0:12:26 To support it, please check out our sponsors in the description.
0:12:39 And now, dear friends, here’s Mark and Risen.
0:12:48 All right, let’s start with optimism.
0:12:57 If you were to imagine the best possible one to two years, 2025, ’26, for tech, for big
0:13:01 tech and small tech, what would it be, what would it look like, lay out your vision for
0:13:05 the best possible scenario trajectory for America?
0:13:06 The roaring 20s.
0:13:07 The roaring 20s.
0:13:08 The roaring 20s.
0:13:09 I mean, look, a couple of things.
0:13:14 It is remarkable over the last several years with all of the issues, including not just
0:13:17 everything in politics, but also COVID and every other thing that’s happened.
0:13:18 It’s really amazing.
0:13:19 The US just kept growing.
0:13:21 If you just look at economic growth charts, the US just kept growing.
0:13:23 And very significantly, many other countries stopped growing.
0:13:27 So Canada stopped growing, the UK stopped growing, Germany stopped growing.
0:13:31 And some of those countries may be actually going backwards at this point.
0:13:34 And there’s a very long discussion to be had about what’s wrong with those countries.
0:13:37 And there’s, of course, plenty of things that are wrong with our country.
0:13:41 But the US is just flat out primed for growth.
0:13:47 And I think that’s a consequence of many factors, some of which are lucky and some of
0:13:48 which through hard work.
0:13:52 And so the lucky part is just, number one, we just have incredible physical security
0:13:54 by being our own continent.
0:13:56 We have incredible natural resources.
0:14:00 There’s this running joke now that whenever it looks like the US is going to run out of
0:14:04 some rare earth material, some farmer in North Dakota kicks over a hay bale and finds like
0:14:05 a $2 trillion deposit.
0:14:12 I mean, we’re just blessed with geography and the natural resources.
0:14:14 We can be energy independent anytime we want.
0:14:16 This last administration decided they didn’t want to be.
0:14:18 They wanted to turn off American energy.
0:14:22 This new administration has declared that they have a goal of turning it on in a dramatic
0:14:23 way.
0:14:24 There’s no question we can be energy dependent.
0:14:26 We can be a giant net energy exporter.
0:14:28 It’s purely a question of choice.
0:14:31 And I think the new administration is going to do that.
0:14:33 And so, oh, and then I would say two other things.
0:14:38 One is, we are the beneficiaries, and you’re an example of this, we’re a beneficiary, we’re
0:14:43 the beneficiary of 50, 100, 200 years of like the basically most aggressive, driven, smartest
0:14:46 people in the world, most capable people moving to the US and raising their kids here.
0:14:50 And so, we just have, you know, by far the most dynamic, you know, we’re by far the
0:14:54 most dynamic population, most aggressive, you know, we’re the most aggressive set of
0:14:59 characters in certainly in any Western country and have been for a long time and certainly
0:15:00 are today.
0:15:03 And then finally, I would just say, look, we are overwhelmingly the advanced technology
0:15:04 leader.
0:15:08 You know, we have our issues and we have, I would say, a particular issue with manufacturing,
0:15:12 which we could talk about, but for, you know, anything in software or anything in AI, anything
0:15:16 in, you know, all these, you know, advanced biotech, all these advanced areas of technology,
0:15:20 like we’re by far the leader, again, in part because many of the best scientists and engineers
0:15:23 in those fields, you know, you don’t come to the US.
0:15:29 And so, we just, we have all of the preconditions for a, for just a monster, boom, you know,
0:15:32 I could see economic growth going way up, I could see productivity growth going way
0:15:35 up, rate of technology adoption going way up, and then we could, we can do a global
0:15:40 tour if you like, but like, basically all of our competitors have like profound issues
0:15:44 and, you know, we could kind of go through them one by one, but the competitive landscape
0:15:49 just is, it’s like, it’s, it’s remarkable how, how, how much better position we are
0:15:50 for growth.
0:15:54 What about the humans themselves, almost philosophical questions, you know, I travel across the world
0:16:00 and there’s something about the American spirit, the entrepreneurial spirit that’s uniquely
0:16:02 intense in America.
0:16:03 I don’t know what that is.
0:16:11 I’ve talked to a saga who claims it might be the Scots Irish blood that runs through
0:16:12 the history of America.
0:16:13 What is it?
0:16:17 You, at the heart of Silicon Valley, is there something in the water?
0:16:19 Why is there this entrepreneurial spirit?
0:16:20 Yeah.
0:16:22 So is this a family show or am I allowed to swear?
0:16:23 You can say whatever the fuck you want.
0:16:24 Okay.
0:16:28 So the TV, the great TV show succession, the show, of course, that would, which you were
0:16:30 intended to root for exactly zero of the characters.
0:16:31 Yes.
0:16:34 The show succession was in the final episode of the first season when the whole family
0:16:39 is over in Logan Roy’s ancestral homeland of Scotland and they’re at this castle, you
0:16:40 know, for some wedding.
0:16:43 And Logan is just like completely miserable after having to, you know, because he’s been
0:16:47 in New York for 50 years, he’s totally miserable being back in, in Scotland and he gets in
0:16:51 some argument with somebody and he’s like, he says, finally, just says, my God, I cannot
0:16:58 wait to get out of here and go back to America where we could fuck without condoms.
0:17:01 So was that a metaphor or okay, exactly, right?
0:17:04 And so no, but it’s exactly the thing and everybody instantly knows what they’re like.
0:17:07 Everybody watching that instantly starts laughing because you know what it means, which is exactly
0:17:08 this.
0:17:09 I think there’s like an ethnographic way of it.
0:17:12 There’s a bunch of books on like all, like you said, the Scots-Irish, like all the different
0:17:15 derivations of all the different ethnic groups that have come to the U.S. over the course
0:17:17 of the last 400 years, right?
0:17:22 But what we have is this sort of amalgamation of like, you know, the northeast Yankees who
0:17:26 were like super tough and hardcore, yeah, the Scots-Irish are super aggressive.
0:17:31 You know, we’ve got the Southerners and the Texans, you know, and the sort of whole kind
0:17:35 of blended, you know, kind of Anglo-Hispanic thing, super incredibly tough, strong driven,
0:17:40 you know, capable characters, you know, the Texas Rangers, you know, we’ve got the, yeah,
0:17:43 we’ve got the California, you know, we’ve got the, you know, the wild, we’ve got the
0:17:47 incredibly, you know, inventive hippies, but we also have the hardcore engineers, we’ve
0:17:50 got, you know, the best, you know, rocket scientists in the world, we’ve got the best,
0:17:53 you know, artists in the world, you know, creative professionals, you know, the best
0:17:54 movies.
0:18:00 And so, yeah, there is, you know, all of our problems, I think, are basically, you know,
0:18:04 in my view, to some extent, you know, attempts to basically sand all that off and make everything
0:18:09 basically boring and mediocre, but there is something in the national spirit that basically
0:18:10 keeps bouncing back.
0:18:14 And basically what we discover over time is we basically just need people to stand up
0:18:17 at a certain point and say, you know, it’s time to, you know, it’s time to build, it’s
0:18:20 time to grow, you know, it’s time to do things.
0:18:23 And so, and there’s something in the American spirit that just like, we’re just right back
0:18:24 to life.
0:18:28 And before I actually saw, you know, I saw it as a kid here in the early 80s, you know,
0:18:34 because the 70s were like horribly depressing, right, in the U.S., like they were a nightmare
0:18:35 on many fronts.
0:18:40 And in a lot of ways, the last decade to me has felt a lot like the 70s, just being mired
0:18:45 in misery and just this self-defeating, you know, negative attitude and everybody’s upset
0:18:46 about everything.
0:18:50 And, you know, and then by the way, like energy crisis and hostage crisis and foreign wars
0:18:56 and just demoralization, right, you know, the low point for in the 70s was, you know,
0:18:59 Jimmy Carter, who just passed away, he went on TV and he gave this speech known as the
0:19:00 Malay speech.
0:19:04 And it was like the weakest possible trend to like rouse people back to a sense of like
0:19:05 passion completely failed.
0:19:10 And, you know, we had the, you know, the hostages in, you know, Iran for I think 440 days and
0:19:14 every night on the nightly news, it was, you know, lines around the block, energy crisis,
0:19:16 depression, inflation.
0:19:19 And then, you know, Reagan came in and, you know, Reagan was a very controversial character
0:19:23 at the time and, you know, he came in and he’s like, nope, it’s morning in America.
0:19:25 And we’re the shining city on the hill and we’re going to do it.
0:19:26 And he did it.
0:19:27 And we did it.
0:19:29 And the national spirit came roaring back and, you know, word really hard for a full
0:19:30 decade.
0:19:33 And I think that’s exactly what, I think, you know, we’ll see, but I think that’s what
0:19:34 could happen here.
0:19:39 And I just did a super long podcast on Milton Friedman with Jennifer Burns, who’s this incredible
0:19:41 professor at Stanford.
0:19:42 And he was part of the Reagan.
0:19:46 So there’s a bunch of components to that, one of which is economic.
0:19:47 Yes.
0:19:52 And one of which, maybe you can put a word on it of not to be romantic or anything, but
0:19:58 freedom, individual freedom, economic freedom, political freedom, and just in general, individualism.
0:20:00 Yeah, that’s right.
0:20:01 Yeah.
0:20:05 And as you know, as America has this incredible streak of individualism, you know, and individualism
0:20:09 in America probably peaked, I think, between roughly, call it the end of the Civil War,
0:20:14 1865 through to probably call it 1931 or something, you know, and there was this like incredible
0:20:15 run.
0:20:17 I mean, that period, you know, we now know that period is the Second Industrial Revolution.
0:20:21 And it’s when the United States basically assumed global leadership and basically took
0:20:24 over technological and economic leadership from England.
0:20:27 And then, you know, that led to, you know, ultimately then, therefore being able to,
0:20:30 you know, not only industrialize the world, but also win World War II and then win the
0:20:31 Cold War.
0:20:36 And yeah, you know, there’s a massive industrial, you know, massive individualistic streak.
0:20:39 By the way, you know, Milton Friedman’s old videos are all on YouTube.
0:20:46 They are every bit as compelling and inspiring as they were then, you know, he’s a singular
0:20:51 figure and many of us, you know, I never knew him, but he was actually at Stanford for many
0:20:52 years at the Hoover Institution.
0:20:53 But I never met him.
0:20:57 But I know a lot of people who worked with him and, you know, he was a singular figure,
0:21:02 but his, all of his lessons, you know, live on are fully available.
0:21:05 But I would also say it’s not just individualism and this is, you know, this is one of the
0:21:08 big things that’s like playing out in a lot of our culture and kind of political fights
0:21:12 right now, which is, you know, basically this feeling, you know, certainly that I have and
0:21:16 I share with a lot of people, which is it’s not enough for America to just be an economic
0:21:20 zone and it’s not enough for us to just be individuals and it’s not enough to just have
0:21:23 line go up and it’s not enough to just have economic success.
0:21:29 There are deeper questions at play and also, you know, there’s more to a country than just
0:21:30 that.
0:21:32 And, you know, quite frankly, a lot of it is intangible.
0:21:37 A lot of it is, you know, involved spirit and passion and, you know, like I said, we
0:21:41 have more of it than anybody else, but, you know, we have to choose to want it.
0:21:43 The way I look at it is like all of our problems are self-inflicted.
0:21:46 Like they’re, you know, decline is a choice.
0:21:50 You know, all of our problems are basically demoralization campaigns, you know, basically
0:21:53 people telling us, people in positions of authority telling us that we should, you know,
0:21:55 we shouldn’t, you know, stand out.
0:21:56 We shouldn’t be adventurous.
0:21:57 We shouldn’t be exciting.
0:21:58 We shouldn’t be exploratory.
0:22:01 You know, we shouldn’t, you know, this, that and the other thing and we should feel bad
0:22:02 about everything that we do.
0:22:06 And I think we’ve lived through a decade where that’s been the prevailing theme and I think
0:22:10 quite honestly, as of November, I think people are done with it.
0:22:14 If we could go on a tangent of a tangent, since we’re talking about individualism and
0:22:19 that’s not all that it takes, you’ve mentioned in the past the book, The Ancient City, by,
0:22:24 if I could only pronounce the name French historian, Numa Denis Foustel de Coulombe.
0:22:25 I don’t know.
0:22:26 That was amazing.
0:22:27 Okay.
0:22:28 All right.
0:22:29 From the 19th century.
0:22:30 Anyway, you said this is an important book to understand who we are and where we come
0:22:31 from.
0:22:34 So what that book does, it’s actually quite a striking book.
0:22:40 So the book is written by this guy, as a profusive, let’s do the pronunciations, foreign language
0:22:42 pronunciations for the day.
0:22:50 He was a professor of classics at the Sorbonne in Paris, you know, the top university in
0:22:51 the, actually in the 1860s.
0:22:57 So actually right around after the U.S. Civil War and he was a savant of a particular kind,
0:23:00 which is he, and you can see this in the book, is he had apparently read and sort of absorb
0:23:06 and memorized every possible scrap of Greek and Roman literature and so it’s like a walking
0:23:09 like index on basically Greek and Roman, everything we know about Greek and Roman culture.
0:23:11 And that’s significant.
0:23:13 The reason this matters is because basically none of that has changed, right?
0:23:17 And so he had access to the exact same materials that we have, we have access to.
0:23:19 And so there, you know, we’ve learned nothing.
0:23:21 And then specifically what he did is he talked about the Greeks and the Romans, but specifically
0:23:23 what he did is he went back further.
0:23:26 He reconstructed the people who came before the Greeks and the Romans and what their life
0:23:27 and society was like.
0:23:30 And these were the people who were now known as the, as the Indo-Europeans.
0:23:33 And these were, or you may have heard of these, these are the people who came down from the
0:23:34 steppes.
0:23:37 And so they came out of what’s now like Eastern Europe, like around sort of the outskirts of
0:23:38 what’s now Russia.
0:23:40 And then they sort of swept through Europe.
0:23:44 They ultimately took over all of Europe, by the way, you know, almost many of the ethnicities
0:23:48 in the Americas, the hundreds of years to follow, you know, are Indo-European.
0:23:51 So like, you know, they were this basically this warrior, basically class that like came
0:23:55 down and swept through and, and, and, and, and, you know, essentially, you know, populated
0:23:56 much of the world.
0:23:58 And there’s a whole interesting saga there.
0:24:01 But what he does, and then they basically, they, they, from there came basically what
0:24:04 we know as the Greeks and the Romans were kind of evolutions off of that.
0:24:08 And so what he reconstructs is sort of what life was like, what life was like, at least
0:24:11 in the West for people in their kind of original social state.
0:24:15 And the significance of that is, is the original social state is this is living in the state
0:24:20 of the absolute imperative for survival with absolutely no technology, right?
0:24:22 Like no modern systems, no nothing, right?
0:24:23 You’ve got the clothes on your back.
0:24:27 You’ve got your, you know, you’ve got whatever you can build with your bare hands, right?
0:24:30 This is, you know, predates basically all concepts of, of, of technologies we understand
0:24:31 that today.
0:24:35 And so these are people under like maximum levels of physical survival pressure.
0:24:37 And so what, what social patterns did they evolve to be able to do that?
0:24:43 And then the social pattern basically was as follows, is a three part social structure,
0:24:50 family, tribe and city and zero concept of individual rights and essentially no concept
0:24:51 of individualism.
0:24:54 And so you were not an individual, you were a member of your family.
0:24:58 And then a set of families would aggregate into a tribe and then a set of tribes would
0:25:01 aggregate into a, into a city.
0:25:05 And then the morality was completely, it was actually what Nietzsche talks, Nietzsche
0:25:08 talks about, the morality was entirely master morality, not slave morality.
0:25:12 And so in their morality, anything that was strong was good and anything that was weak
0:25:13 was bad.
0:25:14 And it’s very clear why that is, right?
0:25:18 It’s because strong equals good equals survive, weak equals bad equals die.
0:25:22 And that led to what became known later as the master slave dialectic, which is, is it
0:25:25 more important for you to live on your feet as a master, even if the risk of dying?
0:25:28 Or are you willing to, you know, live as a slave on your knees in order to not die?
0:25:32 And this is sort of the, the derivation of that moral framework.
0:25:35 Christianity later inverted that moral framework, but it, you know, the original framework lasted
0:25:38 for, you know, many, many thousands of years.
0:25:40 No concept of individualism, the head of the family had total life and death control over
0:25:44 the, over the family, the head of the tribe, same thing, head of the city, same thing.
0:25:48 And then you were morally obligated to kill members of the, of the other cities on contact.
0:25:49 Right?
0:25:52 You were morally required to, like if you didn’t do it, you were a bad person.
0:25:59 Um, and then the form of the society was basically maximum fascism combined with maximum communism.
0:26:00 Right?
0:26:04 And so it was maximum fascism in the form of this, like absolute top-down control where
0:26:07 the head of the family tribe or city could kill other members of the community at any
0:26:10 time with no repercussions at all.
0:26:14 So maximum hierarchy, but combined with maximum communism, which is no market economy.
0:26:16 And so everything gets shared, right?
0:26:19 And sort of the point of being in one of these collectives is that it’s a collective and,
0:26:21 and, and, you know, and people are sharing.
0:26:24 And of course that limited how big they could get cause, you know, the problem with communism
0:26:25 is it doesn’t scale.
0:26:26 Right?
0:26:27 It works at the level of a family.
0:26:31 It’s much harder to make it work at the level of a country, impossible, maximum fascism,
0:26:32 maximum communism.
0:26:37 And then, and then it was all intricately tied into their religion and their, their religion
0:26:39 was in two parts.
0:26:43 It was a veneration of ancestors and it was veneration of nature.
0:26:47 And the veneration of ancestors is extremely important because it was basically like basically
0:26:50 the ancestors were the people who got you to where you were, the ancestors were the people
0:26:52 who had everything to teach you.
0:26:53 Right?
0:26:55 And then it was veneration of nature cause of course nature is the thing that’s trying
0:26:56 to kill you.
0:27:00 Um, and then you had your ancestor, every family tribe or city had their ancestor gods
0:27:02 and then they had their, um, they had their nature gods.
0:27:03 Okay.
0:27:04 So fast forward to today.
0:27:07 Like we live in a world that is like radically different, but in the book takes you through
0:27:11 kind of what happened from that through the Greeks and Romans through to Christianity.
0:27:14 And so the, but it, but it’s very helpful to kind of think in these terms because the
0:27:19 conventional view of the progress through time is that we are, you know, the cliche is the
0:27:22 arc of the, you know, moral universe, you know, Ben Stor’s justice, right?
0:27:25 Or so-called Whig history, which is, you know, that the arc of progress is positive, right?
0:27:29 And so we, you know, what you hear all the time, what you’re taught in school and everything
0:27:32 is, you know, every year that goes by, we get better and better and more and more moral
0:27:35 and more and more people are in a better version of ourselves.
0:27:39 Our Indo European ancestors would say, Oh no, like you people have like fallen to shit.
0:27:43 Like you people took all of the principles of basically your civilization and you have
0:27:47 deluded them down to the point where they barely even matter, you know, and you’re having,
0:27:50 you know, children at a wedlock and you’re, you know, you regularly encounter people of
0:27:54 other cities and you don’t try to kill them and like, how crazy is that?
0:27:58 And they would basically consider us to be living like an incredibly diluted version of
0:28:01 this sort of highly religious, highly cult-like, right?
0:28:04 Highly organized, highly fascist, fascist communist society.
0:28:10 I can’t resist noting that as a consequence of basically going through all the transitions
0:28:14 we’ve been through, going all the way through Christianity, coming out the other end of Christianity,
0:28:18 Nietzsche declares God is dead, we’re in a secular society, you know, that still has,
0:28:21 you know, tinge is a Christianity, but, you know, largely prides itself on no longer being
0:28:27 religious in that way, you know, we being the sort of most fully evolved, modern, secular,
0:28:32 you know, expert scientists and so forth have basically re-evolved or fallen back on the
0:28:36 exact same religious structure that the Indo Europeans had, specifically ancestor worship,
0:28:42 which is identity politics, and nature worship, which is environmentalism.
0:28:45 And so we have actually like worked our way all the way back to their cult religions without
0:28:46 realizing it.
0:28:49 And it just goes to show that, like, you know, in some ways we have fallen far from the, far
0:28:53 from the family tree, but in some cases we’re exactly the same.
0:29:00 You kind of described this progressive idea of wokeism and so on as worshipping ancestors.
0:29:02 Identity politics is worshipping ancestors, right?
0:29:07 It’s tagging newborn infants with either, you know, benefits or responsibilities or, you
0:29:10 know, levels of condemnation based on who their ancestors were.
0:29:13 The Indo Europeans would have recognized it on site.
0:29:15 We somehow think it’s like super socially progressive.
0:29:16 Yeah.
0:29:17 And it is not.
0:29:19 I mean, I would say obviously not.
0:29:23 Let’s, you know, get new answers, which is where I think you’re headed, which is, look,
0:29:27 is the idea that you can like completely reinvent society every generation and have no regard
0:29:28 whatsoever for what came before you?
0:29:30 That seems like a really bad idea, right?
0:29:33 That’s like the Cambodians with your zero underpull pot and, you know, death, you know,
0:29:34 follows.
0:29:40 It’s obviously the Soviets tried that, you know, the, you know, the utopian fantasists
0:29:43 who think that they can just rip up everything that came before and create something new
0:29:44 in the human condition.
0:29:47 And human society have a very bad history of causing, you know, enormous destruction.
0:29:51 So on the one hand, it’s like, okay, there is like a deeply important role for tradition.
0:29:56 And the way I think about that is it’s the process of evolutionary learning, right?
0:30:00 Which is what tradition ought to be is the distilled wisdom of all, and, you know, this
0:30:01 is not even what Europeans thought about it.
0:30:04 It should be the distilled wisdom of everybody who came before you, right?
0:30:07 All those important and powerful lessons learned.
0:30:09 And that’s why I think it’s fascinating to go back and study how these people lived is
0:30:12 because that’s part of the history and, you know, part of the learning of the goddess
0:30:14 to where we are today.
0:30:17 Having said that, there are many cultures around the world that are, you know, mired
0:30:20 in tradition to the point of not being able to progress.
0:30:23 And in fact, you might even say globally, that’s the default human condition, which
0:30:26 is, you know, a lot of people are in societies in which, you know, there’s like absolute
0:30:30 seniority by age, you know, kids are completely, you know, like in the U.S., like for some
0:30:32 reason, we decided kids are in charge of everything, right?
0:30:35 And like, you know, they’re the trendsetters and they’re allowed to like set all the agendas
0:30:39 and like set all the politics and set all the culture and maybe that’s a little bit crazy.
0:30:42 But like in a lot of other cultures, kids have no voice at all, no role at all, because
0:30:46 it’s the old people who are in charge of everything, you know, they’re gerontocracies.
0:30:50 And it’s all a bunch of 80-year-olds running everything, which by the way, we have a little
0:30:52 bit of that too, right?
0:30:57 And so I would say is like, there’s a down, there’s a real downside, you know, full traditionalism
0:31:02 as communitarianism, you know, it’s ethnic particularism, you know, it’s ethnic chauvinism,
0:31:07 it’s, you know, this incredible level of resistance to change, you know, that’s, I mean, it just
0:31:08 doesn’t get you anywhere.
0:31:12 It may be good and fine at the level of an individual tribe, but as a society living
0:31:15 in the modern world, you can’t evolve, you can’t advance, you can’t participate in
0:31:18 all the good things that, you know, that have happened.
0:31:21 And so, you know, I think probably this is one of those things where extremeness on either
0:31:23 side is probably a bad idea.
0:31:29 And I, but, you know, but this needs to be approached in a sophisticated and nuanced way.
0:31:35 So the beautiful picture you painted of the roaring 20s, how can the Trump administration
0:31:37 play a part in making that future happen?
0:31:38 Yeah.
0:31:42 So look, a big part of this is getting the government boot off the neck of the American
0:31:47 economy, the American technology industry, the American people, you know, and then again,
0:31:50 this is a replay of what happened in the 60s and 70s, which is, you know, for what started
0:31:54 out looking like, you know, I’m sure good and virtuous purposes, you know, we, we ended
0:31:57 up both that and now with this, you know, what I, what I describe as sort of a form of soft
0:32:01 authoritarianism, you know, the good news is it’s not like a military dictatorship.
0:32:05 It’s not like, you know, you get thrown into Lou Bianca, you know, for the most part, it’s
0:32:07 not coming at four in the morning, you’re not getting dragged off to a cell.
0:32:10 So it’s not hard authoritarianism, but it is soft authoritarianism.
0:32:15 And so it’s this, you know, incredible, suppressive blanket of regulation rules, you know, this
0:32:17 concept of a vetocracy, right?
0:32:20 What’s required to get anything done, you know, you need to get 40 people to sign off
0:32:24 in anything, any one of them can veto it, you know, there’s a lot of how our now political
0:32:26 system works.
0:32:30 And then, you know, just this general idea of, you know, progress is bad and technology
0:32:34 is bad and capitalism is bad and building businesses is bad and success is bad.
0:32:39 You know, tall poppy syndrome, you know, basically anybody who sticks their head up,
0:32:41 you know, deserves to get it, you know, chopped off.
0:32:44 Anybody who’s wrong about anything deserves to get condemned forever.
0:32:49 You know, just this very kind of, you know, grinding, you know, repression and then coupled
0:32:55 with specific government actions such as censorship regimes, right and debanking, right?
0:33:00 And you know, draconian, you know, deliberately kneecapping, you know, critical American industries.
0:33:03 And then, you know, congratulating yourself in the back for doing it or, you know, having
0:33:06 these horrible social policies like let’s let all the criminals out of jail and see what
0:33:07 happens.
0:33:08 Right.
0:33:11 And so like, we’ve just been through this period, you know, I call it a demoralization
0:33:14 campaign, like we’ve just been through this period where, you know, whether it started
0:33:17 that way or not, it ended up basically being this comprehensive message that says you’re
0:33:22 terrible and if you try to do anything, you’re terrible and fuck you.
0:33:25 And the Biden administration reached kind of the full pinnacle of that in our time.
0:33:29 They got really bad on many fronts at the same time.
0:33:34 And so just like relieving that and getting kind of back to it reasonably, you know, kind
0:33:40 of optimistic, constructive, you know, pro-growth frame of mind, there’s just, there’s so much
0:33:43 pent-up energy and potentially the American system of that alone is gonna, I think, cause,
0:33:46 you know, growth and spirit to take off.
0:33:49 And then there’s a lot of things proactively, but yeah, and then there’s a lot of things
0:33:50 proactively that could be done.
0:33:52 So how do you relieve that?
0:33:59 To what degree has the thing you described ideologically permeated government and permeated
0:34:00 big companies?
0:34:03 Disclaimer at first, which is I don’t want to predict anything on any of this stuff because
0:34:08 I’ve learned the hard way that I can’t predict politics or Washington at all.
0:34:11 But I would just say that the plans and intentions are clear and the staffing supports it.
0:34:15 And all the conversations are consistent with the new administration and that they plan
0:34:19 to take, you know, very rapid action on a lot of these fronts very quickly.
0:34:21 They’re gonna do as much as they can through executive orders and then they’re gonna do
0:34:24 legislation and regulatory changes for the rest.
0:34:26 And so they’re gonna move, I think, quickly on a whole bunch of stuff.
0:34:29 You can already feel, I think, a shift in the national spirit, or at least, let’s put
0:34:30 it this way.
0:34:33 I feel it for sure and Silicon Valley like it, you know, I mean, we, you know, we just
0:34:36 saw a great example of this with what, you know, with what Mark Zuckerberg is doing.
0:34:39 You know, obviously I’m involved with his company, but, you know, we just saw it kind
0:34:44 of in public, the scope and speed of the changes, you know, are reflective of sort of this, of
0:34:45 a lot of these shifts.
0:34:49 But I would say that that same conversation, those same kinds of things are happening throughout
0:34:50 the industry, right.
0:34:54 And so the tech industry itself, whether people were pro-Trump or anti-Trump, like there’s
0:34:57 just like a giant five shift mood shift that’s like kicked in already.
0:35:02 And then I was with a group of Hollywood people about two weeks ago, and they were still,
0:35:04 you know, people who at least, at least vocally were still very anti-Trump.
0:35:08 But I said, you know, has anything changed since, since November 6th?
0:35:10 And they immediately said, oh, it’s completely different.
0:35:15 It feels like the ISIS thawed, you know, woke us over, you know, they said that all kinds
0:35:18 of projects are going to be able to get made now that couldn’t before that, you know, probably
0:35:20 was going to start making comedies again.
0:35:24 You know, like, they were just like, it’s like, it’s like, it’s just like an incredible
0:35:26 immediate environmental change.
0:35:30 And I’m, as I talk to people kind of throughout, you know, certainly throughout the economy,
0:35:33 people who run businesses, I hear that all the time, which is just this, this last 10
0:35:34 years of misery is just over.
0:35:38 I mean, the one that I’m watching that’s really funny, I mean, Facebook’s giving a lot, that
0:35:39 is getting a lot of attention.
0:35:42 But the other funny one is BlackRock, which I’m not, you know, and I don’t know him,
0:35:44 but I’ve watched for a long time.
0:35:48 And so, you know, Larry Fink is the CEO of BlackRock was like first in as a major, you
0:35:56 know, investment CEO on like every dumb social trend and rule set, like every, all right,
0:36:03 I’m going for it, every retarded, every retarded thing you can imagine, every ESG and every
0:36:08 like, every possible satellite companies with every aspect of just these crazed ideological
0:36:09 positions.
0:36:12 And, you know, he was coming in, he literally was like, had aggregated together trillions
0:36:17 of dollars of, of, of, of shareholdings that he did not, that were, you know, that were
0:36:21 his, his customers rights and he, you know, seized their voting control of their shares
0:36:24 and was using it to force all these companies to do all of this, like crazy ideological
0:36:25 stuff.
0:36:27 And he was like the typhoid Mary of all this stuff in corporate America.
0:36:31 And if he in the last year has been like backpedaling from that stuff, like as fast as he possibly
0:36:32 can.
0:36:35 And I saw just an example last week, he pulled out of the, whatever the corporate net zero
0:36:39 alliance, you know, he pulled out of the crazy energy, energy, energy stuff.
0:36:42 And so like, you know, he’s backing away as fast as he can.
0:36:43 He’s doing it.
0:36:46 Remember the Richard Pryor backwards walk, Richard Pryor had this way where he could,
0:36:50 he could back out of a room while looking at, like he was walking forward.
0:36:54 And so, you know, even they’re doing that.
0:36:58 And just the whole thing, I mean, if you saw the court recently ruled that NASDAQ had these
0:37:03 crazy board of directors composition rules, one of the funniest moments of my life is
0:37:07 when my friend Peter Thiel and I were on the, the, the meta board and these NASDAQ rules
0:37:10 came down mandated diversity on corporate boards.
0:37:13 And so we sat around the table and had to figure out, you know, which of us counted as diverse
0:37:19 and the very professional attorneys that met up explained with a 100% complete straight
0:37:24 phase that Peter Thiel counts as diverse by virtue of being LGBT.
0:37:27 And this is a guy who literally wrote a book called the diversity myth.
0:37:33 And he literally looked like he swallowed a live goldfish and this was imposed.
0:37:36 I mean, this was like so incredibly offensive to him that like, it just like, it was just
0:37:37 absolutely appalling.
0:37:40 And I felt terrible for him, but the look in his face was very funny.
0:37:44 It was imposed by NASDAQ, you know, your stock exchange is imposing this stuff on you.
0:37:48 And then the court, whatever the court of appeals just nuked that, you know, it’s like
0:37:51 these things basically are being like ripped down one by one.
0:37:55 And what’s on the other side of it is basically, you know, finally being able to get back to,
0:37:58 you know, everything that, you know, everybody always wanted to do, which is like run their
0:38:03 companies, have great products, have happy customers, you know, like succeed, like succeed,
0:38:07 achieve, outperform and, you know, work with the best and the brightest and not be made
0:38:08 to feel bad about it.
0:38:10 And I think that’s happening in many areas of American society.
0:38:15 It’s great to hear that Peter Thiel is fundamentally a diversity hire.
0:38:18 Well, so it was very, you know, there was a moment.
0:38:22 So Peter, you know, Peter, of course, you know, is, you know, is publicly gay has been
0:38:26 for a long time, you know, but, you know, there are other men on the board, right?
0:38:28 And you know, we’re sitting there and we’re all looking at it and we’re like, all right,
0:38:32 like, okay, LGBT and we just, we keep coming back to the B, right?
0:38:39 And it’s like, you know, it’s like, all right, you know, I’m willing to do a lot for this
0:38:44 company, but it’s all about sacrifice for diversity.
0:38:45 Well, yeah.
0:38:47 And then it’s like, okay, like, is there a test?
0:38:48 Right.
0:38:49 You know?
0:38:50 Oh, yeah.
0:38:51 Exactly.
0:38:52 How do you prove it?
0:38:56 The questions that got asked, you know, what are you willing to do?
0:38:57 Yeah.
0:39:03 I think I’m very good at asking lawyers completely absurd questions with a totally straight face.
0:39:05 And do they answer with a straight face?
0:39:06 Sometimes.
0:39:07 Okay.
0:39:09 I think in fairness, they have trouble telling when I’m joking.
0:39:15 So you mentioned the Hollywood folks, maybe people in Silicon Valley and vibe shift.
0:39:19 Maybe you can speak to preference falsification.
0:39:21 What do they actually believe?
0:39:23 How many of them actually hate Trump?
0:39:31 But like what percent of them are feeling this vibe shift and are interested in creating
0:39:34 the roaring twenties in the way they’ve described?
0:39:36 So first we should maybe talk population.
0:39:40 So there’s like all of Silicon Valley and the way to just measure that is just look
0:39:41 at voting records.
0:39:42 Right.
0:39:44 And what that shows consistently is Silicon Valley is just a, you know, at least historically,
0:39:49 my entire time there has been overwhelmingly majority just straight up Democrat.
0:39:51 The other way to look at that is political donation records.
0:39:57 And again, you know, the political donations in the Valley range from 90 to 99% to one side.
0:39:59 And so, you know, we’ll, I just bring it up because like we’ll see what happens with
0:40:03 the voting and with donations going forward.
0:40:06 We maybe talk about the fire later, but I can tell you there is a very big question of
0:40:08 what’s happening in Los Angeles right now.
0:40:11 I don’t want to get into the fire, but like it’s catastrophic and, you know, there was
0:40:14 already a rightward shift in the big cities in California.
0:40:18 And I think a lot of people in LA are really thinking about things right now as they’re
0:40:21 trying to, you know, literally save their houses and save their families.
0:40:24 But you know, even in San Francisco, there was a big right, it was a big shift to the
0:40:26 right in the voting in 24.
0:40:30 So we’ll see where that goes, but, you know, you observe that by just looking at the numbers
0:40:32 over time.
0:40:35 The part that I’m more focused on is, you know, and I don’t know how to exactly describe
0:40:39 this, but it’s like the top thousand or the top 10,000 people, right?
0:40:43 And you know, I don’t have a list, but like it’s the, you know, it’s all the top founders,
0:40:47 top CEOs, top executives, top engineers, top VCs, you know, and then kind of into the
0:40:51 ranks, you know, the people who kind of built and run the companies and they’re, you know,
0:40:58 I don’t have numbers, but I have a much more tactile feel, you know, for what’s happening.
0:41:04 So I, the big thing I have now come to believe is that the idea that people have beliefs
0:41:07 is mostly wrong.
0:41:11 I think that most people just go along.
0:41:13 And I think even most high status people just go along.
0:41:17 And I think maybe the most high status people are the most prone to just go along because
0:41:19 they’re the most focused on status.
0:41:24 And the way I would describe that is, you know, one of the great forbidden philosophers
0:41:29 of our time is the Unabomber, Ted Kaczynski, and amidst his madness, he had this extremely
0:41:30 interesting articulation.
0:41:35 You know, he was a, he was an insane lunatic murderer, but he was also a, you know, Harvard
0:41:44 super genius, not that those are in conflict, but he was a very bright guy and he did this
0:41:49 whole thing where he talked about, basically he was very right-wing and talked about leftism
0:41:50 a lot.
0:41:53 And he had this great concept that’s just stuck in my mind ever since I read it, which
0:41:57 is he had this concept you just called oversocialization.
0:42:00 And so, you know, most people are socialized, like most people are socialized, like most
0:42:04 people are, you know, we live in a society, most people learn how to be part of a society,
0:42:06 they give some deference to the society.
0:42:10 There’s something about modern Western elites where they’re oversocialized and they’re just
0:42:16 like overly oriented towards what other people like themselves, you know, think and believe
0:42:20 and you can get a real sense of that if you have a little bit of an outside perspective,
0:42:25 which I just do, I think as a consequence of where I grew up, like even before I had
0:42:28 the views that I have today, there was always just this weird thing where it’s like, why
0:42:31 does every dinner party have the exact same conversation?
0:42:34 Why does everybody agree on every single issue?
0:42:39 Why is that agreement precisely what was in the New York Times today?
0:42:44 Why are these positions not the same as they were five years ago, right?
0:42:47 But why does everybody like snap into agreement every step of the way?
0:42:51 And that was true when I came to Silicon Valley and it’s just just true today, 30 years later.
0:42:55 And so I think most people are just literally take, I think they’re taking their cues from
0:42:59 it’s some combination, the press, the universities, the big foundations, so it’s like basically
0:43:04 it’s like the New York Times, Harvard, the Ford Foundation, and you know, I don’t know,
0:43:08 you know, a few CEOs and a few public figures and you know, maybe, you know, maybe the president
0:43:13 of your parties in power and like whatever that is, everybody just everybody who’s sort
0:43:18 of good and proper and elite and good standing and in charge of things and a sort of correct
0:43:21 member of, you know, let’s call it coastal American society, everybody just believes
0:43:22 those things.
0:43:26 And then, you know, the two interesting things about that is number one, there’s no divergence
0:43:28 among the organs of power, right?
0:43:31 So the Harvard and Yale believe the exact same thing, the New York Times, the Washington
0:43:34 Post believe the exact same thing, the Ford Foundation, the Rockefeller Foundation believe
0:43:38 the exact same thing, Google and you know, whatever, you know, Microsoft believe the
0:43:40 exact same thing.
0:43:43 But those things change over time.
0:43:46 But there’s never conflict in the moment, right?
0:43:50 And so, you know, the New York Times and the Washington Post agreed on exactly everything
0:43:58 in 1970, 1980, 1990, 2000, 2010 and 2020, despite the fact that the specifics changed radically,
0:43:59 the lockstep was what mattered.
0:44:03 And so I think basically we in the Valley, we’re on the tail end of that in the same
0:44:05 way, Hollywood’s the tail end of that in the same way, New York’s the tail end of that,
0:44:08 the same way the media is on the tail end of that.
0:44:10 It’s like some sort of collective hive mind thing.
0:44:13 And I just go through that to say like, I don’t think most people in my orbit, or you
0:44:18 know, say the top 10,000 people in the Valley, or the top 10,000 people in LA, I don’t think
0:44:21 they’re sitting there thinking, basically, I have rocks, I mean, they probably think
0:44:25 they have rocks out of beliefs, but they don’t actually have like some inner core of rocks
0:44:26 out of beliefs.
0:44:28 And then they kind of watch reality change around them and try to figure out how to keep
0:44:30 their beliefs, like correct, I don’t think that’s what happens.
0:44:34 I think what happens is they conform to the belief system around them.
0:44:37 And I think most of the time they’re not even aware that they’re basically part of
0:44:38 a herd.
0:44:45 Is it possible that the surface chatter of dinner parties, underneath that there is
0:44:50 a turmoil of ideas and thoughts and beliefs that’s going on, but you’re just talking to
0:44:55 people really close to you or in your own mind, and the socialization happens at the
0:45:01 dinner parties, like when you go outside the inner circle of one, two, three, four people
0:45:03 who you really trust, then you start to conform.
0:45:09 But inside there, inside the mind, there is an actual belief or a struggle, attention
0:45:17 within New York Times or with the listener, there’s a slow smile that overtook Marc Andreessen’s
0:45:18 face.
0:45:21 So look, I’ll just tell you what I think, which is at the dinner parties and at the
0:45:24 conferences, no, there’s none of that.
0:45:27 What there is is that all of the heretical conversations have anything that challenges
0:45:33 the status quo, any heretical ideas in any new idea is a heretical idea.
0:45:36 Any deviation, it’s either discussed a one-on-one face-to-face.
0:45:40 It’s like a whisper network, or it’s like a real-life social network.
0:45:43 There’s a secret handshake, which is like, okay, you meet somebody and you know each
0:45:47 other a little bit, but not well, and you’re both trying to figure out if you can talk
0:45:50 to the other person openly or whether you have to be fully conformist.
0:45:51 It’s a joke.
0:45:52 Oh, yeah.
0:45:53 Humor.
0:45:54 I’m sorry.
0:45:55 Somebody cracks a joke.
0:45:56 Somebody cracks a joke.
0:45:59 If the other person laughs, the conversation is on.
0:46:05 If the other person doesn’t laugh back slowly away from the scene, I didn’t mean anything
0:46:06 by it.
0:46:08 And by the way, it doesn’t have to be like a super offensive joke.
0:46:12 It just has to be a joke that’s just up against the edge of one of the, use the Sam Bankman
0:46:18 free term, one of the chivalrous, it has to be up against one of the things of one of
0:46:21 the things that you’re absolutely required to believe to be the dinner parties.
0:46:24 And then at that point, what happens is you have a peer-to-peer network.
0:46:30 You have a one-to-one connection with somebody, and then you have your little conspiracy of
0:46:32 a thought criminality.
0:46:35 And then you have your network, you’ve probably been through this, you have your network of
0:46:37 thought criminals, and then they have their network of thought criminals, and then you
0:46:41 have this like delicate mating dances to whether you should bring the thought criminals together.
0:46:42 Right?
0:46:46 And the dance, the fundamental mechanism of the dance is humor.
0:46:47 Yeah, it’s humor.
0:46:48 Right.
0:46:49 Well, of course.
0:46:50 Memes.
0:46:51 Yeah.
0:46:52 Well, for two reasons.
0:46:53 Number one, humor is a way to have deniability.
0:46:55 It’s a way to discuss these things without having deniability.
0:46:56 Oh, I’m sorry.
0:46:57 It was just a joke, right?
0:46:58 So that’s part of it.
0:47:00 Which is one of the reasons why comedians can get away with saying things the rest of
0:47:01 us can.
0:47:04 Because they can always fall back on, “Oh, yeah, I was just going for the laugh.”
0:47:08 But the other key thing about humor, right, is that laughter is involuntary, right?
0:47:09 Like you either laugh or you don’t.
0:47:12 And it’s not like a conscious decision whether you’re going to laugh, and everybody can tell
0:47:14 when somebody’s fake laughing, right?
0:47:16 And this every professional comedian knows this, right?
0:47:18 The laughter is the clue that you’re onto something truthful.
0:47:21 Like people don’t laugh at like made up bullshit stories.
0:47:24 They laugh because like you’re revealing something that they either have not been allowed to
0:47:27 think about or have not been allowed to talk about, right?
0:47:28 Or is off limits.
0:47:31 And all of a sudden, it’s like the ice breaks and it’s like, “Oh, yeah, that’s the thing.
0:47:32 And it’s funny.”
0:47:33 And like I laugh.
0:47:36 And then, and then of course, this is why, of course, live comedy is so powerful is because
0:47:37 you’re all doing that at the same time.
0:47:38 So you start to have, right?
0:47:39 The safety of, you know, the safety of numbers.
0:47:43 And so, so the comedians have like the all, there’s no, no surprise to me like, for example,
0:47:46 Joe has been as successful as he has because they have, they have this hack that the, you
0:47:50 know, the rest of us who are not professional comedians don’t have, but you have your in-person
0:47:51 version of it.
0:47:52 Yeah.
0:47:53 And then you’ve got the question of whether the, whether you can sort of join the networks
0:47:54 together.
0:47:57 And then you’ve probably been to this as, you know, then at some point there’s like a different,
0:48:00 there’s like the alt dinner party, the Thorker middle dinner party and you get six or eight
0:48:02 people together and you join the networks.
0:48:05 And those are like the happiest moments, at least in the last decade, those are like the
0:48:08 happiest moments of everybody’s lives because they’re just like, everybody’s just ecstatic
0:48:12 because they’re like, “I don’t have to worry about getting yelled at and shamed like for
0:48:16 every third sentence that comes out of my mouth and we can actually talk about real things.”
0:48:17 So that’s the live version of it.
0:48:22 And then of course the other side of it is the, you know, the group chat phenomenon, right?
0:48:26 And then basically the same thing played out, you know, until Elon bought Axe and until
0:48:30 Substack took off, you know, which were really the two big breakthroughs in free speech online.
0:48:33 The same dynamic played out online, which is you had absolute conformity on the social
0:48:37 networks, like literally enforced by the social networks themselves through censorship and
0:48:41 then also through cancellation campaigns and mobbing and shaming, right?
0:48:45 But then you had, but then group chats grew up to be the equivalent of a stop, right?
0:48:50 Anybody who grew up in the Soviet Union under communism, you know, they had the hard version
0:48:51 of this, right?
0:48:53 It’s like, how do you know who you could talk to and then how do you distribute information
0:48:58 and, you know, like, you know, again, that was the hard authoritarian version of this.
0:49:01 And then we’ve been living through this weird mutant, you know, softer authoritarian version
0:49:03 but with, you know, with some of the same patterns.
0:49:10 And WhatsApp allows you to scale and make it more efficient to build on these groups
0:49:13 of heretical ideas bonded by humor.
0:49:14 Yeah, exactly.
0:49:15 Well, and this is the thing.
0:49:16 This is kind of the running joke about group chat, right?
0:49:20 The running kind of thing about group chats, it’s not even a joke, it’s like, every group
0:49:23 chat, if you’ve noticed this, like every, this principle of group chats, every group
0:49:26 chat ends up being about memes and humor.
0:49:29 And the goal of the game, the game of the group chat is to get as close to the line
0:49:34 of being actually objectionable as you can get without actually tripping it, right?
0:49:38 And I like literally every group chat that I have been in for the last decade, even if
0:49:42 it starts some other direction, what ends up happening is it becomes the absolute comedy
0:49:47 fest where, but it’s walking, they walk right at the line and they’re constantly testing.
0:49:49 And every once in a while, somebody will trip the line and people will freak out and it’s
0:49:50 like, oh, too soon.
0:49:53 Okay, you know, we got to wait until next year to talk about that, you know, they walk
0:49:54 it back.
0:49:55 And so it’s that same thing.
0:49:57 And yeah, and then group chats is a technological phenomenon.
0:50:00 It was amazing to see because basically it was number one, it was, you know, obviously
0:50:05 the rise of smartphones, then it was the rise of the new messaging services, then it was
0:50:09 the rise specifically of, I would say, combination of what’s happened signal.
0:50:13 And the reason for that is those were the two big systems that did the full encryption.
0:50:15 So you actually felt safe.
0:50:20 And then the real breakthrough, I think, was disappearing messages, which hit signal probably
0:50:25 four or five years ago and hit WhatsApp three or four years ago.
0:50:31 And then the combination of encryption and disappearing messages, I think really unleashed
0:50:32 it.
0:50:35 Well, then there’s the fight over the length of the disappearing messages, right?
0:50:38 And so it’s like, you know, I often get behind of my things.
0:50:43 So I set to seven day, you know, disappearing messages and my friends who are like, no,
0:50:44 that’s way too much risk.
0:50:45 Yeah.
0:50:46 It’s got to be a day.
0:50:48 And then every once in a while, somebody will set to five minutes before they send something
0:50:49 like particularly inflammatory.
0:50:50 Yeah.
0:50:51 100%.
0:50:54 Well, what, I mean, one of the things that bothers me about what’s up, the choice is
0:50:58 between 24 hours and, you know, seven days, one day or seven days.
0:51:04 And I have to have an existential crisis about deciding whether I can last for seven days
0:51:06 with what I’m about to say.
0:51:07 Exactly.
0:51:09 Now, of course, what’s happening right now is the big thaw, right?
0:51:10 And so the vibe shift.
0:51:14 So what’s happening on the other side of the election is, you know, Elon on Twitter two
0:51:17 years ago and now Mark with Facebook and Instagram.
0:51:20 And by the way, with the continued growth of Substack and with other, you know, new platforms
0:51:24 that are emerging, you know, like I think it may be, you know, I don’t know that everything
0:51:29 just shifts back into public, but like a tremendous amount of the, a tremendous amount of the
0:51:33 verboten conversations, you know, can now shift back into public view.
0:51:36 And I mean, quite frankly, this is one of those things, you know, quite frankly, even
0:51:40 if I was opposed to what those people are saying, and I’m sure I am in some cases, you
0:51:43 know, I would argue it’s still like net better for society that those things happen in public
0:51:49 instead of private, you know, do you really want, like, yeah, like, don’t you want to
0:51:50 know?
0:51:53 And, and so, and then it’s just, look, it’s just, I think clearly much healthier to live
0:51:56 in a society in which people are not literally scared about their saying.
0:52:01 I mean, to push back, to come back to this idea that we’re talking about, I do believe
0:52:05 that people have beliefs and thoughts that are heretical, like a lot of people.
0:52:09 I wonder what fraction of people have that.
0:52:12 To me, this is the preference falsification is really interesting.
0:52:18 What is the landscape of ideas that human civilization has in private as compared to
0:52:25 what’s out in public, because like that, the, the, the dynamical system that is the difference
0:52:30 between those two is fascinating, like there’s throughout history, the, the fall of communism
0:52:36 in multiple regimes throughout Europe is really interesting because everybody was following,
0:52:43 you know, the line until not, but you better, for sure, privately, there was a huge number
0:52:49 of boiling conversations happening where like this is this, the bureaucracy of communism,
0:52:53 the corruption of communism, all of that was really bothering people more and more and
0:52:54 more and more.
0:52:58 And all of a sudden, there’s a trigger that allows the vibe shift to happen.
0:53:05 So to me, like the, the interesting question here is what is the landscape of private thoughts
0:53:12 and ideas and conversations that are happening under the surface of, of, of Americans, especially
0:53:17 my question is how much dormant energy is there for this roaring twenties where people
0:53:18 are like, no more bullshit.
0:53:19 Let’s get shit done.
0:53:20 Yeah.
0:53:21 So let’s go through that.
0:53:22 We’ll go through the theory of preference falsification.
0:53:23 Yeah.
0:53:24 Just, just, just by the way, amazing.
0:53:26 The books, unless it gets fascinating.
0:53:27 Yeah.
0:53:28 Yeah.
0:53:29 Great books.
0:53:32 Incredibly, about 20, 30 year old book, but it’s very, it’s completely modern and current
0:53:36 in what it talks about as well as very deeply historically informed.
0:53:42 So it’s called private truths, public lies, and it’s written by a social science professor
0:53:46 named Timur Quran at, I think, Duke.
0:53:47 And it’s, it’s definitive work on this.
0:53:50 And so he, he has this concept, he calls preference falsification.
0:53:53 And so preference falsification is two things, preference falsification.
0:53:56 And you get it from the title of the book, private truths, public lies.
0:54:00 So preference falsification is when you believe something and you can’t say it.
0:54:05 Or, and this is very important, you don’t believe something and you must say it, right?
0:54:10 And, and, and the commonality there is in both cases, you’re lying, you, you, you believe,
0:54:13 you believe something internally and then you’re lying about it in public.
0:54:17 And so the thing, you know, the, and there’s sort of two, the two classic forms of it.
0:54:20 There’s the, you know, for example, there’s the, I believe communism is rotten, but I
0:54:21 can’t say it, version of it.
0:54:26 But then there’s also the, the, the famous parable of the real life example.
0:54:30 But the thing that Voslav Havel talks about in the other good book on this topic, which
0:54:34 is the power of the powerless, you know, who was an anti-communist resistance fighter
0:54:37 who ultimately became the, you know, the president of Czechoslovakia after the fall
0:54:38 of the wall.
0:54:42 But he wrote this book and he, he describes the other side of this, which is workers
0:54:44 of the world unite, right?
0:54:48 And so he, he describes what he calls the parable, the greengrocer, which is your greengrocer
0:54:51 in Prague in 1985.
0:54:54 And for the last 70 years, it has been, or it’s 50 years, it’s been absolutely mandatory
0:54:59 to have a sign in the window of your store that says workers of the world unite, right?
0:55:00 And it’s 1985.
0:55:04 It is like crystal clear that the world, the workers of the world are not going to unite.
0:55:08 Like all the things that could happen in the world, that is not going to happen.
0:55:10 The commies have been at that for 70 years.
0:55:11 It is not happening.
0:55:13 But that slogan had better be in your window every morning, because if it’s not in your
0:55:16 window every morning, you are not a good communist.
0:55:19 The secret police are going to come by and they’re going to, they’re going to get you.
0:55:21 And so the first thing you do when you get to the store is you put that slogan in the
0:55:23 window and you make sure that it stays in the window all day long.
0:55:27 But he says the thing is every single person, the greengrocer knows the slogan is fake.
0:55:29 He knows it’s a lie.
0:55:32 Every single person walking past the slogan knows that it’s a lie.
0:55:35 Every single person walking past the store knows that the greengrocer is only putting
0:55:38 it up there because he has to lie in public.
0:55:42 And the greengrocer has to go through the humiliation of knowing that everybody knows
0:55:44 that he’s caving into the system and lying in public.
0:55:48 And so it turns into the moralization campaign.
0:55:50 It’s not just ideological enforcement.
0:55:54 In fact, it’s not ideological enforcement anymore because everybody knows it’s fake.
0:55:55 The authorities know it’s fake.
0:55:56 Everybody knows it’s fake.
0:55:59 It’s not that they’re enforcing the actual ideology of the world’s workers of the world
0:56:00 uniting.
0:56:05 It’s that they are enforcing compliance and compliance with the regime and fuck you, you
0:56:06 will comply.
0:56:09 And so anyway, that’s the other side of that.
0:56:13 And of course, we have lived in the last decade through a lot of both of those.
0:56:17 I think anybody listening to this could name a series of slogans that we’ve all been forced
0:56:20 to chant for the last decade that everybody knows at this point are just like simply not
0:56:21 true.
0:56:26 I’ll let the audience speculate on their own group chats.
0:56:29 >> Send mark your memes online as well, please.
0:56:30 >> Yes, yes, exactly.
0:56:32 But okay, so anyway, so it’s the two sides of that, right?
0:56:36 So it’s private truth, it’s public lies.
0:56:39 So then what preference falsification does is it talks about extending that from the
0:56:42 idea of the individual experience of that to the idea of the entire society experiencing
0:56:43 that, right?
0:56:47 That’s just your percentages question, which is like, okay, what happens in a society in
0:56:49 which people are forced to lie in public about what they truly believe?
0:56:52 What happens, number one, is that individually they’re lying in public and that’s bad.
0:56:56 But the other thing that happens is they no longer have an accurate gauge at all or any
0:56:59 way to estimate how many people agree with them.
0:57:02 And this is how, again, this literally is like how you get something like the communist
0:57:08 system, which is like, okay, you end up in a situation in which 80 or 90 or 99% of society
0:57:11 can actually all be thinking individually, I really don’t buy this anymore.
0:57:14 And if anybody would just stand up and say it, I would be willing to go along with it,
0:57:17 but I’m not going to be the first one to put my head on the chopping block.
0:57:21 But you have no, because of the suppression censorship, you have no way of knowing how
0:57:22 many other people agree with you.
0:57:26 And if the people, if the people agree with you are 10% of the population and you become
0:57:29 part of a movement, you’re going to get killed.
0:57:33 If 90% of the people agree with you, you’re going to win the revolution, right?
0:57:37 And so the question of like what the percentage actually is, is like a really critical question.
0:57:41 And then basically, in any sort of authoritarian system, you can’t like run a survey to get
0:57:42 an accurate result.
0:57:45 And so you actually can’t know until you put it to the test.
0:57:47 And then what he describes in the book is it’s always put to the test in the same way.
0:57:51 And this is exactly what’s happened for the last two years, like 100% of exactly what’s
0:57:52 happened.
0:57:58 It’s like straight out of this book, which is somebody, Elon sticks his hand up and says,
0:58:02 the workers of the world are not going to unite, right, or the emperor is actually wearing
0:58:03 no clothes, right?
0:58:05 You know, that famous parable, right?
0:58:08 So one person stands up and does it and literally that person is standing there by themselves
0:58:12 and everybody else in the audience is like, ooh, I wonder what’s going to happen to that
0:58:13 guy.
0:58:14 Right.
0:58:15 But again, nobody knows.
0:58:16 Elon doesn’t know.
0:58:17 The first guy doesn’t know.
0:58:19 Other people don’t know, like, which way is this going to go?
0:58:22 And it may be that that’s a minority position and that’s a way to get yourself killed.
0:58:26 Or it may be that that’s the majority position and that and you are now the leader of a revolution.
0:58:29 And then basically, of course, what happens is, okay, the first guy does that, doesn’t get
0:58:30 killed.
0:58:33 The second guy does, well, a lot of the time that guy doesn’t get killed, but when the
0:58:36 guy doesn’t get killed, then a second guy pops his head up, says the same thing.
0:58:37 All right.
0:58:40 Now you’ve got two, two leads to four, four leads to eight, eight leads to 16.
0:58:44 And then as we saw with the fall of the Berlin Wall, this is what happened in Russia and
0:58:47 Eastern Europe in ’89, when it goes, it can go, right?
0:58:49 And then it rips.
0:58:53 And then what happens is very, very quickly, if it turns out that you had a large percentage
0:58:56 of the population that actually believed the different thing, it turns out all of a sudden
0:59:00 everybody has this giant epiphany that says, oh, I’m actually part of the majority.
0:59:05 And at that point, like, you were on the freight train of revolution, right, like, it is rolling,
0:59:06 right?
0:59:11 Now, the other part of this is the distinction between the role of the elites and the masses.
0:59:14 And here, the best book is called The True Believer, which is the Eric Hoffer book.
0:59:20 And so the nuance you have to put on this is the elites play a giant role in this, because
0:59:24 the elites do idea formation and communication, but the elites by definition are a small minority.
0:59:28 And so there’s also this giant role played by the masses, and the masses are not necessarily
0:59:32 thinking these things through in the same intellectualized, formal way that the elites
0:59:33 are.
0:59:36 But they are for sure experiencing these things in their daily lives, and they for sure have
0:59:38 at least very strong emotional views on them.
0:59:42 And so when you really get the revolution, it’s when you get the elites lined up with
0:59:46 or either the current elites change or the new set of elites, a new set of counter elites
0:59:50 basically come along and say, no, there’s actually a different and better way to live.
0:59:53 And then the people basically decide to follow the counter elite.
0:59:55 So that’s the other dimension to it.
0:59:57 And of course, that part is also happening right now.
1:00:00 And again, case study number, one of that would be Elon and his, you know, he turns
1:00:03 out, you know, truly massive following.
1:00:07 And he has done that over and over in different industries, not just saying crazy shit online,
1:00:13 but saying crazy shit in the realm of space, in the realm of autonomous driving, in the
1:00:17 realm of AI, just over and over and over again, turns out saying crazy shit is one of the
1:00:20 ways to do a revolution and to actually make progress.
1:00:21 Yeah.
1:00:22 And it’s like, well, but then there’s the test.
1:00:23 Is it crazy shit?
1:00:24 Or is it the truth?
1:00:25 Yeah.
1:00:27 And, you know, and this is where, you know, many, there are many more specific things
1:00:31 about Elon’s genius, but one of the, one of the really core ones is an absolute dedication
1:00:32 to the truth.
1:00:36 And so when Elon says something, it sounds like crazy shit, but in his mind, it’s true.
1:00:37 Now is he always right?
1:00:38 No.
1:00:39 Sometimes the rockets crash.
1:00:40 Like, you know, sometimes he’s wrong.
1:00:41 He’s human.
1:00:42 He’s like anybody else.
1:00:43 He’s not right all the time.
1:00:46 But at least my, my through line with him, both in what he says in public and what he
1:00:49 says in private, which by the way, are the exact same things.
1:00:50 He does not do this.
1:00:52 He doesn’t lie in public about what he believes in private, or at least he doesn’t do that
1:00:53 anymore.
1:00:56 But it’s 100% consistent in my, in my experience.
1:01:00 By the way, there’s two guys who are 100% consistent like that, that I know, um, Elon
1:01:01 and Trump.
1:01:02 Yeah.
1:01:06 Whatever you think of them, what they say in private is 100% identical to what they
1:01:07 say in public.
1:01:08 Like they are completely transparent.
1:01:10 They’re completely honest in that way, right?
1:01:13 Which is like, and again, it’s not like they’re perfect people, but they’re honest in that
1:01:14 way.
1:01:17 And it makes them potentially both, as they have been very powerful leaders of these
1:01:21 movements, because they’re both willing to stand up and say the thing that if it’s true,
1:01:25 it turns out to be the thing in many cases that, you know, many or most or almost everyone
1:01:28 else actually believes, but nobody was actually willing to say out loud.
1:01:29 And so they can actually catalyze these shifts.
1:01:33 And I, I mean, I think this framework is exactly why Trump took over the Republican party is
1:01:36 I think Trump stood up there on stage with all these other kind of conventional Republicans
1:01:39 and he started saying things out loud that it turned out the base really was, they were
1:01:42 either already believing or they were prone to believe.
1:01:43 And he was the only one who was saying them.
1:01:47 And so the, again, elite masses, he was the elite, the voters of the masses and the voters
1:01:52 decided, you know, no, no more bushes, like we’re going this other direction.
1:01:53 That’s the mechanism of social change.
1:01:56 Like what we just described is like the actual mechanism of the social change.
1:01:59 It is fascinating to me that we have been living through exactly this.
1:02:03 We’ve been moving through everything exactly what Timur Karan describes, everything that
1:02:08 Voslav Havel described, you know, black squares and Instagram, like the whole thing, right?
1:02:09 All of it.
1:02:14 And we’ve been living through the, you know, the true believer elites masses, you know,
1:02:17 thing with, you know, with a set of like basically incredibly corrupt elites wondering
1:02:19 why they don’t have the little masses anymore and a set of new elites that are running away
1:02:20 with things.
1:02:24 And so like we’re, we’re living through this like incredible applied case study of these
1:02:25 ideas.
1:02:28 And, you know, if there’s a moral of the story, it is, you know, I think fairly obvious, which
1:02:33 is it is a really bad idea for a society to wedge itself into a position in which most
1:02:36 people don’t believe the fundamental precepts of what they’re told they have to do, you
1:02:40 know, to be, to be good people like that, that is just not, not a good state to be in.
1:02:44 So one of the ways to avoid that in the future, maybe is to keep the delta between what’s
1:02:47 said in private and what’s said in public small.
1:02:48 Yeah.
1:02:50 It’s like, well, this is sort of the, the siren song of censorship is we can keep people
1:02:54 from saying things, which means we can keep people from thinking things.
1:02:57 And you know, by the way, that may work for a while, right?
1:03:00 Like, you know, this, I mean, again, the hard form of the Soviet Union, you know, Soviet
1:03:05 Union, owning a mimeograph, pre-photocopiers, there were mimeograph machines that were
1:03:08 used to make some was taught underground newspapers, which is the mechanism of written communication
1:03:12 of radical ideas, radical ideas.
1:03:14 Ownership of a mimeograph machine was punishable by death.
1:03:15 Right?
1:03:18 So that’s the hard version, right?
1:03:21 You know, the soft version is somebody clicks a button in Washington and you are erased
1:03:22 from the internet.
1:03:23 Right?
1:03:25 Like, which, you know, good news, you’re still alive.
1:03:28 Bad news is, you know, shame about not being able to get a job, you know, too bad your
1:03:31 family now, you know, hates you and won’t talk to you, you know, whatever, whatever the,
1:03:34 you know, whatever the version of cancellation has been.
1:03:36 And so, so, so like, does that work?
1:03:40 Like, maybe it works for a while, like it worked for the Soviet Union for a while, you
1:03:43 know, in its way, especially when it was coupled with, you know, official state power, but when
1:03:48 it unwinds, it can only wind with like incredible speed and ferocity because to your point, there’s
1:03:49 all this bottled up energy.
1:03:52 Now, your question was like, what are the percentages?
1:03:53 Like what’s the breakdown?
1:03:58 And so my, my rough guess, just based on what I’ve seen in my world is it’s something
1:04:01 like 20, 60, 20.
1:04:05 It’s like you’ve got 20% like true believers in whatever is, you know, the current thing,
1:04:08 you know, you got 20, you got 20% of people who are just like true believers of whatever
1:04:12 they, you know, whatever, you know, whatever’s in the New York Times, Harvard professors and
1:04:16 the Ford Foundation, like just digitally, by the way, maybe it’s 10, maybe it’s five,
1:04:18 but let’s say generously it’s 20.
1:04:22 So it’s a, you know, 20% kind of full on revolutionaries.
1:04:26 And then you’ve got, let’s call it 20% on the other side that are like, no, I’m not
1:04:27 on board with this.
1:04:28 This is, this is crazy.
1:04:31 I’m not, I’m not signing up for this, but, you know, you know, they, their view of themselves
1:04:32 is they’re in a small minority.
1:04:35 And in fact, they start out in a small minority because what happens is the 60% go with the
1:04:38 first 20%, not the second 20%.
1:04:41 So you’ve got this large middle of people and it’s not that there’s anything like, it’s
1:04:44 not that people in the middle are not smart or anything like that.
1:04:47 It’s that they just have like normal lives and they’re just trying to get by and they’re
1:04:51 just trying to go to work each day and do a good job and be a good person and raise their
1:04:55 kids and, you know, have a little bit of time to watch the game.
1:04:59 And they’re just not engaged in the cut and thrust of, you know, political activism or
1:05:01 any of this stuff is just not their thing.
1:05:05 But then, but that’s where the over socialization comes in is just like, okay, by default, the
1:05:11 60% will go along with the 20% of the radical revolutionaries at least for a while.
1:05:14 And then the counter elite is in this other 20%.
1:05:19 And over time, they build up a theory and network and ability to resist.
1:05:22 And a new set of representatives and a new set of ideas.
1:05:24 And then at some point, there’s a contest.
1:05:27 And then, and then, and then right, and then the question is what happens in the middle,
1:05:30 what happens in the 60% and it is kind of my point.
1:05:34 It’s not even really does the 60% change their beliefs as much as it’s like, okay, what, what
1:05:39 is the thing that that 60% now decides to basically fall into step with.
1:05:44 And I think that 60% in the valley that 60% for the last decade decided to be woke.
1:05:49 And you know, extremely, I would say on edge on a lot of things.
1:05:52 And I, you know, that 60% is pivoting in real time.
1:05:53 They’re just done.
1:05:54 They’re just had it.
1:05:59 And I would love to see where that pivot goes because there’s internal battles happening
1:06:00 right now.
1:06:01 Right.
1:06:02 So this is the other thing.
1:06:03 Okay.
1:06:04 So there’s two, two forms of internal, there’s two forms of things.
1:06:07 And Timur has actually talked about this, Professor Crown has talked about this.
1:06:10 And so, so one is he said, he said, this is the kind of unwind where what you’re going
1:06:11 to have is you’re not going to have people in the other direction.
1:06:14 You’re going to have people who claim that they supported Trump all along who actually
1:06:15 didn’t.
1:06:16 Right.
1:06:17 Right.
1:06:19 So it’s going to swing the other way.
1:06:21 And by the way, Trump’s not the only part of this, but you know, he’s just a convenient
1:06:23 shorthand for, you know, for, for a lot of this.
1:06:26 But you know, whatever it is, you’ll, you’ll have people who will say, well, I never supported
1:06:30 the right or I never supported ESG or I never thought we should have canceled that person.
1:06:31 Right.
1:06:34 Where of course they were full on a part of the mob, like, you know, kind of at that
1:06:35 moment.
1:06:36 Right.
1:06:39 So you’ll have preference falsification happening in the other direction and his prediction,
1:06:43 I think basically is you’ll end up with the same quote problem on the, on the other side.
1:06:44 Now, will that happen here?
1:06:48 I don’t know, you know, how far is American society willing to go at any of these things?
1:06:49 I don’t know.
1:06:51 But like there is some, some question there.
1:06:55 And then, and then the other part of it is, okay, now you have this, you know, elite that
1:06:58 is used to being in power for the last decade.
1:07:01 And by the way, many of those people are still in power and they’re in very, you know, important
1:07:03 positions and the New York times is still the New York times and Harvard is still Harvard
1:07:07 and like those people haven’t changed like at all, right.
1:07:10 And they didn’t, you know, they’ve been bureaucrats in the government and, you know, senior democratic,
1:07:12 you know, politicians and so forth.
1:07:15 And they’re sitting there, you know, right now feeling like reality has just smacked them
1:07:18 hard in the face because they lost the election so badly.
1:07:22 But they’re now going into a, and specifically the Democratic party is going into a civil
1:07:23 war.
1:07:24 Right.
1:07:27 And that form of the civil war is completely predictable.
1:07:30 And it’s exactly what’s happening, which is half of them are saying, we need to go back
1:07:31 to the center.
1:07:34 And we need to de-radicalize because we’ve lost the people.
1:07:35 We’ve lost that the people in the middle.
1:07:39 And so we need to go back to the middle in order to be able to get 50% plus one in an
1:07:40 election.
1:07:41 Right.
1:07:43 And then the other half of them are saying, no, we weren’t true to our principles.
1:07:44 We were too weak.
1:07:45 We were too soft.
1:07:46 You know, we must become more revolutionary.
1:07:48 We must double down and we must, you know, celebrate, you know, murders in the street
1:07:50 of health insurance executives.
1:07:52 And that’s, and that right now is like a real fight.
1:07:57 If I can tell you a little personal story that breaks my heart a little bit, there’s a, there’s
1:08:02 a professor, a historian, I won’t say who, who I admire deeply, love his work.
1:08:05 He’s a kind of a heretical thinker.
1:08:12 And we were talking about having a podcast or doing a podcast and he eventually said
1:08:18 that, you know what, at this time, given your guest list, I just don’t want the headache
1:08:24 of being in the faculty meetings in my particular institution.
1:08:28 And I asked who are the particular figures in this guest list.
1:08:31 He said, Trump.
1:08:37 And the second one, he said that you announced your interest to talk to Vladimir Putin.
1:08:39 So I just don’t want the headache.
1:08:45 Now I fully believe he, it would surprise a lot of people if I said who it is, but you
1:08:50 know, this is a person who’s not bothered by the guest list.
1:08:55 And I should also say that 80 plus percent of the guest list is left wing.
1:08:56 Okay.
1:08:59 Nevertheless, he just doesn’t want the headache.
1:09:04 And that speaks to the, the thing that you’ve kind of mentioned that you just don’t, don’t
1:09:05 want the headache.
1:09:10 You just want to just have a pleasant morning with some coffee and talk to your fellow professors.
1:09:14 And I think a lot of people are feeling that in universities and in other contexts in tech
1:09:16 companies.
1:09:20 And I wonder if that shifts how quickly that shifts.
1:09:26 And there the percentages you mentioned, 20, 60, 20 matters and the, and the, the contents
1:09:30 of the private groups matters and the dynamics of how that shifts matters.
1:09:32 Cause it’s very possible.
1:09:36 Nothing really changes in universities and major tech companies or just, there’s a kind
1:09:45 of excitement right now for potential revolution and these new ideas, this new vibes to reverberate
1:09:51 through these companies and universities, but it’s possible the, the wall will hold.
1:09:52 Yeah.
1:09:53 So he’s a friend of yours.
1:09:55 I respect that you don’t want to name him.
1:09:56 I also respect you don’t want to beat on him.
1:09:59 So I would like to beat on him on your behalf.
1:10:00 Does he have tenure?
1:10:01 Yes.
1:10:04 He should use it.
1:10:07 So this is the thing, right?
1:10:10 This is the ultimate indictment of the corruption and the rot at the heart of our education
1:10:12 system at the heart of these universities.
1:10:14 And it’s by the way, it’s like across the board.
1:10:16 It’s like all the, all the top universities.
1:10:20 It’s like, cause the, the siren song for what it’s been for 70 years, whatever, the tenure
1:10:25 system peer review system, tenure system, um, which is like, yeah, you work your butt
1:10:29 off as an academic to get a professorship and then to get tenure, because then you can
1:10:32 say what you actually think, right?
1:10:37 Then you can do your work and your research and your speaking and your teaching without
1:10:40 fear of being fired, right?
1:10:43 Without fear of being canceled, um, like academic freedom.
1:10:48 I mean, think of the term academic freedom and then think of what these people have done
1:10:49 to it.
1:10:52 Like it’s gone.
1:11:02 Like that entire thing was fake and is completely rotten and these people are completely, completely
1:11:06 giving up the entire moral foundation of the system has been built for them, which by the
1:11:12 way is paid for virtually 100% by taxpayer money.
1:11:16 That’s the, what’s the inkling of hope in this, like what this particular person and
1:11:22 others who hear this, what can give them strength, inspiration, and courage, um, that the population
1:11:25 at large is going to realize the corruption in their industry and it’s going to withdraw
1:11:26 the funding.
1:11:27 It’s okay.
1:11:28 So desperation.
1:11:30 No, no, no, no, no, think about what happens next.
1:11:31 Okay.
1:11:32 So let’s go, let’s go through it.
1:11:35 So the, the universities, the university, the universities are funded by four primary sources
1:11:36 of federal funding.
1:11:39 The big one is a federal student loan program, which is, you know, in the many trillions of
1:11:43 dollars at this point and only spiraling, you know, way faster than inflation.
1:11:44 That’s number one.
1:11:48 Number two is federal research funding, which is also very large and you probably know that
1:11:53 when a scientist at the university gets a research grant, the university rakes as much
1:11:58 as 70% of the money for central uses.
1:12:01 Number three is tax exemption at the operating level, which is based in the idea that these
1:12:06 are nonprofit institutions as opposed to let’s say political institutions.
1:12:11 Number four is tax exemptions at the endowment level, you know, which is the financial buffer
1:12:15 that these places have.
1:12:18 Anybody who’s been close to university budget will basically see that what would happen
1:12:20 if you withdrew those sources of federal taxpayer money.
1:12:24 And then for the state schools, the state money, they still legal bankrupt.
1:12:28 And then you could rebuild.
1:12:30 Then you could rebuild because the problem right now, you know, like the folks at University
1:12:32 of Austin are like mounting a very valiant effort.
1:12:34 And I hope that they succeed and I’m sure I’m cheering for them.
1:12:38 But the problem is you’re now inserting, you suppose you and I want to start a new university
1:12:41 and we want to hire all the free thinking professors and we want to have the place that
1:12:42 fixes all this.
1:12:45 Practically speaking, we can’t do it because we can’t get access to that money.
1:12:48 You’re the most direct reason we can’t get access to that money.
1:12:50 We can’t get access to federal student funding.
1:12:54 Do you know how universities are accredited for the purpose of getting access to federal
1:12:57 student funding, federal student loans?
1:13:00 They’re accredited by the government, but not directly, indirectly.
1:13:02 They’re not accredited by the Department of Education.
1:13:07 Instead what happens is the Department of Education accredits accreditation bureaus
1:13:09 that are non-profits that do the accreditation.
1:13:12 Guess what the composition of the accreditation bureaus is?
1:13:16 The existing universities, they’re in complete control.
1:13:20 The incumbents are in complete control as to who gets, as to who gets access to federal
1:13:21 student loan money.
1:13:26 Guess how enthusiastic they are about accrediting a new university, right?
1:13:32 And so we have a government funded and supported cartel that has gone, I mean, it’s just obvious.
1:13:36 Now it’s just gone sideways and basically any possible way it could go sideways, including,
1:13:40 I mean, literally, as you know, students getting beaten up on campus for being the wrong religion.
1:13:43 They’re just wrong in every possible way at this point.
1:13:45 And it’s all in the federal taxpayer back.
1:13:50 And there is no way, I mean, my opinion, there is no way to fix these things without replacing
1:13:51 them.
1:13:54 And there’s no way to replace them without letting them fail.
1:13:56 And by the way, it’s like everything else in life.
1:13:59 I mean, in a sense, this is like the most obvious conclusion of all time, which is what
1:14:04 happens in the business world when a company has a bad job is they go bankrupt and another
1:14:05 company takes its place, right?
1:14:07 And that’s how you get progress.
1:14:11 And of course, below that is what happens is this is the process of evolution, right?
1:14:12 Why does anything ever get better?
1:14:16 Because things are tested and tried and then you know, the things that are good survive.
1:14:18 And so these places have cut themselves off.
1:14:21 They’ve been allowed to cut themselves off from both from evolution at the institutional
1:14:28 level and evolution at the individual level, as shown by the just widespread abuse of tenure.
1:14:33 And so we’ve just stalled out, we built an ossified system, an ossified centralized corrupt
1:14:34 system.
1:14:36 We’re surprised by the results.
1:14:38 They are not fixable in their current form.
1:14:40 I disagree with you on that.
1:14:44 Maybe it’s grounded in hope that I believe you can revolutionize the system from within
1:14:48 because I do believe Stanford and MIT are important.
1:14:51 Oh, but that logic doesn’t follow at all.
1:14:53 That’s underpants-nome logic.
1:14:55 Underpants-nome, can you explain what that means?
1:14:56 Underpants-nose logic.
1:14:59 I just started watching a key touchstone of American culture with my nine-year-old, which
1:15:00 of course is South Park.
1:15:01 Yes.
1:15:02 Wow.
1:15:05 And there is a, which by the way is a little aggressive for a nine-year-old.
1:15:06 Very aggressive.
1:15:07 But he likes it.
1:15:10 So he’s learning all kinds of new words.
1:15:11 All kinds of new ideas.
1:15:12 But yeah.
1:15:14 I told him, I said, “You’re going to hear words on here that you are not allowed to
1:15:15 use.”
1:15:16 Right.
1:15:17 Education.
1:15:22 And I said, “Do you know how we have an agreement that we never lie to mommy?”
1:15:27 I said, “Not using a word that you learn in here does not count as lying.”
1:15:28 Wow.
1:15:29 And keep that in mind.
1:15:32 Orwellian redefinition of lying, but yes, go ahead.
1:15:35 Of course, in the very opening episode, in the first 30 seconds, one of the kids calls
1:15:36 the other kid a dildo.
1:15:37 Right?
1:15:38 We’re off to the races.
1:15:39 Yep.
1:15:40 Let’s go.
1:15:41 Daddy, what’s a dildo?
1:15:42 Yep.
1:15:48 You know, I’m sorry, I don’t know.
1:15:56 So, famous episode of South Park, the underpants gnomes, and so there’s all the kids basically
1:15:59 realize that their underpants are going missing from their dresser drawers.
1:16:02 Somebody stealing the underpants, and it’s just like, “Well, who on earth would steal
1:16:03 the underpants?”
1:16:05 And it turns out it’s the underpants gnomes.
1:16:07 And it turns out the underpants gnomes have come to town, and they’ve got this little
1:16:10 underground warren of tunnels and storage places for all the underpants.
1:16:14 And so they go out at night, they steal the underpants, and the kids discover the underpants
1:16:16 gnomes, and they’re, “What are you doing?
1:16:17 What’s the point of this?”
1:16:21 And so the underpants gnomes present their master plan, which is a three-part plan, which
1:16:24 is step one, collect underpants.
1:16:26 Step three, profit.
1:16:30 Step two, question mark.
1:16:34 So you just proposed the underpants gnomes, which is very common in politics.
1:16:37 So the form of this in politics is, we must do something.
1:16:41 This is something, therefore we must do this.
1:16:45 But there’s no causal logic chain in there at all to expect that that’s actually going
1:16:48 to succeed, because there’s no reason to believe that it is.
1:16:49 It’s the same thing.
1:16:50 But this is what I hear all the time.
1:16:56 I will let you talk as the host of the show in a moment, but I hear this all the time.
1:17:00 I have friends who are on these boards, very involved with these places, and I hear this
1:17:02 all the time, which is like, “Oh, these are very important.
1:17:07 We must fix them, and so therefore they are fixable.”
1:17:09 There’s no logic chain there at all.
1:17:14 If there’s that pressure that you described in terms of cutting funding, then you have
1:17:22 the leverage to fire a lot of the administration and have new leadership that steps up, that
1:17:27 aligns with this vision that things really need to change at the heads of the universities,
1:17:33 and they put students and faculty primary, fire a lot of the administration, and realign
1:17:40 and reinvigorate this idea of freedom of thought and intellectual freedom.
1:17:45 Because there is already a framework of great institutions that’s there, and the way they
1:17:50 talk about what it means to be a great institution is aligned with this very idea that you’re
1:17:51 talking about.
1:17:56 It’s this meaning like intellectual freedom, the idea of tenure, right?
1:18:00 On the surface, it’s aligned, underneath is become corrupted.
1:18:03 If we say free speech and academic freedom often enough, sooner or later these tenured
1:18:04 professors will get brave.
1:18:07 Well, do you think the universities are fundamentally broken?
1:18:09 Okay, so how do you fix it?
1:18:19 How do you have institutions for educating 20-year-olds and institutions that host researchers
1:18:24 that have the freedom to do epic shit, like research-type shit that’s outside the scopes
1:18:27 of R&D departments and inside companies?
1:18:29 So how do you create an institution like that?
1:18:31 How do you create a good restaurant when the one down the street sucks?
1:18:34 All right, you invent something new?
1:18:36 You open a new restaurant?
1:18:37 Yeah.
1:18:38 Okay.
1:18:41 How often in your life have you experienced a restaurant that’s just absolutely horrible
1:18:43 and it’s poisoning all of its customers and the food tastes terrible?
1:18:46 And then three years later, you go back and it’s fantastic.
1:18:49 Charlie Munger actually had the best comment on his great investor, Charlie Munger, the
1:18:50 great comment.
1:18:52 He once asked, he’s like, you know, he’s, you know, General Electric was going through
1:18:55 all these challenges and he was asked to the Q&A, he said, “How would you fix the culture
1:18:56 of General Electric?”
1:18:58 And he said, “Fix the culture of General Electric.”
1:19:02 He said, “I couldn’t even fix the culture at a restaurant.”
1:19:03 Like it’s insane.
1:19:04 Like obviously you can’t do it.
1:19:07 I mean, nobody in business thinks you can do that.
1:19:09 Like, it’s impossible.
1:19:13 Like, it’s not, it’s, no, no, look, having said all that, I should also express this
1:19:17 because I have a lot of friends to work at these places and are involved in various attempts
1:19:18 to fix these.
1:19:19 I hope that I’m wrong.
1:19:20 I would love to be wrong.
1:19:23 I would love for the, I would love for the underpants known step two to be something
1:19:26 clear and straightforward that they can figure out how to do.
1:19:27 I would love to, love to fix it.
1:19:29 I’d love to see them come back to their spoken principles.
1:19:30 I think that’d be great.
1:19:33 I’d love to see the professors with tenure get bravery.
1:19:34 I would love to see.
1:19:38 I mean, it’d be fantastic, you know, my partner and I’ve done like a lot of public speaking
1:19:39 on this topic.
1:19:42 It’s, it’s been intended to not just be harsh, but also be like, okay, like these, these
1:19:44 challenges have to be confronted directly.
1:19:48 By the way, let me also say something positive, you know, especially post October 7th, there
1:19:52 are a bunch of very smart people who are major donors and board members of these institutions
1:19:56 like Mark Rowan, you know, who are really coming in trying to, you know, I think legitimately
1:19:57 trying to, trying to fix these places.
1:20:00 I have a friend on the executive committee at one of the top technical universities.
1:20:02 He’s working over time to try to do this.
1:20:05 Man, I hope they can figure it out.
1:20:08 But I, but the counter question would just be like, do you see it actually happening
1:20:10 at a single one of these places?
1:20:13 I’m a person that believes in leadership.
1:20:18 If you have the right leadership, the whole system can be changed.
1:20:21 So here’s a question for your friend who have tenure at one of these places, which is who
1:20:23 runs his university.
1:20:28 I think, you know, you know, I think runs it whoever the fuck says they run it.
1:20:29 That’s what great leadership is.
1:20:31 Like a president has that power.
1:20:36 But how does he has the leverage because they can mouth off like Elon can fire the professors.
1:20:39 They can fire them through being vocal publicly.
1:20:40 Yes.
1:20:41 Fire the professors.
1:20:42 What do you talk about legally?
1:20:44 No, they cannot fire the professors.
1:20:45 Then we know who runs the university.
1:20:46 The professors.
1:20:47 Yeah.
1:20:49 Professors, the professors and the students, the professors and the Ferrell students.
1:20:53 And they’re of course in a radicalization feedback cycle, driving each other crazy.
1:20:54 The Ferrell students.
1:20:55 Yeah, the Ferrell students.
1:20:56 Yeah, the Ferrell students.
1:20:59 What happens when you’re put in charge of your bureaucracy, where the, where the thing
1:21:02 that the bureaucracy knows is that they can outlast you?
1:21:05 The thing that the tenure professors at all these places know is it doesn’t matter who
1:21:09 the president is because they can outlast them because they cannot get fired.
1:21:12 By the way, it’s the same thing that bureaucrats in the government know.
1:21:14 It’s the same thing that the bureaucrats in the Department of Education know.
1:21:16 They know the exact same thing.
1:21:17 They can outlast you.
1:21:20 It’s, I mean, it’s the whole thing that the resistance, like they can be the resistance.
1:21:23 They can just sit there and resist, which is what they do.
1:21:24 They’re not fireable.
1:21:26 That’s definitely a crisis that needs to be solved.
1:21:27 It’s a huge problem.
1:21:30 And I also don’t like that I’m defending academia here.
1:21:37 I, I agree with you that the situation is dire and, uh, but I just think that institutions
1:21:38 are important.
1:21:41 And I should also add context, since you’ve been grilling me a little bit.
1:21:45 You were using restaurants as an analogy and earlier offline in this conversation, you
1:21:47 said the Dairy Queen is a great restaurant.
1:21:51 So let’s, let’s let the listener take that.
1:21:52 Dairy Queen is the best restaurant.
1:21:53 The best restaurant.
1:21:54 There you go.
1:21:57 I think Marcaadresa is saying today, I don’t want it to cut.
1:21:58 You should go order a blizzard.
1:22:00 Just one day you should walk down there and order a blizzard.
1:22:01 Yeah.
1:22:03 They can get like 4,000 calories in a cup.
1:22:04 They can.
1:22:05 And they’re delicious.
1:22:06 Amazing.
1:22:07 They are truly delicious.
1:22:08 And they’ll put, they’ll put anything in there you want.
1:22:09 All right.
1:22:10 Okay.
1:22:12 So, but anyway, let me just close by saying, look, I, I, my friends at the university system,
1:22:14 I would just say, look, like this is the challenge.
1:22:16 Like I would just, I would just pose this as the challenge.
1:22:19 Like to me, like this is having had a lot of these conversations.
1:22:20 Like this is the bar.
1:22:22 In my view, this is the conversation that actually has to happen.
1:22:24 This is the bar that actually has to be hit.
1:22:27 These problems need to be confronted directly because I think there’s just, I think there’s
1:22:28 been way too much.
1:22:31 I mean, I’m actually worried kind of on the other side, there’s too much happy talk in
1:22:32 these conversations.
1:22:35 I think the taxpayers do not understand this level of crisis.
1:22:39 And I think if the taxpayers come to understand it, I think the funding evaporates.
1:22:43 And so I think the, the fuse is going through, you know, no fault of any of ours, but like
1:22:44 the fuse is going.
1:22:47 And there’s some window of time here to fix this and address it and justify the money.
1:22:53 Because it just normal taxpayers sitting in normal towns, in normal jobs, are not going
1:22:56 to tolerate this for, for that much longer.
1:23:00 You mentioned censorship a few times, let us if we can go deeper into the darkness of
1:23:04 the past and how censorship mechanism was used.
1:23:09 So you are a good person to speak about the history of this because you were there on
1:23:14 the ground floor in 2013 ish Facebook.
1:23:23 I heard that you were there when they invented or maybe developed the term hate speech in
1:23:28 the context of censorship on social media.
1:23:33 So take me to through that history, if you can, the use of censorship.
1:23:37 So I was there on the ground floor in 1993.
1:23:39 There’s multiple floors to this building apparently.
1:23:40 There are.
1:23:41 Yeah.
1:23:45 So I was first asked to implement censorship on the internet, which was in the web browser.
1:23:46 That is fast.
1:23:47 Yeah.
1:23:48 Yeah.
1:23:51 In actually in 1982, I was asked to implement a nudity filter.
1:23:53 Did you have the courage to speak up back then?
1:23:56 I didn’t have any problem speaking up back then.
1:23:58 I was making six dollars and 25 cents an hour.
1:23:59 I did not have a lot to lose.
1:24:03 No, I was asked at the time and then look, you know, legitimate, you know, in some sense
1:24:07 of legitimate request, which is working on a research project actually funded by the
1:24:09 federal government and a public university.
1:24:12 So you know, I don’t think my boss was like in any way out of line, but it was like, yeah,
1:24:15 like this web browser thing is great, but like, could it just make sure to not have
1:24:17 any photos of naked people that show up?
1:24:21 But if you think about this for a second as a technologist, I had an issue, which is this
1:24:22 was like pre-image net, right?
1:24:26 And so I had a brief period where I tried to imagine an algorithm that I referred to
1:24:32 as the breast detection algorithm that I was going to have to design.
1:24:36 And then apparently a variety of other apparently body parts people are also sensitive about.
1:24:41 And and then I politely declined to do this for just the technical difficulties.
1:24:43 Well, number one, I didn’t actually didn’t know how to do it, but number two is just
1:24:46 like, no, I’m not, I’m not building, I’m just not building a censorship engine.
1:24:48 Like I’m, you know, I’m just not doing it.
1:24:51 And in those days, it was, you know, in those days, the internet generally was, you know,
1:24:55 free fire zone for everything is actually interesting as sort of pre-93.
1:24:57 The internet was such a specific niche community.
1:25:02 Like it was like the million kind of highest IQ nerds in the world.
1:25:06 And so it actually like didn’t really have a lot of issues that people were like super
1:25:10 interested in talking about like astrophysics and not very interested in, you know, even
1:25:11 politics at that time.
1:25:16 So there really was not an issue there, but yeah, I didn’t want to start the process.
1:25:19 So I think the way to think about this, so first of all, you know, yeah, so I was involved
1:25:22 in this at Facebook every step, by the way, I’ve been involved this at Facebook every
1:25:24 step of the way I joined the board there in 2007.
1:25:28 So I saw, I’ve seen everything in the last, you know, almost 20 years every step of the
1:25:29 way.
1:25:31 But also I’ve been involved in most of the other companies over time.
1:25:33 So I was an angel investor in Twitter, I knew them really well.
1:25:38 We were the founding investor in Substack, I’m part of the Elon takeover of Twitter
1:25:40 with X, I was an angel at LinkedIn.
1:25:44 So I’ve been in these, we were the funder of Pinterest, we were one of the main investors
1:25:46 there, Reddit as well.
1:25:48 And I was having these conversations with all these guys all the way through.
1:25:52 So as much talk specifically about Facebook, but I can just tell you like the general pattern
1:25:55 and for quite a while it was kind of all the same across these companies.
1:26:00 Yeah, so basically the way to think about this, the true kind of nuanced view of this
1:26:05 is that there is practically speaking no internet service that can have zero censorship.
1:26:09 And by the way, that also mirrors, there is no country that actually has limited free
1:26:11 speech either.
1:26:15 The US First Amendment actually has 12 or 13 formal carve outs from the Supreme Court
1:26:21 over time, you know, so incitement to violence and terrorist recruitment and child abuse
1:26:23 and so, you know, child pornography and so forth, they’re like, they’re not covered by
1:26:25 the First Amendment.
1:26:28 And just practically speaking, if you and I are going to start an internet company and
1:26:32 have a service, we can’t have that stuff either, right, because it’s illegal or it will just
1:26:33 clearly, you know, destroy the whole thing.
1:26:36 So you’re always going to have a censorship engine.
1:26:39 I mean, hopefully it’s not actually in the browser, but like you’re going to have it
1:26:42 for sure at the level of an internet service.
1:26:45 But then what happens is now you have a machine, right?
1:26:50 Now you have a system where you can put in rules saying we allow this, we don’t allow
1:26:51 that.
1:26:54 You have enforcement, you have consequences, right?
1:26:59 And once that system is in place, like it becomes the ring of power, right, which is
1:27:03 like, okay, now anybody in that company or anybody associated with a company or anybody
1:27:06 who wants to pressure that company will just start to say, okay, you should use that machine
1:27:11 for more than just terrorist recruitment and child pornography, you should use it for XYZ.
1:27:17 And basically that transition happened to call it 2012-2013 is when there was this like
1:27:19 very, very kind of rapid pivot.
1:27:22 I think the kickoff to it for some reason was this, it was the beginning of the second
1:27:24 Obama term.
1:27:29 I think it also coincided with the sort of arrival of the first kind of super woke kids
1:27:34 into these schools, you know, that kind of, you know, it’s the kids that were in school
1:27:37 between like, you know, for the Iraq war and then the global financial crisis and like,
1:27:40 they came out like super radicalized, they came into these companies and they immediately
1:27:45 started mounting these social crusades to ban and censor lots of things.
1:27:48 And then, you know, quite frankly, the Democratic Party figured this out and they figured out
1:27:51 that these companies were, you know, very subject to being controlled and the, you know,
1:27:55 the executive teams and boards of directors are almost all Democrats and, you know, there’s
1:27:58 tremendous circulation, a lot of Obama people from the first term actually came and worked
1:28:02 in these companies and a lot of FBI people and other, you know, law enforcement intelligence
1:28:07 people came in and worked and they were all Democrats for that set.
1:28:10 And so they just, you know, the ring of power was lying on the table.
1:28:15 It had been built and they, you know, pick it up and put it on and then they just ran.
1:28:18 And the original discussions were basically always on two topics.
1:28:21 It was hate speech and misinformation.
1:28:23 Hate speech was the original one.
1:28:26 And the hate speech conversation started exactly like you’d expect, which is we can’t have
1:28:29 the n-word in which the answer is fair enough.
1:28:30 Let’s not have the n-word.
1:28:31 Okay.
1:28:34 Now we’ve set a precedent, right?
1:28:37 And then, and then Jordan Peterson has talked a lot about this, the definition of hate speech
1:28:41 ended up being things that make people uncomfortable, right?
1:28:43 So we can’t have things that make, you know, people uncomfortable.
1:28:46 I, of course, you know, people like me that are disagreeable, raise their hands and say,
1:28:49 well, that idea right there makes me uncomfortable.
1:28:51 But of course, that doesn’t count as hate speech, right?
1:28:56 So, you know, the ring of power is on one hand and not on the other hand.
1:29:01 And then basically that began this slide where it ended up being that, you know, completely
1:29:05 anodyne is the point that Mark has been making recently, completely anodyne comments that
1:29:08 are completely legitimate on television or on the Senate floor.
1:29:10 All of a sudden our hate speech can’t be set online.
1:29:14 So that, you know, the ring of power was wielded in grossly irresponsible ways.
1:29:16 We can talk about all the stuff that happened there.
1:29:17 And then the other one was misinformation.
1:29:20 And that wasn’t as there was a little bit of that early on.
1:29:23 But of course, that really kicked in with with Trump.
1:29:28 So, so the hate speech stop, the hate speech stop predated Trump by like three or four years.
1:29:32 The misinformation stuff was basically, it was a little bit later, and it was the consequence
1:29:33 of the Russiagate hoax.
1:29:38 And then that was, you know, a ring of power that was even more powerful, right?
1:29:42 Because, you know, hate speech is like, okay, at some point, if some if something offensive
1:29:44 or not, like at least you can have a question as to whether that’s the case.
1:29:48 But the problem with misinformation is like, is it the truth or not?
1:29:52 You know, you know, what do we know for 800 years or whatever Western civilization?
1:29:56 It’s that, you know, there’s only a few entities that can determine the truth on every topic.
1:29:58 You know, there’s God, you know, there’s the king.
1:29:59 We don’t have those anymore.
1:30:02 And the rest of us are all imperfect and flawed.
1:30:05 And so the idea that any group of experts is going to sit around the table and decide
1:30:08 on the truth is, you know, deeply anti-Western and deeply authoritarian.
1:30:14 And somehow the misinformation kind of crusade went from the Russiagate hoax into just full-blown.
1:30:17 We’re going to use that weapon for whatever we want.
1:30:20 And then, of course, then the culminating moment on that that really was the straw that
1:30:25 broke the camel’s back was we’re going to censor all theories that the COVID virus might
1:30:28 have been manufactured in a lab as misinformation.
1:30:32 And inside these companies, like that was the point where people for the first time, this
1:30:36 is like what, three years ago, for the first time they were like, that was when it sunk
1:30:39 in where it’s just like, okay, this has spun completely out of control.
1:30:42 But anyway, that’s how we got to where we are.
1:30:47 And then basically that spell lasted, that that that complex existed and got expanded
1:30:51 basically from call it 2013 to 2023.
1:30:54 I think basically two things broke it.
1:30:55 One is sub-stack.
1:31:00 And so when I’m super proud of those guys, because they started from scratch and declared
1:31:04 right up front that they were going to be a free speech platform.
1:31:09 And they came under intense pressure, including from the press and, you know, they tried to
1:31:12 just beat them to the ground and kill them and intense pressure, by the way, from, you
1:31:16 know, let’s say certain of the platform companies, you know, basically threatening them.
1:31:17 And they stood up to it.
1:31:21 And, you know, sitting here today, they have the widest spectrum of speech and conversation.
1:31:24 I’ve, you know, anywhere on planet Earth and they’ve done a great job and it’s worked.
1:31:25 By the way, it’s great.
1:31:30 And then obviously Elon, you know, with X was the, you know, the hammer blow.
1:31:34 And then I did the third one now was what Marcus doing at Facebook.
1:31:39 And there’s also like singular moments, I think you’ve spoken about this, which like
1:31:45 John Stuart going on Stephen Colbert and talking about the lab leak theory.
1:31:46 Yes.
1:31:50 I just, there’s certain moments that just kind of shake everybody up.
1:31:54 The right person, the right time, just it’s a wake up call.
1:31:58 So that there, and I will tell you like, and I should say John Stuart attacked me recently
1:32:03 so I’m not that thrilled about him, but I would say I was a long run fan of John Stuart.
1:32:08 I watched probably every episode of the Daily Show when he was on it for probably 20 years.
1:32:11 But he did a very important public service and it was that appearance on the Colbert
1:32:12 show.
1:32:15 And I don’t know how broadly this is, you know, at the time it was in the news briefly,
1:32:18 but I don’t know how if people remember this, but I will tell you in, in the rooms where
1:32:22 people discuss what is misinformation and these policies, that was a very big moment.
1:32:23 That was probably actually the key catalyzing moment.
1:32:28 And I think he exhibited, I would say conspicuous bravery and had a big impact with that.
1:32:31 And yeah, what for people who don’t recall what he did, what, and this was in the full
1:32:35 blown like you absolutely, you know, you absolutely must lock down for two years, you absolutely
1:32:38 must keep all the schools closed, you absolutely must have everybody work from home.
1:32:41 You absolutely must wear a mask, like the whole thing.
1:32:46 And one of those was you absolutely must believe that COVID was completely natural.
1:32:51 You must believe that and not believing that means you’re a fascist Nazi Trump supporter,
1:32:53 mega evil Q and on person, right.
1:32:57 And that was like uniform and that was enforced by the social media companies.
1:33:01 And like I said, that was the peak and John Stuart went on the Colbert show and I don’t
1:33:04 know if they planned it or not because Colbert looked shocked, I don’t know how much it was
1:33:09 a bit, but he went on there and he just had one of these like the emperors wearing no
1:33:13 clothes things where he said, it’s just not plausible that you had the COVID super virus
1:33:20 appear 300 yards down the street from the Wuhan Institute of lethal coronaviruses like it’s
1:33:23 just not plausible that that certainly that you could just rule that out.
1:33:26 And then there was another key moment actually, the more serious version was I think the author
1:33:30 Nicholson Baker wrote a big piece for New York magazine and Nicholson Baker is like
1:33:34 one of our great novelist writers of our time and he wrote the piece and he did the complete
1:33:35 addressing of it.
1:33:39 And that was the first, I think that was the first legit, there had been like alt, you
1:33:42 know, renegade, there had been, you know, people running around saying this, but getting
1:33:43 censored all over the place.
1:33:46 That was the first one that was like in the mainstream press where he and he talked to
1:33:49 all the heretics and he just like laid the whole thing out.
1:33:52 And and that was a moment and I remember, let’s say a board meeting at one of these companies
1:33:56 after that where basically, you know, everybody looked around the table and it was like, all
1:34:01 right, I guess we’re not, we don’t need to censor that anymore.
1:34:03 And you know, and then of course, what immediately follows from that is, well, wait a minute,
1:34:06 why were we censoring that in the first place?
1:34:09 And okay, like, and then, you know, the downstream, not that day, but the downstream conversations
1:34:14 were like, okay, if we made such a giant, in retrospect, if we all made such a giant
1:34:17 collective mistake, censoring that, then what does that say about the rest of our regime?
1:34:21 And I think that was the thread in the sweater that started to unravel it.
1:34:24 I should say it again, I do think that the John Stuart appearance and the statement he
1:34:26 made was a courageous act.
1:34:27 Yeah, I agree.
1:34:30 I think we need to have more of that in the world.
1:34:38 And like you said, Elon, everything he did with X is a series of courageous acts.
1:34:45 And I think what Zuck, what Mark Zuckerberg did on Rogan a few days ago is a courageous
1:34:46 act.
1:34:49 Can you just speak to that?
1:34:51 He has become, I think, an outstanding communicator, right?
1:34:54 And he’s, you know, somebody who came in for a lot of criticism earlier in his career
1:34:55 on that front.
1:35:00 And I think he’s one of these guys who can sit down and talk for three hours and make
1:35:01 complete sense.
1:35:05 And, you know, as you do with all of your episodes, like when somebody sits and talks
1:35:09 for three hours, like you really get a sense of somebody, because it’s really hard to bear
1:35:10 official for that long.
1:35:12 And, you know, he’s not done that repeatedly.
1:35:13 He’s really good at it.
1:35:16 And then look, again, I would maybe put him in the third category now with, certainly
1:35:20 after that appearance, I would say I would put him up there now with, you know, kind of
1:35:23 Elon and Trump in the sense of the public and the private are now synchronized.
1:35:24 I guess I’d say that.
1:35:27 Like, he said on that show what he really believes.
1:35:28 He said all the same things that he says in private.
1:35:31 Like I don’t think there’s really any discrepancy anymore.
1:35:38 I would say he has always taken upon himself a level of obligation, responsibility to running
1:35:43 a company the size of Metta and to running services that are that large.
1:35:46 And I think, you know, his conception of what he’s doing, which I think is correct is he’s
1:35:48 running services that are bigger than any country, right?
1:35:52 He’s running, you know, over 3 billion people use those services.
1:35:55 And so, and then, you know, the company has, you know, many tens of thousands of employees
1:35:57 and many investors and it’s a public company.
1:36:01 And he thinks very deeply and seriously about his responsibilities.
1:36:05 And so, you know, he has not felt like he has had, let’s just say the complete flexibility
1:36:07 that Elon has had.
1:36:10 And you know, people could argue that one way or the other, but, you know, he’s, he’s,
1:36:12 you know, yeah, he’s, he’s, you know, he talked about a lot.
1:36:14 He’s evolved a lot.
1:36:15 A lot of it was he learned a lot.
1:36:17 And by the way, I’m going to put myself right back up there.
1:36:20 Like I’m not claiming any huge foresight or heroism on any of this.
1:36:22 Like I’ve also learned a lot.
1:36:26 Like, like my views on things are very different than they were 10 years ago on lots of topics.
1:36:29 And so, you know, I’ve been on a learning journey.
1:36:31 He’s been on a learning journey.
1:36:33 He is a really, really good learner.
1:36:39 He assimilates information, you know, as good as or better than anybody else I know.
1:36:42 The other thing I guess I would just say is he talked on that show about something very
1:36:46 important, which is when you’re in a role where you’re running a company like that, there
1:36:50 are a set of decisions that you get to make and you deserve to be criticized for those
1:36:53 decisions and so forth and it’s valid.
1:36:57 But you are under tremendous external pressure as well.
1:36:59 And by the way, you’re under tremendous internal pressure.
1:37:01 You’ve got your employees coming at you.
1:37:03 You’ve got your executives in some cases coming at you.
1:37:06 You’ve got your board in some cases coming at you.
1:37:08 You’ve got your shareholders coming at you.
1:37:11 So you’ve got your internal pressures, but you also have the press coming at you.
1:37:13 You’ve got academia coming at you.
1:37:17 You’ve got the entire non-profit complex coming, activist complex coming at you.
1:37:21 And then really critically, you know, he talked about Enrogan and these companies all went
1:37:27 through this in this last, especially five years, you had the government coming at you.
1:37:31 And you know, that’s the really, you know, stinky end of the pool where, you know, the
1:37:35 government was in my view, you know, illegally exerting, you know, just in flagrant violation
1:37:40 of the First Amendment and federal laws on speech and coercion and conspiracy, forcing
1:37:44 these companies to engage in activities, you know, then again, in some cases, they may
1:37:46 have wanted to do, but in other cases, they clearly didn’t want to do and felt like they
1:37:48 had to do.
1:37:54 And the level of pressure, like I just say, like I’ve known every CEO of Twitter, they’ve
1:37:58 all had the exact same experience, which when they were in the job, it was just daily beatings.
1:38:02 Like it’s just getting punched in the face every single day, constantly.
1:38:10 And you know, Mark is very good at getting physically punched in the face and he’s very
1:38:13 good at, you know, taking a punch and he has taken many, many punches.
1:38:17 So I would encourage people to have a level of sympathy for these are not kings.
1:38:20 These are people who operate with like, I would say, extraordinary levels of external
1:38:21 pressure.
1:38:26 I think if I had been in his job for the last decade, I would be a little puddle on the floor.
1:38:30 And so it says, I think a lot about him that he has, you know, risen to this occasion the
1:38:31 way that he has.
1:38:33 And by the way, I should also say, you know, the cynicism, of course, is immediately out.
1:38:37 And, you know, it’s a legitimate thing for people to say, but you know, it’s like, oh,
1:38:39 you’re only doing this because of Trump or, you know, whatever.
1:38:43 And it’s just like, no, like he has been thinking about and working on these things and trying
1:38:45 to figure them out for a very long time.
1:38:50 And so I think what you saw are legitimate, deeply held beliefs, not some, you know, sort
1:38:52 of just in the moment thing that could change at any time.
1:38:59 So what do you think it’s like to be him and other leaders of companies to be you and withstand
1:39:01 internal pressure and external pressure?
1:39:02 What’s that life like?
1:39:04 Is it deeply lonely?
1:39:05 That’s a great question.
1:39:07 Leaders are lonely to start with.
1:39:10 And this is one of those things where almost nobody has sympathy, right?
1:39:11 Nobody feels sorry for a CEO, right?
1:39:13 Like, it’s not a thing, right?
1:39:17 And, you know, and again, legitimately so, like CEOs get paid a lot, like the whole thing.
1:39:18 There’s a lot of great things about it.
1:39:21 So it’s not like they should be out there asking for a lot of sympathy, but it is the
1:39:23 case that they are human beings.
1:39:24 And it is the case that it is a lonely job.
1:39:30 And the reason it’s a lonely job is because your words carry tremendous weight.
1:39:33 And you are dealing with extremely complicated issues and you’re under a tremendous amount
1:39:36 of emotional, you know, personal emotional stress.
1:39:40 And, you know, you often end up not being able to sleep well and you end up not being
1:39:43 able to, like, keep up an exercise routine and all those things and, you know, you come
1:39:45 under family stress because you’re working all the time.
1:39:48 Or my partner, Ben, you know, was, he was CEO of our last company before we started
1:39:49 the venture firm.
1:39:52 He said, you know, the problem he had, like, with his family life was he would, even when
1:39:57 he was home at night, he wasn’t home because he was in his head trying to solve all the
1:39:58 business problems.
1:40:00 And so he was like supposed to be like having dinner with his kids and he was physically
1:40:01 there, but he wasn’t mentally there.
1:40:05 So, you know, you kind of get, you get that a lot, but the key thing is like you can’t
1:40:06 talk to people, right?
1:40:08 So you can’t, I mean, you can talk to your spouse and your kids, but like they don’t
1:40:11 understand that they’re not working in your company, they don’t understand, have the context
1:40:13 to really help you.
1:40:16 You, if you talk to your executives, they all have agendas, right?
1:40:20 And so they’re all, they’re all, and they can’t resist, like it’s just human nature.
1:40:23 And so you can’t necessarily rely on what they say.
1:40:28 It’s very hard in most companies to talk to your board because they can fire you.
1:40:29 Right.
1:40:32 Now, Mark has the situation because he has control, it actually turns out he can talk
1:40:35 to his board and Mark talks to us about many things that he does, that most CEOs won’t
1:40:39 talk to the boards about because we, literally because we can’t fire him.
1:40:42 But the general, a general, including all the CEOs of Twitter, none of them had control
1:40:44 and so they, they could all get fired.
1:40:47 So you can’t talk to the board members, they’re going to fire you.
1:40:51 You can’t talk to the shareholders because they’ll just like dump your stock, right?
1:40:54 Like, okay, so who’s the, so, so the, so every once in a while what you find is basically
1:40:58 the best case scenario they have is they can talk to other CEOs and there’s these little
1:41:00 organizations where they kind of pair up and do that.
1:41:03 And so they maybe get a little bit out of that, but, but even that’s fraught with peril
1:41:08 because can you really talk about confidential information with another CEO, insider trading
1:41:09 risk?
1:41:13 And so it’s just a very, it’s just a very lonely and isolating thing to start with.
1:41:16 And then you, and then on top of that, you apply pressure, right.
1:41:17 And that’s where it gets painful.
1:41:22 And then maybe I’ll just spend a moment on this internal, external pressure thing.
1:41:28 My general experience with companies is that they can withstand most forms of external
1:41:32 pressure as long as they retain internal coherence, right?
1:41:39 So as long as the internal team is really bonded together and supporting each other,
1:41:41 most forms of external pressure you can withstand.
1:41:46 And by that, I mean investor stuff, your stock, you lose your biggest customers, you know,
1:41:51 whatever negative article, you know, negative headline, you know, you can, you can withstand
1:41:52 all that.
1:41:54 And basically, in fact, many of those forms of pressure can be bonding experiences for
1:41:57 the team where they, where they come out stronger.
1:42:01 What you 100% cannot withstand is the internal crack.
1:42:05 And what I always look for in high pressure corporate situations now is the moment when
1:42:07 the internal team cracks.
1:42:13 Because I know the minute that happens, we’re in a different regime, like it’s like the,
1:42:16 you know, the solidest turn into liquid, like we’re in a different regime and like the whole
1:42:17 thing can unravel in the next week.
1:42:20 Because then people turn it, I mean, this, this is what’s happening in Los Angeles right
1:42:21 now.
1:42:26 The mayor and the fire chief turned on each other and that’s it.
1:42:27 That government is dysfunctional.
1:42:29 It is never going to get put back together again.
1:42:30 It is over.
1:42:32 It is not going to work ever again.
1:42:34 And that’s what happens inside companies.
1:42:40 And so, so, so somebody like Mark is under like profound internal pressure and external
1:42:41 pressure at the same time.
1:42:45 Now he’s been very good at maintaining the coherence of his executive team, but he has
1:42:50 had over the years a lot of activist employees as a lot of these companies have had.
1:42:52 And so that’s been continuous pressure.
1:42:55 And then the final thing I’d say is I said that companies can withstand most forms of
1:43:00 external pressure, but not all in the special, though not all one is government pressure.
1:43:05 Is it when your government comes for you like, yeah, any CEO who thinks that they’re bigger
1:43:09 than the government has that notion beaten out of them in short order.
1:43:16 Can you just linger on that because it is maybe educating and deeply disturbing.
1:43:21 You’ve spoken about it before, but we’re speaking about again, this government pressure.
1:43:27 So you think they’ve crossed the line into essentially criminal levels of pressure,
1:43:32 flagrant criminality, felonies, like obvious felonies, and I can, I can actually cite
1:43:33 the laws.
1:43:36 But yes, absolute criminality.
1:43:43 Can you explain how those possible to happen and maybe on a hopeful note, how we can avoid
1:43:44 that happening again?
1:43:49 So as to start with is a lot of this now is in the public record, which is good because
1:43:50 it needs to be in the public record.
1:43:52 And so there’s there’s three forms of things that are in the public record that people
1:43:53 can look at.
1:43:57 So one is the Twitter files, right, which Elon put out with the set of journalists when
1:43:58 he took over.
1:44:01 And I will just tell you, the Twitter files are 100% representative of what I’ve seen
1:44:03 at every other one of these companies.
1:44:05 And so you can just see what happened in Twitter.
1:44:08 And you can just assume that that happened in these other companies, you know, for the
1:44:11 most part, certainly in terms of the kind of pressure that they got.
1:44:15 So that’s that’s number one, that stuff, you can just read it and you should if you haven’t.
1:44:19 The second is Mark referenced this in the Rogan podcast.
1:44:22 There’s a congressman, Jim Jordan, who has a committee congressional committee called
1:44:23 the Weaponization Committee.
1:44:27 And they in the last, you know, whatever three years have done a full scale investigation
1:44:28 of this.
1:44:31 And Facebook produced a lot of documents into that investigation.
1:44:35 And those have many of those have now been made public and you can download those reports.
1:44:38 And there’s like, I’d like 2000 pages worth of material on that.
1:44:41 And that’s essentially the Facebook version of the Twitter files just arrived at with
1:44:43 a different mechanism.
1:44:45 And then third is Mark himself talking about this on on on Rogan.
1:44:47 So, you know, just defer to his comments there.
1:44:53 But yeah, basically what those three forms of information show you is basically the government,
1:44:58 you know, over time, and then culminating in 2020, 2021, you know, in the last four years
1:45:01 just decided that the First Amendment didn’t apply to them.
1:45:06 And they just decided that federal laws around free speech and around conspiracies to take
1:45:10 away the rights of citizens just don’t apply.
1:45:14 And they just decided that they can just arbitrarily pressure, just like literally arbitrarily
1:45:19 call up companies and threaten and bully and yell and scream and, you know, threaten repercussions
1:45:22 and force people to force them to censor.
1:45:25 And you know, there’s this old thing of like, well, the First Amendment only applies to,
1:45:27 you know, the government doesn’t apply to companies.
1:45:30 It’s like, well, there’s actually a little bit of nuance to that.
1:45:34 First of all, it definitely applies to the government like 100%.
1:45:36 The First Amendment applies to the government.
1:45:39 By the way, so does the Fourth Amendment and the Fifth Amendment, including the right to
1:45:41 due process also applies to the government.
1:45:45 There was no due process at all to any of the censorship regime that was put in place.
1:45:48 There was no due process put in place, by the way, for debanking either.
1:45:52 Those are just as serious violations as the free speech violations.
1:45:55 So this is just like flagrant, flagrant unconstitutional behavior.
1:45:57 And then there are specific federal statutes.
1:46:00 There’s it’s 18241 and 18242.
1:46:04 And one of them applies to federal employees, government employees, and the other one applies
1:46:10 to private actors around what’s called deprivation of rights and conspiracy to deprive rights.
1:46:14 And it is not legal, according to the United States Criminal Code, for government employees
1:46:19 or in a conspiracy private entities to take away constitutional rights.
1:46:23 And interestingly, some of those constitutional rights are enumerated, for example, in the
1:46:24 First Amendment, freedom of speech.
1:46:28 And then some of those rights actually do not need to be enumerated.
1:46:32 It is if the government takes away rights that you have, they don’t need to be specifically
1:46:36 enumerated rights in the Constitution in order to still be a felony.
1:46:40 The Constitution does not very specifically does not say you only have the rights that
1:46:41 it gives you.
1:46:44 It says you have all the rights that have not been previously defined as being taken
1:46:45 away from you.
1:46:46 Right.
1:46:49 And so debanking qualifies as a right, you know, right to access to the financial system
1:46:53 is every bit something that’s subject to these laws as free speech.
1:46:54 And so yeah, this has happened.
1:46:57 And then I’ll just add one final thing, which is we’ve talked about two parties so far.
1:47:01 Start with the government employees, and then we’ve talked about the companies.
1:47:04 The government employees, for sure, have misbehaved.
1:47:07 The companies, there’s a very interesting question there as to whether they are victims
1:47:12 or perpetrators or both, you know, they will defend and they will argue and I believe they
1:47:15 have a good case that they are victims not perpetrators, right?
1:47:19 They are the downstream subjects of pressure, not the cause of pressure.
1:47:23 But there’s a big swath of people who are in the middle and specifically the ones that
1:47:26 are funded by the government that I think are in possibly pretty big trouble.
1:47:29 And that’s all of these third party censorship bureaus.
1:47:35 I mean, the one that sort of is most obvious is the so-called Stanford Internet Observatory
1:47:37 that got booted up there over the last several years.
1:47:43 And they basically were funded by the federal government to be third party censorship operations.
1:47:47 And they’re private sector actors, but acting with federal funding.
1:47:52 And so it puts them in this very interesting spot where there could be very obvious theory
1:47:55 under which they’re basically acting as agents of the government.
1:47:59 And so I think they’re also very exposed on this and have behaved in just flagrantly illegal
1:48:00 ways.
1:48:06 Obviously government should not do any kind of pressure, even soft pressure on companies
1:48:07 to censor.
1:48:08 Can’t.
1:48:09 Not allowed.
1:48:11 It really is disturbing.
1:48:20 I mean, it probably started soft, lightly, slowly, and then it escalates as the old will
1:48:27 to power will instruct them to do because you get, I mean, yeah, I mean, that’s why there’s
1:48:31 protection because you can’t put a check on power for government, right?
1:48:34 There are so many ways that they can get you like there are so many ways they can come
1:48:35 at you and get you.
1:48:39 And, you know, the thing here to think about is a lot of times we really think about government
1:48:40 action.
1:48:41 They think about legislation, right?
1:48:45 Because you, so when I was a kid, we got trained at how does government work?
1:48:49 There was this famous animated short, the thing we got shown was just a cartoon of how a bill
1:48:50 becomes a law.
1:48:52 It’s like this, you know, if it’s a little bill snicked along and guessed this, I’m just
1:48:53 a bill.
1:48:54 Yeah.
1:48:55 Exactly.
1:48:56 Like it’s like, all right.
1:48:57 It works at all.
1:48:58 Like that doesn’t actually happen.
1:48:59 We could talk about that.
1:49:03 But even beyond that, mostly what we’re dealing with is not legislation.
1:49:06 When we talk about government power these days, mostly it’s not legislation.
1:49:10 Mostly it’s either regulation, which is basically the equivalent of legislation, but having not
1:49:14 gone through the legislative process, which is a very big open legal issue and one of
1:49:16 the things that the doge is very focused on.
1:49:20 Most government rules are not legislated, they’re regulated, and there’s tons and tons
1:49:24 of regulations that these companies are, so this is another cliche you’ll hear a lot, which
1:49:25 is, oh, private companies can do whatever they want.
1:49:27 It’s like, oh, no, they can’t.
1:49:32 They’re subject to tens of thousands of regulations that they have to comply with, and the hammer
1:49:35 that comes down when you don’t comply with regulations is profound, like they can completely
1:49:38 wreck your company with no ability for you to do anything about it.
1:49:41 So regulation is a big part of the way the power gets exercised.
1:49:45 And then there’s what’s called just flat out administrative power, the term that you’ll
1:49:46 hear.
1:49:48 And administrative power is just literally the government telling you, calling you and
1:49:49 telling you what to do.
1:49:50 Here’s an example of how this works.
1:49:55 So Facebook had this whole program a few years back to do a global cryptocurrency for payments
1:49:56 called Libra.
1:49:59 And they built the entire system, and it was this high-scale sort of new cryptocurrency,
1:50:01 and they were going to build it in every product, and there were going to be 3 billion people
1:50:05 who could transact with Libra, and they went to the government, and they went to all these
1:50:06 different, try to figure out how to make it.
1:50:09 So it was like fully compliant with anti-money laundering and all these controls and everything,
1:50:11 and they had the whole thing ready to go.
1:50:16 Two senators wrote letters to the big banks saying, we’re not telling you that you can’t
1:50:21 work with Facebook on this, but if you do, you should know that every aspect of your business
1:50:26 is going to come under greatly increased level of regulatory scrutiny.
1:50:29 Which is, of course, the exact equivalent of, it sure is a nice corner restaurant you
1:50:33 have here, it would be a shame if somebody tossed a Molotov cocktail through the window
1:50:34 and burned it down tonight.
1:50:37 And so what is that letter?
1:50:42 It’s not a law, it’s not even a regulation, it’s just like straight direct state power.
1:50:47 And then it culminates in literally calls from the White House where they’re just flat
1:50:50 out telling you what to do, which is, of course, what a king gets to do, but not what
1:50:52 a president gets to do.
1:50:57 And so anyway, so what these companies experienced was, they experienced the full panoply of
1:51:00 this, but the level of intensity was in that order.
1:51:03 It was actually legislation was the least important part.
1:51:06 Regulation was more important, administrative power was more important, and then just flat
1:51:10 out demands and flat out threats were ultimately the most important.
1:51:11 How do you fix it?
1:51:15 Well, first of all, you have to elect people who don’t do it.
1:51:19 So as with all these things, ultimately, the fault lies with the voters.
1:51:21 And so you have to decide you don’t want to live in that regime.
1:51:24 I have no idea what part of this recent election map to the censorship regime.
1:51:28 I do know a lot of people on the right got very angry about the censorship, but I think
1:51:32 it probably at least helped with enthusiasm on that side.
1:51:37 Maybe some people in the left will now not want their democratic nominees to be so pro-censorship.
1:51:40 So the voters definitely get a vote.
1:51:45 Number one, number two, I think you need transparency, you need to know what happened.
1:51:46 We know some of what happened.
1:51:50 Peter Teal has written in the FT just now saying we just need like, after what we’ve
1:51:55 been through in the last decade, we need broad-based truth and reconciliation efforts to really
1:51:57 get to the root of things.
1:51:59 So maybe that’s part of it.
1:52:02 We need investigations for sure.
1:52:03 Ultimately we need prosecutions.
1:52:06 We need ultimately, we need people to go to jail because we need to set object lessons
1:52:09 that say that you don’t get to do this.
1:52:13 And on those last two, I would say that those are both up to the new administration and I
1:52:15 don’t want to speak for them and I don’t want to predict what they’re going to do.
1:52:19 But they have, they for sure have the ability to do both of those things and we’ll see
1:52:20 where they take it.
1:52:21 Yeah, it’s truly disturbing.
1:52:26 I don’t think anybody wants this kind of overreach of power for government, including perhaps
1:52:28 people that are participating in it.
1:52:35 It’s like this dark momentum of power that you just get caught up in it and that’s the
1:52:36 reason there’s that kind of protection.
1:52:38 Nobody wants that.
1:52:41 So I use the metaphor of the ring of power and for people who don’t catch the reference
1:52:44 as Lord of the Rings and the thing with the ring of power and Lord of the Rings is the
1:52:48 ring the Gollum has in the beginning and it turns you invisible and it turns out it like
1:52:52 unlocks all this like fearsome power is the most powerful thing in the world is key to
1:52:53 everything.
1:52:56 And basically the moral lesson of Lord of the Rings, which was written by a guy who thought
1:53:00 very deeply about these things is, yeah, the ring of power is inherently corrupting.
1:53:03 The characters at one point, they’re like, end off, just put on the ring and like fix
1:53:04 this.
1:53:05 Right.
1:53:10 And he’s like, he will not put the ring on even to like end the war because he knows
1:53:11 that it will corrupt him.
1:53:17 And then as it starts, the character of Gollum is the result of like a normal character who
1:53:20 ultimately becomes this incredibly corrupt and deranged version of himself.
1:53:24 And so, I mean, I think you said something actually quite profound there, which is the
1:53:27 ring of power is infinitely tempting.
1:53:29 The censorship machine is infinitely tempting.
1:53:32 If you have it, like you are going to use it.
1:53:37 It’s overwhelmingly tempting because it’s so powerful and that it will corrupt you.
1:53:41 And yeah, I don’t know whether any of these people feel any of this today.
1:53:42 They should.
1:53:43 I don’t know if they do.
1:53:47 But yeah, you go out five or 10 years later, you know, you would hope that you would realize
1:53:51 that your soul has been corroded and you probably started out thinking that you were a patriot
1:53:55 and you were trying to defend democracy and you ended up being, you know, extremely authoritarian
1:53:57 and anti-democratic and anti-western.
1:54:05 Can I ask you a tough question here staying on the ring of power, Elon is quickly becoming
1:54:11 the most powerful human on earth?
1:54:13 I’m not sure about that.
1:54:14 You don’t think he is?
1:54:16 Well, he doesn’t have the nukes, so.
1:54:17 Nukes.
1:54:22 Yeah, there’s different definitions and perspectives on power, right?
1:54:30 How can he and or Donald Trump avoid the corrupting aspects of this power?
1:54:31 I mean, I think the danger is there with power.
1:54:32 It’s just, it’s flat out there.
1:54:36 I would say with Elon, I mean, you know, we’ll see, I would say with Elon, and I would say
1:54:40 by the way, overwhelmingly, I would say so far so good, I’m extremely, extremely thrilled
1:54:45 by what he’s done on almost every front for like, you know, the last 30 years, but including
1:54:48 all this stuff recently, like I think he’s been a real hero on a lot of topics where
1:54:50 we needed to see heroism.
1:54:53 But look, I would say I guess the sort of case that he has this level of power is some
1:54:57 combination of the money and the proximity to the president.
1:55:00 And obviously both of those are instruments of power.
1:55:05 The counterargument to that is I do think a lot of how Elon is causing change in the
1:55:06 world right now.
1:55:08 I mean, there’s, there’s the companies he’s running directly where I think he’s doing
1:55:13 very well and we’re investors in multiple of them and doing very well.
1:55:17 But I think like a lot of the stuff that gets people mad at him is like, it’s the social
1:55:20 and political stuff and it’s, you know, it’s his statements and then it’s the downstream
1:55:21 effects of his statements.
1:55:25 So like, for example, it’s, you know, for the last couple of weeks, it’s been him, you
1:55:28 know, kind of weighing in on this rape gang scandal, you know, this rape organized child
1:55:30 rape thing in the UK.
1:55:34 And you know, it’s, it’s, you know, it’s, it’s actually a, it’s a preface cascade.
1:55:36 It’s one of these things where people knew there was a problem.
1:55:37 They weren’t willing to talk about it.
1:55:39 It kind of got suppressed.
1:55:43 And then Elon brought it up and then all of a sudden there’s now in the UK, this like
1:55:46 massive explosion of basically open conversation about it for the first time.
1:55:49 And, you know, it’s like this catalyzing, all of a sudden everybody’s kind of woken
1:55:52 up and being like, Oh my God, you know, this is really bad.
1:55:55 And then there will be now, you know, I’m pretty sure pretty, pretty clearly big changes
1:55:56 as a result.
1:56:00 And Elon was, you know, he played the role of the boy who said, the emperor has no clothes,
1:56:01 right?
1:56:02 But, but, but here’s the thing.
1:56:03 Here’s my point.
1:56:05 Like he said it about something that was true, right?
1:56:09 And so had he said it about something that was false, you know, he would get no credit
1:56:10 for it.
1:56:11 He wouldn’t deserve any credit for it.
1:56:12 But he said something that was true.
1:56:16 And by the way, everybody over there instantly, they were like, Oh yeah, he’s right.
1:56:17 Right.
1:56:20 Like nobody, like nobody seriously said, they’re just arguing the details now.
1:56:22 So, so number one, it’s like, okay, he says true things.
1:56:26 And so it’s like, okay, how far a bit of this way, like how worried are we?
1:56:30 Are we about somebody becoming corrupt by virtue of their power being that they get to speak
1:56:31 the truth?
1:56:34 And I guess I would say, especially in the last decade of what we’ve been through where
1:56:37 everybody’s been lying all the time about everything, I’d say, I think we should run
1:56:39 this experiment as hard as we can to get people to tell the truth.
1:56:42 And so I don’t feel that bad about that.
1:56:47 And then the money side, you know, this rapidly gets into the money and politics question.
1:56:51 And the money and politics question is this very interesting question because it seems
1:56:55 like there’s a clear cut case that the more money and politics, the worse things are and
1:56:58 the more corrupted the system is.
1:57:02 That was a very popular topic of public conversation up until 2016, when Hillary outspent Trump
1:57:05 three to one and lost.
1:57:09 You’ll notice that money and politics has all most vanished as a topic in the last eight
1:57:10 years.
1:57:14 And once again, Trump spent far, you know, Kamala raised and spent 1.5 billion on top
1:57:16 of what Biden spent.
1:57:18 So they were, they were at, I don’t know, something like three billion total and Trump,
1:57:22 I think spent again, like a third or a fourth of that.
1:57:26 And so the money and politics kind of topic has kind of vanished from the popular conversation
1:57:27 the last eight years.
1:57:34 It has come back a little bit now that Elon is spending, you know, but, but again, like
1:57:37 it’s like, okay, he’s spending, but the data would seem to indicate in the last, at least
1:57:39 in the last eight years that money doesn’t win the political battles.
1:57:43 It’s actually like the voters actually have a voice and they actually exercise it and
1:57:44 they don’t just listen to ads.
1:57:47 And so again, there I would say like, yeah, clearly there’s some power there, but I don’t
1:57:50 know if it’s like, I don’t know if it’s some like, I don’t know if it’s some weapon that
1:57:54 he can just like turn on and use in a definitive way.
1:57:59 And I don’t know if there’s parallels there, but I could also say just on a human level,
1:58:04 he has a good heart and I interact with a lot of powerful people and that’s not always
1:58:05 the case.
1:58:07 So that’s a good thing there.
1:58:08 Yeah.
1:58:13 If we, if we can draw parallels to the Hobbit or whatever who gets to put on the ring.
1:58:14 Frodo.
1:58:15 Frodo, yeah.
1:58:17 Yeah, maybe one of the lessons of Lord of the Rings, right, is even, even Frodo would
1:58:19 have been, you know, even Frodo would have been corrupted, right?
1:58:23 But, you know, nevertheless, you had somebody who could do what it took at the time.
1:58:27 The thing that I find just so amazing about the Elon phenomenon and all the critiques
1:58:31 is, you know, the one thing that everybody in our societies universally agrees on because
1:58:36 of our, it’s sort of our post-Christian egalitarian, so, you know, we live in sort of this post
1:58:42 secularized Christian context in the West now and it’s, you know, we consider Christianity
1:58:45 kind of, you know, backwards, but we still believe essentially all the same things.
1:58:49 We just dress them up in sort of fake science.
1:58:53 So the, the one thing that we’re all told, we’re all taught from, from, is that the best
1:58:55 people in the world are the people who care about all of humanity, right?
1:58:59 And we venerate, you know, all of our figures are people who care about all of, you know,
1:59:02 Jesus cared about all of humanity, Gandhi cared about all of humanity, Martin Luther
1:59:05 King cared about all of humanity, like, it’s, it’s, it’s, the person who cares the most
1:59:07 about everybody.
1:59:11 And with Elon, you have a guy who literally like is, he’s literally, he talks about this
1:59:15 constantly and he talks about it exactly the same in private is literally, he is operating
1:59:18 on behalf of all of humanity to try to get us to, you know, he goes through to get us
1:59:21 through multi-planetary civilization so that we can survive a strike on anyone planet so
1:59:25 that we can extend the light of human consciousness into the world and, you know, into the universe
1:59:27 and have it persist, you know, and the good of the whole thing.
1:59:31 And like literally the critique is, yeah, we want you to care about all of humanity,
1:59:32 but not like that.
1:59:39 Yeah, all the critics, all the, all the surface turmoil, the critics will be forgotten.
1:59:42 Yeah, I think that’s, yeah, that’s clear.
1:59:47 You said that we always end up being ruled by the elites of some kind.
1:59:50 Can you explain this law, this idea?
1:59:55 So this comes from a Italian political philosopher from about a hundred years ago named Robert.
2:00:02 I’m going to mangle the, let you pronounce the Italian, Michelle’s or Michael’s.
2:00:06 And then it was, I learned about it through a famous book on politics, probably the best
2:00:10 book on politics written in the 20th century called the Machiavellians by this guy, James
2:00:12 Burnham, who has had a big impact on me.
2:00:16 But in the Machiavellians, he resurrects what he calls this sort of Italian realist school
2:00:19 of political philosophy from the, from the 10s and 20s.
2:00:21 And these were people, to be clear, this was not like a Mussolini thing.
2:00:26 These were people who were trying to understand the actual mechanics of how politics actually
2:00:27 works.
2:00:31 So to get to the actual sort of mechanical substance of like how the political machine
2:00:32 operates.
2:00:38 And this guy, Michelle, said this concept he ended up with called the iron law of oligarchy.
2:00:42 And so what the iron law of oligarchy, and I mean, take a step back to say what he meant
2:00:44 by oligarchy because it has multiple meanings.
2:00:47 So basically, in classic political theory, there’s basically three forms of government
2:00:48 at core.
2:00:51 There’s democracy, which is rule of the many.
2:00:53 There’s oligarchy, which is rule of the few.
2:00:55 And there’s monarchy, which is rule of the one.
2:00:58 And you can just use that as a general framework of any government you’re going to be under
2:01:01 is going to be one of those, just a mechanical observation, without even saying which ones
2:01:05 are good or bad, just a structural observation.
2:01:08 And so the question that Michelle’s asked was, like, is there such a thing as democracy?
2:01:10 Like, is there actually such a thing as democracy?
2:01:13 Is there ever actually like direct, direct government?
2:01:17 And what he did was he mounted this sort of incredible historical exploration of whether
2:01:19 democracies had ever existed in the world.
2:01:22 And the answer basically is almost never, and we could talk about that.
2:01:27 But the other thing he did was he sought out the most democratic private organization in
2:01:31 the world that he could find at that point, which he concluded was some basically communist
2:01:35 German Auto Workers Union that was like wholly devoted to the workers of the world uniting,
2:01:37 you know, back when that was like the hot thing.
2:01:40 And he went in there and he’s like, okay, this is the organization out of all organizations
2:01:43 on planet Earth that must be operating as a direct democracy.
2:01:46 And he went in there and he’s like, oh, nope, there’s a leadership class.
2:01:49 You know, there’s like six guys at the top and they control everything and they lead
2:01:53 the rest of the membership along, you know, by the nose, which is of course the story
2:01:54 of every union.
2:01:58 The story of every union is always the story of, you know, there’s a Jimmy Hoffa in there,
2:01:59 you know, kind of running the thing.
2:02:04 You know, we just saw that with the Dock Workers Union, right, like, you know, there’s a guy.
2:02:05 And he’s in charge.
2:02:09 And by the way, the number two is his son, right, like that’s not like a, you know, an
2:02:10 accident, right?
2:02:14 So the iron law of oligarchy basically says democracy is fake.
2:02:17 There’s always a rule in class, there’s always a ruling elite structurally.
2:02:21 And he said the reason for that is because the masses can’t organize, right?
2:02:22 What’s the fundamental problem?
2:02:26 Whether the mass is 25,000 people in union or 250 million people in a country, the masses
2:02:31 can’t organize, the majority cannot organize, only a minority can organize and to be effective
2:02:33 in politics, you must organize.
2:02:38 And therefore every political structure in human history has been some form of a small
2:02:44 organized elite ruling, a large and dispersed majority, every single one.
2:02:51 The Greeks and the Florentines had brief experiments in direct democracy and they were total disasters.
2:02:54 In Florence, I forget the name of it, it was called like the workers revolt or something
2:02:55 like that.
2:02:59 There was like a two year period where they basically experimented with direct democracy
2:03:02 during the Renaissance and it was a complete disaster.
2:03:04 And they never tried it again.
2:03:08 In the state of California, we have our own experiment on this, which is the proposition
2:03:13 system, which is an overlay on top of the legislature and anybody who looks at it for
2:03:15 two seconds concludes it’s been a complete disaster.
2:03:19 It’s just a catastrophe and it’s caused enormous damage to the state.
2:03:23 And so basically the presumption that we are in a democracy is just sort of by definition
2:03:24 fake.
2:03:27 Now, good news for the US, it turns out the founders understood this and so of course they
2:03:30 didn’t give us a direct democracy, they gave us a representative democracy, right?
2:03:34 And so they built the oligarchy into the system in the form of Congress and the executive
2:03:37 branch, the judicial branch.
2:03:40 But so anyway, so as a consequence, democracy is always and everywhere fake.
2:03:43 There is always a ruling elite.
2:03:47 And basically the lesson of the Machiavellians is you can deny that if you want, but you’re
2:03:48 fooling yourself.
2:03:52 The way to actually think about how to make a system work and maintain any sort of shred
2:03:56 of freedom is to actually understand that that is actually what’s happening.
2:04:02 And lucky for us, the founders saw this and figured out a way to, given that there’s
2:04:09 going to be a ruling elite, how to create a balance of power among that elite so it
2:04:10 doesn’t get out of hand.
2:04:11 It was very clever, right?
2:04:13 And some of this was based on earlier experiments.
2:04:16 Some of this, by the way, these were very, very smart people, right?
2:04:18 And so they knew tremendous amounts of like Greek and Roman history.
2:04:23 They knew the Renaissance history, the Federalist papers, they argued this at great length.
2:04:24 You can read it all.
2:04:29 They ran like one of the best seminars in world history, trying to figure this out.
2:04:30 And they went through all this.
2:04:33 And yeah, and so they thought through it very carefully, but just to give you an example,
2:04:34 which continues to be a hot topic.
2:04:38 So one way they did it is through the three branches of government, right?
2:04:42 Executive legislative and judicial sort of balance of powers.
2:04:45 But the other way they did it was they sort of echoing what had been done earlier, I think
2:04:50 in the UK Parliament, they created the two different bodies of the legislature, right?
2:04:54 And so the House and the Senate, and as you know, the House is a portion on the basis
2:04:56 of population and the Senate is not, right?
2:05:00 The small states have just as many senators as the big states.
2:05:02 And then they made the deliberate decision to have the House get reelected every two
2:05:05 years to make it very responsive to the will of the people.
2:05:09 And they made the decision to have the Senate get reelected every six years so that it had
2:05:12 more buffer from the passions at the moment.
2:05:14 But what’s interesting is they didn’t choose one or the other, right?
2:05:16 They did them both.
2:05:18 And then to get legislation passed, you have to get through both of them.
2:05:22 And so they built in like a second layer of checks and balances.
2:05:26 And then there’s 1,000 observations we could make about like how well the system is working
2:05:30 today and like how much does it live up to the ideal and how much are we actually complying
2:05:31 with the Constitution?
2:05:34 And there’s lots of, you know, there’s lots of open questions there.
2:05:39 But you know, this system has survived for coming on 250 years with a country that has
2:05:42 been spectacularly successful that I don’t think at least, you know, I don’t think any
2:05:44 of us would trade this system for any other one.
2:05:46 And so it’s one of the great all-time achievements.
2:05:47 Yeah, it’s incredible.
2:05:52 And we should say they were all pretty young relative to our current set of leaders.
2:05:53 Many in their 20s at the time.
2:05:54 And like super geniuses.
2:05:57 This is one of those things where it’s just like, all right, something happened where
2:06:01 there was a group of people where, you know, nobody ever tested their IQs, but like, these
2:06:02 are Einstein’s of politics.
2:06:03 Yeah.
2:06:04 The amazing thing.
2:06:07 But anyway, I just, I go through all that, which is they were very keen students of the
2:06:12 actual mechanical practice of democracy, not fixated on what was desirable.
2:06:16 They were incredibly focused on what would actually work, which is, you know, I think
2:06:17 the way to think about these things.
2:06:22 There were engineers of sorts, not the fuzzy humanity students of sorts.
2:06:24 They were shape rotators, not word cells.
2:06:26 I remember that.
2:06:27 Wow.
2:06:29 That meme came and went.
2:06:30 I think you were centered to them.
2:06:31 You’re centered to a lot of memes.
2:06:32 I was.
2:06:36 You’re the meme dealer and the meme popularizer.
2:06:37 That meme I guess we credit for.
2:06:39 And then the current thing is the other one I get some credit for.
2:06:42 I don’t know that I invented either one, but I popularized them.
2:06:44 Take credit and run with it.
2:06:52 If you can just linger on the Machiavellians, it’s a, it’s a study of power and power dynamics.
2:06:59 Like you mentioned, looking at the actual reality of the machinery of power from everything
2:07:04 you’ve seen now in government, but also in companies, what are some interesting things
2:07:08 you can sort of continue to say about the dynamics of power, the jostling for power that
2:07:10 happens inside these institutions.
2:07:11 Yeah.
2:07:15 So it, a lot of it, you know, we already talked about this a bit with the universities, which
2:07:19 is you can apply a Machiavellian style lens to the, it’s why I posed the question to you
2:07:24 that I did, which is, okay, who runs the university, the trustees, the administration, the students
2:07:25 or the faculty.
2:07:28 And then, you know, the answer, the true answer is some combination of the three or of the
2:07:33 four, plus the donors, by the way, plus the government, plus the press, et cetera, right.
2:07:36 And so there, you know, there’s a, there’s a mechanical interpretation of that.
2:07:41 I mean, companies operate under the exact same, you know, set of questions, who runs a company,
2:07:44 you know, the CEO, but like the CEO runs the company basically up to the day that either
2:07:47 the shareholders or the management team revolt.
2:07:50 If the shareholders revolt, it’s very hard for the CEO to stay in the seat.
2:07:53 If the management team revolts, it’s very hard for the CEO to stay in the seat.
2:07:56 By the way, if the employees revolt, it’s also hard to stay in the seat.
2:07:59 By the way, if the New York Times comes at you, it’s also very hard to stay in the seat.
2:08:02 If the Senate comes at you, it’s very hard to stay in the seat.
2:08:07 So, you know, like a reductionist version of this that is a good shorthand is who can
2:08:09 get who fired.
2:08:13 You know, so, so who has more power, you know, the newspaper columnist who makes, you know,
2:08:17 $200,000 a year or the CEO who makes, you know, $200 million a year.
2:08:20 And it’s like, well, I know for sure that the columnist can get the CEO fired.
2:08:21 I’ve seen that happen before.
2:08:25 I have yet to see a CEO get a columnist fired.
2:08:32 Did anyone ever get fired from the Bill Ackman assault on journalism?
2:08:36 So Bill, Bill like really showed the bullshit that happens in journalism.
2:08:39 No, because what happens is they, they, they were at with the, I mean, they, and I would
2:08:41 say to their credit, they were as a badge of honor.
2:08:43 And then to their shame, they were as a badge of honor, right?
2:08:48 Which is if, you know, if they’re doing the right thing, then they are justifiably proud
2:08:50 of themselves for standing up under pressure.
2:08:53 But it also means that they can’t respond to legitimate criticism.
2:08:56 And, you know, they’re obviously terrible at that now.
2:09:01 As I recall, he went straight to the CEO of the actual Springer that owns Insider.
2:09:04 And I, you know, and I happen to know the CEO and I think he’s quite a good CEO.
2:09:08 But like, I, like, well, it’s a good example is the CEO of actual Springer run his own
2:09:09 company.
2:09:10 Right.
2:09:12 Like, well, there’s a fascinating, okay, so there’s a fascinating thing playing out right
2:09:13 now.
2:09:18 Not to dwell on these fires, but it’s a, you see, the pressure reveals things, right?
2:09:22 And so if you’ve been watching what’s happened with LA Times recently, so this guy, biotech
2:09:26 entrepreneur buys the LA Times, like whatever, eight years ago, it is just like the most
2:09:30 radical social revolutionary thing you can possibly imagine.
2:09:32 It endorses every crazy left-wing radical.
2:09:36 You can imagine it endorses Karen Bass, it endorses Gavin Newsom, it’s just like a litany
2:09:39 of all the people who are currently burning the city to the ground.
2:09:42 It’s just like endorsed every single bad person, every step of the way.
2:09:44 He’s owned it the entire time.
2:09:47 You know, he put his foot down right before, for the first time, I think put his foot down
2:09:50 right before the November election and said, we’re not, we’re getting, he said, we’re
2:09:52 going to get out of this thing where we just always endorse the Democrat.
2:09:53 And we said, we’re not endorsing.
2:09:57 I think he said, we’re not endorsing for the presidency and like the paper flipped out.
2:09:58 Right.
2:10:01 It’s like our billionaire backer who’s, I don’t know what he spends, but like, he must
2:10:05 be burning 50 or 100 million dollars a year out of his pocket to keep this thing running.
2:10:09 He paid 500 million for it, which is amazing.
2:10:13 Back when people still thought these things were businesses.
2:10:17 And then he’s probably burned another 500 million over the last decade, keeping it running.
2:10:20 And he burns probably another 50, a hundred million a year to do this.
2:10:24 And the journalists at the LA Times hate him with the fury of a thousand sons.
2:10:27 Like they just like absolutely freaking despise him.
2:10:29 And they have been like attacking him and, you know, the ones that can get jobs elsewhere
2:10:32 quit and do it and the rest just stay and say the worst, you know, most horrible things
2:10:33 about him.
2:10:36 And they want to constantly run these stories, attacking him.
2:10:40 And so he has had this reaction that a lot of people in LA are having right now to this
2:10:44 fire and to this just like incredibly vivid collapse of leadership and all these people
2:10:48 that he had his paper head and tourists are just disasters.
2:10:50 And he’s on this tour.
2:10:54 He’s basically just, he’s decided, he’s, he’s decided to be the boy who says the emperor
2:10:57 has no clothes, but he’s doing it to his own newspaper.
2:10:58 Very smart guy.
2:11:01 And he’s basically saying, yeah, we, we, yes, we did all that and we endorsed these
2:11:04 people and it was a huge mistake and we’re going to completely change.
2:11:08 And his paper is, you know, in a complete internal revolt.
2:11:09 But I go through it, which is okay.
2:11:12 Now we have a very interesting question, which is who runs the LA Times.
2:11:17 Because for the last eight years, it hasn’t been him.
2:11:19 It’s been the reporters.
2:11:23 Now for the first time, the owner is showing up saying, oh no, I’m actually in charge and
2:11:25 the reporters are saying, no, you’re not.
2:11:28 And like, like it is freaking on.
2:11:32 And so again, if the Machiavellian’s mindset on this is like, okay, how is power actually
2:11:33 exercised here?
2:11:37 Can, can, can a guy who’s like even super rich and super powerful, who even owns his
2:11:39 own newspaper, can he stand up to a full-scale assault?
2:11:43 Not only by his own reporters, but by every other journalism outlet who also now thinks
2:11:45 he’s the antichrist.
2:11:50 And he is trying to exercise power by speaking out publicly and so that’s the game of power
2:11:51 there.
2:11:52 And firing people.
2:11:54 And you know, he has removed people and he has set new rules.
2:11:57 I mean, he is, he is now, I think at long, I think he’s saying that he’s now at long
2:12:01 last actually exercising prerogatives of an owner of a business, which just decide on
2:12:02 the policies and staffing of the business.
2:12:06 There are certain other owners of these publications that are doing similar things right now.
2:12:08 He’s the one I don’t know.
2:12:10 So he’s the one I can talk about.
2:12:13 But there are others that are going through this same thing right now.
2:12:17 And I think it’s a really interesting open question, like, you know, in a fight between
2:12:20 the employees and the employer, like it’s not crystal clear that the employer wins that
2:12:21 one.
2:12:23 And just to stay on journalism for a second, we mentioned Bill Ackman.
2:12:28 I just want to say, put him in the category we mentioned before of a really courageous
2:12:29 person.
2:12:37 I don’t think I’ve ever seen anybody so fearless in going after, you know, in following what
2:12:40 he believes in publicly.
2:12:46 That’s courage that, that, that several things he’s done publicly has been really inspiring,
2:12:47 just being courageous.
2:12:49 What do you think is like the most impressive example?
2:12:57 Where he went after a journalist whose whole incentive is to like, I mean, it’s like sticking
2:13:02 your like kicking the beehive or whatever, you know, what’s going to follow.
2:13:08 And to do that, I mean, that’s why it’s difficult to challenge journalistic organizations because
2:13:12 they’re going to, you know, there’s just so many mechanisms they use, including like writing
2:13:16 articles and get cited by Wikipedia, then drive the narrative and then they can get
2:13:18 you fired, all this kind of stuff.
2:13:27 Bill Ackman, like a bad MFR, just, just tweets these essays and just goes after them legally
2:13:32 and also in the public eye and just, I don’t know, that was truly inspiring.
2:13:36 There’s not many people like that in public.
2:13:42 And hopefully that inspires not just me, but many others to be like, to be courageous themselves.
2:13:45 Did you know of him before he started doing this in public?
2:13:49 I knew of Neri, his wife, who’s just a brilliant researcher and scientist, and so I admire
2:13:50 her look up to her.
2:13:51 I think she’s amazing.
2:13:55 Well, the reason I ask if you knew about Bill is because a lot of people had not heard
2:13:58 of him before, especially like before October 7th and before some of the campaigns he’s
2:14:01 been running since in public, and with Harvard and so forth.
2:14:05 But he was very well known in the investment world before that.
2:14:10 So he was a famous, he was a so-called activist investor for, you know, very, very successful
2:14:15 and very widely respected for probably 30 years before, before, before now.
2:14:19 And I bring that up because it turns out they weren’t for the most part battles that happened
2:14:20 in kind of full public view.
2:14:23 They weren’t national stories, but in the business and investing world, the activist
2:14:30 investor is a very, it’s like in the movie Taken, it’s a very specific set of skills.
2:14:34 How to like really take control of situations and how to wreck the people who you’re going
2:14:36 up against.
2:14:41 And just to, and there’s a lot, there’s been controversy over the years on this topic
2:14:44 and there’s too much detail to go into, but the, the defensive activist investing, which
2:14:48 I think is valid is, you know, these are the guys who basically go in and take stakes in
2:14:51 companies that are being poorly managed or under optimized.
2:14:55 And, and, and then generally what that means is at least the theory is that means the existing
2:15:00 management has become entrenched and lazy, mediocre, you know, whatever, not responding
2:15:04 to the needs of the shareholders, often not responding to the customers.
2:15:09 And the activists basically go in with a minority position and then they rally support among
2:15:11 other investors who are not activists.
2:15:16 And then they basically show up and they force change, but they are the aggressive version
2:15:17 of this.
2:15:19 And I’ve been on the, I’ve been involved in companies that have been on the receiving
2:15:24 end of these, where it is amazing how much somebody like that can exert pressure on situations
2:15:26 even when they don’t have formal control.
2:15:30 So it’s another, it would be another chess piece on the mechanical board of kind of how
2:15:31 power gets exercised.
2:15:34 And basically what happens is the effect of analysts a large amount of the time they end
2:15:37 up taking, they end up taking over control of companies, even though they never own more
2:15:39 than like 5% of the stock.
2:15:42 And so anyway, so it turns out with Bill’s been such a fascinating case because he has
2:15:48 that like complete skill set and he has now decided to bring it to bear in areas that
2:15:50 are not just companies.
2:15:53 And two interesting things for that, one is, you know, some of these places, you know,
2:15:57 and some of these battles are still ongoing, but number one, like a lot of people who run
2:16:00 universities or newspapers are not used to being up against somebody like this.
2:16:04 And by the way, also now with infinitely deep pockets and lots of experience in courtrooms
2:16:06 and all the things that kind of go with that.
2:16:12 But the other is, through example, he is teaching a lot of the rest of us, the activist playbook,
2:16:13 like in real time.
2:16:17 And so the Liam Neeson skill set is getting more broadly diffused just by being able to
2:16:19 watch and learn from him.
2:16:22 So I think he, I think he’s having a, you know, I would put him up there with Elon in
2:16:25 terms of somebody who’s really affecting how all this is playing out.
2:16:29 But even skill set aside, just courage and yes, including by the way, courage to go outside
2:16:30 of his own zone.
2:16:31 Yeah.
2:16:32 Right.
2:16:35 You know, cause like he hasn’t, I’ll give you an example, like my firm venture capital
2:16:36 firm, we have LPs.
2:16:40 There are things that I feel like I can’t do or say cause I feel like I would be bringing,
2:16:44 you know, I would be bringing embarrassment or other consequences to our LPs.
2:16:47 He has investors also where he worries about that.
2:16:50 And so his, so a couple of things, one is his willingness to go out a bit and risk his
2:16:52 relationship with his own investors.
2:16:55 But I will tell you the other thing, which is his investors, I know this for a fact his
2:16:59 investors have been remarkable to be supportive of him doing that because as it turns out,
2:17:02 a lot of them actually agree with him.
2:17:06 And so he’s the same thing he does in his activism campaigns.
2:17:09 He is able to be the tip of the spear on something that actually a lot more people agree with.
2:17:10 Yeah.
2:17:14 It turns out if you have truth behind you, it helps.
2:17:18 And just again, you know, how I started is a lot of people are just fed up.
2:17:23 You’ve been spending a bunch of time in Mar-a-Lago and Palm Beach helping the new administration
2:17:26 in many ways, including interviewing people who might join.
2:17:31 So what’s your general sense about the talent about the people who are coming in into the
2:17:33 new administration?
2:17:36 So I should start by saying I’m not a member of the new administration.
2:17:40 I’m not, I’m not in the room, I’m not like in the room when a lot of these people are
2:17:41 being selected.
2:17:42 I believe you said unpaid intern.
2:17:43 I am an unpaid intern.
2:17:48 So I’m a volunteer and I, you know, when helpful, but I’m not, I’m not making the decisions
2:17:50 nor am I in a position to, you know, speak for the administration.
2:17:53 So I don’t want to say anything that will cause people to think I’m doing that.
2:17:54 It’s a very unusual situation, right?
2:17:57 Where you had an incumbent president and then you had a four-year gap where he’s out of
2:17:59 office and then you have him coming back, right?
2:18:04 And as you’ll recall, there was a fair amount of controversy over the end of the first term.
2:18:05 Oh, yeah.
2:18:09 The fear, the specific concern was, you know, the first Trump administration, you know,
2:18:12 they will all say this is like they didn’t come in with a team, right?
2:18:15 So they, you know, they didn’t come into the team and most of the sort of institutional
2:18:19 base of the Republican party were Bush Republicans and they were, and many of them had become
2:18:20 never Trumpers.
2:18:22 And so they had a hard time putting the team together.
2:18:24 And then by the way, they had a hard time getting people confirmed.
2:18:27 And so if you talk to the people who were there in the first term, it took them two
2:18:30 to three years to kind of even get the government in place.
2:18:33 And then they basically only had the government in place for, you know, for basically like
2:18:37 18 months and then COVID hit, you know, and then sort of aftermath and everything and all
2:18:39 the drama and headlines and everything.
2:18:42 And so the concern, you know, including from some very smart people in the last two years
2:18:46 has been, boy, if Trump gets a second term, is he going to be able to get a team that
2:18:50 is as good as the team he had last time or a team that is actually not as good because
2:18:53 maybe people got burned out, maybe they’re more cynical now, maybe they’re not willing
2:18:55 to go through the drama.
2:18:57 By the way, a lot of people on in the first term came under like, you know, with their
2:19:01 own withering legal assaults and, you know, some of them went to prison and like, you
2:19:05 know, a lot, a lot of stuff happened, lots of investigations, lots of legal fees, lots
2:19:09 of bad press, lots of debanking, by the way.
2:19:14 A lot of the officials in the first term got debanked, including the president’s wife
2:19:15 and son.
2:19:16 Yeah.
2:19:17 I heard you tell that story.
2:19:18 It’s insane.
2:19:19 That’s just insane.
2:19:20 In the wake of the first term.
2:19:21 Yes.
2:19:25 We now take out spouses and children with our ring of power.
2:19:28 And so there’s like this legitimate question as to like whether, okay, what will the team
2:19:29 for the second term look like?
2:19:33 And at least what I’ve seen and what you’re seeing is the appointments is it looks much,
2:19:34 much better.
2:19:37 First of all, it just looks better than the first term and not because the people in the
2:19:40 first term were not necessarily good, but just you just have this like influx of like
2:19:44 incredibly capable people that have shown up that want to be part of this.
2:19:46 And you just didn’t have that the first time.
2:19:49 And so they’re just drawing on a much deeper, richer talent pool than they had the first
2:19:50 time.
2:19:53 And they’re drawing on people who know what the game is, like they’re drawing on people
2:19:57 now who know what is going to happen and they’re still willing to do it.
2:20:00 And so they’re going to get, I think, you know, some of the best people from the first
2:20:05 term, but they’re bringing in a lot of people who they couldn’t get the first time around.
2:20:07 And then second is there’s a bunch of people, including people in the first term where they’re
2:20:09 just 10 years older.
2:20:13 And so they went through the first term and they just learned how everything works.
2:20:16 Or they’re young people who just had a different point of view, and now they’re 10 years older
2:20:19 and they’re ready to go serve in government.
2:20:21 And so there’s a generational shift happening.
2:20:25 And actually one of the interesting things about the team that’s forming up is it’s remarkably
2:20:26 young.
2:20:29 Some of the cabinet members and then many of the second and third level people are like
2:20:33 in their 30s and 40s, you know, which is a big change from the gerontocracy that, you
2:20:36 know, we’ve been under for the last 30 years.
2:20:39 And so I think the caliber has been outstanding, you know, and we could sit here and list tons
2:20:42 and tons of people, but like, you know, the people who are running, you know, it’s everything
2:20:46 from the people who are running all the different departments at HHS, it’s the people running,
2:20:50 you know, the number two at the Pentagon is Steve Feinberg, who’s just like an incredible
2:20:53 legend of private equity, incredible capable guy.
2:20:57 We’ve got two, actually two of my partners are going in, who I both think are amazing.
2:20:58 Yeah.
2:21:02 Like many, many parts of the government that people are like really impressive.
2:21:10 Well, I think one of the concerns is actually that given the human being of Donald Trump,
2:21:18 that there would be more tendency towards, let’s say, favoritism versus meritocracy,
2:21:22 that there’s kind of circles of sycophancy that form.
2:21:30 And if you’re be able to be loyal and never oppose and just be basically suck up to the
2:21:32 president, then you’ll get a position.
2:21:33 So that’s one of the concerns.
2:21:40 And I think you’re in a good position to speak to the degree that’s happening versus
2:21:43 hiring based on merit and just getting great teams.
2:21:44 Yeah.
2:21:48 So look, I just start by saying any leader at that level, by the way, any CEO, there’s
2:21:49 always some risk of that.
2:21:50 Right.
2:21:53 So there’s always some, you know, it’s just, it’s like a natural reality warps around powerful
2:21:54 leaders.
2:21:55 And so there’s always some risk to that.
2:21:57 Of course, the good and powerful leaders are, you know, very aware of that.
2:22:01 And Trump at this point in his life, I think, is highly aware of that, at least my interactions
2:22:03 with him, like he definitely seems very aware of that.
2:22:06 So that’s one thing.
2:22:09 I would just say that I think the way to look at that, I mean, and look like I said, I don’t
2:22:11 want to predict what’s going to happen once this whole thing starts unfolding.
2:22:14 But I would just say, again, the caliber of the people who are showing up and getting
2:22:18 the jobs and then the fact that these are some of the most accomplished people in the
2:22:24 business world and in the medical field, I just, you know, Jay Battitaria coming in
2:22:25 around NIH.
2:22:27 So I was actually in the, I was actually, I was part of the interview team for a lot
2:22:29 of the HHS folks.
2:22:30 Nice.
2:22:31 Jay is amazing.
2:22:32 I was so happy to see that.
2:22:36 So I literally got, this is a story, I got to the transition office for one of the days
2:22:38 of the HHS interviews and I was on one of the interviewing teams and they gave us, I didn’t
2:22:41 know who the candidates were and they gave us the sheet in the beginning and I go down
2:22:46 the sheet and I saw Jay’s name and I like, I almost physically fell out of my chair.
2:22:51 And I was just like, you know, and I have, I happen to know Jay and I like respect him
2:22:52 enormously.
2:22:55 And then he proved himself under this like talk about a guy who proved himself under
2:23:01 extraordinary pressure over the last five years and then go radical under the pressure.
2:23:04 He maintained balance and thoughtfulness and depth.
2:23:05 I mean, incredibly.
2:23:10 Very serious, very analytical, very applied and, and, and, and yes, 100% tested under
2:23:15 pressure came out like the more people look back at what he said and did and you know,
2:23:19 he’s not, you know, none of us are perfect, but like overwhelmingly, like overwhelmingly
2:23:21 insightful throughout that whole period.
2:23:24 And you know, we, you know, we would all be much better off today had he been in charge
2:23:26 of the response.
2:23:29 And so just like an incredibly capable guy and look, and then he learned from all that
2:23:30 right.
2:23:31 He learned a lot in the last five years.
2:23:35 And so the idea that somebody like that could be ahead of NIH as compared to the people
2:23:37 we’ve had is just like breathtakingly.
2:23:41 It’s just a gigantic upgrade, you know, and then Marty McRae coming in to run FDA exact
2:23:42 same thing.
2:23:47 The guy coming to run a CDC exact same thing.
2:23:49 I mean, I’ve been spending time with Dr. Oz.
2:23:52 So, you know, I’m not like, again, I’m not like it, I’m not on these teams.
2:23:56 I’m not in the room, but like I’ve been spending enough time trying to help that like his level
2:24:00 of insight into the healthcare system is like, it’s like astounding and it comes from being
2:24:03 a guy who’s been like in the middle of the whole thing and been talking to people about
2:24:07 this stuff and working on it and serving as a doctor himself and in medical systems for,
2:24:11 you know, his entire life and it’s just like, you know, he’s like a walking encyclopedia
2:24:12 on these things.
2:24:17 And so, and you know, very dynamic, you know, very charismatic, very smart, organized, effective.
2:24:20 So, you know, to have somebody like that in there.
2:24:24 And so anyway, they’re just, I have like 30 of these stories now across all these different,
2:24:25 all these different positions.
2:24:29 And so I, and then I just, I’d be quite honest, I do do the compare and contrast to the last
2:24:30 four years.
2:24:32 And it’s not even, these people are not in the same ballpark.
2:24:36 They’re just like wildly better.
2:24:40 And so, you know, the pound for pound is maybe the best team in the White House since, you
2:24:48 know, I don’t even know, maybe the 90s, maybe the, maybe the 30s, maybe the 50s, you know,
2:24:52 maybe Eisenhower had a team like this or something, but it’s, it’s, it’s, there’s a lot of really
2:24:53 good people in there now.
2:24:56 Yeah, the potential for change is certainly extremely high.
2:24:59 Well, can you speak to Doge?
2:25:04 What’s the most wildly successful next two years for Doge?
2:25:06 Can you imagine?
2:25:11 Maybe also, can you think about the trajectory that’s the most likely and what kind of challenges
2:25:12 would it be facing?
2:25:13 Yeah.
2:25:18 So, and start by saying again, I’m not disclaimer after disclaimer, I’m not on Doge.
2:25:19 I’m not a member of Doge.
2:25:25 We should say there’s about 10 lawyers in the room staring now, I’m just kidding.
2:25:27 Both the angels and the devils on my shoulder.
2:25:28 Okay.
2:25:29 Yeah.
2:25:30 So I’m not speaking for Doge.
2:25:32 I’m not in charge of Doge.
2:25:33 Those guys are doing it.
2:25:34 I’m not doing it.
2:25:38 But I am, you know, again, I’m volunteering to help as much as I can and I’m 100% supportive.
2:25:39 Yeah.
2:25:43 So look, I, I think the way to think, I mean, the, the basic outlines are in public, right?
2:25:47 Which is it’s a, it’s a time limited, you know, basically commission.
2:25:48 It’s not a formal government agency.
2:25:51 It’s a, you know, time limited 18 month.
2:25:55 It’ll, it’ll, in terms of implementation, it will advise the executive branch, right?
2:26:00 And so the, the, the implementation will happen through the, the White House and the president
2:26:02 has total attitude on what he wants to, what he wants to implement.
2:26:07 Um, and then basically what I think about it is three kind of streams, you know, kind
2:26:09 of target sets and they’re related, but different.
2:26:12 So money, uh, people and regulations.
2:26:16 Um, and so, you know, the headline number, they’ve, you know, put us the two trillion
2:26:19 dollar number and there’s already, you know, disputes over, over that and whatever.
2:26:23 And there’s all question there, but then there’s the people thing and the people thing is interesting
2:26:26 because you get into these very, um, kind of, um, fascinating questions.
2:26:30 Um, and I’ve been doing this, I, I won’t do this for you as a pop quiz, but I do this
2:26:34 for people in government as a pop quiz and I can stump them every time, which is a, how
2:26:36 many federal agencies are there?
2:26:41 And the answer is somewhere between 450 and 520 and nobody’s quite sure.
2:26:43 And then the other is how many people will work for the federal government.
2:26:47 Um, and the answer is, you know, something on the order, I forget, but like 4 million
2:26:52 full time employees and maybe up to 20 million contractors and nobody is quite sure.
2:26:54 And so there’s a large people component to this.
2:26:57 Um, and then by the way, there’s a related component to that, which is how many of them
2:27:01 are actually in the office and the answer is not many.
2:27:03 Most of the federal buildings are still empty, right?
2:27:06 And so, and then there’s questions of, like, are people, you know, working from home or
2:27:08 we’re actually working from home.
2:27:11 So there’s the people to mention and of course the money and the people are connected and
2:27:13 then there’s the third, which is the regulation thing, right?
2:27:17 And I described earlier how basically our system of government is much more now based
2:27:20 on regulations than legislation, right?
2:27:24 Most of the rules that we all live under are not from a bill that went through Congress.
2:27:27 They’re from an agency that created a regulation.
2:27:28 That turns out to be very, very important.
2:27:32 So one is, a lot of already described, we want to do the doge wants to do broad based
2:27:33 regulatory relief.
2:27:36 And Trump has talked about this and basically get the government off his backs and liberate
2:27:39 the American people to be able to do things again.
2:27:40 Um, so that’s part of it.
2:27:43 But there’s also something else that’s happened, which is very interesting, which was there
2:27:47 were a set of Supreme Court decisions about two years ago, um, that went directly after
2:27:53 the idea that the executive branch can create regulatory agencies and issue regulations
2:27:57 and enforce those regulations without corresponding congressional legislation.
2:28:03 Um, and most of the federal government that exists today, including most of the departments
2:28:07 and most of the rules and most of the money and most of the people, most of it is not
2:28:09 enforcing laws that Congress passed.
2:28:11 Most of it is, is regulation.
2:28:16 And the Supreme Court basically said large parts, you know, large to maybe all of that
2:28:20 regulation that did not directly result from a bill that went through Congress, the way
2:28:25 that the cartoon said that it should, um, that may not actually be legal.
2:28:30 Now, the previous White House, of course, was super in favor of big government.
2:28:31 They had no desire to act.
2:28:32 They did nothing based on this.
2:28:34 They didn’t, you know, pull anything back in.
2:28:39 But the new regime, if they choose to, could say, look, the thing that we’re doing here
2:28:43 is not, you know, challenging the laws were actually complying with the Supreme Court decision
2:28:46 that basically says we have to unwind a lot of this.
2:28:50 And we have to unwind the regulations, which are no longer legal constitutional.
2:28:53 We have to unwind the spend and we have to unwind the people.
2:28:56 And so that, and that’s how you get from basically connect the thread from the regulation part
2:28:59 back to the money part, back to the people part.
2:29:01 They have work going on all three of these threads.
2:29:05 They have, I would say, incredibly creative ideas on how to deal with this.
2:29:09 I’m, I know lots of former government people who 100% of them are super cynical on this
2:29:10 topic.
2:29:11 And they’re like, this is impossible.
2:29:12 This can never possibly work.
2:29:17 And I’m like, well, I can’t tell you what the secret plans are, but like, like blow
2:29:21 my mind, like, and all three of those, like they have ideas that are like really quite
2:29:24 amazing, as you’d expect from, you know, from the people involved.
2:29:28 And so over the course of the next few months, you know, that’ll start to become visible.
2:29:33 And then the final thing I would say is this is going to be very different than attempts
2:29:34 like that.
2:29:38 There have been other programs like this in the past, the Clinton Gore administration
2:29:42 had one and then there were others before that Reagan had one.
2:29:46 The difference is this time, there’s social media.
2:29:52 And so there has never been, it’s interesting, one of the reasons people in Washington are
2:29:57 so cynical is because they know all the bullshit, like they know all the bad spending and all
2:30:01 the bad rules and all the like, you know, I mean, look, we’re adding a trillion dollars
2:30:04 to the national debt every 100 days right now.
2:30:08 And that’s compounding and it’s now passing the size of the Defense Department budget and
2:30:10 it’s compounding and it’s pretty soon it’s going to be adding a trillion dollars every
2:30:13 90 days and then it’s going to be adding a trillion dollars over 80 days and then it’s
2:30:15 going to be a trillion dollars every 70 days.
2:30:18 And then if this doesn’t get fixed at some point, we enter a hyperinflationary spiral
2:30:23 and we become Argentina or Brazil and Kablooey, right?
2:30:26 And so like everybody in DC knows that something has to be done.
2:30:30 And then everybody in DC knows for a fact that it’s impossible to do anything.
2:30:31 Right.
2:30:34 They know all the problems and they also know the sheer impossibility of fixing it.
2:30:37 But I think what they’re not taking into account that what the critics are not taking into account
2:30:42 is these guys can do this in the full light of day and they can do it on social media.
2:30:44 They can completely bypass the press.
2:30:46 They can completely bypass the cynicism.
2:30:51 They can expose any element of unconstitutional or silly government spending.
2:30:54 They can run victory laps every single day on what they’re doing.
2:30:56 They can bring the people into the process.
2:30:59 And again, if you think about it, this goes back to our Machiavellian structure, which
2:31:05 is if you think about, again, you’ve got democracy, oligarchy, monarchy, rule of the many, rule
2:31:07 of the few, rule of the one.
2:31:10 You could think about what’s happening here as a little bit of a sandwich, which is you
2:31:15 have, we don’t have a monarch where we have a president, rule of the one with some power.
2:31:19 And then we have the people who can’t organize, but they can be informed and they can be aware
2:31:22 and they can express themselves through voting and polling.
2:31:26 And so there’s a sandwich happening right now is the way to think about it, which is
2:31:30 you’ve got basically monarchy, rule of one, combining with rule of many, right?
2:31:32 And rule of many is that you get to vote, right?
2:31:34 The people do get to vote, basically.
2:31:38 And then essentially Congress as in the sort of permanent bureaucratic class in Washington
2:31:40 as the oligarchy in the middle.
2:31:45 And so the White House plus the people, I think have the power to do all kinds of things
2:31:46 here.
2:31:48 And I think that would be the way I would watch it.
2:31:56 The transparency, I mean, Elon just by who he is, is incentivized to be transparent and
2:32:00 show the bullshit in the system and to celebrate the victories.
2:32:02 So it’s going to be so exciting.
2:32:08 I mean, honestly, it just makes government more exciting, which is a win for everybody.
2:32:11 These people are spending our money.
2:32:14 These people have enormous contempt for the taxpayer.
2:32:16 Okay, here’s the thing you hear in Washington.
2:32:17 Here’s one of the things.
2:32:18 So the first thing you hear is this is impossible.
2:32:19 They’ll be able to do nothing.
2:32:21 And then yeah, I walk them through this and they’re like, they start to get, they start
2:32:24 to don and then this is a new kind of thing.
2:32:27 And then they’re like, well, it doesn’t matter because all the money is in entitlements and
2:32:32 the debt and the military.
2:32:34 And so, yeah, you’ve got like this silly fake, whatever, NPR funding or whatever, and it
2:32:36 just, it’s a rounding error and it doesn’t matter.
2:32:41 And you look it up in the budget and it’s like, whatever, $500 million or $5 billion.
2:32:44 Or it’s the charging stations that don’t exist.
2:32:47 It’s the $40 billion of charging stations and they build eight charging stations.
2:32:52 Or it’s the broadband internet plan that delivered broadband to nobody, right?
2:32:53 And cost you $30 billion.
2:32:57 So these boondoggles and what everybody in Washington says is the $30 billion is a rounding
2:32:58 error on the federal budget.
2:32:59 It doesn’t matter.
2:33:00 Who cares if they, if they make it go away.
2:33:05 And of course, any taxpayer is like, what the?
2:33:06 What do you mean?
2:33:07 It’s $30 billion.
2:33:08 Yeah.
2:33:09 Right.
2:33:12 And then the experts are like, and the press is in on this too, then the experts are like,
2:33:14 well, it doesn’t, it doesn’t matter because it’s a rounding error.
2:33:15 No, it’s $30 billion.
2:33:20 And if you’re this cavalier about $30 billion, imagine how cavalier you are about the $3
2:33:21 trillion.
2:33:22 Yeah.
2:33:23 Okay.
2:33:24 $30 billion is $30 billion.
2:33:27 A lot of the federal budget and percentage know it’s not, but $30 billion divided by
2:33:31 30 do the math, $30 billion divided by let’s say 300 million taxpayers, right?
2:33:36 Like what’s that math expert $100 per taxpayer per year.
2:33:37 Okay.
2:33:43 So $100 to an ordinary person working hard every day to make money and provide for their
2:33:44 kids.
2:33:46 $100 is a meal out.
2:33:48 It’s a trip to the amusement park.
2:33:51 It’s the ability to, you know, buy additional educational materials.
2:33:54 It’s the ability to have a babysitter, to be able to have a romantic relationship with
2:33:55 your wife.
2:33:59 It’s, there’s like a hundred things that that person can do with $100 that they’re not doing
2:34:03 because it’s going to some bullshit program that is being basically where the money’s
2:34:07 being looted out in the form of just like ridiculous, ridiculousness and graft.
2:34:11 And so the idea that that $30 billion program is not something that is like a very important
2:34:17 thing to go after is just like the level of contempt for the taxpayer is just off the charts.
2:34:21 And then that’s just one of those programs and there’s like a hundred of those programs
2:34:22 and they’re all just like that.
2:34:24 Like it’s not like any of this stuff is running well.
2:34:26 Like the one thing we know is that none of this stuff is running well.
2:34:27 Like we know that for sure.
2:34:28 Right.
2:34:31 And we like, we know these people aren’t showing up to work and like we know that all this crazy
2:34:32 stuff is happening.
2:34:33 Right.
2:34:37 And like, you know, the, do you remember Elon’s story of the, do you remember Elon’s story
2:34:39 of what got the Amish to turn out to vote in Pennsylvania?
2:34:40 Oh, okay.
2:34:41 So like Pennsylvania.
2:34:42 Okay.
2:34:43 So Pennsylvania is like a wonderful state, great history.
2:34:46 It has these cities like Philadelphia that have descended like other cities into just
2:34:49 like complete chaos, violence, madness and death, right?
2:34:53 And the federal government has just like let it happen is incredibly violent places.
2:34:56 And so the Biden administration decided that the big pressing law enforcement thing that
2:35:00 they needed to do in Pennsylvania was that they needed to start raiding Amish farms to
2:35:04 prevent them from selling raw milk with armed raids.
2:35:05 Right.
2:35:10 And it turns out it really pissed off the Amish and it turns out they weren’t willing to drive
2:35:14 to the polling places because they don’t have cars, but if you came and got them, they would
2:35:15 go and they would vote.
2:35:17 That’s one of the reasons why Trump won anyway.
2:35:21 So like the law enforcement agencies are off working on like crazy things.
2:35:23 Like the system’s not working.
2:35:26 And so you, you add up, pick 130 billion dollar programs.
2:35:27 All right.
2:35:28 Now you’re okay.
2:35:30 Math major, a hundred times a hundred.
2:35:31 Ten thousand.
2:35:32 Ten thousand dollars.
2:35:33 Okay.
2:35:34 Ten thousand dollars per taxpayer per year.
2:35:36 And but it’s also not just about money.
2:35:40 That’s really obviously money is a hugely important thing, but it’s the cavalier attitude.
2:35:41 Yes.
2:35:48 And in sort of in the ripple effect of that, it makes it so nobody wants to work in government
2:35:49 and be productive.
2:35:53 It makes it so the corruption can, it breeds corruption.
2:35:55 It breeds laziness.
2:35:59 It breeds secrecy because you don’t want to be transparent about having done nothing all
2:36:00 year.
2:36:01 All those kinds of stuff.
2:36:02 And you don’t want to reverse that.
2:36:08 So it would be exciting for the future to work in government to, because the, the amazing
2:36:13 thing if you’re the steel man government is you can do shit at scale.
2:36:20 You have money and you can directly impact people’s lives in a positive sense at scale.
2:36:22 That’s super exciting.
2:36:28 As long as there’s no bureaucracy that slows you down or not huge amounts of bureaucracy
2:36:30 that slows you down significantly.
2:36:31 So here’s the trick.
2:36:36 This blew my mind because I was, you know, once you open the hell mouth of looking into
2:36:40 the federal budget, you learn all kinds of things.
2:36:44 So there is a term of art in government called impoundment.
2:36:48 And so you, if you’re like me, you’ve learned this the hard way when your car has been impounded.
2:36:52 The government meaning of impoundment, the federal budget meaning is a different meaning.
2:36:54 Impoundment is as follows.
2:36:58 The constitution requires Congress to authorize money to be spent by the executive branch,
2:36:59 right?
2:37:02 So the executive branch goes to Congress says we need money acts.
2:37:03 Congress does their thing.
2:37:05 They come back and they say, you can have money, why?
2:37:08 The money’s appropriated from Congress, the executive branch spends it on the military
2:37:11 or whatever they spend it on, or on roads to nowhere or charging stations to nowhere
2:37:14 or whatever.
2:37:18 And what’s in the constitution is the Congress appropriates the money.
2:37:23 Over the last 60 years, there has been an additional interpretation of appropriations
2:37:29 applied by the courts and by the system, which is the executive branch not only needs Congress
2:37:33 to appropriate X amount of money, the executive branch is not allowed to underspend.
2:37:37 Yeah, I’m aware of this, I’m aware of this.
2:37:40 And so there’s this thing that happens in Washington at the end of every fiscal year,
2:37:45 which is September 30th, and it’s the great budget flush, and any remaining money that’s
2:37:47 in the system that they don’t know how to productively spend, they deliberately spend
2:37:53 it on productively, to the tune of hundreds and hundreds of billions of dollars.
2:37:57 A president that doesn’t want to spend the money can’t not spend it.
2:37:58 Yeah.
2:38:02 Like, okay, A, that’s not what’s in the constitution, and there’s actually quite a good Wikipedia
2:38:05 page that goes through the great debate on this is played out in the legal world over
2:38:06 the last 60 years.
2:38:10 And basically, if you look at this with anything resembling, I think I don’t mind you’re like,
2:38:13 “All right, this is not what the founders meant.”
2:38:16 And then number two, again, we go back to this thing of contempt.
2:38:21 Can you imagine showing up and running the government like that, and thinking that you’re
2:38:24 doing the right thing, and not going home at night, and thinking that you’ve sold your
2:38:25 soul?
2:38:29 I actually think you sort of had a really good point, which is it’s even unfair to the
2:38:31 people who have to execute this.
2:38:32 Yeah.
2:38:35 It makes them bad people, and they didn’t start out wanting to be bad people.
2:38:37 And so, there is stuff like this, like…
2:38:38 Yeah.
2:38:39 Everywhere.
2:38:40 Everywhere.
2:38:42 And so, we’ll see how far these guys get.
2:38:44 I am extremely encouraged what I’ve seen so far.
2:38:48 It seems like a lot of people will try to slow them down, but yeah, I’m up to get far.
2:38:50 Another difficult topic, immigration.
2:38:56 What’s your take on the, let’s say, heated H-1B visa debate that’s going on online and
2:38:58 legal immigration in general?
2:38:59 Yeah.
2:39:04 By saying I am not involved in any aspect of government policy on this, I am not planning
2:39:05 to be.
2:39:07 This is not an issue that I’m working on, or that I’m going to work on.
2:39:08 We’re not.
2:39:11 This is not part of the agenda of what the firm is doing, so my firm is doing.
2:39:17 So, I’m not in this, in the new administration of the government, I’m not planning to be,
2:39:19 so purely just personal opinion.
2:39:25 So, I would describe what I have as a complex or hopefully nuanced view on this issue that’s
2:39:28 maybe a little bit different than what a lot of my peers have.
2:39:32 And I think, and I kind of thought about this, I didn’t say anything about it all the way
2:39:36 through the big kind of debate over Christmas, but I thought about it a lot and read everything.
2:39:39 I think what I realized is that I just have a very different perspective on some of these
2:39:44 things and the reason is because of the combination of where I came from and then where I ended
2:39:45 up.
2:39:50 And so, let’s start with this, where I ended up in Silicon Valley.
2:39:54 And I have made the pro high-skilled immigration argument many, many times, the H-1B argument
2:40:00 many times, in past lives, I’ve been in DC many times arguing with prior administrations
2:40:03 about this, always on the side of trying to get more H-1Bs and trying to get more high-skilled
2:40:04 immigration.
2:40:11 And I think that argument is very strong and very solid and very, has paid off for the
2:40:15 US in many, many ways and we can go through it, but I think it’s the argument everybody
2:40:16 already knows, right?
2:40:17 It’s like the stock.
2:40:19 You take any Silicon Valley person, you press the button and they tell you why we need to
2:40:21 drain the world to get more H-1Bs, right?
2:40:23 So, everybody kind of gets that argument.
2:40:27 So, it’s basically just to summarize, it’s a mechanism by which you can get super smart
2:40:33 people from the rest of the world, import them in, keep them here to increase the productivity
2:40:35 of the US companies.
2:40:36 Yeah.
2:40:40 And then it’s not just good for them and it’s not just good for Silicon Valley or the tech
2:40:41 industry.
2:40:44 It’s good for the country because they then create new companies and create new technologies
2:40:49 and create new industries that then create many more jobs for Native-born Americans than
2:40:53 would have previously existed and so you’ve got a, it’s a positive sum, flywheel thing
2:40:54 where everybody wins.
2:40:56 Like everybody wins, there are no trade-offs.
2:40:59 It’s all absolutely glorious in all directions.
2:41:04 You cannot possibly, there cannot possibly be a moral argument against it under any circumstances.
2:41:08 Anybody who argues against it is obviously doing so from a position of racism is probably
2:41:10 a fascist and a Nazi, right?
2:41:11 Right.
2:41:12 I mean, that’s the thing.
2:41:13 And like I said, I’ve made that argument many times.
2:41:16 I’m very comfortable with that argument and then I’d also say, look, I would say number
2:41:20 one, I believe a lot of it, I’ll talk about the parts I don’t believe, but I believe a
2:41:21 lot of it.
2:41:23 And then the other part is, look, I benefit every day.
2:41:28 I always describe it as I work in the United Nations, like I, my own firm and our founders
2:41:35 and our companies and the industry and my friends, you know, are just this like amazing,
2:41:40 you know, panoply cornucopia of people from all over the world.
2:41:43 And you know, I just, I’ve worked, I don’t know at this point where people from, it’s
2:41:45 got to be, I don’t know, 80 countries or something.
2:41:47 And hopefully over time, it’ll be, you know, the rest as well.
2:41:50 And, you know, it’s just, it’s been amazing and they’ve done many of the most important
2:41:52 things in my industry and it’s been really remarkable.
2:41:55 So that’s all good.
2:41:58 And then, you know, there’s just the practical version of the argument, which is we are the,
2:41:59 we are the main place.
2:42:00 These people get educated anyway.
2:42:01 Right.
2:42:03 They, the best and the brightest tend to come here to get educated.
2:42:06 And so, you know, this is the old kind of Mitt Romney staple of green card to every,
2:42:11 you know, at least, you know, maybe not every university degree, but every technical degree.
2:42:15 The sociologists we could quibble about, but, you know, the roboticists for sure.
2:42:16 For sure.
2:42:17 For sure.
2:42:18 We can all agree that.
2:42:19 At least I want you over on something today.
2:42:21 Well, no, I’m exaggerating for a fact.
2:42:23 So, and I lost you.
2:42:25 I had you for half a second.
2:42:27 I haven’t gotten to the other side of the argument yet.
2:42:28 Okay.
2:42:29 Thank you.
2:42:31 So surely we can all agree that we need to staple a green card.
2:42:33 The rollercoaster is going up.
2:42:35 The rollercoaster is ratcheting slowly up.
2:42:36 So, yeah.
2:42:38 So surely we can all agree that the roboticists should all get green cards.
2:42:41 And again, like there’s a lot of merit to that, obviously, like, look, we want the U.S.
2:42:43 to be the world leader in robotics.
2:42:46 What’s step one to being the world leader in robotics is have all the great robotics
2:42:47 people, right?
2:42:50 Like, you know, very unlike the underpass know, it’s like a very straightforward formula.
2:42:51 Right.
2:42:52 Yeah.
2:42:53 All right.
2:42:54 That’s all well and good.
2:42:55 All right.
2:42:57 But it gets a little bit more complicated because there is a kind of argument that’s sort
2:43:00 of right underneath that that you also hear from, you know, these same people.
2:43:04 And I have made this argument myself many times, which is we need to do this because we don’t
2:43:06 have enough people in the U.S. who can do it otherwise.
2:43:07 Right.
2:43:08 We have all these unfilled jobs.
2:43:10 We’ve got all these, you know, all these companies that wouldn’t exist.
2:43:11 We don’t have enough good founders.
2:43:12 We don’t have enough engineers.
2:43:16 We don’t have enough scientists or then the next version of the argument below that is
2:43:20 our education system is not good enough to generate those people.
2:43:23 And which is a weird argument, by the way, because, like, our education system is good
2:43:27 enough for foreigners to be able to come here preferentially in, like, a very large number
2:43:31 of cases, but somehow not good enough to educate our own native born people.
2:43:34 So there’s like a weird, these little cracks in the matrix that you can kind of stick your
2:43:38 fingernail into and kind of wonder about and we’ll come back to that one.
2:43:41 Like, at least, yes, our education system has this flaws.
2:43:45 And then underneath that is the argument that Vivek made, you know, which is, you know,
2:43:50 we have a cultural rot in the country and native born people in the country don’t work hard
2:43:53 enough and spend too much time watching TV and TikTok and don’t spend enough time studying
2:43:54 differential equations.
2:43:59 And again, it’s like, all right, like, you know, yeah, there’s a fair amount to that.
2:44:04 Like there’s a lot of American culture that is, you know, there’s a lot of frivolity.
2:44:07 There’s a lot of, you know, look, I mean, we have well documented social issues in many
2:44:11 fronts, many things that cut against having a culture of just like straightforward high
2:44:13 achievement and effort and striving.
2:44:16 Anyway, like, you know, those are the basic arguments.
2:44:19 But then I have this kind of other side of my, you know, kind of personality and thought
2:44:23 process, which is, well, I grew up in a small farming town in rural Wisconsin, the rural
2:44:24 Midwest.
2:44:27 And, you know, it’s interesting, there’s not a lot of people who make it from rural
2:44:31 Wisconsin to, you know, high tech.
2:44:33 And so it’s like, all right, why is that exactly, right?
2:44:37 And I know I’m an aberration, like I was the only one from anybody I ever knew who ever
2:44:38 did this, right?
2:44:40 I know what an aberration I am, and I know exactly how that aberration happened.
2:44:46 And it’s a very unusual set of steps, including, you know, many that were just luck.
2:44:51 But like it, there is in no sense a talent flow from rural Wisconsin into high tech,
2:44:55 like not at all.
2:44:59 There is also like in no sense a talent flow from the rest of the Midwest into high tech.
2:45:01 There is no talent flow from the South into high tech.
2:45:03 There is no flow from the Sun Belt into high tech.
2:45:08 There is no flow from, you know, the deep South into high tech, like just like literally
2:45:12 it’s like the blanks, but there’s this whole section of the country that just where the
2:45:15 people just like for some reason don’t end up in tech.
2:45:20 Now, that’s a little bit strange because these are the people who put a man on the moon.
2:45:23 These are the people who built the World War II war machine.
2:45:27 These are the people, at least their ancestors are the people who built the Second Industrial
2:45:32 Revolution and built the railroads and built the telephone network and built, you know,
2:45:36 logistics and transportation and the auto industry was built in Cleveland and Detroit.
2:45:40 And so at least these people’s parents and grandparents and great grandparents somehow
2:45:44 had the wherewithal to like build all of this like amazing things and invent all these things.
2:45:48 And then there’s many, many, many, many stories in the history of American invention and innovation
2:45:52 and capitalism where you had people who grew up in the middle of nowhere, Philo Farnsworth
2:45:55 who invented the television and just like, you know, tons and tons of others, endless
2:45:57 stories like this.
2:46:00 Now you have, I’d look up a puzzle, right, in the conundrum, which is like, okay, like
2:46:03 what is happening on the blank spot of the map?
2:46:07 And then of course, you also can’t help noticing that the blank spot on the map, the Midwest,
2:46:12 the South, you’ve also just defined Trump country, the Trump voter base, right?
2:46:13 And it’s like, oh, well, that’s interesting.
2:46:15 Like how did that happen?
2:46:16 Right.
2:46:19 And so either you really, really, really have to believe the very, very strong version of
2:46:22 like the Vivec thesis or something where you have to believe that like that basically
2:46:26 culture, the whole sort of civilization in the middle of the country and how the country
2:46:31 is so like deeply flawed, either inherently flawed or culturally flawed such that for
2:46:35 whatever reason, they are not able to do the things that their, you know, parents and
2:46:38 grandparents were able to do and that their peers are able to do it or something else
2:46:39 is happening.
2:46:40 Would you care to guess on what else is happening?
2:46:41 I mean, what?
2:46:42 Affirmative action?
2:46:43 Affirmative action.
2:46:44 Okay.
2:46:48 This is very, think about this, this is very entertaining, right?
2:46:51 What are the three things that we know about affirmative action?
2:46:55 It is absolutely 100% necessary.
2:47:00 But however, it cannot explain the success of any one individual, nor does it have any
2:47:01 victims at all.
2:47:07 They could explain maybe disproportionate, but surely it doesn’t explain why you’re
2:47:11 probably the only person in Silicon Valley from Wisconsin.
2:47:15 What educational institution the last 60 years has wanted Farm Boys from Wisconsin?
2:47:18 But what institution rejected Farm Boys from Wisconsin?
2:47:19 All of them.
2:47:20 All of them.
2:47:21 Of course.
2:47:22 Okay.
2:47:23 So we know this.
2:47:26 This is the Harvard and UNC Supreme Court cases.
2:47:28 So this was like three years ago.
2:47:31 These were big court cases, you know, because the idea of affirmative action has been litigated
2:47:35 for many, many, many years and through many court cases and the Supreme Court repeatedly
2:47:38 in the past had upheld that it was a completely legitimate thing to do.
2:47:41 And a lot of these, and there’s basically two categories of affirmative action that
2:47:43 like really matter, right?
2:47:47 One is admissions into educational institutions and then the other is jobs, right, getting
2:47:48 hired.
2:47:49 Like those are the two biggest areas.
2:47:53 The education one is like super potent has been a super potent political issue for a
2:47:56 very long time for all, you know, people have written and talked about this for many decades.
2:47:57 I don’t need to go through it.
2:47:59 There’s many arguments for why it’s important.
2:48:01 There’s many arguments as to how it could backfire.
2:48:02 It’s been this thing.
2:48:06 But the Supreme Court upheld it for a very long time.
2:48:08 The most recent ruling, I’m not a lawyer, I don’t have the exact reference in my head,
2:48:15 but there was a case in 2003 that said that Sandra Day O’Connor famously wrote that, you
2:48:20 know, although it had been 30 years of affirmative action and although it was not working remotely
2:48:24 as it had been intended, she said that, you know, well, basically we need to try it for
2:48:25 another 25 years.
2:48:29 But she said basically as a message to future Supreme Court justices, if it hasn’t resolved
2:48:33 basically the issues it’s intended to resolve within 25 years, then we should probably call
2:48:34 it off.
2:48:36 By the way, we’re coming up on the 25 years.
2:48:39 It’s a couple years away.
2:48:43 The Supreme Court just had these cases as a Harvard case, and I think a University of
2:48:44 North Carolina case.
2:48:48 And what’s interesting about those cases is the lawyers in those cases put a tremendous
2:48:53 amount of evidence into the record of how the admissions decisions actually happen at
2:48:59 Harvard and happen at UNC, and it is like every bit as cartoonishly garish and racist
2:49:04 as you could possibly imagine, because it’s a ring of power.
2:49:07 And if you’re an admissions officer at a private university or an administrator, you
2:49:11 have unlimited power to do what you want, and you can justify any of it under any of
2:49:14 these rules or systems.
2:49:17 And up until these cases, it had been a black box where you didn’t have to explain yourself
2:49:19 and show your work.
2:49:23 And what the Harvard and USC cases did is they basically required showing the work.
2:49:26 And there was all kinds of phenomenal detail.
2:49:29 Number one is there were text messages in there that will just curl your hair of students
2:49:33 being spoken of and just crude racial stereotypes that would just make you want to jump out
2:49:34 the window.
2:49:35 It’s horrible stuff.
2:49:38 But also, there was statistical information.
2:49:41 And of course, the big statistical kicker to the whole thing is that at top institutions,
2:49:46 it’s common for different ethnic groups to have different cutoffs for SAT that are as
2:49:48 wide as 400 points.
2:49:52 So different groups.
2:49:57 So specifically, Asians need to perform at 400 SAT points higher than other ethnicities
2:50:00 in order to actually get admitted into these– I mean, it’s not even about– I mean, white
2:50:02 people are a part of this, but Asians are a very big part of this.
2:50:06 And actually, the Harvard case is actually brought by an activist on behalf of actually
2:50:09 the Asian students who are being turned away.
2:50:12 And it’s basically– I mean, it’s the cliche now in the valley and in the medical community,
2:50:16 which is if you want a super genius, you hire an Asian from Harvard, because they are guaranteed
2:50:21 to be freaking Einstein, because if they weren’t, they were never getting admitted, right?
2:50:24 Almost all the qualifications get turned away.
2:50:29 So they’ve been running this– it’s a very, very explicit, very, very clear program.
2:50:32 This of course has been a third rail of things that people are not supposed to discuss under
2:50:34 any circumstances.
2:50:37 The thing that has really changed the tenor on this is, I think, two things.
2:50:40 Number one, those Supreme Court cases, the Supreme Court ruled that they can no longer
2:50:42 do that.
2:50:45 I will tell you, I don’t believe there’s a single education institution in America that
2:50:48 is conforming with the Supreme Court ruling.
2:50:51 I think they are all flagrantly ignoring it, and we could talk about that.
2:50:53 Mostly because of momentum, probably, or what?
2:50:55 They are trying to make the world a better place.
2:50:57 They are trying to solve all these social problems.
2:50:59 They are trying to have diverse student populations.
2:51:02 They are trying to live up to the expectations of their donors.
2:51:04 They are trying to make their faculty happy.
2:51:09 They are trying to have their friends and family think that they’re good people.
2:51:13 They’re trying to have the press write nice things about them.
2:51:18 It’s nearly impossible for them, and to be clear, nobody has been fired from an admissions
2:51:20 office for 25 years of prior.
2:51:24 What we now, the Supreme Court now, is ruled to be illegality.
2:51:28 They’re all the same people under the exact same pressures.
2:51:32 The numbers are moving a little bit, but I don’t know anybody in the system who thinks
2:51:35 that they’re complying with the Supreme Court.
2:51:36 Who’s in charge?
2:51:39 In the rank ordering of who rules who, the university’s rule of the Supreme Court way
2:51:42 more than the Supreme Court rules the university’s.
2:51:45 Another example of that is that every sitting member of the Supreme Court went to either
2:51:48 Harvard or Yale.
2:51:53 The level of incestuousness here is like … Anyway, so there’s that.
2:51:54 This has been running for a very long time.
2:51:58 One is the Harvard and USC cases gave up the game, number one, or at least showed what
2:51:59 the mechanism was.
2:52:04 And then number two, the other thing is obviously the aftermath of October 7th, and what we
2:52:08 discovered was happening with Jewish applicants, and what was happening at all the top institutions
2:52:14 for Jewish applicants was they were being actively managed down as a percentage of the
2:52:17 base.
2:52:23 I’ve heard reports of extremely explicit, basically, plans to manage the Jewish admissions
2:52:28 down to their representative percentage of the US population, which is 2%.
2:52:31 There’s a whole backstory here, which is 100 years ago, Jews were not admitted into a lot
2:52:34 of these institutions, and then there was a big campaign to get them in.
2:52:37 Once they could get in, they immediately became 30% of these institutions because there’s
2:52:39 so many smart, talented Jews.
2:52:43 So it went from 0% to 30%, and then the most recent generation of leadership has been trying
2:52:45 to get it down to 2%.
2:52:49 And a lot of Jewish people, at least a lot of Jewish people I know, sort of, they kind
2:52:53 of knew this was happening, but they discovered it the hard way after October 7th, right?
2:52:57 And so all of a sudden … So basically, the Supreme Court case meant that you could address
2:53:00 this in terms of the Asian victims.
2:53:04 The October 7th meant that you could address it in terms of the Jewish victims, and for
2:53:07 sure both of those groups are being systematically excluded, right?
2:53:10 And then, of course, there’s the thing that you basically can’t talk about, which is all
2:53:13 the white people are being excluded.
2:53:17 And then it turns out it’s also happening to black people.
2:53:21 And this is the thing that blew my freaking mind when I found out about it.
2:53:28 So I just assumed that this was great news for American blacks, because obviously if
2:53:31 whites, Asians, and Jews are being excluded, then the whole point to this in the beginning
2:53:35 was to get the black population up, and so this must be great for American blacks.
2:53:41 So then I discovered this New York Times article from 2004 called, “Blacks are being admitted
2:53:44 into top schools at greater numbers, but which ones?”
2:53:45 Uh-oh.
2:53:48 And again, and by the way, this is in the New York Times.
2:53:53 This is not in, like, you know, whatever, national review, this is New York Times, 2004.
2:53:57 And the two authorities that were quoted in the story are Henry Lewis Gates, who’s the
2:54:01 dean of the African-American studies community in the United States, super brilliant guy,
2:54:07 and then Lonnie Guinear, who was a, she was a potential Supreme Court appointee under,
2:54:10 I think, close friend of Hillary Clinton, and there was, for a long time, she was on
2:54:12 the shortlist for Supreme Court.
2:54:18 So one of the top, you know, jurists, lawyers in the country, both black, was sort of legendarily
2:54:22 successful in their, in their, in the academic and legal worlds, and black.
2:54:24 And they are quoted as the authorities in this story.
2:54:29 And the story that they tell is actually very, it’s amazing.
2:54:33 By the way, it’s happening today in education institutions, and it’s happening in companies,
2:54:38 and you can see it all over the place and the government, which is, at least at that
2:54:44 time, the number was half of the black admits into a place like Harvard were not American
2:54:45 born blacks.
2:54:53 They were foreign born blacks, specifically, northern African, generally Nigerian, or West
2:54:54 Indian.
2:54:55 Right.
2:54:59 And by the way, many Nigerians and northern Africans have come to the U.S. and have been
2:55:00 very successful.
2:55:03 Nigerian Americans as a group, like way outperform, they’re, you know, this is a super smart cohort
2:55:04 of people.
2:55:07 And then West Indian blacks in the U.S. are incredibly successful.
2:55:12 Most recently, by the way, Kamala Harris, as well as Colin Powell, like just two sort
2:55:13 of examples of that.
2:55:18 And so basically what Henry Louis Gates, Alana Guarnira said in the story is Harvard is basically
2:55:23 struggling to either whatever it was, identify a recruit, make successful whatever it was,
2:55:25 American born native blacks.
2:55:30 And so therefore, they were using high school immigration as an escape hatch to go get blacks
2:55:31 from other countries.
2:55:35 And then this was 2004 when you could discuss such things.
2:55:39 Obviously, that is a topic that nobody has discussed since.
2:55:40 It has sailed on.
2:55:45 All of the DEI programs of the last 20 years have had this exact characteristic.
2:55:48 There’s large numbers of black people in America who are fully aware of this and are like,
2:55:51 “It’s obviously not us that are getting these slots.
2:55:54 We’re obviously, we’re literally competing with people who are being imported.”
2:55:58 And if you believe in the basis of affirmative action, you are trying to make up for historical
2:56:00 injustice of American black slavery.
2:56:06 So the idea that you’re import somebody from Nigeria that never experienced that is tremendously
2:56:08 insulting to black Americans.
2:56:11 Anyway, so you can see where I’m heading with this.
2:56:16 We have been in a 60-year social engineering experiment to exclude native born people from
2:56:20 the educational slots and jobs that high school immigration has been funneling foreigners
2:56:21 into.
2:56:22 Right.
2:56:24 And so it turns out it’s not a victim-free thing.
2:56:27 There’s like 100% there’s victims because why?
2:56:28 There’s only so many.
2:56:30 For sure, there’s only so many education slots and then for sure, there’s only so many of
2:56:31 these jobs.
2:56:32 Right.
2:56:35 Google only hires so many, you know, whatever level seven engineers.
2:56:36 Right.
2:56:38 And so that’s the other side of it.
2:56:39 Right.
2:56:44 And so you’re a farm boy in Wisconsin, right, or a black American whose ancestors arrived
2:56:53 here on a slave ship 300 years ago in Louisiana, or a Cambodian immigrant in the Bronx and
2:56:58 your kid or a Jewish immigrant or from a very successful Jewish family.
2:57:02 And your entire, you know, for three generations, you and your parents and grandparents went
2:57:03 to Harvard.
2:57:07 And what all of those groups know is the system that has been created is not for them.
2:57:08 Right.
2:57:11 It’s designed specifically to exclude them.
2:57:14 And then what happens is all of these tech people show up in public and say, “Yeah, let’s
2:57:15 bring in more foreigners.”
2:57:16 Right.
2:57:21 And so anyway, so the short version of it is you can’t anymore, I don’t think, just
2:57:29 have the “high school immigration” conversation for either education or for employment without
2:57:32 also having the DEI conversation.
2:57:34 And then DEI is just another word for affirmative action.
2:57:36 So it’s the affirmative action conversation.
2:57:39 And you need to actually deal with this at substance and to see what’s actually happening
2:57:42 to people you need to join these topics.
2:57:46 And I think it is much harder to make the moral claim for high school immigration given
2:57:52 the extent to which DEI took over both the education process and the hiring process.
2:57:53 Okay.
2:57:57 So first of all, that was brilliantly laid out, the nuance of it.
2:58:02 So just to understand, it’s not so much a criticism of H1B, high school immigration,
2:58:08 it’s that there needs to be more people saying, “Yay, we need more American-born hires.”
2:58:12 So I spent the entire Christmas holiday reading every message on this and not saying anything.
2:58:17 And what I was – which you know me well enough to know that’s a serious level of –
2:58:18 Yeah, that’s very zen.
2:58:19 Yes, thank you.
2:58:20 Thank you.
2:58:21 No, it wasn’t.
2:58:25 There was tremendous rage on the other side of it, but I suppressed it.
2:58:29 So I was waiting for the dog that didn’t bark, right?
2:58:33 And the dog that didn’t bark was I did not – and tell me if you saw one, I did not see
2:58:36 a single example of somebody pounding the table for more high school immigration who
2:58:40 was also pounding the table to go get more smart kids who are already here into these
2:58:42 educational institutions and into these jobs.
2:58:44 I didn’t see a single one.
2:58:45 That’s true.
2:58:47 I think I agree with that.
2:58:49 There really was a divide.
2:58:51 But it was like literally, it was like the proponents of high school immigration.
2:58:53 And again, this was me for a very long time.
2:58:57 I mean, I kind of took myself by surprise on this because I was on – you know, I had
2:58:59 the much simpler version of this story for a very long time.
2:59:03 Like I said, I’ve been in Washington many times under past presidents lobbying for this.
2:59:05 By the way, never made any progress, which we could talk about.
2:59:08 Like it never actually worked.
2:59:10 But you know, I’ve been on the other side of this one.
2:59:14 But I was literally sitting there being like, all right, which of these like super geniuses
2:59:17 who many of whom by the way are very successful high school immigrants or children of high
2:59:23 school immigrants, which of these super geniuses are going to like say, actually we have this
2:59:25 like incredible talent source here in the country, which again, to be clear, I’m not
2:59:26 talking about white people.
2:59:30 I’m talking about native-born Americans, whites, Asians, Jews, blacks, for sure.
2:59:31 For sure.
2:59:32 For sure.
2:59:33 Those four groups.
2:59:34 But also white people.
2:59:35 Yeah.
2:59:36 And also white people.
2:59:44 Making the case for American-born hires are usually not also supporting H1B.
2:59:50 It’s an extreme divide and those people that are making that case are often not making it
2:59:55 in a way that’s like making it in quite a radical way.
2:59:56 Yeah.
2:59:57 Let’s put it this way.
2:59:58 Yeah.
2:59:59 But you have this interesting thing.
3:00:01 You have a split between the sides that I’ve noticed, which is one side has all of the
3:00:02 experts.
3:00:03 Right.
3:00:04 Right.
3:00:05 And I’m using scare quote for people listening to audio.
3:00:08 I’m making quotes in the air with my fingers as vigorously as I can.
3:00:11 One side has all the certified experts.
3:00:13 The other side just has a bunch of people who are like, they know that something is wrong
3:00:16 and they don’t quite know how to explain it.
3:00:19 So unusual about the Harvard UNC cases, by the way, in front of Supreme Court, is they
3:00:22 actually had sophisticated lawyers for the first time in a long time actually put all
3:00:25 the seven s together and actually put it in the public record.
3:00:28 They actually had experts, which is just really rare.
3:00:31 Generally what you get is you get, because if you don’t have experts, what do you have?
3:00:35 You know something is wrong, but you have primarily an emotional response.
3:00:42 You feel it, but can you put it in the words and tables and charts that a certified expert
3:00:43 can?
3:00:44 No, you can’t.
3:00:45 That’s not who you are.
3:00:48 That doesn’t mean that you’re wrong and it also doesn’t mean that you have less of a
3:00:49 moral stance.
3:00:50 Yeah.
3:00:51 And so it’s just like, all right.
3:00:54 Now, by the way, look, I think there are ways to square the circle.
3:00:56 I think there’s a way to have our cake and eat it too.
3:00:58 Like I think there’d be many ways to resolve this.
3:01:04 I think, again, I think the way to do it is to look at these issues combined, at DEI combined
3:01:05 with high school immigration.
3:01:12 It so happens the DEI is under much more scrutiny today than it has been for probably 20 years.
3:01:18 Affirmative action is, the Supreme Court did just rule that it is not legal for universities
3:01:19 to do that.
3:01:23 They are still doing it, but they should stop.
3:01:28 And then there are more and more, you’ve seen more companies now also dishing their DEI
3:01:29 programs.
3:01:33 In part, that’s happening for a bunch of reasons, but it’s happening in part because a lot of
3:01:37 corporate lawyers will tell you that the Supreme Court rulings and education either already
3:01:43 apply to businesses or just as a clear foreshadowing, the Supreme Court will rule on new cases that
3:01:44 will ban any businesses.
3:01:51 And so there is a moment here to be able to look at this on both sides.
3:01:55 Let me add one more nuance to it that makes it even more complicated.
3:01:57 So the cliche is we’re going to drain the world, right?
3:01:58 You’ve heard that?
3:02:00 We’re going to take all the smart people from all over the world.
3:02:01 We’re going to bring them here.
3:02:02 We’re going to educate them.
3:02:04 And then they’re going to raise their families here, create businesses here, create jobs
3:02:05 here, right?
3:02:07 In the cliche, that’s a super positive thing.
3:02:08 Yeah.
3:02:09 Okay.
3:02:12 So what happens to the rest of the world?
3:02:13 They lose?
3:02:18 Well, how fungible are people?
3:02:24 How many highly ambitious, highly conscientious, highly energetic, high achieving, high IQ
3:02:28 super geniuses are there in the world?
3:02:30 And if there’s a lot, that’s great.
3:02:34 But if there just aren’t that many, and they all come here, and they all aren’t where
3:02:39 they would be otherwise, what happens to all those other places?
3:02:43 So it’s almost impossible for us here to have that conversation in part because we become
3:02:46 incredibly uncomfortable as a society talking about the fact that people aren’t just simply
3:02:50 all the same, which is the whole thing we could talk about.
3:02:54 But also we are purely the beneficiary of this effect, right?
3:02:57 We are brain draining the world, not the other way around.
3:02:58 There’s only four.
3:03:02 So if you look at the flow of high-skill immigration over time, there’s only four permanent sinks
3:03:05 of high-skill immigration in places people go.
3:03:07 It’s the US, Canada, the UK, and Australia.
3:03:10 It’s the four of the five eyes.
3:03:12 It’s the major English fear countries.
3:03:16 And so for those countries, this seems like a no-lose proposition.
3:03:20 It’s all the other countries that basically what we four countries have been doing is
3:03:21 draining all those smart people up.
3:03:25 It’s actually much easier for people in Europe to talk about this I’ve discovered because
3:03:27 the Eurozone is whatever, 28 countries.
3:03:31 And within the Eurozone, the high-skill people over time have been migrating to originally
3:03:36 the UK, but also specifically, I think it’s the Netherlands, Germany, and France.
3:03:40 But specifically, they’ve been migrating out of the peripheral Eurozone countries.
3:03:43 And the one where this really hit the fan was in Greece, right?
3:03:47 So Greece falls into chaos, disaster, and then you’re running the government in Greece
3:03:51 and you’re trying to figure out how to put an economic development plan together.
3:03:54 All of your smart young kids have left.
3:03:56 Like what are you going to do, right?
3:04:01 By the way, this is a potential, I know you care a lot about Ukraine, this is a potential
3:04:02 crisis for Ukraine.
3:04:06 Not because, in part because of this, because we enthusiastically recruit Ukrainians of
3:04:07 course.
3:04:09 And so we’ve been brain draining Ukraine for a long time.
3:04:12 But also, of course, war does tend to cause people to migrate out.
3:04:18 And so when it comes time for Ukraine to rebuild as a peaceful country, is it going to have
3:04:20 the talent base even that it had five years ago?
3:04:22 It’s like a very big and important question.
3:04:25 By the way, Russia, like we have brain drain a lot of really smart people out of Russia.
3:04:29 A lot of them are here over the last 30 years.
3:04:31 And so there’s this thing.
3:04:33 It’s actually really funny if you think about it.
3:04:37 The one thing that we know to be the height of absolute evil that the West ever did was
3:04:40 colonization and resource extraction.
3:04:44 So we know the height of absolute evil was when the Portuguese and the English and everybody
3:04:47 else went and had these colonies and then went in and we took all the oil and we took
3:04:51 all the diamonds and we took all the whatever lithium or whatever it is, right?
3:04:55 Well, for some reason, we realized that that’s a deeply evil thing to do when it’s a physical
3:04:58 resource, when it’s a non-conscious physical matter.
3:05:02 For some reason, we think it’s completely morally acceptable to do it with human capital.
3:05:08 In fact, we think it’s glorious and beautiful and wonderful and the great flowering of peace
3:05:10 and harmony and moral justice of our time to do it.
3:05:13 And we don’t think for one second what we’re doing to the countries that we’re pulling
3:05:15 all these people out of.
3:05:18 And this is one of these things like I don’t know, like maybe we’re just going to live
3:05:22 in this delusional state forever and we’ll just keep doing it and it’ll keep benefiting
3:05:23 us and we just won’t care what happens.
3:05:27 But like, I think there may come, this is one of these, this is like one of these submarines
3:05:28 under 10 feet under the waterline.
3:05:32 Like, I think it’s just a matter of time until people suddenly realize, “Oh my god, what
3:05:33 are we doing?”
3:05:37 Because like, we need the rest of the world to succeed too, right?
3:05:39 Like, we need these other countries to like flourish.
3:05:42 Like we don’t want to be the only successful country in the middle of just like complete
3:05:46 chaos and disaster and we just extract and we extract and we extract and we don’t think
3:05:47 twice about it.
3:05:51 Well, this is so deeply profound actually.
3:05:55 So what is the cost of winning, quote unquote?
3:06:01 If these countries are drained in terms of human capital on the level of geopolitics,
3:06:02 what does that lead to?
3:06:08 Even if we talk about wars and conflict and all of this, we actually want them to be strong
3:06:13 in the way we understand it’s strong, not just in every way.
3:06:19 So that cooperation and competition can build a better world for all of humanity.
3:06:22 It’s interesting.
3:06:27 This is one of those truths where you just speak and it resonates and I didn’t even
3:06:28 think about it.
3:06:29 Yeah, exactly.
3:06:34 So this is, you were sitting in the June the holidays, he said, just boiling over.
3:06:39 So all that said, there’s still to use some good to the H1B.
3:06:40 Okay.
3:06:42 So then you get this other, okay.
3:06:43 So then there’s, quote, come all the way around.
3:06:44 There’s another nuance.
3:06:45 So there’s another nuance.
3:06:48 There’s another nuance, which is mostly the value we don’t use H1Bs anymore.
3:06:49 Mostly we use O1s.
3:06:55 So there’s a separate class of visa and O1 is like this.
3:06:57 It turns out the O1 is the super genius visa.
3:06:59 So the O1 is basically our founder.
3:07:02 Like when we have like a, when we have somebody from anywhere in the world and they’ve like
3:07:06 invented a breakthrough new technology and they want to come to the U.S. to start a company,
3:07:11 they come in through an O1 visa and that actually is like a, it’s a fairly high bar.
3:07:13 It’s a high acceptance rate, but it’s like a pretty high bar and they, they do a lot
3:07:17 of work and they, there’s like a, you have to put real work into it and really, really
3:07:19 prove your case.
3:07:24 Mostly what’s happened with the H1B visa program is that it has gone to basically two categories
3:07:25 of employers.
3:07:29 One is the basically a small set of big tech companies that hire in volume, which is exactly
3:07:31 the companies that you would think.
3:07:34 And then the other is it goes to these, what they call kind of the mills, the consulting
3:07:35 mills, right.
3:07:38 And so there’s these set of companies with names I don’t want to pick on companies,
3:07:43 you know, names like Cognizant that, you know, hire basically have in their business model
3:07:47 is primarily Indian being in primarily Indians in large numbers.
3:07:51 And you know, they often have, you know, offices next to company owned housing and they’ll
3:07:53 have, you know, organizations that are, you know, they’ll have, you know, organizations
3:07:56 that are literally thousands of Indians, you know, living and working in the U.S. and
3:08:01 they do basically call it mid-tier like IT consulting.
3:08:04 So you know, these folks, they’re making good, good, good, good wages, but they’re making
3:08:11 68 a year, $100,000 a year, not the, you know, 300,000 that you’d make in the valley.
3:08:15 And so like in practice, the startups basic like little tech, as we call it or the startup
3:08:20 world mainly doesn’t use H1Vs at this point, and mainly can’t because the system is kind
3:08:23 of rigged in a way that we really can’t.
3:08:26 And then, and then, and then again, you get to the sort of underlying morality here, which
3:08:30 is it’s like, well, you know, Amazon like Amazon’s in like, I love Amazon, but like
3:08:33 they’re a big powerful company, you know, they’ve got, you know, more money than God,
3:08:37 they’ve got resources, they’ve got long-term planning horizon, they do big, you know, profound
3:08:42 things over, you know, decades at a time, you know, they could, you know, or any of
3:08:45 these other companies could launch massively effective programs to go recruit the best
3:08:48 and brightest from all throughout the country.
3:08:52 And, you know, you’ll notice they don’t do that, you know, they bring in, you know, 10,000,
3:08:55 20,000 H1Vs a year.
3:08:57 And so you’ve got a question there.
3:09:00 And then these mills, like there’s lots of questions around them and whether they should,
3:09:03 you know, whether that’s even a ethical way, you know, I don’t want to say they’re unethical,
3:09:08 but there’s questions around like exactly what the trade-offs are there.
3:09:11 And so, you know, this, yeah, and this is like a Pandora’s box that really, you know,
3:09:16 nobody really wanted to be opened, you know, to play devil’s advocate on all this in terms
3:09:19 of like national immigration issues, you know, none of this is like a top-end issue just
3:09:21 because the numbers are small, right.
3:09:24 And so, you know, I don’t think, you know, the administration has said like, this is
3:09:27 not like a priority of theirs for right now.
3:09:30 But I guess what I would say is like there is actually a lot of complexity and nuance
3:09:32 here.
3:09:35 I have a lot of friends, like I said, I have a lot of friends and colleagues who are,
3:09:39 you know, who came over on H1Vs or O1s, green cards, many are now citizens.
3:09:42 And you know, every single one of them was not every single one.
3:09:45 A lot of them were enthusiastic to, you know, defend the honor of immigrants throughout
3:09:46 this whole period.
3:09:48 And they said to me, it’s like, well, Mark, how can we, you know, how can we, how can
3:09:51 we more clearly express, you know, the importance of high school immigration to the U.S.?
3:09:57 And I was like, I think you can do it by advocating for also developing our native born talent.
3:10:01 Like, do you want to inflame the issue or do you want to diffuse the issue?
3:10:02 Right.
3:10:04 And I think the answer is to diffuse the issue.
3:10:09 Let me give you one more positive scenario, which, and then I’ll also beat up on the university
3:10:10 some more.
3:10:14 Do you know about the National Merit Scholarship System?
3:10:16 Have you heard about this?
3:10:17 Not really.
3:10:22 So there’s a system that was created during the Cold War called the National Merit Scholars.
3:10:27 And it is a basically, it was created, I forget, in the 1950s or ’60s when it was when people
3:10:31 in government actually wanted to identify the best and the brightest as heretical an
3:10:33 idea as that sounds today.
3:10:39 And so it’s basically a national talent search for basically IQ.
3:10:44 Its goal is to identify basically the top 0.5% of the IQ in the country.
3:10:46 By the way, completely regardless of other characteristics.
3:10:51 So there’s no race, gender, or any other aspect to it is just going for straight intelligence.
3:10:57 It uses the, first the PSAT, which is the preparatory SAT that you take, and then the SAT.
3:11:02 So it uses those scores, that is the scoring, it’s a straight PSAT-SAT scoring system.
3:11:09 So they use the SAT as a proxy for IQ, which it is.
3:11:13 They run this every year, they identify, it’s like they get down to like 1% of the population
3:11:17 of the kids, 18-year-olds in a given year who score highest on the PSAT, and then they
3:11:22 get down to further qualify down to the 0.5% that also replicate on the SAT.
3:11:25 And then it’s like the scholarship amount is like $2,500, right?
3:11:30 So it’s like, it was a lot of money 50 years ago, not as much today, but it’s a national
3:11:33 system being run literally to find the best and the brightest.
3:11:37 How many of our great and powerful universities use this as a scouting system?
3:11:39 Like, our universities all have sports teams.
3:11:44 They all have national scouting, full-time scouts who go out and they go to every high
3:11:47 school and they try to find all the great basketball players and bring them into the
3:11:50 NCAA, into all these leagues.
3:11:53 How many of our great and powerful and enlightened universities use the national merit system
3:11:58 to go do a talent search for the smartest kids and just bring them in?
3:12:02 Let me guess, very few, zero.
3:12:03 As you say it, that’s brilliant.
3:12:07 There should be that same level of scouting for talent internally.
3:12:08 Go get the smartest ones.
3:12:11 I’ll give you one more clicker on this topic if you’re not, if I haven’t beaten it to
3:12:12 death.
3:12:16 The SAT has changed.
3:12:22 The SAT used to be a highly accurate proxy for IQ that caused a bunch of problems.
3:12:25 People really don’t like the whole idea of IQ.
3:12:29 The SAT has been actively managed over the last 50 years by the college board that runs
3:12:32 it and it has been essentially like everything else.
3:12:37 It’s been dumbed down in two ways.
3:12:42 Number one has been dumbed down where an 800 from 40 years ago does not mean what an 800
3:12:43 means today.
3:12:48 40 years ago it was almost impossible to get an 800.
3:12:53 Today there’s so many 800s that you could stock the entire Ivy League with 800s.
3:12:55 It’s been deliberately dumbed down.
3:12:59 Then two is they have tried to pull out a lot of what’s called the G-loading.
3:13:03 They’ve tried to detach it from being an IQ proxy because IQ is such an inflammatory
3:13:04 concept.
3:13:07 The consequence of that is, and this is sort of perverse, they’ve made it more coachable
3:13:08 people.
3:13:13 Right, so the SAT 40 years ago coaching didn’t really work and more recently it has really
3:13:14 started to work.
3:13:18 One of the things you see is that the Asian spike, you see this giant leap upward in
3:13:21 Asian performance over the last decade and I think looking at the data, I think a lot
3:13:26 of that is because it’s more coachable now and the Asians do the most coaching.
3:13:28 There’s a bunch of issues with this.
3:13:31 The coaching thing is really difficult because the coaching thing is a subsidy then to the
3:13:34 kids whose parents can afford coaching.
3:13:37 I don’t know about you, but where I grew up there was no SAT coaching.
3:13:38 There’s like an issue there.
3:13:41 I didn’t even know what the SAT was until the day I took it, much less that there was
3:13:45 coaching, much less that it could work, so much less we could afford it.
3:13:46 So number one, there’s issues there.
3:13:50 But the other issue there is think about what’s happened by the dumbing down.
3:13:55 800 no longer captures all the smart, 800 is too crude of a test.
3:13:57 It’s like the AI benchmarking problem.
3:13:59 It’s the same problem they have in AI benchmarking right now.
3:14:02 800 is too low of a threshold.
3:14:06 There are too many kids scoring 800 because what you want is you want whatever, if it’s
3:14:09 going to be 100,000 kids, I don’t know what it is, it’s going to be 50,000 kids a year
3:14:10 scoring 800.
3:14:15 You also then want kids to be able to score 900 and 1100 and 1200 and you want to ultimately
3:14:19 get to, you’d like to ultimately identify the top 100 kids and make sure that you get
3:14:21 them in MIT.
3:14:25 And the resolution of the test has been reduced so that it actually is not useful for doing
3:14:26 that.
3:14:29 And again, I would say this is like part of the generalized corruption that’s taken
3:14:33 place throughout this entire system where we have been heading in the reverse direction
3:14:37 from wanting to actually go get the best and brightest and actually put them in the places
3:14:38 where they should be.
3:14:41 And then just the final comment would be the great thing about standardized testing and
3:14:45 the national merit system is, like I said, it’s completely race blind, it’s gender blind,
3:14:47 it’s blind on every other characteristic.
3:14:49 It’s only done on test scores.
3:14:54 And you can make an argument about whether that’s good or bad, but it is for sure, it’s
3:14:57 the closest thing that we had to get to merit.
3:15:00 It was the thing that they did when they thought they needed merit to win the Cold War.
3:15:03 And of course, we could choose to do that anytime we want.
3:15:07 And I just say, I find it like incredibly striking and an enormous moral indictment
3:15:10 of the current system that there are no universities to do this today.
3:15:13 So back to the immigration thing just real quick, it’s like, okay, we aren’t even trying
3:15:16 to go get the smart kids out of the center or south.
3:15:19 And even if they think that they can get into these places, they get turned down.
3:15:21 And the same thing for the smart Asians and the same thing for the smart Jews and the
3:15:23 same thing for the smart black people.
3:15:29 And like, it’s just like, I don’t know how, like, I don’t know how that’s moral.
3:15:31 Like, I don’t get it at all.
3:15:37 As you said about the 800, so I took the SAT and the ACT many times and I’ve always gotten
3:15:39 perfect on math 800.
3:15:47 It’s just, and I’m not that, I’m not special, like it doesn’t identify genius.
3:15:54 I think you want to search for genius and you want to create measures that find genius
3:15:57 of all different kinds, speaking of diversity.
3:16:06 And I guess we should reiterate and say over and over and over, defend immigrants, yes,
3:16:09 but say we should hire more and more native born.
3:16:13 Well, you asked me in the beginning, like, what’s the most optimistic forecast, right,
3:16:21 that we could have in the most optimistic forecast would be my God, what if we did both?
3:16:25 So that’s the reasonable, the rational, the smart thing to say here.
3:16:26 In fact, we don’t have to have a war.
3:16:30 Well, it would diffuse, it would diffuse the entire issue.
3:16:32 If everybody in the center in the south of the country and every Jewish family, Asian
3:16:37 family, black family knew they were getting a fair shake, like it would diffuse the issue.
3:16:38 Like how about diffusing the issue?
3:16:43 Like what a crazy radical, sorry, I don’t mean to really get out of my skis here, but
3:16:47 I think your profile on X states it’s time to build.
3:16:52 It feels like 25, 2025 is a good year to build.
3:17:02 So I wanted to ask your advice and maybe for advice for anybody who’s trying to build,
3:17:08 who’s trying to build something useful in the world, maybe launch a startup or maybe just
3:17:14 launch apps, services, whatever, ship software products.
3:17:21 So maybe by way of advice, how do you actually get to shipping?
3:17:24 So I mean, a big part of the answer I think is we’re in the middle of a legit revolution
3:17:29 and I know you’ve been talking about this on your show, but like AI coding, I mean,
3:17:34 this is the biggest earthquake to hit software in certainly my life, maybe since the investment
3:17:35 software.
3:17:39 And I’m sure we’re involved in various of these companies, but these tools from a variety
3:17:46 of companies are absolutely revolutionary and they’re getting better at leaps and bounds
3:17:47 right every day.
3:17:52 You know all this, but the thing with coding, there’s open questions of whether AI can get
3:17:57 better at understanding philosophy or creative writing or whatever, but for sure we can make
3:18:01 it much better at coding because you can validate the results of coding.
3:18:05 And so there’s all these methods of synthetic data and self-training and reinforcement learning
3:18:07 that for sure you can do with coding.
3:18:12 And so everybody I know who works in the field says AI coding is going to get to be phenomenally
3:18:14 good and it’s already great.
3:18:17 And you can, I mean, anybody wants to see this just go on YouTube and look at AI coding
3:18:21 demos, you know, little kids making apps in 10 minutes working with an AI coding system.
3:18:23 And so I think it’s the golden age.
3:18:25 I mean, I think this is an area where it’s clearly the golden age.
3:18:29 The tool set is extraordinary, you know, in a day as a coder for sure in a day, you can
3:18:34 retrain yourself, you know, start using these things, get a huge boost in productivity as
3:18:37 a non-coder you can learn much more quickly than you could before.
3:18:41 That’s actually a tricky one in terms of learning as a non-coder to build stuff.
3:18:45 But still, I feel like you still need to learn how to code.
3:18:47 It becomes a superpower.
3:18:49 It helps you be much more productive.
3:18:56 Like you could legitimately be a one person company and get quite far.
3:18:57 I agree with that up to a point.
3:19:03 So the, I think for sure for quite a long time, the people who are good at coding are going
3:19:06 to be the best at actually having AI’s code things because they’re going to understand
3:19:08 what, I mean, very basic, they’re going to understand what’s happening, right?
3:19:11 And they’re going to be able to evaluate the work and they’re going to be able to, you
3:19:13 know, literally like manage AI’s better.
3:19:16 Like even if they’re not literally handwriting the code, they’re just going to have a much
3:19:17 better sense of what’s going on.
3:19:21 So I definitely think like 100%, my nine year old is like doing all kinds of coding classes
3:19:24 and he’ll keep doing that for certainly through 18.
3:19:26 We’ll see after that.
3:19:29 And so for sure that’s the case.
3:19:32 But look, having said that, one of the things you can do with an AI is say, teach me how
3:19:35 to code, right?
3:19:40 And so, and you know, there’s a whole bunch of, you know, I’ll name names, you know,
3:19:43 like there’s a whole bunch of work that they’re doing economy for free.
3:19:47 And then, you know, we have this company, Replet, which is originally specifically built
3:19:52 for kids for coding that is as AI built in, that’s just absolutely extraordinary now.
3:19:56 And then, you know, there’s a variety of other systems like this.
3:20:00 And yeah, I mean, the AI is going to be able to teach you to code AI, by the way, is as
3:20:04 you know, spectacularly good at explaining code, right?
3:20:08 And so, you know, the tools have these features now where you can talk to the code base.
3:20:12 So you can like literally like ask the code based questions about itself.
3:20:15 And you can also just do the simple form, which is you can copy and paste code into
3:20:20 chat GPT and just ask it to explain it, what’s going on, rewrite it, improve it, make recommendations.
3:20:23 And so there’s, yeah, there’s dozens of ways to do this.
3:20:26 By the way, you can also, I mean, even more broadly than code, like, you know, okay, you
3:20:31 want to make a video game, okay, now you can do AI, art generation, sound generation, dialogue
3:20:34 generation, voice generation, right?
3:20:37 And so all of a sudden, like you don’t need designers, you know, you don’t need, you know,
3:20:38 voice actors.
3:20:43 You know, so yeah, so there’s just like unlimited, and then, you know, because, you know, a big
3:20:47 part of coding is so called glue, you know, it’s interfacing into other systems.
3:20:50 So it’s interfacing in a, you know, stripe to take payments or something like that.
3:20:54 And you know, AI is fantastic at writing glue code.
3:20:57 So you know, really, really good at making sure that you can plug everything together,
3:21:01 really good at helping you figure out how to deploy, you know, it’ll even write a business
3:21:03 plan for you.
3:21:06 So it’s just this, it’s like everything happening with AI right now, it’s just it’s like this
3:21:10 latent superpower, and there’s this incredible spectrum of people who have really figured
3:21:14 out massive performance increases, productivity increases with it already.
3:21:16 There’s other people who aren’t even aware it’s happening.
3:21:21 And there’s some gearing to whether you’re a coder or not, but I think there are lots
3:21:23 of non coders that are after the races.
3:21:27 And I think there are lots of professional coders who are still like, you know, the blacksmiths
3:21:32 were not necessarily in favor of, you know, car business.
3:21:36 So yeah, there’s the old William Gibson quote, the future is here, it’s just not evenly
3:21:37 distributed yet.
3:21:41 And this is maybe the most potent version of that that I’ve ever seen.
3:21:48 Yeah, there’s a, you know, the old meme with the, with the bell curve, the people on both
3:21:51 extremes say AI coding is the future.
3:21:52 Right.
3:21:56 It’s very common to the programmers to say, you know, if you’re any good of a programmer,
3:21:57 you’re not going to be using it.
3:21:58 That’s just not true.
3:22:04 No, I consider myself a reasonably good programmer and I, my productivity has been just skyrocketed
3:22:12 and the joy of programming skyrocketed is every aspect of programming is more efficient,
3:22:15 more productive, more fun, all that kind of stuff.
3:22:19 I would also say code is, you know, code has, code has of anything in like industrial society,
3:22:24 code has the highest elasticity, which is to say the easier it is to make it, the more
3:22:25 it gets made.
3:22:29 I think effectively there’s unlimited demand for code, like in other words, like there’s
3:22:34 always some other idea for a thing that you can do, a feature that you can add or a thing
3:22:36 that you can optimize.
3:22:40 And so, and so like overwhelmingly, you know, the amount of code that exists in the world
3:22:43 is a fraction of even the ideas we have today and then we come up with new ideas all the
3:22:44 time.
3:22:50 And so I think that like, you know, I was, I was late 80s, early 90s, when sort of automated
3:22:53 coding systems started to come out, expert systems, big deal in those days.
3:22:56 And there were all these, there was a famous book called The Decline and Fall of the American
3:22:59 Programmer, you know, that predicted that these new coding systems were going to mean
3:23:00 we wouldn’t have programmers in the future.
3:23:04 And of course, the number of programming jobs exploded by like a factor of 100.
3:23:07 Like my guess will be, we’ll have more, my guess is we’ll have more coding jobs probably
3:23:11 by like an order of magnitude 10 years from now.
3:23:12 That will be different.
3:23:13 There’ll be different jobs.
3:23:17 They’ll involve orchestrating AI, but there will be, we will be creating so much more
3:23:21 software that the whole industry will just explode in size.
3:23:26 Are you seeing the size of companies decrease in terms of startups?
3:23:28 What’s the landscapes of little tech?
3:23:31 All we’re seeing right now is the AI hiring boom of all time.
3:23:37 Oh, for the big tech people and little tech, everybody’s trying to hire as many engineers
3:23:38 as they can to build AI systems.
3:23:40 It’s just, it’s a hundred percent.
3:23:44 I mean, there’s a handful of company, you know, there’s a little bit, there’s customer
3:23:45 service.
3:23:48 You know, there, we have some companies and others, I think it’s Klarna that’s publicizing
3:23:55 a lot of this in Europe where, you know, there are jobs that can be optimized and jobs that
3:23:56 can be automated.
3:24:02 But like for engineering jobs, like it’s just an explosion of hiring, that at least so far
3:24:05 there’s no trace of any sort of diminishing effect.
3:24:07 Now, having said that, I am looking forward to the day.
3:24:12 I am waiting for the first company to walk in saying, yes, like the more radical form
3:24:13 of it.
3:24:16 So basically the companies that we see are basically one of two kinds.
3:24:20 We see the companies that are basically, sometimes use weak form, strong form.
3:24:25 So the weak form companies sometimes use the term, it’s called the sixth bullet point.
3:24:28 AI is the sixth bullet point on whatever they’re doing.
3:24:29 Sure.
3:24:30 Right.
3:24:31 And it’s on the slide, right?
3:24:33 So they’ve got the, you know, whatever, dot, dot, dot, dot, and then AI is the sixth thing.
3:24:35 And the reason AI is the sixth thing is because they had already previously written the slide
3:24:37 before the AI revolution started.
3:24:40 And so they just added the sixth bullet point on the slide, which is how you’re getting
3:24:44 all these products that have like the AI button up in the corner, right, the little sparkly
3:24:45 button.
3:24:46 Right.
3:24:48 And all of a sudden, Gmail is offering to summarize your email, which I’m like, I don’t
3:24:49 need that.
3:24:53 Like I need you to answer my email, not summarize it like what the hell.
3:24:54 Okay.
3:24:55 So we see those.
3:24:56 And that’s fine.
3:24:59 That’s like, I don’t know, putting sugar on the cake or something.
3:25:02 But then we see the strong form, which is the companies that are building from scratch
3:25:03 for AI.
3:25:04 Right.
3:25:05 And they’re building it.
3:25:08 I actually just met with a company that is building literally an AI email system as an
3:25:09 example.
3:25:10 Oh, nice.
3:25:11 I can’t wait.
3:25:12 Yeah.
3:25:13 They’re going to completely, right.
3:25:14 It’s going to be an obvious idea.
3:25:15 Very smart team.
3:25:17 You know, it’s going to be great.
3:25:20 And then, you know, Notion just, you know, another, not one of our companies, but just
3:25:21 came out with a product.
3:25:24 And so now companies are going to basically come through, sweep through, and they’re going
3:25:27 to do basically AI first versions of basically everything.
3:25:31 And those are like companies built, you know, AI is the first bullet point is the strong
3:25:32 form of the argument.
3:25:33 Yeah.
3:25:34 Cursor is an example of that.
3:25:38 They basically said, okay, we’re going to rebuild the thing with AI as the first citizen.
3:25:41 What if we knew from scratch that we could build on this?
3:25:45 And again, this is like, this is part of the full employment act for startups and VCs is
3:25:50 it just like if a technology transformation is efficiently powerful, then you actually
3:25:54 need to start the product development process over from scratch because you need to reconceptualize
3:25:55 the product.
3:25:58 And then usually what that means is you need a new company because most incumbents just
3:25:59 won’t do that.
3:26:02 And so, yeah, so that’s underway across many categories.
3:26:07 What I’m waiting for is the company where it’s like, no, our org chart is redesigned
3:26:08 as a result of AI, right?
3:26:12 And so, I’m looking, I’m waiting for the company where it’s like, no, we’re going to have like,
3:26:15 you know, and the cliche, here’s a thought experiment, right?
3:26:18 The cliche would be we’re going to have like the human executive team and then we’re going
3:26:20 to have the AI’s be the workers, right?
3:26:25 So we’ll have VP of engineering supervising 100 instances of coding agents, right?
3:26:26 Okay, maybe.
3:26:27 Right.
3:26:31 By the way, or maybe, maybe the VP of engineering should be the AI.
3:26:34 Maybe supervising human coders who are supervising AI’s, right?
3:26:39 Because one of the things that AI should be pretty good at is managing because it’s like
3:26:41 not, you know, it’s like a process driven.
3:26:43 It’s the kind of thing that AI is actually pretty good at, right?
3:26:46 Performance evaluation coaching.
3:26:49 And so, should it be an AI executive team?
3:26:54 And then, you know, and then of course the ultimate question, which is AI CEO, right?
3:26:57 And then, you know, and then there’s, and then maybe the most futuristic version of it would
3:27:00 be an actual AI agent that actually goes fully autonomous.
3:27:01 Yeah.
3:27:04 What if you really set one of these things loose and let it, let it basically build itself
3:27:05 a business?
3:27:08 And so I will say like, we’re not yet seeing those.
3:27:13 And I think there’s a little bit of the systems aren’t quite ready for that yet.
3:27:16 And then I think it’s a little bit of, you really do need at that point, like a founder
3:27:21 who’s really willing to break all the rules and really willing to take the swing.
3:27:22 And those people exist.
3:27:23 And so I’m sure we’ll see that.
3:27:27 And some of it is, as you know, with all the startups, this is the execution.
3:27:34 The idea that you have a AI first email client, seems like an obvious idea, but actually creating
3:27:38 one, executing it and then taking on Gmail is really, is really difficult.
3:27:45 I mean, Gmail, it’s fascinating to see Google can’t do it because, because why?
3:27:49 Because the momentum, because it’s hard to re-engineer the entirety of the system feels
3:27:52 like Google is perfectly positioned to, to do it.
3:27:59 Same with like your perplexity, which I love, like Google could technically take on perplexity
3:28:02 and do it much better, but they haven’t, not yet.
3:28:06 So it’s fascinating why that is for large companies.
3:28:08 I mean, that, that is an advantage for little tech.
3:28:09 They could be agile.
3:28:10 Yeah, that’s right.
3:28:11 They could move fast.
3:28:12 Yeah.
3:28:14 Little companies can break glass in a way big companies can’t.
3:28:15 Right.
3:28:18 This is sort of the big breakthrough that Clay Christians had in the innovators dilemma,
3:28:21 which is sometimes when big companies don’t do things, it’s because they’re screwing up
3:28:23 and that certainly happens.
3:28:26 But a lot of times they don’t do things because it would break too much glass.
3:28:30 It was specifically, it would, it would interfere with their existing customers and their existing
3:28:31 businesses.
3:28:32 And they just simply won’t do that.
3:28:34 And by the way, responsibly, they shouldn’t do that.
3:28:35 Right.
3:28:41 And so they just get, Clay Christians, this big thing is they, they often don’t adapt
3:28:46 because they are well run, not because they’re probably run, but they’re optimizing machines.
3:28:49 They’re, they’re, they’re optimizing, I guess, existing business and, and, and, and as, as
3:28:54 you kind of just said, this is like a permanent state of affairs for large organizations, like
3:28:56 every once in a while, one breaks the pattern and actually does it.
3:28:59 But for the most part, like this is a very predictable form of human behavior.
3:29:03 And this fundamentally is why startups exist.
3:29:08 It feels like 2025 is when the race for dominance and AI will see some winners.
3:29:10 Like it’s a big year.
3:29:12 So who do you think wins the race?
3:29:16 Open AI, Meta, Google, XAI, who do you think wins the AI race?
3:29:18 I would say, I’m not going to predict, I’m going to say there’s questions all over the
3:29:19 place.
3:29:22 And then we have, we have this category of question we call the trillion dollar question,
3:29:26 which is like literally depending on how it’s answered, people make or lose a trillion dollars.
3:29:30 And I think there’s like, I don’t know, five or $6 trillion questions right now that are
3:29:33 hanging out there, which is an unusually large number.
3:29:36 And I just, you know, I’ll just hit a few of them and we can talk about them.
3:29:38 So one is big models versus small models.
3:29:40 Another is open models versus closed models.
3:29:44 Another is whether you can use synthetic data or not.
3:29:45 Another is chain of thought.
3:29:48 How far can you push that in reinforcement learning?
3:29:52 And then another one is political trillion dollar questions, policy questions, which,
3:29:57 you know, the U.S. and the EU have both been flunking dramatically and the U.S. hopefully
3:29:59 is about to really succeed at.
3:30:00 Yeah.
3:30:03 And then there’s probably another, you know, half dozen big important questions after that.
3:30:08 And so these are all just like, say, this is an industry that’s in flux in a way that
3:30:11 I even more dramatic, I think, than the ones I’ve seen before.
3:30:15 And look, the most example, most obvious example, the flux is sitting here three, sitting here
3:30:19 in the summer, you know, sitting here less than three years ago, sitting here in December
3:30:23 22, we would have said that OpenAI is just running away with everything.
3:30:27 And sitting here today, it’s like, you know, there’s at least six, you know, world-class
3:30:33 God model companies and teams that are, by the way, generating remarkably similar results.
3:30:36 That’s actually been one of the most shocking things to me is like, it turns out that once
3:30:40 you know that it’s possible to build one incredibly smart Turing test passing large
3:30:44 language model, which was a complete shock and surprise to the world.
3:30:48 It turns out within, you know, a year you can have five more.
3:30:51 There’s also a money component thing to it, which is to get the money to scale one of
3:30:53 these things into the billions of dollars.
3:30:56 There’s basically right now only two sources of money that will do that for you.
3:31:00 One is the hyperscalers giving you the money, which you turn around and round trip back
3:31:01 to them.
3:31:05 Or, you know, foreign sovereigns, you know, other, you know, country sovereign sovereign
3:31:10 wealth funds, which can be, you know, difficult in some cases for companies to access.
3:31:14 So there’s a, there’s another, there’s maybe another trillion dollar question is the financing
3:31:15 question.
3:31:16 Here’s one.
3:31:19 Sam Altman has been public about the fact that he wants to transition open AI from being
3:31:21 a nonprofit, being a for-profit.
3:31:25 The way that that is legally done is that, and there is a way to do it, there is a way
3:31:30 in U.S. law to do it, the IRS and other legal entities, government entities scrutinize this
3:31:34 very carefully because the U.S. takes foundation nonprofit law very seriously because of the
3:31:36 tax exemption.
3:31:40 And so the way that, historically the way that you do it is you start a for-profit and
3:31:44 then you, you raise money with the for-profit to buy the assets of the nonprofit at fair
3:31:47 market value.
3:31:51 And you know, the last financing round at open AI was, you know, 150 some billion dollars.
3:31:56 And so logically, the, if, if, if the flip is going to happen, the for-profit has to
3:32:02 go raise 150 billion dollars out of the chute to buy the assets, you know, raising 150 billion
3:32:03 is a challenge.
3:32:06 Um, so, you know, is that even possible?
3:32:09 If that is possible, then open AI maybe as often the races as a for-profit company.
3:32:13 If not, you know, you know, I don’t know, and then, you know, obviously the Elon lawsuit.
3:32:17 So, so just because they’re the market leader today, you know, there’s big important questions
3:32:18 there.
3:32:20 You know, Microsoft has this kind of love-hate relationship with them.
3:32:21 Where does that go?
3:32:25 Apple’s, you know, lagging badly behind, but, you know, they’re very good at catching up.
3:32:29 Amazon, you know, is primarily a hyperscalar, but they now have their own models.
3:32:33 And then there’s the other questions like you laid out brilliantly, briefly and brilliantly,
3:32:39 open versus closed, big versus little models, synthetic data, that’s a huge, huge question.
3:32:45 And then test on compute with a chain of thought, the role of that, and this is fascinating.
3:32:48 And these are, I think it’s fair to say, trillion-dollar questions.
3:32:49 You know, these are big.
3:32:51 Like, look, you know, it’s like, here’s a trillion-dollar question, which is kind of
3:32:54 embedded in that, which is just hallucinations, right?
3:32:58 Like, so if you are trying to use these tools creatively, you’re thrilled because they can
3:33:02 draw new images and they can make new music and they can do all this incredible stuff,
3:33:03 right?
3:33:04 They’re creative.
3:33:07 The flip side of that is if you need them to be correct, they can’t be creative.
3:33:11 That’s, you know, the term hallucination and these things do hallucinate.
3:33:16 And you know, there have been, you know, court cases already where lawyers have submitted
3:33:20 legal briefs that contain made-up court citations, case citations, the judge is like, wait a
3:33:21 minute, this doesn’t exist.
3:33:24 And the very next question is, did you write this yourself?
3:33:26 And the lawyer goes, “Uh…”
3:33:30 I mean, that’s why you, along with Grock, looking for truth.
3:33:33 I mean, that’s an open, technical question.
3:33:35 How close can you get to truth with LLMs?
3:33:36 Yeah, that’s right.
3:33:43 And my sense, this is very contentious topic at the industry, my sense is if to the extent
3:33:47 that there is a domain in which there is a definitive and checkable and provable answer,
3:33:51 and you might say math satisfies that, coding satisfies that, and maybe some other fields,
3:33:54 then you should be able to generate synthetic data.
3:33:55 You should be able to do chain of thought reasoning.
3:33:57 You should be able to do reinforcement learning.
3:34:02 And you should be able to ultimately, you know, eliminate hallucinations for, but by the way,
3:34:05 that’s a trillion dollar question right there as to whether that’s true.
3:34:08 But then, but then there’s question like, okay, is that going to work in the more general
3:34:09 domain?
3:34:12 Like, so for example, one possibility is these things are going to get truly superhuman like
3:34:17 math and coding, but at like discussing philosophy, they’re going to just, they’re basically as
3:34:19 smart as they’re ever going to be.
3:34:23 And they’re going to be kind of, you know, say mid-wit grad student level.
3:34:26 And the theory there would just be they’re already out of training data.
3:34:30 Like they literally, you know, you talk to these people like literally the big models,
3:34:33 the big models are like within a factor of two X of consuming all the human generated
3:34:36 training data to the point that some of these big companies are literally hiring people
3:34:39 like doctors and lawyers to sit and write new training data by hand.
3:34:42 And so does this mean that like you have to, if you want your model to be better at philosophy,
3:34:45 you have to go hire like a thousand philosophers and have them write new content?
3:34:47 And is anybody going to do that?
3:34:50 And so, you know, maybe, maybe these things are topping out in certain ways and they’re
3:34:52 going to leap way ahead in other ways.
3:34:57 And so anyway, so we just don’t, you know, I guess this is maybe my main conclusion is
3:35:02 I don’t any of these, anybody tell anybody telling you these big sweeping conclusions,
3:35:05 you know, this whole super, you know, all of these abstract generalized super intelligence
3:35:09 AGI stuff, like it, you know, maybe it’s the engineer in me, but like, no, like that’s
3:35:16 not the, that’s not the, that’s too abstract, like it’s got to actually work.
3:35:18 And then by the way, it has to actually pay for it.
3:35:22 I mean, this is a problem right now with the, you know, the big models, the big models that
3:35:25 are like really good at coding a math, they’re like actually very expensive to run.
3:35:28 You know, they’re quite slow.
3:35:33 Another trillion dollar question, future chips, which I know you’ve talked a lot about.
3:35:37 Another trillion dollar question, yeah, I mean, all the global issue, oh, another trillion
3:35:43 dollar question, censorship, right, like, and, and, and, and all the, as they say, all
3:35:48 the human feedback training process.
3:35:49 Exactly what are you training these things to do?
3:35:51 What are they allowed to talk about?
3:35:55 How long do they give you these and how often do they give these incredibly preachy moral
3:35:56 lectures?
3:35:59 Here’s a, here’s a, here’s a good, here’s a trillion dollar question.
3:36:05 How many other countries want their country to run its education system, healthcare system,
3:36:08 news system, political system on the basis of an AI that’s been trained according to
3:36:13 the most extreme left-wing California politics, right, because that’s kind of what they have
3:36:15 on offer right now.
3:36:17 And I think the answer to that is not very many.
3:36:22 So there’s like massive open questions there about like what, you know, and by the way,
3:36:25 like what morality of these things are going to get trained on as a.
3:36:32 In that one, we’re cracking wide open with what’s been happening over the past few months,
3:36:38 censorship on every level of these companies and just the very idea what truth means and
3:36:45 what it means to be, expand the Overton window of LLMs or the Overton window of human discourse.
3:36:47 So what, what I experienced, you know, going back to how we started, what I experienced
3:36:53 was, all right, social media censorship regime from hell, debanking, right, at like large
3:36:58 scale, and then the war on the crypto industry, trying to kill it, and then basically declared
3:37:03 intent to do the same thing to AI and to put AI under the same kind of censorship and control
3:37:06 regime as social media and the banks.
3:37:11 And I think this election tips in America, I think this election tips us from a timeline
3:37:15 in which things were going to get really bad on that front to a timeline in which I think
3:37:17 things are going to be quite good.
3:37:21 But look, those same questions also apply outside the US and, you know, the EU is doing
3:37:25 their thing, they’re being extremely draconian, and they’re trying to lock in a political
3:37:27 censorship regime on AI right now.
3:37:29 That’s so harsh that even American AI companies are not even willing to launch new products
3:37:31 in the EU right now.
3:37:35 Like, that’s not going to last, but like what happens there, right?
3:37:38 And what are the tradeoffs, you know, what levels of censorship are American companies
3:37:42 going to have to sign up for if they want to operate in the EU or is the EU still capable
3:37:50 of generating its own AI companies or have we brain drained them so that they can’t.
3:37:52 So big questions.
3:37:53 Quick questions.
3:38:03 So you’re very active on X, a very unique character, flamboyant, exciting, bold.
3:38:05 You post a lot.
3:38:10 I think there’s a meme, I don’t remember it exactly, but Elon posted something like inside
3:38:12 Elon there are two wolves.
3:38:16 One is please be kind or more positive.
3:38:22 And the other one is, I think, you know, doing the, take a big step back and fuck yourself
3:38:24 in the face guy.
3:38:28 How many wolves are inside your mind when you’re tweeting?
3:38:30 To be clear, a reference from the comedy classic, “Tropic Thunder.”
3:38:31 “Tropic Thunder.”
3:38:32 Yeah.
3:38:33 Legendary movie.
3:38:34 Yes.
3:38:39 Any zoomers listening to this who haven’t seen that movie, go watch it immediately.
3:38:40 Yeah.
3:38:41 There’s nothing offensive about it.
3:38:50 I’m Cruz’s greatest performance.
3:38:55 So yeah, no, look, just start by saying like I’m not supposed to be tweeting at all.
3:38:56 So yeah.
3:38:57 Yes.
3:38:58 Yes.
3:38:59 Yes.
3:39:00 But you know.
3:39:01 So how do you approach that?
3:39:02 Like, how do you approach what to tweet?
3:39:03 I mean, I don’t.
3:39:08 Like, so it’s a, it’s a, I don’t, I don’t well enough.
3:39:10 It’s mostly an exercise in frustration.
3:39:13 Look, there’s a glory to it and there’s, there’s, there’s an issue with it and the glory of
3:39:18 it is like, you know, instantaneous global communication that, you know, in X in particular
3:39:21 is like the, you know, the town square on all these, you know, social issues, political
3:39:24 issues, everything else, current events.
3:39:26 But I mean, look, there’s no question, the format, the format of at least the original
3:39:29 tweet is, you know, prone to be inflammatory.
3:39:34 You know, I’m the guy who at one point, the entire nation of India hated me because I
3:39:38 once tweeted something and it turned out that it’s still politically sensitive and the entire
3:39:39 continent.
3:39:43 I stayed up all night that night as, as I became front page headline and leading television
3:39:46 news in each time zone in India for a single tweet.
3:39:50 So like the single tweet out of context is a very dangerous thing.
3:39:55 Obviously, X now has the middle ground where they, you know, they now have the longer form
3:39:56 essays.
3:40:01 And so, you know, probably the most productive thing I can do is, is longer form, is longer
3:40:02 form things.
3:40:05 You’re not going to do it though, are you?
3:40:06 I do, I do from time to time.
3:40:07 Sometimes.
3:40:08 I should, I should do more of them.
3:40:11 And then, yeah, I mean, look, and yeah, obviously X is doing great.
3:40:14 And then like I said, like stuff stack, you know, has become the center for a lot, you
3:40:15 know, a lot of them.
3:40:19 I think the best kind of, you know, deeply thought through, you know, certainly intellectual
3:40:23 content, you know, tons of current events, stuff there as well.
3:40:26 And then, yeah, so, and then there’s a bunch of other, you know, a bunch of new systems
3:40:27 that are very exciting.
3:40:30 So I think one of the things we can look forward to in the next four years is number one, just
3:40:34 like a massive reinvigoration of social media as a consequence of the changes that are happening
3:40:35 right now.
3:40:37 And I’m very excited to see the, to see what’s going to happen with that.
3:40:42 And then, I mean, it’s happening on X, but it’s now going to happen on other platforms.
3:40:47 And then the other is crypto is going to come, you know, crypto is going to come right back
3:40:48 to life.
3:40:49 And actually, that’s very exciting.
3:40:54 Actually, that’s worth noting is that’s another trillion dollar question on AI, which is in
3:40:58 a world of pervasive AI, and especially in a world of AI agents, imagine a world of billions
3:41:03 or trillions of AI agents running around, they need an economy.
3:41:07 And crypto, in our view, happens to be the ideal economic system for that, right?
3:41:08 Because it’s a programmable money.
3:41:10 It’s a very easy way to plug in and do that.
3:41:13 And there’s this transaction processing system that can do that.
3:41:16 And so I think the crypto-AI intersection, you know, is potentially very, a very, very
3:41:17 big deal.
3:41:22 And so that was, that was going to be impossible under the prior regime.
3:41:25 And I think under the new regime, hopefully, it’ll be something we can do.
3:41:30 Almost for fun, let me ask a friend of yours, Jan Lacoon, what are your top 10 favorite things
3:41:33 about Jan Lacoon?
3:41:37 He’s a, I think he’s a, he’s a brilliant guy.
3:41:38 I think he’s important to the world.
3:41:41 I think you guys disagree on a lot of things.
3:41:44 But I personally like vigorous disagreement.
3:41:48 I, as a person in the stands, like to watch the gladiators go at it.
3:41:50 No, he’s a super genius.
3:41:53 I mean, look, he, I haven’t said we’re super close, but you know, casual, casual friends,
3:41:56 I worked with him at Meta, you know, he’s the chief scientist at Meta for a long time
3:42:02 and it still, you know, works with us and, and, you know, and as obviously as a legendary
3:42:06 figure in the field and one of the main people responsible for what’s happening, it’s my
3:42:10 serious observation would be that it’s, it’s, it’s the thing I keep, I’ve talked to him
3:42:13 about for a long time and I keep trying to read and follow everything he does is he’s
3:42:19 probably, he is the, I think, see if you agree with this, he is the smartest and most credible
3:42:23 critic of LLMs is the path for AI.
3:42:26 And he’s not, you know, there’s certain, I would say troll like characters who are
3:42:30 just like crapping everything, but like, yeah, and has like very deeply thought through basically
3:42:35 theories as to why LLMs are an evolutionary dead end.
3:42:40 And I actually like, I try to do this thing where I try to model, you know, I try to have
3:42:43 a mental model of like the two different sides of a serious argument.
3:42:46 So I, I’ve tried to like internalize that argument as much as I can, which is difficult
3:42:49 because like we’re investing it behind LLMs as aggressively as we can.
3:42:54 So if he’s right, like, that can be a big problem, but like we should also know that.
3:42:59 And then I sort of use his ideas to challenge all the bullish people, you know, to really
3:43:01 kind of test their level of knowledge.
3:43:06 So I like to kind of grill people like I’m not like, I’m not, you know, I was not, you
3:43:09 know, I was got my CS degree 35 years ago.
3:43:12 So I’m not like deep in the technology, but like if, if to the extent I can understand
3:43:16 Jan’s points, I can use them to, you know, to really surface a lot of the questions for
3:43:18 the people who are more bullish.
3:43:20 And that’s been, I think, very productive.
3:43:21 Yeah.
3:43:24 So, yeah, it’s just, it’s very striking that you have somebody who is like that central
3:43:28 in the space who is actually like a full on, a full on skeptic.
3:43:31 And you know, and again, you could, this could go different ways.
3:43:33 He could end up being very wrong.
3:43:37 He could end up being totally right, or it could be that he will provoke the evolution
3:43:39 of these systems to be much better than they would have been.
3:43:40 Yeah.
3:43:41 He could be both right and wrong.
3:43:44 And first of all, I do, I do agree with that.
3:43:51 He’s one of the most legit and regress and deep critics of the LLM path to AGI, you know,
3:43:56 his basic notions that they’re needs, AI needs to have some physical understanding of the
3:43:57 physical world.
3:44:01 And that’s very difficult to achieve with LLM.
3:44:05 And that, that is a really good way to challenge the limitations of LLMs and so on.
3:44:11 He’s also been a vocal and a huge proponent of open source, which is a whole nother which
3:44:12 you have been as well.
3:44:13 Which is very useful.
3:44:14 Yeah.
3:44:15 And that’s been just fascinating to watch.
3:44:16 And anti-dumer.
3:44:17 Anti-dumer.
3:44:18 Yeah.
3:44:19 Yeah.
3:44:20 He’s, he’s, he’s very anti-dumer.
3:44:21 He embodies.
3:44:22 He also has many wolves.
3:44:23 He does.
3:44:24 He does.
3:44:25 He does.
3:44:26 He does.
3:44:27 So it’s been really, really fun to watch.
3:44:28 The other two.
3:44:29 Okay.
3:44:30 Here’s my other wolf coming out.
3:44:31 Yeah.
3:44:36 The other two of the three Godfathers of AI are like radicals, like, like full on left,
3:44:40 you know, far left, you know, like they, I would say like either Marxists or borderline
3:44:41 Marxists.
3:44:44 And they’re like, I think quite extreme in their social political views.
3:44:47 And I think that feeds into their demerism.
3:44:50 And I think, you know, they, they, they are lobbying for like draconian government.
3:44:54 I think what would be runnously destructive government legislation and regulation.
3:44:58 And so it’s, it’s actually super helpful, super, super helpful to have Jan as a counterpoint
3:44:59 to those two.
3:45:00 Another fun question.
3:45:02 Our mutual friend, Andrew Huberman.
3:45:03 Yes.
3:45:08 First, maybe what do you love most about Andrew and second, what score on a scale of one to
3:45:09 10?
3:45:11 Do you think he would give you on your approach to health?
3:45:12 Oh, three.
3:45:13 Physical three.
3:45:15 You think you score that high, huh?
3:45:16 Okay.
3:45:17 That’s good.
3:45:18 Exactly.
3:45:23 Well, so he did, he convinced me to stop drinking alcohol, which was a big deal.
3:45:24 Successfully.
3:45:27 Well, it was like my, other than my family, it was my favorite thing in the world.
3:45:29 And so it was a major, major reduction.
3:45:32 Like having like a glass of scotch at night was like a major, like it was like the thing
3:45:33 I would do to relax.
3:45:38 So he has profoundly negatively impacted my emotional health.
3:45:43 I blame him for making me much less happy as a person, but much, much, much healthier.
3:45:44 Physically healthier.
3:45:46 So that, that I credit him with that.
3:45:48 I’m glad I did that.
3:45:50 But then his sleep stuff, like, yeah, I’m not doing any of that.
3:45:51 Yeah.
3:45:52 I have no interest in his sleep.
3:45:53 Shit.
3:45:54 Like, no.
3:45:57 This whole light, natural light, no, we’re not doing that.
3:45:58 Too hardcore for this.
3:46:01 I don’t see any, I don’t see any natural, I don’t see any natural light in here.
3:46:02 It’s all covered.
3:46:03 It’s all horrible.
3:46:04 And I’m very happy.
3:46:09 I would be very happy living and working here because I’m totally happy without natural
3:46:10 light.
3:46:11 In darkness.
3:46:12 It must be a metaphor for something.
3:46:13 Yes.
3:46:14 It’s a test.
3:46:16 Look, it’s a test of manhood as to whether you can have a blue screen in your face for
3:46:17 three hours and then go right to sleep.
3:46:22 Like I don’t understand why you should want to take shortcuts.
3:46:25 I now understand what they mean by toxic masculinity.
3:46:29 All right.
3:46:37 So let’s see, you’re exceptionally successful by most measures, but what to use the definition
3:46:39 of success?
3:46:43 I would probably say it is a combination of two things.
3:46:48 I think it is contribution.
3:46:56 So have you done something that mattered ultimately and specifically a matter to people?
3:47:01 And then the other thing is, I think happiness is either overrated or almost a complete myth.
3:47:05 And in fact, interesting, Thomas Jefferson did not mean happiness the way that we understand
3:47:08 it when he said, “Pursuade to happiness and the declaration of independence.”
3:47:16 He meant it more of the Greek meaning, which is closer to satisfaction or fulfillment.
3:47:23 So I think about happiness as the first ice cream cone makes you super happy, the first
3:47:27 mile of the walk in the park during sunset makes you super happy.
3:47:33 The first kiss makes you super happy, the thousandth ice cream cone, not so much.
3:47:38 The thousandth mile of the walk through the park, the thousandth kiss can still be good,
3:47:42 but maybe just not right in a row.
3:47:46 And so happiness is this very fleeting concept and the people who anchor on happiness seem
3:47:48 to go off the rails pretty often.
3:47:54 It’s sort of the deep sense of having been, I don’t know how to put it, useful.
3:48:00 So that’s a good place to arrive at in life.
3:48:01 Yeah, I think so.
3:48:02 Yeah.
3:48:03 I mean, can you sit?
3:48:04 Yeah.
3:48:07 Who was it who said the source of all the ills in the world with man’s inability to sit in
3:48:11 a room by himself doing nothing?
3:48:14 But if you’re sitting in a room by yourself and you’re like, “All right,” four in the
3:48:18 morning, it’s like, “All right, have I lived up to my expectation of myself?”
3:48:24 Like, if you have, the people I know who feel that way are pretty centered and generally
3:48:33 seem very, I don’t know how to put it, pleased, but proud, calm at peace.
3:48:40 The people who are sensation seekers, by the way, there’s certain entrepreneurs, for example,
3:48:45 who are like in every form of extreme sport and they get huge satisfaction out of that.
3:48:48 Or there’s sensation seeking in sort of useful and productive ways.
3:48:52 Larry Allison was always like that, Zuckerberg was like that.
3:49:00 And then there’s a lot of entrepreneurs who end up no drugs, like sexual escapades that
3:49:02 seem like they’ll be fun at first and then backfire.
3:49:07 Yeah, but at the end of the day, if you’re able to be at peace by yourself in a room
3:49:08 at 4 a.m.
3:49:09 Yeah.
3:49:15 I would even say happy, but I know, I understand Thomas Jefferson didn’t mean it the way maybe
3:49:20 I mean it, but I can be happy by myself at 4 a.m. with a blue screen.
3:49:21 That’s good.
3:49:22 Exactly.
3:49:23 Staring at cursor.
3:49:24 Exactly.
3:49:31 As a small tangent, a quick shout out to an amazing interview you did with Barry Weiss
3:49:34 and just to her in general, Barry Weiss of The Free Press.
3:49:37 She has a podcast called “Honestly with Barry Weiss.”
3:49:38 She’s great.
3:49:39 People should go listen.
3:49:45 You were asked if you believe in God.
3:49:49 One of the joys, see we talked about happiness, one of the things that makes me happy is making
3:49:50 you uncomfortable.
3:49:51 Thank you.
3:49:55 So this question is designed for, many of the questions today are designed for that.
3:50:01 You were asked if you believe in God and you said after a pause, you’re not sure.
3:50:09 So it felt like the pause, the uncertainty there was some kind of ongoing search for wisdom
3:50:11 and meaning.
3:50:14 Are you in fact searching for wisdom and meaning?
3:50:15 I guess I put it this way.
3:50:21 There’s a lot to just understand about people and then I feel like I’m only starting to
3:50:29 understand and that’s certainly a simpler concept than God.
3:50:33 So that’s what I’ve spent a lot of the last 15 years trying to figure out.
3:50:37 I feel like I spent my first like whatever 30 years figuring out machines and then now
3:50:41 I’m spending 30 years figuring out people, which turns out to be quite a bit more complicated.
3:50:47 And then I don’t know, maybe God’s the last 30 years or something.
3:50:52 And then look, I mean, just like Elon is just like, okay, the known universe is very complicated
3:50:53 and mystifying.
3:50:58 I mean, every time I pull up in astronomy, I get super in astronomy and it’s like, daddy,
3:51:03 how many galaxies are there in the universe and how many galaxies are there in the universe?
3:51:04 100 billion.
3:51:05 Okay.
3:51:06 Like how?
3:51:07 Yeah.
3:51:08 Yeah.
3:51:11 Like how is that freaking possible?
3:51:16 Like what, like it’s just, it’s such a staggering concept that I-
3:51:21 I actually wanted to show you a tweet that blew my mind from Elon from a while back.
3:51:25 He said, Elon said, as a friend called it, this is the ultimate skill tree.
3:51:31 This is a wall of galaxies, a billion light years across.
3:51:32 So these are all galaxies.
3:51:33 Yeah.
3:51:36 Like what the, like how, how is it that big?
3:51:37 Like how the hell?
3:51:40 I’m like, you know, I can read the textbook into this and then that and the whatever eight
3:51:42 billion years and the big bang and the whole thing.
3:51:44 And then it’s just like, all right, wow.
3:51:48 And then it’s like, all right, the big bang, all right, like what was, what was before the
3:51:49 big bang?
3:51:56 Do you think we’ll ever, we humans will ever colonize like a galaxy and maybe even go beyond?
3:51:57 Sure.
3:51:58 I mean, yeah.
3:51:59 I mean, in the fullness of time.
3:52:00 Yeah.
3:52:01 So you have that kind of optimism.
3:52:02 You have that kind of hope that extends across thousands of years.
3:52:03 In the fullness of time.
3:52:04 I mean, yeah.
3:52:06 I mean, yeah, you know, all the, all the problems, all the challenges with it that I do, but
3:52:07 like, yeah, why not?
3:52:10 I mean, again, in the fullness of time, it’ll, it’ll take a long time.
3:52:12 You don’t think we’ll destroy ourselves?
3:52:13 No.
3:52:14 I doubt it.
3:52:15 I doubt it.
3:52:18 And, you know, fortunately we have Elon giving us, giving us the backup plan.
3:52:19 So I don’t know.
3:52:21 Like I grew up, you know, real Midwest sort of just like conventionally kind of Protestant
3:52:22 Christian.
3:52:25 It never made that much sense to me.
3:52:26 Got trained as an engineer and a scientist.
3:52:27 I’m like, oh, that definitely doesn’t make sense.
3:52:31 I’m like, I know, I’ll spend my life as an empirical, you know, rationalist and I’ll figure
3:52:32 everything out.
3:52:37 You know, and then again, you walk up against these things, you know, you bump up against
3:52:40 these things and you’re just like, all right, I like, okay, I guess there’s a scientific
3:52:44 explanation for this, but like, wow.
3:52:46 And then there’s like, all right, where did that come from?
3:52:47 Right.
3:52:50 And then how far back can you go on the causality chain?
3:52:51 Yeah.
3:52:54 And then, yeah, I mean, then even, even just, you know, experiences that we all have on
3:52:56 earth, it’s hard to, it’s hard to rationally explain it all.
3:53:01 And then, you know, so yeah, I guess I just say I’m kind of radically open-minded at peace
3:53:04 with the fact that I’ll probably never know.
3:53:07 The other thing that has happened, and maybe the more practical answer to the question
3:53:12 is, I think I have a much better understanding now of the role that religion plays in society
3:53:14 than I didn’t have when I was younger.
3:53:18 And my partner, Ben, has a great, I think he quotes his father on this.
3:53:22 He’s like, if man does not have a real religion, he makes up a fake one.
3:53:25 And the fake ones go very, very badly.
3:53:30 And so there’s this class, it’s actually really funny, there’s this class of intellectual,
3:53:33 there’s this class of intellectual that has what appears to be a very patronizing point
3:53:37 of view, which is, yes, I’m an atheist, but it’s very important that the people believe
3:53:40 in something, right?
3:53:43 And Marx had like the negative view on that, which is religion is the opiate of the masses,
3:53:46 but there’s a lot of like right-wing intellectuals who are themselves, I think, pretty atheist
3:53:49 diagnostic that are like, it’s deeply important that the people be Christian or something
3:53:50 like that.
3:53:53 And on the one hand, it’s like, wow, that’s arrogant and presumptive.
3:53:58 But on the other hand, you know, maybe it’s right because, you know, what have we learned
3:54:02 in the last hundred years is in the absence of a real religion, people will make up fake
3:54:03 ones.
3:54:07 There’s this writer, there’s this political philosopher who’s super interesting on this
3:54:08 name, Eric Vogelin.
3:54:12 And he wrote this, he wrote in that sort of mid part of the century, mid and late part
3:54:13 of the 20th century.
3:54:17 He was like born and I think like 1900 and like died in like 85.
3:54:23 So he saw the complete run of communism and Nazism and himself, you know, fled, I think
3:54:26 he fled Europe and, you know, the whole thing.
3:54:30 And, you know, his sort of big conclusion was basically that both communism and Nazism
3:54:36 and fascism were basically religions were, but like in the deep way of religions, like
3:54:39 they were, you know, we call them political religions, but they were like actual religions.
3:54:43 And, you know, they were the, they were what Nietzsche forecasted when he said, you know,
3:54:47 God is dead, we’ve killed him and we won’t wash the blood off our hands for a thousand
3:54:48 years, right?
3:54:53 Is we will come up with new religions that will just cause just mass murder and death.
3:54:57 And like, you read his stuff now and you’re like, yep, that happened, right.
3:55:00 And then of course, as fully, you know, elite moderants, of course, we couldn’t possibly
3:55:02 be doing that for ourselves right now.
3:55:04 But of course we are.
3:55:08 And you know, I would argue that Eric Vogelin for sure would argue that the last 10 years,
3:55:11 you know, we have been in a religious frenzy, you know, the woke that woke has been a full
3:55:16 scale religious frenzy and has had all of the characteristics of a religion, including
3:55:21 everything from patron saints to holy texts to, you know, sin.
3:55:26 It said, woke, wokeness has said every aspect of a, wokeness has said every, I think it’s
3:55:31 that every single aspect of an actual religion other than redemption, right, which is maybe
3:55:34 like the most dangerous religion you could ever come up with is the one where there’s
3:55:35 no forgiveness.
3:55:36 Right.
3:55:39 And so I think if Vogelin were alive, I think he would have zeroed right in on that would
3:55:40 have said that.
3:55:43 And, you know, we just like sailed right off.
3:55:46 I mentioned earlier, like we, we somehow rediscover the religions of the Indo Europeans
3:55:49 were all into identity politics and environmentalism.
3:55:52 Like, I don’t think that’s an accident.
3:55:58 So it’s anyway, like there, there is something very deep going on in the human psyche on
3:56:07 religion that is not dismissible and needs to be taken seriously, even if one struggles
3:56:10 with the, the specifics of it.
3:56:15 I think I speak for a lot of people that has been a real joy and for me, an honor to get
3:56:21 to watch you seek to understand the human psyche as you described you in that 30 year
3:56:24 part of your life.
3:56:26 And it’s been an honor to talk with you today.
3:56:27 Thank you, Mark.
3:56:28 Thank you, Alex.
3:56:29 Is that it?
3:56:31 That’s only, only how long is that?
3:56:36 Four hours with Mark Andreessen is like 40 hours of actual content.
3:56:41 So I’ll accept being one of the short ones for the listener.
3:56:47 Mark looks like he’s ready to go for 20 more hours and I need a nap.
3:56:48 Thank you, Mark.
3:56:49 Thank you, Alex.
3:56:52 Thanks for listening to this conversation with Mark Andreessen.
3:56:57 To support this podcast, please check out our sponsors in the description.
3:57:02 And now let me leave you with some words from Thomas Sowell.
3:57:09 It takes considerable knowledge just to realize the extent of your own ignorance.
3:57:21 Thank you for listening and hope to see you next time.
3:57:25 [Music]
3:57:27 (gentle music)
3:57:30 (upbeat music)
Marc Andreessen is an entrepreneur, investor, co-creator of Mosaic, co-founder of Netscape, and co-founder of the venture capital firm Andreessen Horowitz.
Thank you for listening ❤ Check out our sponsors: https://lexfridman.com/sponsors/ep458-sc
See below for timestamps, transcript, and to give feedback, submit questions, contact Lex, etc.
Transcript:
https://lexfridman.com/marc-andreessen-2-transcript
CONTACT LEX:
Feedback – give feedback to Lex: https://lexfridman.com/survey
AMA – submit questions, videos or call-in: https://lexfridman.com/ama
Hiring – join our team: https://lexfridman.com/hiring
Other – other ways to get in touch: https://lexfridman.com/contact
EPISODE LINKS:
Marc’s X: https://x.com/pmarca
Marc’s Substack: https://pmarca.substack.com
Marc’s YouTube: https://www.youtube.com/@a16z
Andreessen Horowitz: https://a16z.com
SPONSORS:
To support this podcast, check out our sponsors & get discounts:
Encord: AI tooling for annotation & data management.
Go to https://encord.com/lex
GitHub: Developer platform and AI code editor.
Go to https://gh.io/copilot
Notion: Note-taking and team collaboration.
Go to https://notion.com/lex
Shopify: Sell stuff online.
Go to https://shopify.com/lex
LMNT: Zero-sugar electrolyte drink mix.
Go to https://drinkLMNT.com/lex
OUTLINE:
(00:00) – Introduction
(12:46) – Best possible future
(22:09) – History of Western Civilization
(31:28) – Trump in 2025
(39:09) – TDS in tech
(51:56) – Preference falsification
(1:07:52) – Self-censorship
(1:22:55) – Censorship
(1:31:34) – Jon Stewart
(1:34:20) – Mark Zuckerberg on Joe Rogan
(1:43:09) – Government pressure
(1:53:57) – Nature of power
(2:06:45) – Journalism
(2:12:20) – Bill Ackman
(2:17:17) – Trump administration
(2:24:56) – DOGE
(2:38:48) – H1B and immigration
(3:16:42) – Little tech
(3:29:02) – AI race
(3:37:52) – X
(3:41:24) – Yann LeCun
(3:44:59) – Andrew Huberman
(3:46:30) – Success
(3:49:26) – God and humanity
PODCAST LINKS:
– Podcast Website: https://lexfridman.com/podcast
– Apple Podcasts: https://apple.co/2lwqZIr
– Spotify: https://spoti.fi/2nEwCF8
– RSS: https://lexfridman.com/feed/podcast/
– Podcast Playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4
– Clips Channel: https://www.youtube.com/lexclips