AI transcript
0:00:07 We want to build Google’s and Facebook’s and AI and giant companies, giant cryptocurrencies, and now internet communities.
0:00:11 What they want to do is they want to exert authority over others.
0:00:16 What is the reason for the hostility between media and tech?
0:00:20 So for them, the best thing they can do is to put a man out of work.
0:00:22 And for us, the best thing we can do is we can put a man on the moon.
0:00:35 Today on the podcast, I’m joined by Balaji Srinivasan, entrepreneur, investor, and author of The Network State, to talk about one of our longest-running shared topics, the conflict between tech and legacy media.
0:00:45 We get into the rise of GoDirect, the financial collapse of journalism, the media’s political capture, and how crypto and AI might offer a path to rebuild trust, on-chain, and on our own terms.
0:00:50 Balaji calls for a total rethink of what truth infrastructure looks like and why tech needs to play offense.
0:00:52 Let’s get into it.
0:01:09 As a reminder, the content here is for informational purposes only, should not be taken as legal, business, tax, or investment advice, or be used to evaluate any investment or security, and is not directed at any investors or potential investors in any A16Z fund.
0:01:14 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
0:01:22 For more details, including a link to our investments, please see A16Z.com forward slash disclosures.
0:01:29 Balaji, another day, another journo hit piece.
0:01:46 We were sort of talking offline about Mark and Ben’s Evolution of the Media episode, and let me just more broadly reflect, you and I have been friends and collaborators for the last 10 years, and one of the topics we’ve spent a lot of time talking about is the media, sort of the state of the media, what needs to be fixed about the media, and how to do that.
0:01:51 And we, with your leadership, have actually been a part of that trend and that evolution.
0:01:58 And so I wanted to take the opportunity to talk to you, kind of reflect about that evolution, and talk about where we still need to go.
0:01:59 Yeah.
0:01:59 Well, okay.
0:02:02 So there’s so much I can say on this.
0:02:07 I’m going to show one graph that, of course.
0:02:10 It wouldn’t be a biology podcast if we didn’t start it with a graph.
0:02:12 Yes, exactly.
0:02:17 This shows that, essentially, newspaper revenue rose to, like, $70 billion in the year 2000.
0:02:23 And then, right after the financial crisis, it just suddenly collapsed over the course of, like, four or five years.
0:02:26 And Google went vertical and Facebook went vertical, right?
0:02:32 And the thing about this is, this was the internet disrupting blue America, okay?
0:02:38 There’s a similar graph for manufacturing that shows China disrupting red America, almost at exactly the same time, right?
0:02:47 So just to focus on this one, though, for a second, once you see the internet disrupting blue America, because media is, like, a core thing for them, this is actually what led to wokeness.
0:02:50 Because, you know, you’ve heard the saying, go woke, go broke, right?
0:02:50 Yeah.
0:02:52 But in their case, it was actually go broke, go woke.
0:02:54 Okay?
0:02:57 Not my original coinage, but applied to this graph, it’s relatively original.
0:03:02 Because wokeness was, what happened was, they just fell off a cliff like this.
0:03:06 And from 2008 to 2012, tech was just part of the Democrat Party.
0:03:08 It was like, you know, Steve Jobs is there.
0:03:14 And there’s actually this article from 2012 on The Atlantic, like, the nerds go marching in, but tech helping to reelect Obama.
0:03:15 And Facebook was helping Obama.
0:03:17 Yeah, exactly.
0:03:21 All that stuff was basically solidly on the Democrat side up until 2012.
0:03:26 After the 2012 election, right after Obama’s inauguration, you can date it to right after that.
0:03:31 Because even in 2012, like, New York media and so on was saying there’s no such thing as a brogrammer.
0:03:32 Okay?
0:03:33 You can Google that article.
0:03:38 So 2012, tech was part of the coalition, so there’s no reason to attack them.
0:03:46 After the inauguration 2013, and it’s literally that spring and summer, the knives came out, and media started attacking tech.
0:03:47 Okay?
0:03:54 And there are these articles, you know, would you just look at all these rich people, whereas actually in Slate, that was before they got radicalized.
0:03:59 And they’re saying, oh, it’s actually bad to attack people just for the sake of being rich.
0:04:03 It was before all media had updated to actually tech is our enemy now, right?
0:04:04 Yeah.
0:04:11 But unless you understand the economics of it, I don’t think one can understand why the journos suddenly went crazy.
0:04:14 Now, the thing is, we’re now in 2025, right?
0:04:19 It is now 17 years after the collapse in media revenue, right?
0:04:27 So somebody who was born then, who was 18 years old, you’re playing any, like, multiplayer video games, like Quake or whatever, you know, all the new stuff, you know, MOBAs, right?
0:04:31 You can get spawned into the middle of something where everybody’s shooting at each other, right?
0:04:36 That’s what, like, the Gen Z kid is today.
0:04:37 Okay.
0:04:44 So from the restructuring of someone who’s 18 years old, the war with the journos has basically been a feature of their entire existence, okay?
0:04:59 But it actually wasn’t like that because in the 90s and the 2000s, the journos were secure enough in their economic position because you could write, like, four or six articles for Time Magazine a year and get paid a nice salary and travel around the world.
0:05:00 Did they kill?
0:05:01 Yes, they killed.
0:05:04 But they didn’t feel the need to kill all the time.
0:05:12 It’s funny because of the ways I’m putting it, Nellie Bowles actually did a thing for Barry Weiss a few years ago, and it’s like learning how not to kill, okay?
0:05:14 Well, it’s funny because she’s one of the few converts.
0:05:16 She made the transition.
0:05:17 She was able to get to the other side.
0:05:27 When you look at that, or you look at, there’s another one by Hamilton Nolan at CJR, okay, which is basically like the powerful don’t need the media.
0:05:31 Journalism, particularly at its highest level, is about raw power.
0:05:34 See, they admit it, right?
0:05:35 Go ahead.
0:05:36 I remember you had this old quote.
0:05:40 Was it some journalist who was like, we know our profession is kind of, like, immoral?
0:05:41 That’s an old quote.
0:05:41 Oh, yeah, yeah, yeah.
0:05:42 That’s actually a great one also.
0:05:43 Ready?
0:05:44 That is the journalist.
0:05:46 All right, this is a book report.
0:05:52 Anybody who has read this book cannot look at the journos the same way.
0:06:01 So Janet Malcolm talks about this, and her opening line, this is a great book called The Journalists of the Murderer, and her opening line is this famous, famous thing.
0:06:02 Hold on, let me find this.
0:06:08 Every journalist who is not too stupid or too full of himself to notice what is going on knows the way he does is morally indefensible.
0:06:17 He is a kind of confidence man, preying on people’s vanity, ignorance, or loneliness, gaining their trust and betraying them without remorse, like the credulous widow who wakes up one day to find the charming young man and all her savings on.
0:06:24 So the consenting subject of a piece of nonfiction writing learns when the article or book appears, his hard lesson.
0:06:27 Journalists justify their treachery in various ways according to their temperaments.
0:06:30 The more pompous talk about freedom of speech, the public’s right to know.
0:06:34 The least talented talk about art, the seemliest murmur about earning a living.
0:06:34 Okay?
0:06:41 And now this is actually a very important thing because this is, by the way, rated one of the top 100 nonfiction books of the 20th century by the modern library.
0:06:43 So this is a great book to read.
0:06:46 There’s another book you should also read, The Great Lady Winked by Ashley Rinsberg.
0:06:47 Okay.
0:06:49 So those prerequisites, let’s talk about the specifics.
0:06:53 First is, there’s like 10, 15 things I can say.
0:06:54 Let me just go point by point.
0:06:58 The first is, what is the reason for the hostility between media and tech, right?
0:07:03 It’s actually the master framework on the whole thing is it’s state versus network, right?
0:07:05 This is basically from my book, The Network State.
0:07:11 I think it’s a useful frame, which is like, for example, Elon versus mainstream media is network versus state, right?
0:07:14 Social media versus mainstream media is network versus state.
0:07:18 Or when it says, what is this whole article that is attacking Luke Farreter?
0:07:22 It’s like, why did this programmer attack the institutions of the U.S. government?
0:07:24 Network versus state, right?
0:07:26 Like this tech programmer attacking the state institutions, right?
0:07:34 And it’s the people on social media, the tech people, who are mad at the fact that this state-aligned institution is attacking our tech people.
0:07:36 Once you apply that framework, that applies to everything.
0:07:39 For example, SpaceX is network, NASA is state.
0:07:41 Uber is a network.
0:07:43 Taxi medallions are the state.
0:07:44 Bitcoin is a network.
0:07:45 The Fed is the state.
0:07:46 And so on and so forth, right?
0:07:51 And those are two different organizing principles for how you think about the world.
0:07:56 Basically, the state is someone should pass a law, and the network is someone should write some code, right?
0:08:03 The state is everybody who is directly or indirectly paid by essentially either the U.S. government or a government more generally.
0:08:07 And the network is all those people who are directly or indirectly monetized and make their living on the network.
0:08:14 So when someone goes from NYT to Substack, they’re moving from state to network, right?
0:08:24 And now the network is actually taking parts of the state where all these tech guys are getting into government, getting into politics, getting into media, getting into finance, getting into the traditional niches that were for the state.
0:08:32 That’s why they’re so mad because their share of the global pie and the local and the American pie is shrinking and the network’s pie is expanding, right?
0:08:34 They’re like, stay in your lane.
0:08:35 Why are they saying that?
0:08:42 They’re like, you should just be like hitting keys on computers and being a nerd and making like LED light bulbs that flash.
0:08:47 You should not be like rewriting the code base of how the world works, right?
0:08:52 And as I’ll get to, there’s a deep question of legitimacy, right?
0:08:53 The network is new money.
0:08:54 The state is old money, right?
0:08:58 And when I say the state, by the way, there’s like the literal state in the sense of the U.S. government.
0:09:03 And there’s the unelected institutions that surround the U.S. government in a ring that give it instruction.
0:09:06 Example, the newspapers tell the state what to do.
0:09:09 The universities tell the state what to do.
0:09:14 The philanthropies tell the state what to do and so on and so forth, right?
0:09:16 But they’re also in turn funded by the state.
0:09:18 So universities obviously directly get federal funding, right?
0:09:24 Some of them are literally public colleges, but they’re very dependent on tax exemptions and so on and so forth that the state grants.
0:09:25 So they only exist because of that.
0:09:27 The philanthropies also ditto.
0:09:36 NGOs, they have compounding foundations or their foundation endowment compounds because they’ve got favorable tax treatment, which normal companies don’t get, but they’re state affiliated.
0:09:38 And finally, the media, that’s the least obvious.
0:09:40 How are they upstream of the state?
0:09:44 Well, obviously, they hold somebody accountable by publishing a negative article on them.
0:09:48 Of course, they never hold themselves accountable because they’re always like, we’re speaking truth to power.
0:09:50 I’m like, obviously, they’re not.
0:09:50 Why?
0:09:54 Every journal is so courageous as to attack your boss whenever they’re off.
0:09:56 Okay?
0:10:00 Basically, whenever you’re talking to a journal, you’re not talking to the journal, you’re talking to their boss, right?
0:10:07 Ultimately, for example, Bloomberg, when Bloomberg was running for president, actually, Michael Bloomberg is one of the better of them because he’s actually like a tech entrepreneur.
0:10:10 So I’m not like completely anti-Michael Bloomberg.
0:10:20 But Michael Bloomberg, when he was running for president, Bloomberg News, his pit bulls, they actually posted this amazing thing, which said, we will report on but not investigate Michael Bloomberg.
0:10:25 Amazing, amazing phrase, report on what an amazing phrase.
0:10:30 What it means is, basically, if somebody else says something, we will reprint it in Bloomberg.
0:10:33 So you can’t say we didn’t report on him.
0:10:34 Okay?
0:10:36 But we’re not going to go and dig through his trash.
0:10:38 We’re not going to do it adversarially.
0:10:39 We’re not going to stalk him.
0:10:42 We’re not going to spam his family like they did to our boy, Luke Faraday.
0:10:42 Right?
0:10:43 Massacre our boy.
0:10:44 Well, so that’s the thing.
0:10:46 They didn’t massacre him because he’s got our support.
0:10:46 Right?
0:10:46 Yeah.
0:10:48 But they did attack him.
0:10:49 I’m just referencing the meme.
0:10:49 Yes.
0:10:50 I know.
0:10:50 I know.
0:10:50 I know.
0:10:51 That’s right.
0:10:51 Yeah.
0:10:59 But the thing is, they don’t do that because if you were a Bloomberg journo and you went after Michael Bloomberg, that’s what’s known as a CLM, career-limiting move.
0:11:04 In fact, actually, the entire journo establishment, that’s all nepotous.
0:11:05 That’s all old money.
0:11:06 Right?
0:11:16 They project onto us what their lifestyle is, like Salzberger, who inherited the New York Times, Murdoch, the guy who inherited Fox News now, and the Newhouse’s.
0:11:19 Who inherited Wired and Condé Nast.
0:11:20 Basically, Condé Nast was a parent company of that.
0:11:22 Basically, they’re all heirs.
0:11:23 Right?
0:11:24 They’re not self-made.
0:11:26 And the journos don’t have equity.
0:11:29 See, old money treats the journalists much worse.
0:11:32 Tech people, we treat our employees so much better because we give them equity.
0:11:33 Right?
0:11:33 They level up.
0:11:39 Journos is a completely two-tier system where there’s the publishers, the owners of these papers, and there’s these serfs, the journos.
0:11:40 So how do they compensate them?
0:11:48 They compensate them in status where the journo is made to believe that they’re like some independent, like, freewheeling attack dog.
0:11:52 Of course, they can’t actually attack true power who is their bosses.
0:11:52 Right?
0:11:53 They’ll never even mention them.
0:11:55 That’s the thing is, if I say Zuck, right?
0:11:56 If I say Zuckerberg.
0:12:00 Zuckerberg, whether you like him or not, Zuckerberg deserves our respect because he’s the man.
0:12:01 He really is the man in the radio.
0:12:02 He’s taken the hits for 20 years.
0:12:03 Right?
0:12:10 And he has survived so many things, and crucially, he’s CEO, he’s founder, he’s out there, and you can criticize him by name.
0:12:12 And if I say Zuckerberg, everybody can summon a face to the name.
0:12:14 They can summon all his body and so on.
0:12:17 If I say Sellsberger, it’s a blank.
0:12:19 99% of people don’t even know the guy exists.
0:12:20 Right?
0:12:22 He’s like, you know the usual suspects?
0:12:22 Yeah.
0:12:25 He’s like Kaiser Sosay, Kaiser Sellsberger.
0:12:25 Okay?
0:12:28 Honestly, I’ve heard his name a million times, but I don’t even know what he looks like.
0:12:29 You don’t even know what he looks like.
0:12:37 But this thing is, basically, Zuckerberg is somebody who, again, for better or worse, he runs a major communications channel, and so he’s covered.
0:12:37 Right?
0:12:39 But Sellsberger doesn’t get good coverage.
0:12:42 He gets no coverage.
0:12:42 Yeah.
0:12:43 Right?
0:12:46 That is actually really interesting.
0:12:46 Right?
0:12:48 Who’s holding him accountable?
0:12:48 Right?
0:12:50 Journalism for the privacy for me.
0:12:51 Exactly.
0:12:56 The guy who’s surrounded by thousands of journalists at all times is the only person in the world who has any privacy.
0:12:58 Okay?
0:13:00 Yeah, that’s good.
0:13:01 You’ve never seen this guy’s face.
0:13:01 Right?
0:13:02 Who is this guy?
0:13:03 Yes, he’s on the New York Times.
0:13:05 And people say, oh, my God, he’s on the New York Times.
0:13:05 You’ve never seen his face.
0:13:11 But the point is, if you did word face association, in terms of a number of impressions, it’s a quantitative thing, right?
0:13:13 If I use AI, I could quantify it.
0:13:16 How many people can summon the face of Zuckerberg with a word?
0:13:18 Like millions, billions probably, right?
0:13:18 Yeah.
0:13:20 How many even know that Sellsberger exists?
0:13:20 Basically nobody.
0:13:21 Why?
0:13:22 Because it’s Zuckerberg’s company.
0:13:25 He is considered a person who is in charge of the company.
0:13:28 And people don’t just say, oh, Facebook has some policy issue.
0:13:29 Facebook has this policy issue.
0:13:30 Meta has this policy issue.
0:13:31 They go after Zuck personally.
0:13:40 But the NYT, they’re granted the enormous shield of calling it the NYT, calling it the institution, as opposed to Sellsberger’s paper.
0:13:41 It is just Sellsberger’s paper.
0:13:42 It’s his blog.
0:13:42 Right?
0:13:44 There’s nothing that is printed there without his approval.
0:13:45 Right?
0:13:48 So he inherited it from his father’s father’s father’s father’s father.
0:13:48 Okay?
0:13:49 It’s like five generations.
0:13:50 Here’s a big one.
0:13:51 Okay?
0:13:52 And then let’s get back to poor old Luke.
0:13:53 Okay?
0:13:54 So Durante.
0:13:55 Okay, Walter Durante.
0:13:56 So let me show you.
0:13:59 There’s basically, as I said, journos never hold themselves accountable.
0:14:00 Right?
0:14:04 So here’s a great, great, great tweet by Paul Graham.
0:14:05 And then here’s my reply to it.
0:14:06 So look at Paul.
0:14:09 One of the biggest surprises in my adult life is how unethical reporters are.
0:14:10 And movies are always the good guys.
0:14:12 Everyone in techno stories like this.
0:14:13 Now, by the way, this is actually a deep point.
0:14:13 Why?
0:14:15 Paul is saying this.
0:14:16 In movies, they’re always the good guys.
0:14:19 So this is a concept I call Jurassic Ballpark.
0:14:21 And Paul has said something like this many times.
0:14:21 Right?
0:14:24 Jurassic Ballpark is like, you know, the movie Jurassic Park.
0:14:29 And in it, they are missing some DNA for the dinosaurs.
0:14:30 They use amphibious DNA.
0:14:34 And of course, that leads to issues because then they can reproduce and so on and so forth.
0:14:34 Right?
0:14:34 Okay.
0:14:41 So in the same way, when you have a missing segment of history or culture, you just ballpark
0:14:42 with what comes out of a movie.
0:14:43 Right?
0:14:44 Jurassic Ballpark.
0:14:44 Right?
0:14:45 You just ballpark it.
0:14:49 But that could be really, really wrong or fake because it’s a movie after all.
0:14:49 Right?
0:14:54 Unless you have personal experience of something, your impression of it is the movie version.
0:14:55 And this is a non-obvious point.
0:14:55 Right?
0:14:59 So like, how else could it be because visuals are very persuasive.
0:14:59 Right?
0:15:02 And video is a high bandwidth pathway to the human brain.
0:15:02 Right?
0:15:06 And it’s not like your brain was like built.
0:15:08 Maybe in system two versus system one.
0:15:09 You know, system one is like instinctive.
0:15:11 And system two is like logical thinking.
0:15:11 Right?
0:15:15 Like maybe your system two can distinguish between true and false for the video, but your
0:15:16 system one can’t.
0:15:16 Right?
0:15:17 Okay.
0:15:21 So essentially people see all these things like journals are the good guys and so on.
0:15:23 We just saw the journalist and the murderer.
0:15:23 Right?
0:15:27 We just saw once you actually understand the space, they’re more like Kaiser Soze, Kaiser
0:15:30 Sulzberger, where they are the unreliable narrator.
0:15:30 Right?
0:15:34 They write the story and everybody else dies and they’re the good guys.
0:15:38 And you never actually hear the story of how the story is written, which is actually much
0:15:39 more important than the story.
0:15:42 Like whenever you see the New York Times has obtained, how did they obtain it?
0:15:44 Oh, they got some stolen documents.
0:15:50 Oh, or they told a source, give me this and then I’ll write favorably about you and don’t
0:15:52 give me this and I’m going to name you in the thing and you’re going to lose your job
0:15:53 and get attacked.
0:15:54 They do all that kind of stuff.
0:15:55 And go ahead.
0:16:00 I remember your definition of journalism, invasion of privacy for profit.
0:16:00 Yeah, exactly.
0:16:06 The non-consensual invasion of privacy for profit is what legacy media is.
0:16:07 Let’s take that definition.
0:16:07 Okay.
0:16:09 Non-consensual.
0:16:10 Can you opt out?
0:16:12 Can you say, journo, stop stalking me?
0:16:14 Can you say, journo, stop spamming me?
0:16:17 Can you say, don’t mention my family?
0:16:20 See, the thing is, in normal English, we have words for this.
0:16:24 When journos go through your garbage, like there was somebody who’s like stalking people online
0:16:29 and looking at all their, like Luke Farriter had somebody going and spamming all of his
0:16:30 friends and contacts and whatever.
0:16:34 And they were just like some like drug addict or crazy person.
0:16:37 He could say, okay, that’s stalking, that’s spamming.
0:16:39 You could get a 50 foot restraining order.
0:16:44 I actually want to see, by the way, today’s court system, I want to see people use anti-stalking,
0:16:46 anti-spam kind of things.
0:16:48 Because can’t spam, it’s an unsolicited messages, right?
0:16:52 Like you can try and use that on the journals and maybe there’s a sympathetic court system
0:16:52 now, right?
0:16:55 Like basically they get one warning, go the F away.
0:16:58 And the second is they’ve got money, so go after them on that, right?
0:16:59 Okay, fine.
0:17:03 So now I’m sure there’s some process where basically there’s all New York Times were sold and there’s
0:17:06 various things where, you know, they’ve had historical precedents that protect them.
0:17:11 But basically once you think of them as spammers, as stalkers, as scammers, because they are
0:17:12 scammers, the journalist is a murderer.
0:17:13 What is the definition she used?
0:17:15 A con man, right?
0:17:19 The journalist is a con man because they’ll always write this email to you, which is like
0:17:24 fluffing you up and flattering you and saying how great you are, blah, blah, blah, and pretending
0:17:26 that they come in under flag of parley.
0:17:29 They get their quote if you’re dumb enough to talk to them.
0:17:31 And then they stab you in the article.
0:17:36 That’s why Janet Malcolm said the consenting subject of a piece of nonfiction learns when the article
0:17:41 appears his hard lesson because you talk to this person, they present themselves as a
0:17:44 human being, as like a person you’re having a conversation with, and actually they twisted
0:17:46 every word to try to stab you, right?
0:17:47 Okay, coming back up.
0:17:51 So point being, basically, they’re actually the stalkers, the spammers, the scammers.
0:17:52 That’s what the journos are.
0:17:54 You can’t get them to go away.
0:17:55 So that’s a non-consensual part.
0:17:58 The non-consensual invasion of privacy for profit.
0:18:01 So there’s this saying that sometimes journos use as self-defense saying this is like,
0:18:05 journalism is printing something that someone does not want printed.
0:18:07 Everything else is public relations, right?
0:18:09 Now, what does that mean?
0:18:10 Why does someone not want to print it?
0:18:12 Usually because private information, right?
0:18:17 So the non-consensual invasion of privacy for profit.
0:18:21 Let us not forget that these are multi-billion dollar media corporations, right?
0:18:27 It took a long time in the 2010s for people to finally realize the New York Times, the Journal,
0:18:29 they’re just dot coms, right?
0:18:31 They’re not referees.
0:18:32 They’re not neutrals.
0:18:35 For some reason, people give them the imprimatur of like an institution.
0:18:36 We need to save our institution.
0:18:38 But what’s the difference to New York Times and Facebook?
0:18:39 They’re companies.
0:18:40 Exactly.
0:18:41 They’re a corporation.
0:18:43 Fair game, right?
0:18:50 And in fact, that’s why they got so mad at us because we actually believed in what they
0:18:53 said that we had freedom of speech and that free markets existed.
0:18:58 Actually, until really the early 2010s, there’s no actual practical freedom of speech.
0:18:58 You know why?
0:19:02 Because, you know, it’s saying never argue with a man who buys ink by the barrel, right?
0:19:05 Basically, freedom of the press belonged to those who owned one.
0:19:11 So think about how expensive it was to get a radio license, a TV license, to own a newspaper
0:19:15 and send trucks to people’s houses with all the ink and the printing press.
0:19:17 That was like the super high capital cost, right?
0:19:22 These are guys who basically own like essentially factories that cranked out papers and you at
0:19:24 home could say something to your friend, but you didn’t have distribution, right?
0:19:25 And we understand what distribution.
0:19:29 Now, you know, Teal talked about distribution years ago before it was quantified with social
0:19:29 media.
0:19:33 Now, very roughly, distribution is like number of followers, number of people on your email
0:19:34 list, but also quality of them, right?
0:19:37 Not just quantity, but quality and quantity of your follower base is roughly distribution.
0:19:43 So, you didn’t have distribution, and I mentioned this before, but in the early 90s, you know
0:19:44 the Unabomber?
0:19:45 Yeah, of course.
0:19:45 Yeah.
0:19:46 Why’d he kill all those people?
0:19:50 He killed those people so he could get an op-ed in the Washington Post, right?
0:19:51 Now, why?
0:19:57 Because distribution was so scarce back then that he wanted to get his manifesto out, so
0:19:59 he’d literally kill people for the distribution, right?
0:20:01 That was just within our lifetimes, just 30 years ago.
0:20:02 That’s how scarce distribution was.
0:20:05 Today, he’d be a crazy person on the internet, right?
0:20:06 He’d get his message out there.
0:20:10 But if you realize Unabomber is willing to kill to get his message out, you also realize
0:20:14 why there’s some people who are like crazy trolls on X, because they might not be able
0:20:18 to kill, but they’re certainly willing to attack somebody else, attack their character, or reputation
0:20:20 assassination, and now you get to the journos, right?
0:20:25 So, journalism, as the non-consensual invasion of privacy for profit, really captures what it
0:20:26 is that these critters do.
0:20:32 Now, up until about 2020 or so, there was no force that could resist them.
0:20:34 They were just like rampaging, right?
0:20:39 I mean, we could resist them economically, but especially in the 2010s, they were so
0:20:40 mad at us taking their money, right?
0:20:41 And what did that mean, by the way?
0:20:46 It just meant that, like, you refresh NYT.com, you see a Rolex ad or whatever.
0:20:47 You see some car ad.
0:20:48 You see some clothes ad.
0:20:51 And if you refresh meta.com, right?
0:20:53 Facebook.com, Instagram, what do you see?
0:20:56 You see an ad for exactly the same company, right?
0:20:57 Yep.
0:21:03 And so, that means the sales account executives at both of these organizations, right, what
0:21:08 they’re doing is they’re competing for literally the same customer, right?
0:21:15 And obviously, the Facebook, Google, et cetera ad has much more scale.
0:21:16 It has much more analytics.
0:21:18 It’s built internet first and so on and so forth.
0:21:20 So, the NYT just starts bleeding out around you.
0:21:23 But the point being, we’re beating them economically.
0:21:27 They couldn’t code search engines or social networks, but they could write stories and shape
0:21:27 narratives.
0:21:33 I know the difference between a Dropbox product announcement and, you know, NYT story.
0:21:35 Their stories have villains, right?
0:21:39 Our product announcements are all basically really making the world a better place.
0:21:40 It’s like, guess what?
0:21:42 10 gigs more storage or whatever, right?
0:21:43 Guess what?
0:21:48 Now you can, like, do, like, comic book style AI or whatever, right?
0:21:50 All that stuff doesn’t hurt anybody.
0:21:54 That’s just essentially, this is adding cool things to the world.
0:21:55 Oh, here’s a new robot.
0:21:56 Like, that’s what we’re doing.
0:21:58 And what are they doing?
0:22:01 Like, the highest award in journalism, what’s the most prestigious thing you can do?
0:22:02 They want to catch the Theranos.
0:22:04 Well, even higher than that is Watergate.
0:22:05 Right.
0:22:06 Right?
0:22:10 So, basically, whereas for us, it’s like SpaceX, right?
0:22:14 So, for them, the highest, the best thing they can do is to put a man out of work.
0:22:16 And for us, the best thing we can do is we can put a man on the moon.
0:22:18 Okay?
0:22:22 So, literally, the number one thing that they can see, and this, again, state versus network,
0:22:23 right?
0:22:28 We want to build Googles and Facebooks and AI and all this kind of stuff, drones.
0:22:32 And we want to build giant companies, giant cryptocurrencies, and now internet communities.
0:22:36 What they want to do is they want to exert authority over others.
0:22:39 But they want to do so in a deniable way, right?
0:22:42 Because if you go to the Pulitzer website or something like that, and you look at these
0:22:47 prizes, they’ll all say something like, our reporting held, you know, these so-and-so
0:22:50 accountable and led to an FTC investigation, blah, blah, blah, blah.
0:22:58 So, they’re really willing to take credit when their words on the page lead to the state golem
0:23:00 going and animating and smashing somebody, right?
0:23:02 It led to an FTC investigation.
0:23:05 It led to a new FAA regulation, led to this, led to that, right?
0:23:08 Leading to some person getting fired, some program getting set up.
0:23:11 And the ultimate thing, obviously, is to get the president of the United States fired.
0:23:12 That’s why they wanted to get Nixon fired, right?
0:23:13 With Watergate, right?
0:23:18 They were essentially like, think of Wall Street Journal, NYT, and Washington Post as the three
0:23:21 board of director seats on top of the presidency.
0:23:23 The president was like the titular CEO.
0:23:27 But the Journal, the Times, and the Washington Post, if they all put their multi-sig key
0:23:30 in the lock, they did their board of directors vote, they’d get them fired, right?
0:23:32 So, that was actually the state of affairs.
0:23:34 That’s what it meant by holding the government accountable.
0:23:36 Nobody in the government was really in power.
0:23:39 The journalists could write enough negative stories, and they were out of power, right?
0:23:43 So, the thing is that when it comes publisher time, they will admit that their stories led
0:23:44 to something.
0:23:51 But when it comes BLM time, they deny that their stories led to, you know, half America
0:23:52 getting burned down, right?
0:23:57 So, it’s a one-way ratchet where they take all the credit and avoid all the blame.
0:24:02 Because could you actually show the trace of that image coming into somebody’s eye and then
0:24:04 them setting fire to this building?
0:24:06 Once in a while, you might be able to show it, right?
0:24:10 They publish a manifesto saying, I read X, Y, and Z, and that’s why I burned it down.
0:24:14 But that causal effect, right, the cause and effect, and basically demonstrating that,
0:24:19 is, you know, there’s an impression, a page view that comes in over here through the
0:24:20 eyes and the ears.
0:24:23 And then there is a burn down the building kind of action to their side.
0:24:26 It’s funny because they’re quick to use causal in the other direction.
0:24:31 They’re quick to say, hey, Facebook is causing people to be depressed or to whatever, turn
0:24:32 right wing or whatever.
0:24:33 Yes, that’s right.
0:24:35 So, tech, there’s a causal effect for everything negative.
0:24:36 Yeah, I agree.
0:24:38 Cher knows there’s a causal effect for everything positive.
0:24:39 That’s amazing.
0:24:40 Wow, what an amazing, right?
0:24:42 Everything you do is bad.
0:24:43 Everything they do is good.
0:24:43 Amazing flipping.
0:24:47 Once you see this, though, you can basically be like Neo, you know, in the Matrix and just
0:24:49 like block everything like this, right?
0:24:51 And another big piece of this, they stopped doing this.
0:24:55 But one of the things they were doing years ago is they’re like, I’m a guy, everybody in
0:24:58 tech is so white and blah, blah, blah, right?
0:25:03 And obviously, it’s so much more international than the journos.
0:25:07 If you go and take that famous photo of Elon in the conference room and you compare the
0:25:10 people there versus the NYT editorial boat photo, right?
0:25:15 And you’ll see they’re like well-dressed, essentially, mostly European ancestry people.
0:25:19 And again, I’m not the kind of person who thinks like white is an insult, but they do,
0:25:19 right?
0:25:21 So, it’s all projection, right?
0:25:27 The journos themselves have these tyrannical, evil, meritless nepotists as bosses, right?
0:25:29 The journos themselves can never actually make it anywhere.
0:25:33 And it’s all favoritism and glad-handing, and there’s no merit, and it’s all luck and
0:25:34 connections.
0:25:40 And the journos themselves essentially are these envious people who exist to harm you, to increase
0:25:41 their career prospects.
0:25:43 And they project all that out into everybody else, right?
0:25:48 So, once you see that, like every accusation is a confession or whatever, you realize, oh,
0:25:51 this is how the world works in their stupid Brooklyn side of things.
0:25:55 And they think of us as a rival tribe that acts the same way, right?
0:25:55 Okay.
0:25:58 So, now, that’s like part of the macro.
0:26:03 Now, I have bad news for them, which is to say that I have bad news for us and also bad
0:26:03 news for them.
0:26:05 But let’s start with the bad news for them.
0:26:11 The bad news for them is that they, in the gigantic war between the internet and blue America,
0:26:11 right?
0:26:14 And by the way, we didn’t actually intend to start that war.
0:26:15 We were just building great stuff.
0:26:19 And it became so popular that we took away all of the customers of these guys.
0:26:23 But it’s not like you set up Twitter or Facebook or Google to go and blow up the Times and the
0:26:24 posts and the journal, right?
0:26:28 That was just something like Bezos got the post out of petty cash, right?
0:26:32 Like, it was just something where we built a valuable enough business that it generated so
0:26:35 much wealth that, okay, you could just go and acquire this thing, right?
0:26:38 And, of course, I understand why they got mad.
0:26:42 But what they should have just done is rather than try to beat us, join us or whatever, part
0:26:44 of it is also there’s three or four different things.
0:26:50 There’s a scenario where if Steve Jobs had lived, that when Bezos bought the post, Jobs buys
0:26:54 the Times and Larry Page buys the Wall Street Journal and we’d be on Mars by now.
0:26:54 Yep.
0:26:55 Okay.
0:26:59 So, like, the thing is, at the end of the day, there’s actually, as much as I think there
0:27:04 are like evil twin in some ways, there’s a deep sense in which there’s a similarity, which
0:27:08 is we’re all about the collection, dissemination, and presentation of information.
0:27:11 That’s the similarity between tech and media.
0:27:15 The collection of information, like the raw, you know, data, whether it’s a user-generated
0:27:16 content or so on.
0:27:21 Dissemination, which is distribution, the posting, circulation, and presentation, user interface.
0:27:23 They do think about their charts or infographics.
0:27:26 They also obsess about copy and so on and so forth, right?
0:27:30 So, we’re a fork of them in the same way that, like, Yale was a fork of Harvard or America
0:27:31 was a fork of Britain.
0:27:36 The internet’s a fork of the East Coast culture.
0:27:37 And that goes deeper.
0:27:40 People like Paul Graham, Catherine Boyle, Mike Moritz.
0:27:40 Yeah, yeah, yeah.
0:27:41 Exactly.
0:27:42 Exactly.
0:27:42 That’s right.
0:27:43 So, in another life, right?
0:27:43 Yeah.
0:27:46 Mike Moritz was a journalist, became an investor.
0:27:49 Catherine Boyle, also a journalist, became an investor, right?
0:27:53 Peter Thiel would have been, probably, he’d been born 20 years earlier, probably would
0:27:55 have been a Supreme Court jurist, right?
0:27:57 He’d probably have been, like, chief judge of the Supreme Court or something.
0:27:59 Paul Graham would have been a professor.
0:28:00 Larry Page, a professor.
0:28:01 Sergey Brin, a professor.
0:28:04 Mike Solana in Brooklyn, probably.
0:28:04 For sure.
0:28:07 He’d probably be a book publisher or something like that, right?
0:28:09 I probably would have been a professor and so forth.
0:28:13 And that’s true for Andrew Ng, Daphne Kohler, Martine Cassato, right?
0:28:13 Dixon.
0:28:18 Because the thing is, with computer science, there is an amazing connection between the word
0:28:19 and the deed.
0:28:21 Actually, AI makes that even closer.
0:28:22 You write it down and it’s done.
0:28:24 It’s this amazing thing, right?
0:28:28 So, ultimately, what we’re doing is we’re also writing all day, right?
0:28:30 We’re writing all day to see an impact in the world, right?
0:28:35 So, the difference is their impact, because they’re doing it through the state, and a failed
0:28:38 state at that, is almost invariably negative, right?
0:28:42 And because we’re doing it through the network, we have feedback loops where, for example, when
0:28:46 we type something in and it’s actually factually wrong, like, the compiler throws up on it
0:28:47 and it just doesn’t compile.
0:28:50 There’s fact-checking, like, on the page when we type something.
0:28:51 They have no such thing.
0:28:55 Their only fact-check is actually, crucially, not by the world, but by their peers.
0:28:59 That is to say, it’s only when they lose status among other journos that they actually ever
0:29:01 course-correct, right?
0:29:01 Ever.
0:29:03 And that’s, like, very rare, right?
0:29:05 So long as within their tribe, they’re not losing status.
0:29:06 There’s nothing.
0:29:06 Okay.
0:29:12 Now, basically what happened is, once you see this kind of model, by the way, going back
0:29:16 to the original point, the internet disrupting blue America, China disrupting red America,
0:29:20 you can understand essentially the last 18 years in the following way.
0:29:25 Blue America was disrupted by the internet, and so they began wokeness to take a piece of
0:29:30 red America’s pie and the tech clash to try and take back control from the internet.
0:29:35 Red America was disrupted by China, so the trade war was against China and Trump was against
0:29:36 blue America, right?
0:29:40 Because both their pies, they felt they were shrinking, and so they launched two front wars
0:29:45 on blue America and red America and the internet, red America and blue America and China.
0:29:45 Okay?
0:29:47 We’ll do the China bit later.
0:29:47 Okay?
0:29:54 But blue America versus red America and the internet, after a massive push, has lost.
0:29:55 Right?
0:29:59 I mean, it was close, but basically, Elon and X day.
0:30:01 It’s, like, literally D-Day.
0:30:02 This feels like eons ago.
0:30:03 It’s only three years ago.
0:30:05 Literally three years ago, X still hadn’t been acquired.
0:30:06 Twitter was still Twitter.
0:30:08 We are in, like, wartime speed of things happening.
0:30:09 It’s insane how quickly things, right?
0:30:10 Okay.
0:30:14 So, X day was something where, by the way, it wasn’t just Elon.
0:30:16 $44 billion.
0:30:21 Even Elon, as amazing as Elon is, that’s, like, at the right tail of what even Elon was
0:30:23 capable of actually having as a raise.
0:30:24 That’s a big raise for anybody, right?
0:30:25 Yeah.
0:30:29 So, it took, but the richest man in the world, the wealthy, he’s launching rockets, doing
0:30:30 ships.
0:30:34 He has all this other stuff, and he decided to take on this enormous extra thing and somehow
0:30:37 managed it, which is actually crazy to think about, because everything else, Elon is the
0:30:40 end of one, because whenever I talk to a founder, I’m like, focus, focus, focus.
0:30:41 Okay.
0:30:44 After you have your first whatever billion-dollar company, and it’s $10 billion, and it’s $100
0:30:46 billion, then you can do your next or whatever, right?
0:30:46 Fine.
0:30:48 But, so, Elon’s end of one.
0:30:54 Point is, in 2022, where it looked like the free world was just completely on the ropes by
0:30:57 these, like, racially-obsessed woke psychos just having us pinned like this.
0:30:59 You couldn’t even say whether men and women exist.
0:31:02 Like, we’re genuinely talking about, like, a permanent midnight.
0:31:04 Like, descending is really, really bad stuff, right?
0:31:05 How evil they were.
0:31:07 They just burned down half America.
0:31:10 They were just getting psycho, more and more and more psycho, right?
0:31:17 And so, amidst that, basically, all the resources of all the centrist tech and finance guys,
0:31:21 because that’s what the $44 billion, there’s that famous message of Elon to Ellison that’s
0:31:22 being made public or whatever.
0:31:22 He’s like, what are you in for?
0:31:23 Like, one or two.
0:31:26 Okay, but, okay, Ellison could put in one or two.
0:31:30 But, like, so many other people, it was like Avengers Assemble putting a mil, whatever they
0:31:32 could afford, a mil, 10, 20, 50.
0:31:35 I mean, 50 is a big investment for almost anybody, right?
0:31:38 Like, 50 is like, you know, it’s a serious LP meeting.
0:31:45 So, Elon, being Elon, was able to pass the hat and assemble this gigantic coalition of $44
0:31:50 billion, all of our remaining forces, for X day, right?
0:31:54 And so, the landing was extremely contested, right?
0:31:57 They basically wanted whatever the opposite of what he wanted.
0:31:58 Yes, that’s right.
0:32:02 So, she basically, essentially, whether forced to quit her acquisition or not, right?
0:32:07 Or, I think, like, Elon is actually 40 chess, right?
0:32:09 So, he actually is that smart, right?
0:32:14 So, it is possible that Elon just essentially made people think, just reverse psychology,
0:32:18 got the judge who had negated his pay package or, like, forced him through this whole court
0:32:22 process to get it back to say, okay, same judge, now this time we’ll do reverse psychology.
0:32:25 So, net-net, he got Twitter.
0:32:30 So, anyway, point is, X day was the day.
0:32:35 X day was also the liberation of Meta and liberation of YouTube, right?
0:32:40 All the countries that these racial fanatics at the NYT had occupied, right?
0:32:41 The networks they had occupied.
0:32:46 You know how, like, the Nazi empire was at, like, its peak and they thought they were going
0:32:49 to win and then D-Day and then they just collapsed like this, right?
0:32:55 So, Salzburger and Soros, they thought they were going to win and then X day, boom, came
0:33:01 in and just there, like, with X flipping, YouTube uncensored, like, Meta uncensored, everything
0:33:05 uncensored, and so on and so forth because X was upstream of the conversation.
0:33:09 Anyway, point is, it took Elon’s personal intervention in June 2023.
0:33:11 See, his first round of firings hadn’t done it.
0:33:12 His second round hadn’t done it.
0:33:17 It was like the third round of chemo to get rid of the wokes that had just infested Twitter,
0:33:21 right, to actually change things.
0:33:25 And by the way, you know, again, Elon’s intuitive more than a philosopher per se, just his philosophy
0:33:26 is his execution, right?
0:33:30 But there’s actually a real logic to why he renamed it X, you know why?
0:33:32 Among other things, this is one of many reasons.
0:33:33 Why?
0:33:37 I’m not saying this is why he did it, but it is an implicit aspect of how he did it.
0:33:42 It’s just like them renaming all the schools and, like, tearing down the statues and so
0:33:43 on, right?
0:33:49 Every journo had put years of effort into building up their profiles, and all their blue
0:33:54 checks were stripped, and all their profiles got renamed, and it was just X, right?
0:33:55 Another example is fake news, right?
0:34:02 Fake news had actually been used in 2016 for a few weeks in the context of all that social
0:34:04 media news is fake, but the New York Times is real news.
0:34:08 And Trump turned it on them, and it’s like, actually, the NYT is fake news, the fake news
0:34:08 media, right?
0:34:09 Which is perfect, right?
0:34:13 So, in the same way, the blue check went from something that was something that journos
0:34:16 valued to something that just Elon just stripped from all of them, right?
0:34:19 He just stripped their status, stripped their distribution.
0:34:23 I don’t know whether the ban on outbound links, again, whether it’s intentional or not.
0:34:24 I have no personal information.
0:34:25 This is what I just speculate.
0:34:29 Or, again, and even if it wasn’t intentional, this is just, like, part of the effect of it.
0:34:34 The ban on outbound links meant that suddenly X became the place you just go for the information,
0:34:36 and you don’t go to the journos anymore, right?
0:34:37 So, he stripped their status.
0:34:40 He stripped their control over the central stream of media.
0:34:42 He stripped their traffic from links.
0:34:46 He renamed the whole thing to show that he had root control over it.
0:34:48 Because you know what a pain in the ass it is to do a rename, right?
0:34:55 This X.com was, in a sense, tech’s revenge for making every fucking GitHub renamed from
0:34:57 master to main, okay?
0:34:59 Now, is Yale renaming its master’s degree?
0:35:00 No.
0:35:02 Okay, of course not, right?
0:35:04 It’s master class, whatever.
0:35:07 Like, anybody who’s at the New York Times, are they saying, I’ve got a main degree now,
0:35:09 rather than a master’s in journalism?
0:35:12 I think Columbia still has a master’s in journalism, right?
0:35:15 So, the whole point was, again, these journalists have a double standard.
0:35:17 They would impose that on us, right?
0:35:23 You, one million GitHub repos have to be renamed from master to main, whatever, 100 million
0:35:23 GitHub repos.
0:35:25 Do you know what a pain in the ass that was?
0:35:26 It was a huge pain in the ass.
0:35:30 Every single dev had to go through some stupid exercise on this, which was just a
0:35:32 demonstration of their power over us at that time.
0:35:34 That was what renaming means.
0:35:36 It means you cause a massive inconvenience for everybody to show that you have institutional
0:35:37 power.
0:35:39 So, now, Elon returns a favor.
0:35:41 And does it at even greater scale, right?
0:35:42 Okay.
0:35:45 So, now, what that is, is network overstate, where we fought in the domain in which we were
0:35:46 stronger.
0:35:50 And in a sense, also, by the way, taking back X and renaming X, you know, there’s like a
0:35:53 hard-fought city in some countries or some wars.
0:35:56 Gosh, there’s a city, I think, how many times has it sold?
0:35:58 It changed hands during the Korean War.
0:36:00 It was like several times, right?
0:36:02 So, yes.
0:36:05 So, Seoul changed hands for four times, okay?
0:36:06 It went back and forth, right?
0:36:13 And so, you think of X as being like Seoul during the Korean War, okay?
0:36:14 Because it’s a social war.
0:36:15 It’s a digital war.
0:36:16 We’re a blue and red tribe.
0:36:19 And actually, just to make that really explicit, this is a good visual.
0:36:20 All right.
0:36:22 So, this is from 2017.
0:36:26 And the interesting thing is, it’s the same on both Twitter and Facebook, right?
0:36:28 This is from eight years ago.
0:36:32 You can see, they labeled nodes as blue and red based on whether they’d said, I’m voting
0:36:34 for Clinton or Trump, just parsing the text, right?
0:36:36 And they looked at their connections.
0:36:37 And blue people were connected to blues.
0:36:38 And reds were connected to reds.
0:36:41 And the only large red media outlet was Breitbart.
0:36:44 And blue was over here.
0:36:47 And the Hill was one of the very few outlets that both sides tend to link because it just
0:36:49 gave neutral political news.
0:36:50 Like, so-and-so was running.
0:36:51 So-and-so results.
0:36:53 One neutral, truly neutral argument, right?
0:37:00 And so, here you could actually see that this was a social war, right?
0:37:05 And, you know, I’ve made this point in the past that in 1861, like when it was the North
0:37:08 versus the South, we take for granted that the ideological and the geographical coincided,
0:37:09 right?
0:37:14 The North was the Union, and the South had the slaves, and slavery is legal here and illegal
0:37:17 here, and the geographical and ideological coincided, right?
0:37:22 But by 2016, the geographical and ideological did not coincide.
0:37:25 You didn’t have a clean red states and blue states.
0:37:25 It’s very fractal.
0:37:30 Red states and blue states do exist, but it’s a preponderance as opposed to something that’s
0:37:30 as clean as this.
0:37:35 Now, to be clear, if you showed ideology here, it also looked more fractal, but it was more
0:37:37 distinct ideologically back then.
0:37:39 And here, it’s much more geographically overlapping.
0:37:40 With me so far?
0:37:40 Yep.
0:37:41 Okay.
0:37:47 But there is a domain where these two factions are completely distinct, and that is the domain
0:37:48 of the cloud, right?
0:37:52 So, on the land, red and blue are higgledy-piggledy right next to each other.
0:37:54 They can’t invade each other’s territory.
0:37:58 Like, what, you’re going to invade, like, the cities or the cornfields or something.
0:38:00 You can’t invade the lands.
0:38:05 So, instead, the social war is fighting over the mines, invading the mines, right?
0:38:10 Now, once you see this, you can actually understand a lot about the last 10 years, right?
0:38:13 We’ve basically been in the middle of this gigantic social war.
0:38:18 And the goal was, for blue, like, how do you win a social war against red?
0:38:20 Their goal, you ever play the game Othello?
0:38:21 No.
0:38:23 I know the game, but I’ve never played it.
0:38:24 It’s like, you flip tiles.
0:38:29 You’ve got black and white tiles, and you surround other tiles, and you flip them from black to
0:38:29 white or white to black.
0:38:30 Okay?
0:38:37 And so, essentially, the goal for blue was to win ideologically and flip every red node
0:38:37 to blue.
0:38:42 One way of thinking about it is, you know how an ant colony, individual ants, they don’t
0:38:45 actually know what they’re doing, but the colony has an intelligence.
0:38:48 So, even if the ants don’t know what they’re doing, the colony has an intelligence, right?
0:38:53 This is the same for a flock of seagulls or a school of fish, right?
0:38:56 There’s colony intelligence, and insects in particular are like this, right?
0:39:01 So, once you start thinking about ideology, it’s just like that, right?
0:39:04 Think of woke as, like, blue, right?
0:39:06 Or, like, radicalized blue.
0:39:08 It’s like laser eyes, right?
0:39:10 Like, basically, go broke, go woke.
0:39:13 They lost all this money as the internet disrupted them.
0:39:18 Blue laser eyes come online, and they start going to their old religion, because they don’t
0:39:19 have the money anymore.
0:39:22 So, go back to the old civil rights rhetoric and so on.
0:39:26 It’s like, basically, when countries got blown up in the Middle East and other places, when
0:39:30 countries go on hard times, that’s when fundamentalism returns, right?
0:39:32 Because they don’t have the economics anymore.
0:39:35 So, their economics went away, laser eyes glowing blue.
0:39:39 So, the eyes glow blue of these Chernos, and they basically were like, okay, it’s life or
0:39:44 death for us, and they started going and trying to capture as many institutions as possible
0:39:44 in the social war.
0:39:50 So, that’s why they were just going after random seeming nodes and forcing, canceling
0:39:50 them.
0:39:55 Remember how they wanted Amazon to put BLM on its, they did get Amazon to put BLM on its
0:39:55 homepage?
0:39:59 Like, for a long time, you’d load Google and it would have some BLM thing on there, right?
0:40:00 You know what I’m talking about, right?
0:40:01 Yeah, of course.
0:40:05 So, everybody, you know, Brian Armstrong, during the 2020 BLM riots, like, people were, like,
0:40:08 trying to force him to say Black Lives Matter on Twitter.
0:40:09 What was the point of that?
0:40:11 It’s like the Shahada in Islam, right?
0:40:15 The point of that was to show you’re a convert to BLM, right?
0:40:19 To flip a red node BLM, because it’s a checkmark over your head to show you they flipped that
0:40:20 node.
0:40:24 Now that’s part of BLM tribe, and now they can turn attention on the next one, right?
0:40:30 And so, all of the cancellation, all of the censorship, all the deplatforming, all of the
0:40:37 unbanking, all of the insane ideological fervor you can conceptualize as an attempt from BLM
0:40:45 to reunify red versus blue on BLM terms by turning every internet company into something
0:40:50 that was paying tribute to BLM versus worthless DI jobs, and every red into somebody who was
0:40:54 paying tribute to BLM by essentially not just giving up the presidency, but assuming the position.
0:40:57 Like, for example, why do they want to defund the police?
0:40:58 They want to fund their NGOs.
0:40:59 That’s what it’s all about.
0:41:01 They wanted to redirect the budget.
0:41:04 That’s why there’s like 200 homeless NGOs in San Francisco alone.
0:41:10 The homeless industrial complex, that shows that the NGOs, as their budget rises, the homeless
0:41:11 population rises with it.
0:41:14 They’re basically paid to get people addicted to drugs.
0:41:16 It’s the Department of Dependency Department, right?
0:41:23 And so, the point is that basically, all of this stuff with defund the police, defund the
0:41:29 NGOs, all of that can be conceptualized as this broad social war of blue against red to flip
0:41:31 all red nodes blue, right?
0:41:33 And what was their main weapon?
0:41:37 You’re racist, you’re sexist, you’re homophobic, you’re this, you’re that, transphobic, blah,
0:41:37 blah, blah, blah, blah.
0:41:43 And with this language, that same language they could use to force you out of their institution
0:41:47 because they’d fire you for being accused of any of these things, and they’d also say
0:41:52 your institution, they’d bust your borders and they would swarm you with unqualified hires
0:41:54 or else you’d be accused of this.
0:41:59 So, the same language they’d use to strengthen their borders and deport reds, and they’d use
0:42:01 to bust your borders and import blues.
0:42:02 Do you see what I’m saying, right?
0:42:02 Yeah.
0:42:06 Because if you’re a racist, you get fired from a blue organization, and you’re a racist unless
0:42:07 you hire blues.
0:42:08 Yes.
0:42:09 Okay?
0:42:13 Now, of course, today we see they don’t actually care about brown people or black people, only
0:42:14 blue people, right?
0:42:16 So, it’s more clear in 2025, there’s less clear of several years.
0:42:17 Fine.
0:42:17 Okay.
0:42:18 So, now coming back up the stack.
0:42:25 So, once we realized that it was a social war, now we can actually understand why the journos
0:42:26 want to kill you.
0:42:32 Like, basically, you know how, at various times in history, France and Germany have
0:42:35 traded, and France and Germany have fought, and France and Germany have traded, and France
0:42:35 and Germany have fought.
0:42:37 We are in wartime mode with the journos.
0:42:42 So, it is, like, extremely stupid for anybody to, now let’s go to concrete brass tacks.
0:42:45 What should technologists do, and what should we specifically do generally, right?
0:42:48 So, first, just at an individual level, right?
0:42:50 Number one, go direct.
0:42:53 Build your own distribution to avoid distortion.
0:42:54 Okay?
0:42:59 That is to say, any content you have, it should be posted on your feeds.
0:43:01 Why would you go and feed it to some journo?
0:43:02 You’ve got some scoop.
0:43:04 You don’t need them for distribution anymore.
0:43:07 It’s more obvious that they need your content to build up their channel.
0:43:12 And they will distort it in the process because, remember, they get credibility within other
0:43:15 journos by being hostile to tech guys.
0:43:19 If they write a positive story, then it’s like, well, you’re a flack, you’re running a press
0:43:19 release.
0:43:24 Also, by the way, there’s another point, which is conflict is interesting, right?
0:43:29 Like, any movie, if you’re writing a screenplay, and it’s just somebody sitting on the grass
0:43:32 enjoying a fine sunny day, that’s boring, right?
0:43:34 But if a meteor hits, suddenly you’ve got attention, right?
0:43:37 So, what works in a movie setting is not what works in real life, right?
0:43:40 So, the journals want conflict.
0:43:44 And so, our concept of, hey, it’s 10 gigs for Dropbox or whatever, that might be helpful
0:43:47 to the public, but it doesn’t tell a story, and they want a story.
0:43:51 So, they’re always going to take what you do and put it through some distorting lens to
0:43:54 get to the other side, and they will get more page views at the expense of your company that
0:43:55 you worked on so hard, right?
0:43:58 So, number one, go direct.
0:44:01 Number two, build your own distribution to avoid distortion.
0:44:05 Now, the thing about that is, hire creators.
0:44:06 There’s two kinds of creators.
0:44:11 There’s those who are at the storyline level, and you probably only need, like, one of those
0:44:16 per company, because if you have too many very strong-willed personalities in a company, like,
0:44:19 you can only have one Steve Jobs at Apple, basically, right?
0:44:22 However, you can have a lot of people assisting with production, right?
0:44:24 With making content, with scaling that creator, or what have you.
0:44:26 So, usually, you’re going to have a founding creator.
0:44:28 And I’m not saying it’s 100%, by the way.
0:44:32 Sometimes, once you get to a certain scale, it’s good to have, like, some personalities.
0:44:35 For example, actually, Jesse Pollock’s doing a great job at base for Coinbase, right?
0:44:39 He’s got his own distinctive style that’s complementary to Coinbase’s style, and so on.
0:44:43 So, at a certain scale, you can have multiple personalities that are driving certain product
0:44:43 lines, or what have you.
0:44:44 And so, that’s fine, right?
0:44:48 But at least for a startup getting up to a build, you probably only want to have, sort
0:44:49 of, one storyline, one main creator.
0:44:52 And you have a lot of production support behind that, right?
0:44:53 And that can really work.
0:44:55 You can get very far with that.
0:44:57 Video, images, all this kind of stuff, right?
0:44:58 So, A, go direct.
0:45:01 B, build your own distribution to avoid distortion.
0:45:04 And by the way, one way of thinking about that, they call them the media because they mediate your
0:45:05 experience of reality.
0:45:09 When you’re putting anything through a media, it’s like an Instagram filter that makes you
0:45:10 into a villain, okay?
0:45:11 Why would you do that to yourself?
0:45:17 It’s like paying the journo with free content to make yourself look bad and get a permalink
0:45:18 that’s attacking you.
0:45:28 I remember our good friend Flo Crivello was launching his remote office startup during COVID, and he gave TechCrunch the exclusive, and they criticized it.
0:45:29 They basically didn’t make it look good.
0:45:31 And he’s like, why would I give you my launch announcement?
0:45:33 I’m here to advertise my company.
0:45:35 I gave it to you guys.
0:45:38 I gave you the exclusive, and you made me look stupid.
0:45:39 Like, why would I ever do that?
0:45:43 And the fundamental thing is, it’s a business development relationship.
0:45:46 Literally think of it as TechCrunch is a corporation.
0:45:49 Why are you giving them something for free?
0:45:50 Right?
0:45:51 Literally, it’s that.
0:45:56 It’s like, that journo has a spreadsheet, whether it’s them or their manager who’s looking at it.
0:46:03 And there’s a row in the spreadsheet for that URL, and it’s got the number of clicks and the number of conversions and the ad revenue on that article.
0:46:05 And that’s the only thing they care about.
0:46:07 That’s the only thing they care about.
0:46:07 You know what’s not there?
0:46:09 The valuation or health of your company.
0:46:12 Obviously, they don’t care.
0:46:13 They would literally light it on fire.
0:46:15 That’s what they did during BLM.
0:46:18 The journo would light your house on fire and sell tickets to the blaze.
0:46:19 Okay?
0:46:20 That’s their business model.
0:46:21 Right?
0:46:25 And so, obviously, it’s like the dumbest possible deal.
0:46:26 People still do this stuff.
0:46:28 And I’m like, I mean, Elon uncensored Twitter.
0:46:30 You can post whatever you want.
0:46:30 Right?
0:46:31 YouTube’s uncensored.
0:46:32 Like, everything’s uncensored now.
0:46:33 Get good.
0:46:33 Right?
0:46:37 Anybody who’s talking to journos in 2025, hiring public relations, what are you doing?
0:46:38 What are you even doing?
0:46:38 Right?
0:46:39 Okay.
0:46:41 Now, I will say one thing.
0:46:41 This is very important.
0:46:49 Over the last several years, when we’ve done, like, the tech and media kind of ecosystem, there has been something that’s worked, and there’s something that didn’t work.
0:46:50 What worked?
0:46:53 Individual-led projects.
0:46:54 Right?
0:46:55 Mike Salon is PirateWires.
0:46:56 That has a real style to it.
0:46:57 TBPN.
0:46:57 Right?
0:47:03 Which exists to, I think, make Ramp get more conversions, which is very funny.
0:47:03 Right?
0:47:04 It’s very funny.
0:47:06 Of course, Ramp is our main sponsor.
0:47:07 They’ve got them on the hats.
0:47:08 They’ve NASCAR’d it.
0:47:08 It’s funny.
0:47:10 Ramp’s a good product, by the way.
0:47:11 And so, TBPN.
0:47:11 Great.
0:47:12 Right?
0:47:13 Coogan, he stuck with it.
0:47:14 He did a lot, you know.
0:47:16 And, obviously, All In has done very well.
0:47:16 Right?
0:47:18 Obviously, Elon has done very well.
0:47:23 And, you know, I think you did it with Moz, and now, you know, I think that’s right.
0:47:27 But what I think has not done as well is the things that are institutional.
0:47:31 Because if it’s too institutional, you’re playing it safe.
0:47:36 And you’re playing it safe, and there isn’t any conflict, there isn’t any opinion, there
0:47:37 isn’t anything novel.
0:47:38 It’s focus grouped.
0:47:38 Right?
0:47:41 Certain things benefit from averaging.
0:47:41 Right?
0:47:44 For example, the velocity of a plane or something like that.
0:47:45 You don’t want large deviations.
0:47:47 You want it to be within an envelope.
0:47:47 Right?
0:47:50 So, there’s certain kinds of phenomena where you want averaging.
0:47:55 Opinions and theses are usually not like that.
0:47:55 Right?
0:48:01 So, one way of thinking about it is the entire 20th century was the centralized century.
0:48:06 And even the movement from the widescreen to the portrait size, like a phone is like 9×16.
0:48:12 The movement from widescreen to portrait size is visually the movement from institutional to
0:48:13 individual.
0:48:17 Because a portrait kind of thing, a TikTok style thing, doesn’t have room for a panoramic
0:48:18 shot of a huge crowd.
0:48:19 It’s for a person standing there.
0:48:20 Right?
0:48:24 So, it’s amazing that even the screen itself captures that move from institutional to individual.
0:48:24 Right?
0:48:27 And you see this also on X and other platforms.
0:48:30 These journos, I don’t know what they were doing, if they were faking numbers or whatever.
0:48:32 They have like 20 million followers or whatever.
0:48:34 And they have three likes on their tweets now.
0:48:35 Right?
0:48:39 So, something happened there where either it was all fake or there’s just low engagement
0:48:40 or just boring.
0:48:42 But people just don’t trust those institutions anymore.
0:48:42 Right?
0:48:45 So, that’s another really important lesson.
0:48:46 Individual over institutional.
0:48:50 If you’re doing social media, it should be the amplified voice of your
0:48:51 founding creator.
0:48:51 Right?
0:48:51 Yeah.
0:48:55 And the founding creator is as important as the founding engineer.
0:48:58 Because the founding engineer is implementation, but the founding creator is the distribution.
0:49:02 The founding engineer is the how, but the founding creator is the why.
0:49:06 Because the founding creator has a community that they’re tapped into.
0:49:09 And they’re saying, why should this product exist?
0:49:10 Right?
0:49:15 So, you can often start by understanding your community and building a product for them and
0:49:16 then hiring the engineer.
0:49:18 It’s actually like a third kind of person.
0:49:18 Right?
0:49:23 Normally, it’s been like, there’s like the engineering founder and there’s like the business founder.
0:49:24 This is like the content founder.
0:49:25 Right?
0:49:25 Yeah.
0:49:30 And actually, this is where Coulson and Altman have observed where are the Gen Z, where are
0:49:31 the younger founders who are not in tech?
0:49:32 They’re in content.
0:49:33 Yeah.
0:49:34 Right?
0:49:36 Because actually, that’s where extreme leverage is.
0:49:37 That’s the Mr. Beast.
0:49:40 That’s the guy who actually looks like you, Aiden Ross.
0:49:42 Some of the younger guys are there.
0:49:42 Right?
0:49:43 I Show Speed is like that.
0:49:43 Right?
0:49:46 And they’re very talented at what they do.
0:49:50 And it’s just not something that we thought of as a thing because, you know, there’s still
0:49:54 like the startup kind of thing, but that’s actually now a, I shouldn’t say it’s a game
0:49:59 for 30 and 40-somethings, but millennials are good at startups and we’re still good at startups
0:50:02 and we still are, you know, Palmer’s doing a new thing, Altman’s doing a thing, obviously
0:50:03 not a new thing, but you know what I mean?
0:50:05 Like, we keep doing stuff, right?
0:50:10 But the 20-somethings are often very good at content and content is actually upstream of
0:50:10 product.
0:50:13 There’s room for a lot of collaboration there, potentially, where they’re doing the marketing.
0:50:17 It’s kind of like Beats by Dre, but you start with Dre rather than Apple, right?
0:50:22 I’m not saying anything people don’t know, but right now those have been, I think, doing
0:50:29 things that are relatively low-tech, like Mr. Beast Feastables or t-shirts or stuff like
0:50:29 that.
0:50:34 Brian Johnson, I think with Blueprint, is starting to get higher tech where you start with the
0:50:38 creator and then ideally you can distribute like quantified self stuff through that, if
0:50:39 that makes any sense, right?
0:50:41 Huberman could also do something like this if he decides to get into that area.
0:50:47 Any biotech company, genomics company, sequencing company could do deals with Huberman, for example,
0:50:47 for distribution, right?
0:50:48 Okay.
0:50:52 So this, by the way, is starting to make the case for why don’t outsource your creation.
0:50:53 Are you outsourcing your engineering?
0:50:55 Don’t outsource your creating, right?
0:50:57 Don’t outsource your content.
0:50:58 Content’s as key as code.
0:51:00 Content happens in the house, right?
0:51:03 Content, you have to sweat over it.
0:51:07 And actually, so for example, here’s a few things that I’m like half implementing, partially
0:51:10 implementing, or I am implementing, but I want to implement more, right?
0:51:14 It’s the kind of thing you want, obviously, come to ACZ or come to Network School, come
0:51:16 to NS.com, come to Network School, come and work with us.
0:51:20 But for example, GitHub allows a bunch of people to contribute to code at the same time.
0:51:21 We take that for granted.
0:51:24 How do you get a bunch of people to contribute to content at the same time?
0:51:26 Something like frame.io is pretty good.
0:51:27 You know, frame.io.
0:51:31 You can put all these clips in there, all these images in there, and then you have something
0:51:32 for people to work with.
0:51:35 Or CapCut web interface, right?
0:51:38 You can log into that and just basically load stuff in there, and you have a few accounts
0:51:40 that are shared among team members, right?
0:51:44 So now you actually have something where creation was a single-player app.
0:51:48 You start making it a multiplayer app, and now internet connections are good enough that
0:51:51 you can do versioning on big files and reviews of big files and so on and so forth.
0:51:55 Start thinking about your content base like your code base, okay?
0:52:01 And obviously, AI is a big part of that, though it’s not the only thing, since I think, you
0:52:05 know, any new tool, people use a tool, and they overuse a tool, and you bring it back, and
0:52:08 you’re like, okay, it’s a percentage of my thing, but it’s not everything, right?
0:52:11 If we did this whole podcast as AI, and we had, like, computer-generated us, it wouldn’t
0:52:13 be as interesting or what have you, right?
0:52:14 Because it’s generic.
0:52:16 AI is necessarily, it’s almost like a search engine.
0:52:17 It pulls, like, the…
0:52:18 It’s funny, I was saying this the other day.
0:52:21 Midwit writing used to be woke.
0:52:23 Now all Midwit writing is AI.
0:52:25 Like, it’s not this, it’s that, right?
0:52:29 So it’s like a super intelligence, yet Midwit, right?
0:52:32 But that’s because it’s building the average from the whole internet, so it’s useful when
0:52:33 you prompt it.
0:52:36 Anyway, point is, so that’s another piece on media.
0:52:38 So build your own distribution to avoid distortion.
0:52:40 Go direct, if you have something to say.
0:52:42 Hire creators.
0:52:44 You know, no journos, only influencers.
0:52:45 That’s a related point.
0:52:49 Do not bring journalists to your conference.
0:52:52 Do not bring journalists to your event.
0:52:54 Do not bring journalists anywhere.
0:52:58 Just like they would try to unperson you and cancel you on everything.
0:53:03 Like, you need hard borders, no journos, right?
0:53:08 The reason is, because as Janet Malcolm said, they’re like a con man, right?
0:53:11 There’s normal people who’ll talk to you like a normal person.
0:53:15 And there’s a journo who’ll come there with a mic and try to get someone to say something
0:53:16 and then attack them, right?
0:53:19 And the problem is, this is another big piece of it.
0:53:22 Like, a lot of people, you know how the journalists get them, is they get them on ego.
0:53:24 Okay?
0:53:27 So, it’s actually very similar to like the CIA.
0:53:27 Do you know what I’m saying?
0:53:29 Like, yeah, it’s like mission.
0:53:30 Yeah, exactly.
0:53:31 See, most people don’t understand this.
0:53:36 Like, much of what the NYT and what these guys do, and they’re so much weaker than they
0:53:36 used to be.
0:53:38 Like, so, so, so much weaker.
0:53:39 Thank God.
0:53:40 They’ve lost the center, right?
0:53:45 What happened is, they just piled up the subscriptions, and they got all the wine moms
0:53:46 and lost the Andresons.
0:53:48 Amazing trade for us.
0:53:49 Oh, my God.
0:53:50 They lost Glenn Greenwald.
0:53:51 They lost Mark Anderson.
0:53:52 They lost Nate Silver.
0:53:55 They lost Barry Weiss, right?
0:53:57 Amazing trade for us, right?
0:53:58 Like, for the center.
0:54:03 Because they actually got Tech Envy, and they just optimized the money, and they lost actually
0:54:05 all influence and power over the center.
0:54:06 Thank God, right?
0:54:07 Fine.
0:54:11 And actually, that’s a good trade, by the way, to be clear, I actually want everybody
0:54:11 to have a good life.
0:54:15 It’s possible that in some alternate reality, some of the journos would be good people.
0:54:16 Not all.
0:54:19 Some of them are genuinely, like, evil people, like a stalkerish personality.
0:54:23 But a good chunk of them, like, for example, Derek Thompson is not a hater.
0:54:24 Yeah, he’s good.
0:54:24 Yeah.
0:54:26 But that’s why he left the Atlantic, right?
0:54:31 So, all the ones who are not haters eventually leave.
0:54:34 Because you have to have the soul of a hater to be a journo today.
0:54:38 You have to have, like, a soul, which is, like, a stalker, envious kind of person, Gollum-like
0:54:40 personality or whatever, to be a journo, right?
0:54:42 And also, you have to not get it.
0:54:44 You have to be, like, not numerical enough.
0:54:47 Because, you know, the pay is better in tech in general.
0:54:48 The employees are treated better.
0:54:52 So, you have to be, like, a hater of a certain kind, a very specific kind, right?
0:54:57 Anyway, point is, the journos, another key concept when interacting with them, or not
0:54:59 interacting, you shouldn’t interact with them, but let’s say when dealing with them,
0:55:03 is they’re like a private, a for-profit CIA or FBI.
0:55:03 Should I explain this point?
0:55:05 Actually, especially CIA.
0:55:05 Okay.
0:55:08 So, the thing is, most people think from, again, remember the Paul Graham thing about,
0:55:10 you learn from the movies, the Jurassic Ballpark?
0:55:15 So, most people think that what the CIA does is, like, assassination.
0:55:19 But a lot of what it does is actually character assassination.
0:55:20 It plants stories.
0:55:26 Isn’t that much cleaner to just have somebody plant a story, and then they’re discredited?
0:55:29 And it’s so much cleaner than bullets and blood and so forth, right?
0:55:31 In fact, they did this in East Germany as well.
0:55:36 By the way, Wikipedia is actually just as bad as the media, because Wikipedia, it’s garbage
0:55:37 in, garbage out.
0:55:40 What happens is, you can only cite articles from legacy media.
0:55:42 You can’t cite social media directly.
0:55:43 So, Wikipedia’s a rehash.
0:55:46 So, in anything that’s contemporary, anything that’s political, they’re really terrible.
0:55:50 Nevertheless, there’s some articles from when they didn’t get corrupted.
0:55:54 So, a psychological warfare used by the Ministry of State Security, it served to combat alleged
0:55:55 through covert means.
0:56:00 Basically, Zerzetsung was because the communists were also fighting a similar social war.
0:56:01 They had conservatives.
0:56:01 They had libertarians.
0:56:03 They had non-communists on their territory.
0:56:04 They didn’t want to kill them.
0:56:08 They wanted to convert them, just like the blues flipping the reds to convert to blue,
0:56:08 right?
0:56:13 They were targeted to stop activities of political dissent and cultural incorrectness, right?
0:56:15 And what are the kinds of things they did, right?
0:56:20 Like, they do things like go and mess up your sock drawer to make you think you’re insane
0:56:22 or tell people you’re having an affair, right?
0:56:25 So, it’s like subvert and undermine an opponent, right?
0:56:29 So, disrupt the target’s private life so they’re unable to continue their hostile negative
0:56:30 activities toward the state.
0:56:33 This is what they did to Luke Faraday just now.
0:56:35 Exactly this.
0:56:38 The aim was to disrupt the target’s private or family life so they’re unable to continue their
0:56:40 hostile negative activities toward the state.
0:56:41 Do you see that?
0:56:45 What they’re mad is that Luke, they didn’t care when Luke was just analyzing some old,
0:56:48 you know, museum pieces or whatever, right?
0:56:53 But once they’re going after the state, when Doge is going after the state, that’s their bread,
0:56:53 right?
0:56:54 That’s their power center.
0:56:55 That’s their golem.
0:56:58 The FTC investigated this after our articles, right?
0:57:01 So, anything that’s upstream of that, they don’t want us to be upstream of that.
0:57:02 They want to be upstream of that.
0:57:03 Make sense?
0:57:09 So, as there’s that song, you know, like the communists and the journalists are the same,
0:57:10 but I repeat myself, right?
0:57:14 And the reason, by the way I say that is all, I may have mentioned this, but like John Reed,
0:57:21 Walter Durante, Herbert Matthews, like David Halibur, Sam, Edgar Snow, just take all those
0:57:25 names and those are all the journalists who did the PR for the communists.
0:57:27 It’s literally the reason that Castro’s in power.
0:57:31 For example, there’s this book, The Man Who Created Fidel, right?
0:57:36 The Man Who Invented Fidel, Castro, Cuba, and Herbert L. Matthews of the New York Times.
0:57:37 You see that one, right?
0:57:43 Or here’s another one, which is Durante, Ukraine, Amazon, like basically Stalin’s apologist.
0:57:44 Isn’t that amazing?
0:57:46 Stalin’s apologist worked at the New York Times.
0:57:49 Castro’s apologist worked at the New York Times.
0:57:50 Crazy stuff.
0:57:52 Actually, there’s another one, Perfect Spy, right?
0:57:55 Which is the Vietnam one, right?
0:58:00 The incredible story of double life of Pham Zuan An, a Time magazine reporter on Vietnamese
0:58:01 communist station.
0:58:02 Isn’t that interesting?
0:58:06 See, when I say journalists and communists, but I repeat myself, I’m like being completely
0:58:07 literal.
0:58:11 We just pulled up three books in 30 seconds that were about journalists communists.
0:58:15 So, Dylan Matthews from Vox, he’s talking about the Luke Farad argument.
0:58:17 He says, the negative reaction to this is wild.
0:58:22 If you join the government and your primary legacy is helping to kill millions of people
0:58:25 through aid cuts, you can handle some criticism if you can live with yourself.
0:58:29 The Salisbury’s primary legacy is killing, not like through aid cuts, the Salisbury’s primary
0:58:32 legacy is killing millions of Ukrainians.
0:58:33 Where’s their criticism, huh?
0:58:34 Right?
0:58:37 Again, Dylan Matthews is too much of a coward to do that, right?
0:58:38 The logic is just insane.
0:58:43 It’s claiming that by Dylan Matthews not giving all of his money to aid, he’s killing people
0:58:45 or by not making more money, he’s killing people.
0:58:48 I mean, just it’s, you know, he’s his 23-year-old of killing millions of people.
0:58:50 Why are we giving money in the first place?
0:58:53 Every policy, one of the things about it, notice what they’re doing here.
0:58:57 With the Hall of Demore, there’s actually a very clear-cut case where Durante wrote 13
0:59:02 articles that basically said Stalin wasn’t liquidating the Ukrainians.
0:59:04 He just meant it metaphorically.
0:59:07 They’re literally covering up like a murder in progress, okay?
0:59:08 So that was like a very clear-cut case.
0:59:13 Here, you’re talking about, okay, this is the Feed the Pigeon Society argument, by the
0:59:14 way, right?
0:59:20 Like essentially, as the number of the population grows as dependent, any cut to budget whatsoever
0:59:22 for the blues is equated with murdering their dependents.
0:59:24 And they probably believe that, right?
0:59:30 So it grows to the sky and everybody gets alms that, of course, the blues get nice paid non-profit
0:59:31 jobs out of, right?
0:59:32 And of course, you have more and more dependents.
0:59:33 And by the way, guess what?
0:59:35 He’s absolutely completely wrong here as well.
0:59:36 You know why?
0:59:37 Go to Easterly and Levine.
0:59:41 Easterly and Levine said, stop the aid, right?
0:59:42 Why?
0:59:46 Because all the aid is used for these warlords in Africa, right?
0:59:51 Actually, there’s something where in Nigeria, there’s a business plan competition that was
0:59:54 the most successful, quote, aid project ever because they’re making businesses.
0:59:57 Like, the thing is, I saw this myself in India.
1:00:03 Guys like these effective altruists or these guys, they don’t want peers.
1:00:05 They want pawns, right?
1:00:09 Brown people in India were starving in the 80s or whatever.
1:00:11 And they were pawns of these NGOs who sent the aid.
1:00:13 Now, India doesn’t need aid.
1:00:15 India is actually number three in unicorns.
1:00:17 It’s landing on the dark side of the moon.
1:00:19 I’m not saying everything is perfect, but it’s rising.
1:00:23 So, the fundamental premise of his point that aid helps is incorrect.
1:00:29 Aid actually hurts because aid, you know, it’s kind of like testosterone supplementation
1:00:30 of biosynthetic pathway.
1:00:35 If somebody takes too much of an exogenous hormone, it cuts off their natural production,
1:00:36 right?
1:00:40 Like, basically, if people take too much of the way of steroids, you know, it can cut off
1:00:41 your natural hormonal production.
1:00:42 You have to get it exactly right.
1:00:46 The actual charity is investment.
1:00:47 This is actually a deep point.
1:00:48 Should I explain this point?
1:00:53 So, imagine you’ve got two, quote, rich guys, okay?
1:01:00 And one of them is Soros or USAID is kind of like a rich institution or somebody who’s handing
1:01:01 out aid, okay?
1:01:02 Grants, seeming grants.
1:01:04 And another person is an investor.
1:01:11 So, for the people who are queuing up to write those grants to seek aid, okay, they are making
1:01:14 themselves as sympathetic or as pathetic as possible.
1:01:18 And in The Limit, it’s like the movie Slumdog Millionaire, right?
1:01:23 Where you see it’s dramatized, but the limbs of the kid are cut off to make them more sympathetic.
1:01:29 It’s almost learned or caused helplessness to either become, to pretend to, or become as
1:01:32 helpless and pathetic and sympathetic as possible so you get the maximum amount of money.
1:01:37 You win the competition for being the biggest loser, in a sense, the biggest victim, right?
1:01:38 That’s what wokeness is, right?
1:01:43 By contrast, if you think about our culture in tech and VC, right?
1:01:46 What we respect more than anything else is strength, right?
1:01:49 Essentially, you come to us, you come to us for a check.
1:01:54 And what we respect the most, in a sense, is if we didn’t put a check in you, but you still
1:01:57 win and you raise from someone else or you do it on your own, you bootstrap.
1:02:01 And then a year later, we’re like, I respect you.
1:02:01 I was wrong.
1:02:03 You were strong enough on your own.
1:02:06 And one way of talking about this is fake it till you make it, but another way of putting
1:02:10 it is, rather than the Slumdog Millionaire of people chopping off their limbs and thinking
1:02:13 about how depressed and pathetic they are to compete for grants and aid, instead, imagine
1:02:17 a bunch of people who are all running a race, like a mile or whatever, right?
1:02:22 They’re running a mile and 20 people compete, only one wins, but the other 19 at least got
1:02:23 a workout, right?
1:02:28 So everybody who’s in the process of trying to raise venture or, not that you have to raise
1:02:31 money, obviously, you can just totally bootstrap it yourself now.
1:02:33 It’s a single person startup is much easier than it’s been.
1:02:40 But anybody who’s in that process becomes stronger as a consequence of it, because you
1:02:43 constantly want to keep giving updates to the investor on all the stuff you’re shipping.
1:02:48 And that means, like, sometimes the easiest way to do that is to actually just ship.
1:02:49 I mean, most of the time, these, right?
1:02:55 So in the process of proving yourself to others, you prove yourself to yourself, right?
1:03:00 So that’s why a small amount of capital, when 20 people compete for it, strengthens the
1:03:00 whole ecosystem.
1:03:05 But a small amount of aid, when 20 people compete for it, weakens the entire ecosystem.
1:03:11 So, and another way of putting this also is, take these wokes who purport to believe in
1:03:12 equality, okay?
1:03:13 The Soros types or whatever.
1:03:17 Are they walking down the street and they’re saying, oh, here’s some guy in the street.
1:03:18 I’m going to give them half my fortune.
1:03:19 Now we’ve achieved equality.
1:03:21 Are they going to knock it down?
1:03:22 So let’s say they’ve got a billion dollars.
1:03:29 Are they going to find 100,000 people and each give them $1,000 or $10,000 so they’ve all
1:03:32 got $10,000 and they’re down to $10,000 so they’re all equal?
1:03:33 It’s within their power to do so.
1:03:35 They could literally hit a button to do so.
1:03:39 If they actually believe in equality, they could instantly achieve equality right now.
1:03:40 Okay?
1:03:45 And indeed, I actually do believe in redistribution for every self-proclaimed socialist just for
1:03:50 them to take their fortunes and redistribute them, opt in to socialism, we’ll take all their
1:03:52 money, all the wokes, right?
1:03:52 They’d like an example.
1:03:54 Yeah, exactly.
1:03:54 That’s right.
1:03:56 Look, we basically opt into that, right?
1:03:59 What they actually want, of course, is to take your money and do something with it.
1:04:01 But you take them at their word, they actually believe in equality.
1:04:04 What they mean by equality, by the way, is equality between themselves and the people
1:04:05 they’re looking up at.
1:04:08 They’re not thinking about all the people who they’re wealthier than or whatever, and that’s
1:04:08 it, right?
1:04:09 Okay, fine.
1:04:12 And another way of putting it is, if somebody’s walking on the street and they see somebody
1:04:14 and they’re down their luck, they might give them a dollar or $10.
1:04:16 They’re not going to give them half their salary.
1:04:18 So charity decelerates.
1:04:22 The more somebody rises, the less sympathetic and pathetic they are.
1:04:26 And in fact, people have talked about like how once somebody gets out of the total underclass
1:04:30 into the working poor, they actually sometimes make less money from all the grants and stuff
1:04:33 because all those cutoffs is how they’re considered self-sufficient, right?
1:04:37 You actually can earn your way into a local valley before you earn your way out of it.
1:04:39 It’s a disincentive to work.
1:04:39 Okay.
1:04:43 On the other side of things, for example, take Teal and Zuck.
1:04:44 This is a very famous example.
1:04:45 There’s many more like this.
1:04:49 Zuck started out much, in a sense, poorer than Teal.
1:04:51 Teal put in 500K.
1:04:52 Zuck is now much richer than Teal.
1:04:56 But Teal also became much richer in the consumments, right?
1:05:04 So that’s an example of investment actually achieves redistribution of fortunes or creation of fortunes
1:05:08 or greater equality in a way that charity never would, right?
1:05:10 So capitalism is the ultimate social.
1:05:13 In the same way like the phones that got to everybody in the world, the billions of phones,
1:05:14 capitalism did that.
1:05:16 Aid didn’t do that, right?
1:05:20 All this USAID stuff is just aiding blue NGOs.
1:05:22 What he’s actually mad about, go ahead.
1:05:24 I was laughing at the truth of that.
1:05:25 That’s the truth of it, right?
1:05:29 So the fundamental premise of his point is exactly wrong.
1:05:33 And so you’re taking away their pets, you’re taking away their pawns, you’re taking away
1:05:35 their reason for existing.
1:05:37 And of course, they’ll pathologize that, right?
1:05:39 But actually, they’re doing harm to them, right?
1:05:40 They’re not helping them.
1:05:41 Helping is investment.
1:05:46 I mean, it obviously goes to the old saw of like teach a man to fish versus give a man a
1:05:46 fish, right?
1:05:49 But give a man a thousand fish forever, they become completely dependent.
1:05:51 And that’s actually the goal of it.
1:05:54 Really, what’s happening is the cutoff of USAID is rolling up Blue Empire.
1:05:56 So it’s killing the blue business model.
1:05:58 That’s what they’re mad about, Luca.
1:05:59 Okay, keep going.
1:06:02 I think going back to Luke, what’s our advice to him?
1:06:04 Or how do you sort of react or reflect to the situation?
1:06:05 What should he do now?
1:06:06 So I don’t know how they got that photo of him.
1:06:08 Did he sit for that photo?
1:06:08 No, no, no.
1:06:09 I don’t think so.
1:06:09 Yeah.
1:06:13 So point is, I think overall, he didn’t talk to journos, which is good.
1:06:14 Look, Luke will be fine.
1:06:15 Why will Luke be fine?
1:06:17 Because his tribe supports him, right?
1:06:20 And the journos’ ability to impact somebody else.
1:06:25 With that said, the reason they do this stuff now is, and this is the unfortunate part,
1:06:29 they do this stuff, and they post this, take this Dylan thing.
1:06:33 Like, by his logic, oh, then someone would be justified in Luigi type stuff, right?
1:06:37 This is really the very dangerous thing about what these journos are doing.
1:06:40 They’re trying to essentially foment hatred against tech guys.
1:06:44 What have we done besides make things cheaper, faster, better, right?
1:06:48 Wow, I can now communicate with anybody, anywhere, at any time for no money.
1:06:50 I can find all the world’s information at my fingertips.
1:06:53 I can do math and computer science.
1:06:54 I can do simulations.
1:06:55 We can launch rockets.
1:06:56 We’ve got electric cars.
1:06:58 Oh, we’re the bad guys, right?
1:07:02 Versus the people who are just like stalking and spamming everybody all the time, right?
1:07:06 So first thing is just to have incredibly strong moral bedrock frame.
1:07:08 Understand that everything that journos are doing is projection.
1:07:13 I think advice to Luke, the first thing, by the way, is I actually think that had tech
1:07:15 ignored that article, it wouldn’t have gone anywhere, right?
1:07:15 Right.
1:07:16 That’s another piece about it.
1:07:17 Don’t take the bait.
1:07:21 The irony is that this is the opposite of a hit piece for him in that he’s now got
1:07:21 defenders.
1:07:25 We know it’s a puff piece in our circles, like it’s a badge of honor.
1:07:26 Yeah, yeah, that’s right.
1:07:29 So I think what I would say is I think his family and friends should consider going to
1:07:32 Starbase Texas or something like that, right?
1:07:38 Basically, you want to now sort at this point and you want to sort and be amidst communities
1:07:40 of people who share your values, right?
1:07:42 And the sooner you do that, the better.
1:07:46 And the reason is you just don’t want to have crazy blue people around you.
1:07:51 The Tesla terrorists and so on who are blowing things up, the Luigis, so on and so forth,
1:07:51 right?
1:07:53 So that is actually the danger here.
1:07:55 And I don’t say that lightly.
1:07:59 And in fact, if you want, we can just bleep out Luke’s name, you know, and so on and so
1:08:05 because I don’t want to, you know, but basically, I think the issue with this is here’s another
1:08:07 kind of recommendation for tech guys.
1:08:08 There’s a decision rule.
1:08:10 Don’t take the bait.
1:08:14 The journos only get traffic for their articles when they get range views from us.
1:08:14 And guess what?
1:08:19 They got 700K views or whatever for this tweet and they got conversions because they sold
1:08:20 ads, right?
1:08:24 And so that’s a dub for them in a sense, right?
1:08:27 I mean, look, it’s not like a total dub because it’s certainly not getting anybody fired or anything
1:08:28 like that.
1:08:30 It’s much less of a dub, but it’s some kind of dub.
1:08:35 So, okay, a while ago, there was some journo who was like doing some like cover story or
1:08:36 something like that.
1:08:40 And they put people on a tech guy for like 15 months or something.
1:08:43 And he just completely ignored the entire thing.
1:08:46 And it got no clicks and it got no views or anything.
1:08:49 The opposite of love isn’t hate, it’s indifference, right?
1:08:56 The fact that, like, another way of putting it is, so we’ve built up much of the supply chain,
1:08:57 but not all of it, right?
1:09:01 So, the most important thing is we’ve gotten X and we’ve reestablished control over the
1:09:07 platform because they had gone deep into our territory and actually had crazy blue stuff
1:09:08 in some of our citadels, right?
1:09:09 Like our VC firms.
1:09:12 So, one of the things they were doing is they’re trying to target the stuff that’s upstream,
1:09:16 the platforms, the venture capitalists and so on to try to hit the, you know, why they
1:09:17 go after Uber, right?
1:09:20 They didn’t want to know the trillion dollar company, certainly not the libertarian one.
1:09:21 Why they go after VCs?
1:09:23 They’re the ones distributing capital.
1:09:27 If they go after these nexus points, these critical nodes, they could try to hit those
1:09:28 if you’re in a social war.
1:09:30 It’s like taking a capital city or a town.
1:09:32 These are important nexus points, right?
1:09:33 You don’t want to go after a desert.
1:09:34 You go after nexus points, right?
1:09:36 So, they were deep in our territory.
1:09:37 So, now we took back the platform.
1:09:38 That’s good.
1:09:44 And we’re flanking mainstream media with tweets and podcasts, right?
1:09:48 Ultra short form and ultra long form content where they don’t have as much establishment
1:09:49 oomph, right?
1:09:52 We figured out the formula that works, which is individual over institutional.
1:09:59 Now, the next step, and this is the big story for the next five years or so, the ledger of
1:10:00 record, right?
1:10:02 Ultimately, you can’t be a critic.
1:10:04 You have to be a constructive critic, right?
1:10:09 Like, like, Ron Paul said, end the Fed, and Satoshi implemented Bitcoin, right?
1:10:11 So, you have the criticism, then you have the construction.
1:10:13 So, we actually have to build something better.
1:10:15 We have to build internet-first media, right?
1:10:18 So, that is that whole talk I gave on the ledger of record.
1:10:21 And that, by the way, that talk was originally from, like, 2020.
1:10:25 And I actually feel pretty good about that, essentially predicting that GPT-3, the next version
1:10:27 of it, would be able to summarize things.
1:10:30 It happened faster than I thought, but, like, I think the projection was correct, right?
1:10:31 And the fundamental…
1:10:31 The box scores.
1:10:32 The box scores.
1:10:33 Exactly.
1:10:40 The fundamental premise is, if you think about a sports article, it’s essentially, now we’d
1:10:45 use it, and I phrased it slightly differently then, but it’s essentially a wrapper around
1:10:46 a box score.
1:10:50 Or if you take a financial article, it’s essentially a wrapper around stock ticker symbols.
1:10:53 And you take a political article, it’s a wrapper around tweets.
1:10:54 That’s a raw feed.
1:10:57 It’s the numbers that underpin the letters, right?
1:11:01 So, now, if you think about what a blockchain is, it’s a cryptographically verifiable feed.
1:11:03 That’s, in a sense, what Bitcoin is, right?
1:11:07 What a blockchain is, it’s a stream of events similar to Twitter or any other event-based
1:11:10 feed, except it’s got much harder cryptographically verifiable guarantees.
1:11:12 Proof of what, when, and where, right?
1:11:14 Or proof of what, when, and who.
1:11:15 What is the hash?
1:11:16 When is the timestamp?
1:11:18 Who is the digital signature?
1:11:21 You can also do, like, proof of location, proof of where, and other kinds of proofs, right?
1:11:28 So, that stream of cryptographic proof is, like, a better Twitter in that sense, like, cryptographically
1:11:28 verifiable Twitter.
1:11:32 Then you have AI referencing that to create articles, right?
1:11:36 That’s a high-level concept of the ledger of record that replaces the paper record.
1:11:38 We have to play to win.
1:11:45 And so, we have to essentially realize that that is the center of the whole thing, right?
1:11:46 Truth.
1:11:49 And, actually, we have a better form of truth.
1:11:49 You know what that is?
1:11:51 It’s a form that is native to us.
1:11:52 Crypto.
1:11:53 Yes.
1:11:58 And, specifically, there is a good book, actually, by a reformed journo, or two somewhat
1:11:59 reformed journos.
1:12:01 These are, like, the ones who are not haters, right?
1:12:03 Vigni and Casey, they’re okay.
1:12:06 But the truth machine, the blockchain, the future, everything, this is, like, seven years
1:12:06 ago.
1:12:11 And the thing is, this just basically puts in book form a concept that existed for a long
1:12:11 time.
1:12:13 So, I’ve just got a citation for the concept, right?
1:12:18 So, essentially, the point is that Bitcoin is decentralized cryptographic truth.
1:12:22 Like, essentially, the whole thing about Bitcoin that’s so hard is, how do you get
1:12:26 global consensus on who owns what BTC?
1:12:32 And we have something now where, whether you’re a Democrat or Republican, Japanese or Chinese,
1:12:35 Indian or Pakistani, everybody agrees on the state of the Bitcoin blockchain.
1:12:39 They have global consensus on this thing, which is worth trillions of dollars.
1:12:41 People fight wars over billions of dollars, millions of dollars.
1:12:43 They kill people over thousands of dollars sometimes.
1:12:47 So, to have global consensus on this with no policemen or no military backing it, right?
1:12:51 You know that saying, like, how many divisions has the Pope, which Stalin would say, right?
1:12:54 How many long divisions has the New York Times, right?
1:12:56 They don’t have any, right?
1:12:57 Bitcoin has, right?
1:13:00 So, we actually have truth on our side.
1:13:04 A more powerful form of decentralized cryptographic truth.
1:13:06 It’s not headquartered in downtown Manhattan.
1:13:08 It’s on the internet.
1:13:12 Let me weigh in for a second, because this all sounds compelling, but I also just want to
1:13:16 celebrate right now we have John Coogan and Mike Solana and us and lots of other folks doing
1:13:19 such great work without sort of the crypto elements.
1:13:20 And so, why is that necessary?
1:13:21 Like, what’s missing?
1:13:21 Yes.
1:13:22 Great question.
1:13:28 So, what we’re doing with all the commentary is necessary but not sufficient, right?
1:13:33 Because commentary, it’s humorous, it’s opinion, right?
1:13:34 We need reporting.
1:13:34 We need news.
1:13:35 You need reporting.
1:13:36 Exactly.
1:13:36 That’s right.
1:13:40 So, commentary and summarization, right, is over here.
1:13:42 But news is the update.
1:13:47 Now, the thing is, Twitter is obviously a feed of raw facts that people are putting out there
1:13:47 in decentralized way.
1:13:54 Bitcoin increases that because it actually says, once you can get consensus on who owns
1:13:58 what BTC, you can also get consensus on who owns what stocks, what bonds, what Ethereum,
1:13:59 smart contracts.
1:14:02 And actually, as I did an article the other day, all property becomes cryptography.
1:14:03 We can do that in a different session.
1:14:06 You basically have consensus on who owns what property.
1:14:09 So, all valuable things you can get cryptographic consensus on, right?
1:14:16 And Chainlink and stuff like that, they built essentially armored cars for information, sending
1:14:17 it up and down to the blockchain.
1:14:22 Polymarket, armored cars for information, where information on the internet that’s commercially
1:14:24 valuable can be protected by cryptography and set there.
1:14:27 So, let’s take the case of PirateWires and TPN.
1:14:28 That’s great.
1:14:32 But let’s say there’s some dispute over whether a photo is real or not, right?
1:14:38 Like, a great example is, The Atlantic published this crazy piece calling for invading Brazil
1:14:43 because they saw a photo of the Brazilian fires, okay?
1:14:45 And it’s like, here, it’s like this crazy piece.
1:14:47 And was the photo not real?
1:14:49 Yes, exactly, right?
1:14:51 That’s a very clear example, okay?
1:14:55 The Amazon fires are more dangerous than WMD, okay?
1:14:59 This is a great example of, when I said, they’ll literally kill you for clicks, right?
1:15:03 This is something they’re calling for the invasion of Brazil on this basis, right?
1:15:06 Because they’re like, oh, the Amazon fires are burning.
1:15:12 And it was all on the basis of a fake photo that actually Macron had tweeted out, which
1:15:16 turned out to be a photo that was taken by a photojournalist who died years ago.
1:15:17 So, it’s from like some stock.
1:15:23 So, there was a timestamp that showed that that photo existed many years ago, right?
1:15:26 So, it wasn’t of a current event, okay?
1:15:29 And so, is that amazing, right?
1:15:29 Yeah.
1:15:32 That’s something where etiology, right?
1:15:33 In a sense, cryptographic.
1:15:33 Why?
1:15:38 Because you load the website, you see that it’s HTTPS, right?
1:15:42 That means there’s actually a cryptographic authentication that it’s like Getty Images or
1:15:42 wherever it was.
1:15:44 It was basically at some stock photo thing.
1:15:45 So, we could see the old timestamp.
1:15:47 You might hit archive.is, right?
1:15:52 So, you have implicit cryptographic verification of the timestamp of that image that was going
1:15:54 to be used to cause a war, right?
1:16:01 So, that’s a concrete example of why control over truth is so important to them.
1:16:02 Why did they put up the billboards there?
1:16:07 Because once you determine what is true and false, did Russia collude with Trump, right?
1:16:08 No.
1:16:10 It’s all fake, right?
1:16:13 It took a massive court process to adjudicate that.
1:16:17 And fortunately, the court system wasn’t corrupted enough that it went through.
1:16:22 But the New York Times collected all these politcers for this for basically false information, right?
1:16:25 One way that’s interesting, though, and this actually helps give some insight into it.
1:16:29 There’s a tech person who’s a lib, okay, who I won’t name.
1:16:33 But basically, during the whole Russiagate thing, said, oh, yeah, I know this is just as
1:16:35 good as Game of Thrones or something.
1:16:36 And I realized, oh, wow.
1:16:38 Remember that Paul Graham thing about the movies?
1:16:45 These people were treating this as if it was like an entertainment show with Trump as a villain
1:16:48 with the Times or with any legacy media.
1:16:50 And this is a very obvious thing, but, you know, it was obvious in the past.
1:16:54 You can predict what they’re going to say about somebody before they say it.
1:16:57 They have very low information content on each thing, right?
1:17:00 It’s Trump bad, blue America good.
1:17:05 And it’s like a cast of characters, almost like Seinfeld, where the same cast of characters,
1:17:08 the good guy and bad guy appears on the page, and you can just auto do it, right?
1:17:12 In fact, did I show you the robo journo from three years ago?
1:17:12 I don’t know.
1:17:13 I’m trying to remember.
1:17:14 Oh, yeah.
1:17:15 So this was a bounty.
1:17:21 I just put up a prize I put up where as soon as AI came out, and we can do a lot more with this, by the way,
1:17:21 but I’ll show you this.
1:17:26 So I put out a call to use AI to generate NYT tier clickbait from tweets.
1:17:27 Remember my thing?
1:17:27 Oh, yeah.
1:17:33 Yeah, I put out the theoretical article was in 2020, where I’m like, you know what?
1:17:38 We could have a feed of data, and all of these journos are just a wrapper around that feed.
1:17:43 And the reason I knew GPT-3 might get there is there’s a company called Narrative Science, actually.
1:17:44 Have you ever seen them?
1:17:45 No.
1:17:47 So Narrative Science, it went bust.
1:17:49 It was a good company, just a little too early, okay?
1:17:54 So this was a few years before the ChatGPT moment, okay?
1:18:02 And Narrative Science, what it did, which at the time was really cool, is it took your financial reports, right?
1:18:06 And it would say, revenue is high in the Northeastern segment, 65K, 40K, you know.
1:18:10 So it would basically generate a narrative from your raw data.
1:18:11 So it was like a readable narrative.
1:18:12 Make sense?
1:18:12 Yep.
1:18:18 So because I saw that, I knew that it was probably possible as this technology advanced to take raw feeds of data
1:18:21 and summarize them in essentially story form, right?
1:18:22 With me so far?
1:18:23 Yep.
1:18:23 Okay.
1:18:26 So that was the kind of theory in 2020.
1:18:32 And then the practice by 2022 was once ChatGPT came out, right?
1:18:36 I put out a call to use AI to generate NYTT or clickbait from tweets.
1:18:38 One brave engineer answered a call.
1:18:39 A student who learned how to code and replet is starting to state it.
1:18:41 His app takes a tweet, generates an article.
1:18:42 It’s already in the ballpark.
1:18:45 And you can see from this video, the GPT times, right?
1:18:46 You see this, right?
1:18:47 Yep.
1:18:47 Okay.
1:18:49 See, let me just, I’ll rewind this, okay?
1:18:51 So here it takes the Elon thing, right?
1:18:52 Elon tweet.
1:18:54 It goes here.
1:18:55 Paste it in.
1:18:56 It churns a little bit.
1:18:57 Okay.
1:18:58 It calculates.
1:19:00 He’s showing all the other articles.
1:19:01 He looked, he generated with the aesthetics.
1:19:03 It looks like NYT, right?
1:19:04 Yeah, so good.
1:19:04 Oh my God.
1:19:05 All right.
1:19:06 This was three years ago.
1:19:08 We can do so much more with this.
1:19:08 All right.
1:19:12 Now, boom, putting the cocaine back in Coca-Cola.
1:19:16 And look, it looks exactly like NYT or clickbait, right?
1:19:19 No journo, only robo.
1:19:22 Okay.
1:19:23 No journo, only crypto.
1:19:27 Because we can also have these be, look, see, there’s a code and so on and so forth.
1:19:30 There’s a saying system administrators have, be careful,
1:19:32 we’ll replace you with a very tiny shell script.
1:19:35 We can just automate, right?
1:19:37 Automate and completely obviate.
1:19:42 And the thing is, actually, all the journos have these unions where they’re against AI.
1:19:44 They’re against AI, they’re against AI.
1:19:48 This is, by the way, similar to, I think, the U.S. imposing tariffs or the Red America
1:19:49 imposing tariffs on China.
1:19:51 It’s like Blue America imposing tariffs on AI.
1:19:52 I don’t think it’s going to work.
1:19:58 But basically, Blue America imposing tariffs on AI is a protectionist late-breaking thing
1:20:01 where they think, okay, we can protect our revenue from this and there won’t be any
1:20:02 AI-based disruptors of us.
1:20:04 But there will be.
1:20:07 And they’re going to be internet first because there’s a lot of English speakers online
1:20:09 and most of them don’t live in the U.S.
1:20:11 And so there’s a lot of talent out online.
1:20:16 And so one piece of this is what I just showed.
1:20:22 And the crucial thing about that is those stories there can have all the backlinks and citations,
1:20:23 right?
1:20:25 So they show the raw tweets that are underpinning it.
1:20:28 And if you click, you can just change the style.
1:20:29 I want this conservative.
1:20:30 I want this liberal.
1:20:36 You essentially now have turned all of the massaging and Russell conjugation.
1:20:40 Russell conjugation is you’re doing a bad thing, but I’m doing a good thing.
1:20:41 Like Zuckerberg has another great one.
1:20:44 They attack Zuck for having dual-class stock.
1:20:46 Just to show you how just evil these guys are.
1:20:48 You can’t fire Mark Zuckerberg’s kids.
1:20:51 That’s the problem with tech companies using dual-class stock schemes, right?
1:20:54 So it’s all like presidents are not kings, right?
1:20:57 So now this, maybe you’d believe this argument on its own, okay?
1:20:59 But the next day, what do they do?
1:21:03 Or the previous article, it’s like how punch protected the times, right?
1:21:06 So here, the solution was to give that.
1:21:12 So dual-class is good when they do it, and it’s bad when tech does it, right?
1:21:15 Now, the thing is, you have to have a long context window.
1:21:18 Like I have a long context window because I remember this article from 2012,
1:21:20 and I remember this one from 2019, right?
1:21:22 So you have to have a long context window.
1:21:25 And until recently, I didn’t know how to show somebody else
1:21:26 to find all these internal contradictions.
1:21:27 But guess what?
1:21:29 AI can do that.
1:21:30 AI can do that.
1:21:35 AI can find every internal contradiction to NYT ever, okay?
1:21:38 And so you could just have them, NYT versus NYT.
1:21:41 They’re enslaving people, and then they’re pretending they’re unsaid.
1:21:43 There’s just so many things like that, right?
1:21:44 The Ukraine pro and con, right?
1:21:46 Okay, so coming back to your point.
1:21:50 We need to have a stronger form of truth
1:21:55 because if we don’t have that, you’re essentially accepting their premise
1:21:57 that this event happened, right?
1:21:58 Right.
1:22:00 The crypto stuff I buy, but even before that, we haven’t been able,
1:22:02 we’ve been able to build commentary, but we haven’t, to your point,
1:22:04 had enough sort of pro-tech reporters.
1:22:06 So the ecosystem had to be there, right?
1:22:07 The ecosystem had to be there.
1:22:08 Things had to work.
1:22:09 Block space had to get there.
1:22:10 AI had to get there.
1:22:14 Like, we needed to feel clear for what we’re going to do,
1:22:19 which is decentralized cryptographic truth, right?
1:22:21 Decentralized cryptographic truth, where it’s free,
1:22:25 it’s verifiable on your computer, right?
1:22:27 That’s the thing about the Bitcoin blockchain, you can verify it.
1:22:30 Now, one of the things, I should be more clear about exactly what I mean by true or whatever.
1:22:36 When a statement is posted on chain, what you can verify is the metadata, right?
1:22:40 You can say, it’s very hard to falsify the time at which this was posted.
1:22:44 It’s very hard to falsify the hash because of properties of cryptographic caches.
1:22:50 And it’s very hard to falsify the digital signature of what entity posted it, right?
1:22:53 Each of those three things has certain cryptographic guarantees that I can get into why they’re hard.
1:22:55 But they’re hard to falsify that.
1:22:59 That doesn’t mean that it could be an AI image that you posted on chain.
1:23:04 But it would have been hard to, five years later,
1:23:07 to say that AI image never existed before when I can see proof of it.
1:23:09 It’s like the Brazilian fires photo is a great example of that, right?
1:23:12 Another example, in a Chinese court, actually,
1:23:18 blockchain evidence was used to show that someone had a patent that was invalid
1:23:21 because somebody had posted something very similar to it many years ago
1:23:23 so they could use the hash to show they had priority.
1:23:24 Does that make sense, right?
1:23:28 So, there’s enough stuff that we’ve done in crypto
1:23:32 with proof of location, proof of this, proof of that, proof of solvency.
1:23:35 There’s many kinds of attestations and proofs
1:23:37 that you can put on chain that are pretty hard to fake
1:23:39 that is a fundamentally new set of primitives
1:23:42 that journos aren’t equipped to deal with
1:23:44 because we’re talking about math, right?
1:23:46 And they can’t do math.
1:23:47 They’re anti-selective.
1:23:49 If they could do math, they’d be in tech, usually, right?
1:23:51 But math is a universal property of humans.
1:23:53 You don’t need a subscription to the New York Times to do math.
1:23:56 I don’t need to pay Salzberger to do math, right?
1:23:57 Someone in India, someone in the Philippines,
1:23:59 someone in the South, someone in the North, wherever,
1:24:00 you can do math.
1:24:03 You don’t have to subscribe here for the truth, right?
1:24:06 Like, the truth is actually everybody’s thing, right?
1:24:07 Everybody should have access to the truth.
1:24:08 You shouldn’t have to pay the Salzburgers for the truth.
1:24:11 And in fact, I refuse to pay the Salzburgers for the truth, right?
1:24:14 I don’t allow them to centrally determine what truth is.
1:24:16 That’s exactly the same thing as Pravda and the Soviet Union, right?
1:24:18 So it gets a very fundamental thing
1:24:21 where tech guys are sensing there’s something here,
1:24:23 but ultimately the network has to supplant the state
1:24:25 as the form of truth.
1:24:27 That’s what Bitcoin represents, the truth machine.
1:24:30 And it gives a set of primitives, as I mentioned,
1:24:31 the who, the what, the when,
1:24:33 and then with other things, we can send that to the where,
1:24:36 that we can actually have a feed of facts, right?
1:24:38 So once you have the root feed of facts,
1:24:40 and think of it as like Twitter,
1:24:42 but with decentralized cryptographic verification.
1:24:43 That’s one way of thinking about it, right?
1:24:45 Imagine you have a bunch of checks, community notes,
1:24:47 but a bunch of check marks at the bottom,
1:24:49 like a continuous integration with GitHub, right?
1:24:51 Where you have a bunch of checks that’s green or red
1:24:53 if the site is deploying properly.
1:24:54 You have a bunch of assertions on it.
1:24:56 Think of it as Trugle, right?
1:24:58 It’s like Google, but for truths.
1:25:00 And you just run every assert,
1:25:01 and all these models are saying
1:25:03 whether something is true or not, right?
1:25:05 And there’s some computation there,
1:25:07 but if it’s valuable enough,
1:25:09 I should put out a prize just for this, by the way.
1:25:09 You know what?
1:25:12 Actually, at ns.com, I’ll put out a prize.
1:25:13 Go to ns.com for a session to earn.
1:25:15 Actually, we’ll put that up on screen.
1:25:16 I’ll send that link to you right after this.
1:25:19 I’ll put out a prize for decentralized cryptographic truth
1:25:20 and Farcaster, right?
1:25:24 Where essentially, you can maybe pay a little bit of crypto
1:25:26 for model evaluations to just fact check something.
1:25:28 It’s sort of like, at Grok, do this.
1:25:29 But I think a better way of doing it
1:25:30 is have multiple models do it,
1:25:32 give like the premises,
1:25:33 give the backlinks and so on and so forth.
1:25:34 And then eventually,
1:25:36 those things should be on chain where it links to.
1:25:38 And by the way, you know who agrees with me so much,
1:25:39 I think on this is Solana,
1:25:41 where he’s like, needs to do more reporting,
1:25:42 not just commentary and so on.
1:25:43 And a good version of that
1:25:46 is Nick Carter’s work on Operation Chokepoint, right?
1:25:46 That’s great.
1:25:49 So that’s a great example of something which is reporting
1:25:52 and not just summary, right?
1:25:53 Not just commentary.
1:25:55 Another example of this,
1:25:56 and what’s interesting, by the way,
1:25:59 is notice that our first-party testimony,
1:26:01 see, when we give first-party testimony,
1:26:02 in aggregate, that’s actually reporting.
1:26:04 So we’re doing things,
1:26:07 like, you know how someone who has like raw talent
1:26:10 in basketball or football or something
1:26:11 can do things,
1:26:12 and they don’t necessarily have great form,
1:26:14 but they can just somehow get it done
1:26:15 with just raw athletic talent there.
1:26:18 There’s a lot of things we’re doing that are good,
1:26:20 that are done on raw, like intuition.
1:26:21 Because when you have a bunch of people
1:26:24 who are posting on X and not talking to journalists,
1:26:26 then the quotes get pulled,
1:26:27 because people would use to say,
1:26:28 I’m canceling my subscription.
1:26:30 And that was always fake and stupid, right?
1:26:32 Because who cares?
1:26:33 They’ve got a million subscribers.
1:26:35 That doesn’t do anything, really, except on Moss.
1:26:37 See, they can get another subscriber,
1:26:40 but they can’t get another quoter, right?
1:26:44 They can’t get another supplier of quotes, right?
1:26:46 Because there’s only one A, six, and Z.
1:26:47 There’s only one Elon.
1:26:49 What is Elon when you email, like, PR, Tesla, or something?
1:26:51 He just replies back with a poop emoji.
1:26:52 What do you reply back to the Washington Post?
1:26:55 He’s like, send my regards to your puppet master, right?
1:26:57 Because he knows, right?
1:26:59 He knows that basically, like,
1:27:01 that they won’t criticize their boss, only yours, right?
1:27:04 So we did this thing intuitively
1:27:07 by freezing them out, of quotes,
1:27:08 not talking to them,
1:27:09 and posting the stuff ourselves.
1:27:11 Now they’re just reduced to bloggers.
1:27:13 Now they’re not sourced.
1:27:14 See, that’s another thing, by the way.
1:27:17 Like, an important concept is,
1:27:19 like, how do the good journalists operate?
1:27:20 You’ll see some of them,
1:27:22 they are almost like a CIA station chief.
1:27:23 They’ll post in their Twitter,
1:27:25 for tips, email, you know,
1:27:27 message me at Signal, this, that, and the other, right?
1:27:28 They’re literally saying,
1:27:29 it’s like a CIA bureau chief
1:27:32 who’s set up their office there in this country,
1:27:34 and, like, some weak country
1:27:35 can’t do anything about that, right?
1:27:38 It’s like a KGB officer who’s there in the country,
1:27:40 and they can’t be deported or whatever
1:27:41 because they’re, like, some embassy rights, right?
1:27:43 So they’re, like, spying on Facebook.
1:27:44 They’re spying on Meta.
1:27:45 They’re trying to solicit leaks.
1:27:48 And why do people leak at these companies
1:27:49 if they leak at these companies?
1:27:50 For the same reasons, you know,
1:27:52 I think it’s, like, M-I-C-E,
1:27:54 you know what that is in the CIA?
1:27:57 Money, ideology, compromise, and ego, right?
1:28:00 So why do people leak to journos?
1:28:02 Why do people talk to journos?
1:28:04 Sometimes it’s money where there’s,
1:28:05 for example, at Uber,
1:28:08 like, the VCs there wanted money,
1:28:10 and Travis didn’t want to sell or IPO,
1:28:12 so that’s why they did it, in part.
1:28:14 Ideology, why?
1:28:16 Because sometimes they’re far left within an organization
1:28:19 and they want to attack that organization.
1:28:22 Compromise, well, that’s interesting.
1:28:25 That’s often, sometimes the journo will have something on somebody
1:28:27 and they’ll say, I won’t print this if you give something else.
1:28:29 That’s not an economic transaction,
1:28:31 but that’s a very dastardly thing.
1:28:33 So it’s like, yeah, don’t talk to journos.
1:28:35 Everybody, what happens is,
1:28:37 the NYT or WHA or whatever,
1:28:40 they’ll message you and they’ll put on their nicest kind of thing.
1:28:43 They’re taught to flatter and sympathize in the email.
1:28:43 You know what it’s like?
1:28:44 Actually, you know what it’s exactly like?
1:28:50 Our SDRs, our sales development guys, our sales guys, right?
1:28:53 They send out emails that are really crafted,
1:28:55 cold email, blah, blah, blah, things, right?
1:28:55 To make the sale.
1:28:58 And it’s a completely calculated thing, okay?
1:29:01 Go and look at, I don’t know, Mark Craney stuff on sales.
1:29:04 If you need a filter, an analogy to understand the journos,
1:29:06 the journos are sending you sales emails.
1:29:09 The difference is, they’re scam sales emails.
1:29:12 It’s like a Nigerian, whatever, it’s like a scammer email, right?
1:29:14 So at least when we’re doing enterprise sales,
1:29:16 maybe it’s an aggressive sale at times,
1:29:16 or whatever, someone’s doing it,
1:29:18 but the product has to work.
1:29:19 They can cancel subscription or whatever.
1:29:22 It’s not, ha, you bought the product.
1:29:23 Now we got malware on your property.
1:29:24 We’re going to destroy your company.
1:29:27 That’s actually what the journos sales email is like, okay?
1:29:29 So there’s an analogy, you can only go so far, right?
1:29:33 And the point being that the ego part, M-I-C-E,
1:29:35 just like the CIA, the bureau chief,
1:29:38 people will do it to get their name in the press.
1:29:41 They’ll do it because they think, oh, it’ll work for me.
1:29:42 I’ll be the one.
1:29:43 I can charm them.
1:29:47 Everybody has to, you know, learn this lesson somehow, right?
1:29:49 I do want to call it that there are some, you know,
1:29:51 we named some of them, but there’s some other new media folks
1:29:52 who are sub-stackers, et cetera,
1:29:56 who are doing journalism, but are not the same journal.
1:29:56 Okay, okay, so all right.
1:29:59 So now let me get to a very, very, very, very,
1:30:00 very important point, okay?
1:30:04 Many words have been corrupted in a certain way.
1:30:08 So when I say journalism, I mean blue journalism, okay?
1:30:11 Because if you were to ask some journo,
1:30:14 is Ben Shapiro a journalist?
1:30:16 They’d say, no, of course not, right?
1:30:19 If you ask them, is Nate Silber still a journalist?
1:30:20 Is Glenn Greenwald still a journalist?
1:30:21 Barry White, I don’t know.
1:30:22 Yeah, Barry.
1:30:25 Are they still, no, they’re just running a blog, right?
1:30:26 Obviously, NYT is Silber’s blog,
1:30:29 in the same way Free Press is Barry Weiss’s outlet, right?
1:30:31 Okay, so this is a very important point.
1:30:34 Let’s say that Zuck competes with TikTok, right?
1:30:38 Zuck would never say TikTok’s not doing technology, right?
1:30:41 Yeah, that’s Chinese technology versus American technology,
1:30:42 but they’re still doing technology.
1:30:44 They’re recognizingly playing the same sport.
1:30:45 You might say they’re like,
1:30:48 it’s under the Communist Party surveillance, whatever.
1:30:50 You can make all those points and argue all that,
1:30:51 and Trump is flipped on, whatever.
1:30:53 Leading that aside, the point is that
1:30:54 you wouldn’t say they’re not doing technology
1:30:55 just because they’re adversarial.
1:30:56 They are doing technology.
1:30:59 They’re just doing it on the Chinese side, right?
1:31:03 Versus the blue journalist will actually deny
1:31:06 that Substack is journalism, right?
1:31:08 That Ben Shapiro is journalism.
1:31:12 Because even if Ben Shapiro has like millions more followers
1:31:14 than they do in a much larger audience and so on and so forth,
1:31:17 even if he’s smarter than they are in many ways,
1:31:18 and, you know, and like a better comp,
1:31:20 certainly he’s better than like their opinion editors and so on.
1:31:22 And he used the Substack,
1:31:24 which by the way are doing original reporting.
1:31:26 What they say, when they say journalism,
1:31:29 they mean he’s not in the club, right?
1:31:31 So remember the social network thing with the blue and the red?
1:31:34 Once you think about it as a network, right,
1:31:37 where the borders are fuzzy, but no less real for being fuzzy,
1:31:38 a network of blues, right?
1:31:42 So like Glenn Greenwald is on the boundary of that, right?
1:31:44 Seymour Hersh maybe arguably is on the boundary
1:31:46 because he’s on Substack and so on and so forth.
1:31:48 Like Barry Weiss arguably is on the boundary in some ways
1:31:50 because she was formerly in the club and so on and so forth.
1:31:54 So it’s a little bit like being an MD or a JD
1:31:57 where you have a formal state license.
1:32:01 To be a blue journo is to have an informal state license, right?
1:32:02 Why is it informal?
1:32:05 Because if they were formally state licensed,
1:32:08 they could say that it’s a state-controlled press.
1:32:13 So instead, what they get is a White House press pass.
1:32:14 It’s a press-controlled state.
1:32:18 The point about that is that once you see that it’s a network,
1:32:22 it’s a club, right, that’s when you realize,
1:32:28 oh, don’t talk to blue journalists is actually really what I’m saying, right?
1:32:28 Yeah.
1:32:30 And when I say tech journalist doesn’t count either,
1:32:32 just tech journalists, like TechCrunch is on,
1:32:35 that word has been, the problem is words have been tortured
1:32:37 to mean the opposite of what they mean, right?
1:32:38 Yeah, it’s anti-tech journalists.
1:32:39 Like science.
1:32:41 Yeah, it’s anti-tech journalists, right?
1:32:41 Exactly.
1:32:45 Like science got tortured to mean masks don’t work before they do, right?
1:32:47 So you actually have to have some prefix or something,
1:32:51 which is like science in the form of independent replication,
1:32:52 not procedure citation, right?
1:32:55 We’re trying to coin new media, something new.
1:32:55 That’s right.
1:32:57 And another example of this is democracy.
1:33:01 Like for the Democrats, it means California is a one-party state, right?
1:33:03 Here, let me show you this, just to show you.
1:33:06 So Democrats and communists have both built one-party states, right?
1:33:10 So here is Newsom taking lessons from Xi,
1:33:18 and he’s explaining how this is an amazing, amazing visual, right?
1:33:21 Where total Democrat Party control, right?
1:33:24 Democrats and communists have both built one-party states, right?
1:33:26 This is more than just like a one-liner.
1:33:27 It’s a deep point.
1:33:32 Just like when they said science and they turned to the opposite of science, right?
1:33:35 Which was masks don’t work before they do.
1:33:40 Just like they said media or they said journalism and they turned to the opposite of journalism,
1:33:44 which is basically it’s not neutral reporting on anything.
1:33:47 It’s reporting on the enemies of blues and protecting blues, right?
1:33:53 Here, they turned democracy into the opposite of democracy where they destroyed competitive multi-party elections, right?
1:33:58 In California, elections are held, but the party always wins, exactly like China, okay?
1:34:01 Democrats destroyed democracy in California.
1:34:02 Deep point.
1:34:05 This is why things got so bad there.
1:34:12 Because with no Republican check, with no multi-party competition, this is when the California train of $100 billion,
1:34:17 this is when the graft really got underway, the homeless industrial and complex explosion,
1:34:22 because there was no accountability at government level for all the Democrat abuses.
1:34:27 They built a one-party state and started looting it, just like the communists did, but I repeat myself, right?
1:34:31 When a Republican is elected, that’s a threat to democracy.
1:34:37 But when a Democrat surveils or sanctions or deplatforms or unbanks, that’s just democracy, right?
1:34:44 Now, the thing is, to be fair, it is true that many Republicans in response to this have started to build Florida,
1:34:46 especially into their own one-party states.
1:34:53 So the problem is that you have Democrats, Republicans, and communists that have all created basically one-party states,
1:34:58 where the only democracy then is going to be the right to exit, to vote with your feet, right?
1:34:59 Go ahead.
1:35:03 Yeah, that’s the perfect place to wrap this episode in terms of it gets to the network state.
1:35:03 The network state.
1:35:07 Now you can vote with your feet and go to California or go to Florida or go to wherever.
1:35:08 That’s right.
1:35:12 And we want to combine these because Starbase shows that you can combine all threes.
1:35:13 And I’ll end with just two things.
1:35:16 So essentially, this is a really important point.
1:35:18 We reclaim free speech.
1:35:20 We need to reclaim democracy, right?
1:35:22 We cannot give up on democracy.
1:35:26 Democracy is actually, first of all, that’s an important interpretation of what I just said.
1:35:31 It was not an abundance but a deficit of democracy that resulted in California’s downfall
1:35:34 because the Democrats built a one-party state, destroyed all multi-party competition,
1:35:37 they gerrymandered it, and that’s how they started all the looting,
1:35:39 the hundreds of billions of dollars in looting, right?
1:35:41 But we can have a rebirth of democracy.
1:35:43 And maybe we can do our next talk.
1:35:45 Democracy is creating startup cities, right?
1:35:45 Why?
1:35:48 They voted, people voted with their feet to move to Starbase.
1:35:50 They voted with their wallet to build up Starbase.
1:35:53 And then finally, they incorporated Starbase by voting with their ballot.
1:35:54 That’s the future of democracy.
1:35:57 Not a two-party system with the illusion of choice,
1:36:00 but a thousand-city system with the reality of choice.
1:36:02 97% for Elon, right?
1:36:05 This is essentially a precursor to what’s coming next,
1:36:09 where you vote with your feet, your wallet, and your ballot at the same time.
1:36:13 And that’s the only way that you can vote against the Democrats or the communists.
1:36:15 The only remaining vote is that vote.
1:36:18 That’s where the true vote is.
1:36:20 And what we need to do is reduce the barrier to exit
1:36:24 to give everybody that practical franchise, right?
1:36:28 Reduce lock-in, make it possible for people to actually have choice
1:36:31 over the government that rules them, right?
1:36:34 And this is also, of course, basically,
1:36:36 we need to become the largest funders in the world
1:36:39 of media, of democracy, of science.
1:36:42 And we actually mean it in the uncorrupted versions,
1:36:44 because I actually do believe in those things unironically, right?
1:36:47 I do believe in media, in books, in writing, and all this kind of stuff.
1:36:51 As I said, remember, we’re a fork of the East Coast, right?
1:36:52 We’re a fork of that establishment.
1:36:55 So we basically, with technology,
1:37:00 we can have a new birth of media, science, democracy, equality on the internet,
1:37:01 because that’s what the internet is, is a peer-to-peer network.
1:37:02 We’re all equal on the internet.
1:37:04 And truth is everybody’s property.
1:37:06 It is not Selsberg’s property.
1:37:07 It’s cryptography.
1:37:10 That’s a great place to wrap.
1:37:11 Balaji, always a pleasure.
1:37:13 Thank you so much for coming on the podcast.
1:37:13 Thanks.
1:37:18 Thanks for listening to the A16Z podcast.
1:37:20 If you enjoyed the episode,
1:37:24 let us know by leaving a review at ratethispodcast.com slash A16Z.
1:37:27 We’ve got more great conversations coming your way.
1:37:28 See you next time.

What really caused the breakdown between tech and media—and what comes next?

Erik Torenberg sits down with Balaji Srinivasan (entrepreneur, investor, and author of The Network State) to explore the long-building conflict between Silicon Valley and legacy journalism. Balaji explains how the collapse of traditional media business models gave rise to political capture, clickbait, and adversarial coverage of the tech industry.

They discuss why “going direct” is no longer optional, how tech became the villain in establishment narratives, and what it would take to build a new truth infrastructure – from decentralized content creation to cryptographic verification.

This episode covers power, distribution, and the future of media, with a signature mix of historical insight, social analysis, and Balaji’s forward-looking frameworks.

Timecodes: 

0:00 Introduction 

1:26 The Media vs. Tech Conflict

2:11 The Collapse of Journalism Revenue

2:39 Rise of Wokeness and Political Realignment

6:50 State vs. Network: A New Framework

9:00 The Power Structure of Media Institutions

19:25 The Role of Distribution and the Internet

29:20 The Social War: Red vs. Blue America

30:05 X Day and the Shift in Social Media Power

42:56 Strategies for Technologists: Go Direct

48:36 The Importance of Individual Creators

1:10:00 Decentralized Truth and the Ledger of Record

1:36:00 The Future of Media, Democracy, and Equality

1:37:08 Conclusion & Final Thoughts

Resources

Find Balaji on X: https://x.com/balajis

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://x.com/eriktorenberg

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Leave a Reply

Your email address will not be published. Required fields are marked *