Author: a16z Podcast

  • America’s Energy Problem: The Grid That Built America Can’t Power Its Future

    AI transcript
    0:00:07 The energy grid and electrical grid of the future, it’s not just going to be the dichotomy of generation, transmission, and storage.
    0:00:12 This sort of next generation of what the grid looks like is going to be in a much more decentralized way.
    0:00:14 Why are delivery costs such a big problem?
    0:00:20 The grid is aging now and brittle. The workforce has aged out.
    0:00:24 Should we just leapfrog the grid? I need this power now, today.
    0:00:27 The United States needs to get better at megaprojects.
    0:00:29 Things that are a billion dollars, things that are at scale.
    0:00:35 There is no safety, there is no national defense, there is no national security without a reliable electrical grid.
    0:00:40 U.S. energy usage per capita peaked in 1973.
    0:00:42 Since then, it’s been flat.
    0:00:46 Meanwhile, China’s per capita energy use has grown ninefold.
    0:00:54 Today, with AI, EVs, manufacturing, and data centers demanding more power than ever, America’s electrical grid is buckling.
    0:00:57 We haven’t just underbuilt it, we’ve forgotten how to build it.
    0:01:06 In this episode, I’m joined by A16Z general partners David Yulevich and Aaron Price-Wright and investing partner Ryan McIntosh from the American Dynamism team.
    0:01:11 We talk about how the U.S. energy system broke, why fixing it is about more than megawatts,
    0:01:15 and what it’s going to take from new tech and talent to faster permitting and smarter software.
    0:01:25 As a reminder, the content here is for informational purposes only, should not be taken as legal, business, tax, or investment advice,
    0:01:32 or be used to evaluate any investment or security, and is not directed at any investors or potential investors in any A16Z fund.
    0:01:37 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
    0:01:45 For more details, including a link to our investments, please see A16Z.com forward slash disclosures.
    0:01:53 U.S. energy peaked in 1973 in terms of per capita usage.
    0:01:56 China’s has increased ninefold over that same time period.
    0:02:00 We have some reasons to be optimistic now that things have started to change or will change further.
    0:02:02 Why don’t you give some context there?
    0:02:04 What’s happened and why should we be excited about what’s coming forward?
    0:02:09 The history of the grid in the United States was build big power plant industry formed around it.
    0:02:12 The grid grew incredibly fast through the 20th century.
    0:02:15 Then around the 80s and 90s, things started slowing down.
    0:02:19 And then through the early 2000s, the grid effectively froze in the United States.
    0:02:25 A big piece of that was a lot of the energy generation, a lot of the manufacturing, a lot of sort of heavy industry moved to Asia.
    0:02:29 And so for the last 20 years, effectively, the grid has ossified.
    0:02:32 We forgot how to build new power plants.
    0:02:38 We forgot how to build new power projects, new loads, sort of large data centers, large factories, large megaprojects.
    0:02:41 And you say we forgot, were we not allowed to, or we actually just lost the know-how?
    0:02:42 We were allowed to, but we lost the skill set.
    0:02:45 And I think you can see it in more extreme examples with nuclear power plants.
    0:02:47 All of that sort of transition happened decades before.
    0:02:54 But basically, the grid itself, the grid operators, forgot how to plan, how to move quickly, how to do it cheaply.
    0:02:57 And so now we’re at this point in time where we are reshoring.
    0:03:00 And we are bringing back manufacturing, we are bringing back data centers.
    0:03:06 And there’s this highly concentrated demand, and it’s now, now, now at sort of any price, but they cannot move fast enough.
    0:03:07 And so that’s what we’re seeing today.
    0:03:10 We talk about data centers and the grid being inflexible to this.
    0:03:12 It’s playing catch up.
    0:03:16 We need to do a lot of the growth that happened in China and bring this here and do it incredibly fast.
    0:03:20 How did this forgetting happen, and how can this relearning happen or this retraining happen?
    0:03:22 It’s a good question.
    0:03:24 I think a lot of it’s like a workforce issue.
    0:03:25 I think a lot of it’s a policy issue.
    0:03:39 I think the United States, historically, was a bunch of regulated utilities, sort of the top-down, big thermal power plant, big transmission lines connecting to substations, then distribution lines going to individual factories, homes, things like that.
    0:03:45 And I think some of the newer technologies don’t necessarily benefit from scale in the same ways that these large thermal plants typically did.
    0:03:50 And so this sort of next generation of what the grid looks like is going to be in a much more decentralized way.
    0:03:53 So there’s also an element of relearning of what is the grid actually?
    0:03:57 Is the grid these large sort of power systems and large infrastructure projects?
    0:04:04 Or does it look like a lot more decentralized way where we can eliminate a lot of the wires in between, things like delivery costs, which have increased exponentially?
    0:04:07 And can we do it in a more dynamic and flexible way?
    0:04:10 So things like solar and batteries, they don’t need to be massive.
    0:04:11 You can put them anywhere.
    0:04:12 You can put them next to load.
    0:04:16 And so this is also sort of an element that grid operators are thinking through.
    0:04:22 How do we do that while also managing frequency, voltage, things like that, without causing sort of a grid to go down?
    0:04:23 So there’s a lot of challenges.
    0:04:28 Well, if you think about what our grid is, I mean, it’s a piece of technology that was designed about 100 years ago.
    0:04:33 And very little technology on the grid has changed in those 100 years.
    0:04:36 And you look at why are delivery costs such a big problem.
    0:04:38 Grid is at capacity.
    0:04:42 Getting a new project onto the grid today, you know, you sign up for interconnection, it could take a decade.
    0:04:46 There’s a backlog of 20 plus years to get a new transformer.
    0:04:51 The transformer technology we’re using today is kind of bananas if you actually look at what makes a transformer.
    0:04:58 First of all, there’s one company that makes these, and there’s one plant in the U.S. that produces the right type of steel that you need in order to make these transformers.
    0:05:03 And it’s 100-year-old tech, and it’s like the wait list for transformers is insane.
    0:05:12 So I think we’re getting to the point in demand starting to rise again and the calcification of grid technology where it’s like should we just leapfrog the grid?
    0:05:15 Like do we really need to wait in line and wait for this to catch up?
    0:05:17 I think there’s like two versions of how you do that.
    0:05:23 One is how do you get power gen and power storage essentially as close to demand as possible?
    0:05:34 And that’s a problem for new tech to really help solve because we’re talking about instead of these like mega projects that we’re used to building, well, not used to building anymore, like massive nuclear plants, massive new natural gas plants, etc.
    0:05:40 We’re talking about much smaller and more distributed sources of power bypassing interconnection altogether in many cases.
    0:05:48 We’re seeing that as a pretty big trend with data centers as data centers are just building power directly on site and co-locating power with the data center because they’re like, I can’t wait.
    0:05:52 Microsoft’s like, I can’t afford to wait 10 years to get an interconnection with the grid.
    0:05:54 I need this power now today.
    0:05:59 So how do you get power more tightly coupled with the load that it’s actually going to serve?
    0:06:10 And that’s a really interesting problem for tech also from a software perspective because if you get generation, storage, and usage all co-located very closely together, like that’s a very good problem for AI to solve, like reinforcement learning.
    0:06:14 Stick that in there and suddenly you get massively efficient systems that you couldn’t get at grid scale.
    0:06:20 I think an interesting point to add on to that is that there’s very little visibility into the grid itself.
    0:06:29 So like they understand sort of power plants are operating or not, but especially at sort of the distribution level, the things, the power lines you might see outside your home, there’s very little understanding of what’s actually going on there.
    0:06:35 And so there’s like a reluctance, especially when you have things like net metering where I’m sending from my home a battery back to the grid.
    0:06:37 Things get incredibly complicated.
    0:06:44 And so the grid operators don’t have a very good understanding of when can they allow new projects to go online, how much power, when to actually cut people off.
    0:06:53 And so there’s a lot of these policies, interconnections, sort of the general term, but like states like Texas have much more lenient policy of you can build wherever you want.
    0:06:55 If we need to cut it off from the grid, we’re going to do so.
    0:06:57 And so it’s sort of this connect and manage approach.
    0:07:09 Whereas other states, like they will do these incredibly long feasibility studies in like a variety of sort of scenarios with like the entire grid is at peak capacity, but they want to make sure this specific project can stay online 24-7.
    0:07:13 So that ends up creating these massive delays of being able to study every single possibility.
    0:07:16 So there’s a lot of policy approaches here as well.
    0:07:27 And there’s a bunch of these technologies called grid enhancing technologies, which are effectively like, you know, an average power line might be used at 50% capacity, but it needs to be designed for the peak capacity for the summer when everyone has their AC on.
    0:07:31 And so there’s a lot of sensors or other technology that could be placed there.
    0:07:35 So you have a much more dynamic view of what our infrastructure actually was looking like.
    0:07:39 And so when we have these new technologies, then we can much more efficiently use the infrastructure we have.
    0:07:42 What are your reactions to this conversation thus far?
    0:07:45 Where are some areas you’re particularly excited about or reasons to be optimistic?
    0:07:57 Well, I think the reason we’re having this conversation is we’re touching on a bunch of these topical themes, which is that we’re in a moment in time where, exactly like Ryan said, the grid is aging now and brittle.
    0:07:59 The workforce has aged out.
    0:08:01 We’ll talk about that more, I think, in a minute.
    0:08:14 But we had to go out and hire and train entire specialized crews, specialized people that work with cement and concrete, specialized people that work with steel to go build the large Vogel reactors in Georgia.
    0:08:17 We put them on Vogel reactors three and four.
    0:08:18 We turned them on.
    0:08:19 Huge win.
    0:08:22 And then those people that went back to building highways or like bridges or something else.
    0:08:31 And like instead of just going and put them on Vogel like five, six, seven, eight, nine, ten and building just like this massive crescendo of nuclear power, we just put these people back into the general workforce.
    0:08:33 And so we’re not learning our lesson there in the workforce.
    0:08:51 And at the same time, we have this insatiable thirst for energy, whether it’s EVs, whether it’s data center compute for AI, or just generally a shift toward more and more consumption of electricity, or even just like the reshoring and manufacturing, all these things that are just very, very electron heavy.
    0:09:01 At the same time that we’re seeing this, you know, what Aaron brings up to me is a piece that we hardly ever talk about, which is resiliency and not having people be as dependent on the,
    0:09:03 the interconnectedness of the grid.
    0:09:15 People that you deploy solar, you know, we talk about like distributed compute for those of us that are in the tech world and like how important it is to like have distributed compute and have networks be able to suffer and survive through segmentation things.
    0:09:17 But like the grid is very interdependent.
    0:09:22 Even in the US, there’s really only a few major regions that can segment themselves off.
    0:09:34 But when you deploy solar or you deploy batteries, or you deploy an SMR reactor or your own power generation on site for your own data center, you don’t have to worry about how brittle the grid is because you’re fairly resilient from it.
    0:09:44 And I think that’s a component is that the energy grid and electrical grid of the future, it’s not just going to be the dichotomy of generation, transmission and storage.
    0:09:52 But as Aaron brought up, you might do all three of those things in the same place and not have to worry about how robust the grid is or how capable the grid operators are.
    0:09:56 And I think that’s a dimension that was never important to people before, but it’s important today.
    0:10:06 And you can certainly imagine if you’re the military, you certainly care about having reliable access to power at all your forward operating bases and even at home at your home military bases.
    0:10:09 Like you just cannot lose your ability to have electricity.
    0:10:11 And so I think all these things are just coming together at once.
    0:10:14 And it’s really exciting moment in time.
    0:10:33 And I think it’s buoyed by the fact that we’re also at this sort of technology inflection point where AI can help some of these things, not just be a consumption driver, but even be an enabler and facilitating more efficient use of electricity, better monitoring of the grid, better ways to even go through the regulatory and permitting process, which is onerous for many cases.
    0:10:42 Building on that, I think Texas, literally today, we’re recording this during a massive heat wave that’s affecting most of the eastern and southern United States.
    0:10:59 And if you compare the grids of Texas with New York, Texas famously, historically, had massive grid failures several years ago when a big heat wave came through, the grid couldn’t keep up with all the air conditioners that were going on, and people saw massive power outages.
    0:11:00 Everyone was really mad.
    0:11:02 People were like, oh, ERCOT doesn’t work.
    0:11:03 DREG doesn’t work.
    0:11:05 And what has Texas done in the couple of years since that happened?
    0:11:09 They have absolutely flooded the grid with solar capacity.
    0:11:13 Texas has doubled their solar capacity in the last approximately three years.
    0:11:17 And with that, they’ve just deployed thousands of batteries.
    0:11:26 One of our portfolio companies, Base Power, is one of the players here, but there’s many battery power companies deploying all across Texas to provide storage for that solar power.
    0:11:38 And if you look at the performance of the Texas grid versus the performance of the New York and surrounding area grid during this heat wave, I must have seen 10 news articles this morning about how well Texas grid has done.
    0:11:45 The elasticity and ability to react to very quick changes in demand without having to change kind of baseload power.
    0:11:50 Like, you can’t build a new natural gas plant or a new nuclear reactor overnight.
    0:11:53 But solar is just so insanely cheap.
    0:11:58 Like, it’s basically having a giant, massive, huge nuclear reactor in the sky that will go forever.
    0:12:00 And Texas isn’t a green state.
    0:12:01 This isn’t a political issue.
    0:12:11 But it’s like, why aren’t we deploying the world’s cheapest form of power literally everywhere we possibly can and then just putting batteries everywhere?
    0:12:12 Like, there just should be batteries everywhere.
    0:12:19 It’s bananas to me that batteries as a topic has, like, recently gotten caught in the sort of political crosshairs.
    0:12:23 We really, as a society, need to be good at power storage and batteries.
    0:12:25 Like, this shouldn’t be a controversial topic.
    0:12:27 We invented the lithium-ion battery.
    0:12:42 And yet today, if you want to buy a battery, whether it’s for a drone or for the grid or for your car or for whatever it is, like, you’re either buying a battery made in a lights-out factory in China or you’re buying a battery produced in Vietnam by a Chinese company.
    0:12:45 And, like, there’s no meaningful effort in the U.S. to change that.
    0:12:54 And I think that this is a really critical problem, not just to manage power load on the grid, but for power for all of the things that we need to power the next generation of innovation in the United States.
    0:13:03 I think we’d be hard-pressed on the American dynamism team to think of a company that we’ve met with an interesting technology in the last two years that doesn’t have a battery in it somewhere.
    0:13:08 So, as a country, we need to be investing in battery technology and battery manufacturing.
    0:13:21 By the way, if China decides that whatever your company is doing that’s using batteries doesn’t align with what they like or they want to punish you, being cut off from being able to buy batteries from China is incredibly punitive to a company.
    0:13:23 And we’ve certainly seen that happen with some of our startups.
    0:13:31 And then you find out quickly that the ability to procure and source batteries from places that are not in China is very difficult.
    0:13:39 If you extrapolate that out to what would happen to our whole country if we just were unable to buy batteries from China, it could be catastrophic in a very short period of time.
    0:13:49 Just to add on to a quick point on the grid side of batteries, if the rest of the country, which is presumably watching what’s going on in ERCOT, which is the grid operator in Texas,
    0:14:02 if Texas can prove that you can deploy these sort of decentralized distributed energy resources and to sort of flatten these peaks, provide more resiliency and ultimately lower price of electricity, then every state should go and do this.
    0:14:07 There’s a very complex web of deregulated and regulated entities when it comes to the grid.
    0:14:13 Of course, there are a lot of different policy and workforce and political reasons why not everywhere is this decentralized world.
    0:14:18 And it’ll probably be more complex than just these deregulated energy-only markets that Texas works with.
    0:14:23 But I think this is going to be very obvious if it isn’t obvious already.
    0:14:29 And I think the United States needs to move incredibly fast to make this happen and hook up batteries, solar panels, make it easier and cheaper to do it.
    0:14:33 Even co-location for large loads is still a very politically fraught issue.
    0:14:35 Utilities are pushing against this.
    0:14:38 It’s still really hard to hook up solar and batteries to your home.
    0:14:43 I think it’s actually cheaper to put residential solar on your home in Germany than in the United States.
    0:14:46 And that’s largely a permitting, largely an installation issue.
    0:14:47 That’s crazy.
    0:14:48 That should not be the case.
    0:14:51 Erin, I believe the quote in your college yearbook was,
    0:14:52 drill, baby, drill.
    0:14:57 How do you think about your sort of love for oil and gas with other sources of energy?
    0:14:59 Oh, my parents will be shaking their heads if they hear this.
    0:15:04 I think, broadly speaking, our approach to energy in the U.S. just needs to be yes and.
    0:15:11 You look at the sort of atrophy of our power build-out over the last 30, 50, whatever, you know,
    0:15:14 you name your time frame years compared to, let’s say, China.
    0:15:19 And if we want to accomplish the goals that we’ve set out as a society to accomplish over the next decade,
    0:15:20 like, we need more power.
    0:15:22 And it’s a matter of yes.
    0:15:25 And I think solar and batteries, extremely important.
    0:15:29 And D, you should talk more about the exciting things that are happening around nuclear.
    0:15:32 But, like, there is a place for oil and gas.
    0:15:35 Like, I cut my teeth at Palantir working in oil and gas.
    0:15:37 My husband worked in oil and gas.
    0:15:40 Like, the first check I wrote at A16Z isn’t an oil and gas company.
    0:15:43 So, this isn’t me coming with a particular agenda around carbon.
    0:15:47 This is me coming and realizing that, like, we basically need every tool in our toolkit.
    0:15:54 And we should be using technology to deploy whatever makes most sense, wherever it makes most sense at scale.
    0:15:58 If we’re talking, like, energy mix of where we’re at today and where do we think we’re headed,
    0:16:05 if I were to make, like, a personal bet, solar batteries are just the ability to be incredibly cheap and deploy incredibly fast.
    0:16:07 Spin up and spin down.
    0:16:07 Yeah.
    0:16:09 And I think that already is a way, but I think that will continue.
    0:16:12 But I think to be very clear is that you need all different types of energy.
    0:16:15 You’re going to need true sort of baseload, dispatchable power.
    0:16:16 It’s going to be gas.
    0:16:17 It’s going to be nuclear.
    0:16:18 It’s going to be geothermal.
    0:16:20 It’s going to be a lot of hydro as well.
    0:16:25 And I think as you attach more of these sort of renewable resources or these non-reliable resources,
    0:16:28 while they’re incredibly cheap and works most of the time,
    0:16:34 this long-tail risk, once you get to, like, 50% to 75% of the grid, is going to become very, very expensive.
    0:16:36 You need a lot more battery backup, things like that.
    0:16:39 And so I think it’s going to be very complex and it’s going to be different for many different regions.
    0:16:43 But certainly it’s not all of any given resource.
    0:16:50 Yeah, when you look at, like, the changing nature of load over the next decade, some of that is going to come from data centers.
    0:16:51 Some fraction.
    0:16:57 I would say it’s probably overstated how much data centers contribute to the growing load in the United States over the next decade.
    0:17:00 Data centers generally are baseload.
    0:17:05 If you’re training a model, you’re largely using a dedicated amount of power for the long term.
    0:17:08 Maybe there are some fluctuations if you’re doing more inference.
    0:17:11 But I would generally say, like, data centers represent baseload.
    0:17:13 But then you also have things like electric vehicles.
    0:17:15 You have things like heat pumps and air conditioners.
    0:17:20 You have industrial autonomy, which may or may not be running 24-7.
    0:17:24 So you’re going to have some increase in the base level of power we as a society need.
    0:17:29 But continuing to increase the size of the peaks and troughs of how we use energy on a day-to-day basis.
    0:17:37 And we should be thinking about designing our grid and designing our energy mix and power sources around what those loads look like.
    0:17:41 And not over-solving for either baseload or variable power.
    0:17:50 I think just to put this in more tangible terms, the peak summer load in places like California might be half of what it is in winter or something like that.
    0:17:51 And it depends on what climate you’re in.
    0:17:58 And so the concept of baseload is, like, do you build all the plants you’d need for the 100 gigawatts of power you’re going to need when it’s wintertime?
    0:18:01 Or in the summer, you know, half the year, you’re only going to need 50.
    0:18:02 So, like, what would be baseload?
    0:18:08 What you need to do in sort of modern civilization is, every time you turn on the switch, like, the power is working.
    0:18:14 And so how you actually match supply with that very, very fluctuating load, both daily and seasonally, is very complex.
    0:18:21 And so, like, today, you might have to build a natural gas, what’s called, like, a peaker plant, that might only operate, like, a week a year.
    0:18:28 And so that’s an incredibly expensive asset that is going to only be delivering very expensive power, but is only needed when all the other resources are tapped.
    0:18:31 And it’s, like, that last couple megawatts of power.
    0:18:44 The alternative now you could do is what’s called, like, demand response or with batteries on the grid is say, okay, well, instead of doing this $10,000 sort of a megawatt hour plant, I can just make it so everyone’s thermostat in this area turns down a couple degrees.
    0:18:49 And so then an aggregate means that I don’t need to build that large asset or pay that expensive premium.
    0:18:51 Okay, pushing back on that.
    0:18:59 Like, I think that the American consumer will fully organ reject that level of dictation over how they use their power.
    0:19:10 I think a more likely outcome is that you can do it on the compute side and just say, look, these three racks of the data center are just going to go offline during the peak summer heat when you’re running your AC.
    0:19:13 This is not a critical job.
    0:19:13 Right.
    0:19:14 It’s a non-critical job.
    0:19:15 It’s not a mission job.
    0:19:17 It’s a back office job.
    0:19:19 And you’re just going to run it at night instead of during the day.
    0:19:22 And you’re going to pay less electricity for that benefit.
    0:19:26 I don’t know if I agree with Aaron that AI is not going to suck up all the compute.
    0:19:30 I think that Constellation just turned up a new nuclear reactor or is reactivating a reactor.
    0:19:40 I think Meta immediately sucked up all of the power that they’re going to generate or nine-tenths of it or something from the new Constellation reactor that Meta signed the contract extension for.
    0:19:50 And so I think we actually probably are underestimating the amount of compute that we’re going to soak up with electricity over the next 10, 20, 30, 40, 50 years.
    0:19:52 The amount of data we’re going to start storing.
    0:19:53 Just look at video.
    0:19:57 The amount of video we create per minute has just ballooned way beyond anyone’s expectations.
    0:19:59 I’m sure the same will be true for AI compute.
    0:20:05 And I think once you start getting into like robotics and autonomy, if you think about compute expansively, I totally agree.
    0:20:05 Yeah.
    0:20:12 And so like those things are going to be much more responsive than do I want to go have my room be 74 degrees instead of 71 degrees.
    0:20:17 Well, let me tell you, anyone that’s done business in Tokyo in the summer knows as a nation, by the way, Japan has done this.
    0:20:19 It is absolutely terrible.
    0:20:20 It’s horrible.
    0:20:21 We’re not going to do that in America, please.
    0:20:23 We’re not going to do that.
    0:20:24 We’re not going to do that in America.
    0:20:30 You’re in like the 40th floor of a Japanese building wearing a suit, by the way, because you have to wear a suit.
    0:20:31 It’s swelteringly hot.
    0:20:34 Everyone is walking around like they’re not miserable, but they are miserable.
    0:20:36 And you watch them out a window, but the window won’t open.
    0:20:38 It is one of the worst.
    0:20:39 Why doesn’t France work in the summer?
    0:20:40 This is why.
    0:20:41 It’s terrible.
    0:20:41 Exactly.
    0:20:43 So we hate this idea.
    0:20:44 I’m spoiled living in California.
    0:20:49 I will say like one of the biggest proponents of this or current users is like crypto mining.
    0:20:50 It’s like these are flexible.
    0:20:51 Yeah, but that’s right.
    0:20:53 But those people are demand responsive, right?
    0:20:56 So those people will just turn off their compute when it’s not cost effective.
    0:20:56 Yeah.
    0:21:01 But it’s important for the grid is that you can build these assets and you have the demand for power here.
    0:21:02 Like they’re going to soak up that demand.
    0:21:06 But if it gets far too expensive, they will also shed that demand.
    0:21:10 In the U.S., my guess is that this already is reflected in the fact that you have peak load pricing.
    0:21:14 Like for me, my power is 10x more expensive between the hours of 4 and 9 p.m.
    0:21:15 So we don’t run the washing machine.
    0:21:16 Yeah, right.
    0:21:20 And so leave it to organizations or individuals to figure out how to manage that.
    0:21:22 But just like charge people for more power.
    0:21:26 It’s a little bit of a non sequitur, but I like that we keep talking about oil and gas.
    0:21:27 We’re talking about natural gas.
    0:21:28 We’re talking about batteries.
    0:21:29 We’re talking about solar.
    0:21:30 We’re talking about nuclear.
    0:21:34 Ryan even mentioned hydro, which of course is totally viable in some places.
    0:21:39 The thing that nobody ever brings up anymore, except for I think a very fringe group, is wind power.
    0:21:42 And I’m very happy to hear that nobody here is jumping for wind.
    0:21:45 I think wind is incredibly cheap when it’s working.
    0:21:48 You kind of know solar is going to work and like sort of this reliable schedule.
    0:21:49 The sun’s going to be out.
    0:21:50 It’s going to be working.
    0:21:53 Spare some cloudy days, but there’s still always something coming through.
    0:21:55 But the wind might not blow for a week.
    0:21:56 I think it’s worse than that.
    0:22:01 I think I read that globally, one third of all wind turbines are out of service at any given time.
    0:22:07 The other thing is, I think wind is the only power generation mechanism where when you get too much of the input,
    0:22:10 the blades of a wind turbine feather and turn off.
    0:22:13 Whereas there’s still something that’s too much sun for solar.
    0:22:16 Too much water and hydro, like that’s not a problem.
    0:22:18 But like too much wind and the wind generator turns off.
    0:22:21 Well, who wants a system where you get more of the input you want and then it stops working?
    0:22:25 It’s also just extremely hard and dangerous and specific to service.
    0:22:29 Like you see those videos of people climbing the ladder up to the top of the wind turbine.
    0:22:35 Grid operators look at wind and it’s like great when it’s working, but they can’t plan for it.
    0:22:37 They have to build other capacity to supersede that
    0:22:38 if it’s not going to be there when they need it.
    0:22:39 So fine, I cede wind.
    0:22:41 We can move past wind.
    0:22:42 Yeah, I agree.
    0:22:42 So no wind.
    0:22:46 But I do think the demand response, this went to the point that Ryan brought up,
    0:22:47 that monitoring the grid is really important.
    0:22:50 Being able to send signaling on the grid is really important.
    0:22:52 And you have to remember, we’re all used to the internet,
    0:22:55 which has bidirectional communication and messaging.
    0:22:57 It has data layer and control layers.
    0:23:00 And there’s like a full control plane and things like that for the internet.
    0:23:02 The electrical grid doesn’t really have that.
    0:23:04 And so to be able to send messaging and things is very, very difficult.
    0:23:07 And now a lot of people just do it out of band using the internet.
    0:23:13 To actually send messaging and do monitoring of the grid itself without an overlay network is very hard.
    0:23:16 And that’s one of the challenges that people are now, I think, starting to address.
    0:23:21 Yeah, it’s wild how much of a mystery what’s happening on the grid is at any given time.
    0:23:23 Like we really have very little visibility.
    0:23:31 And it’s very hard for, I think, centralized utilities to deploy meaningful software to understand that.
    0:23:36 So when we think about as VCs, like what types of things do we look at and what do we get excited about?
    0:23:40 I think companies that kind of are going at this monitoring from the opposite direction.
    0:23:43 Like how do you get software almost insidiously on the grid?
    0:23:49 Like how do you start learning more about demand and generation as close to the source as possible?
    0:23:52 And then try to feed that information back from each other.
    0:24:00 Like the idea that you’re going to go sell a software tool to a PG&E or similar and have a reasonably speedy top-down implementation
    0:24:04 where you actually get good signal and metrics and can actually do interesting things with that data.
    0:24:08 To me, I find like a little bit unbelievable.
    0:24:11 Something very interesting that I learned is a lot of the load forecasting,
    0:24:15 which is basically like the tasking of when plants need to go online.
    0:24:19 So there’s usually a 24-hour ahead sort of market, day-ahead market that’ll basically say,
    0:24:21 you need to run your plan at this time.
    0:24:23 And then they sort of supply and demand match to a price.
    0:24:24 And there’s like a merit order.
    0:24:25 It’s complex, but that’s how it’s done.
    0:24:29 But most of this load forecast is done by just looking at the weather.
    0:24:31 They look at basically one of the best indexes.
    0:24:35 They look at where the homes are, how many people are there, and then what the temperature is going to be.
    0:24:38 That often is sort of the largest factor that goes into this modeling.
    0:24:42 But if we have all these sort of connected resources, we have solar, we have EV chargers,
    0:24:45 all of this stuff is spitting off data, telemetry, and things like that,
    0:24:50 we’re going to get a much better look of how like load is actually being forecasted real-time,
    0:24:54 which is going to help a lot of understanding like where do we actually need to build?
    0:24:55 Like what is the actual price of power?
    0:24:57 And then you can start making these markets, I think, a lot more efficient.
    0:25:02 Well, when you look at energy desks for the big hedge funds or energy trading companies,
    0:25:06 their weather guy is usually the highest paid person on the desk outside the portfolio manager.
    0:25:10 Those climate and weather PhDs that are working on a trading desk,
    0:25:16 they are just absolutely raking it in because they’re like God right now because there’s very little other data.
    0:25:19 That’s why you see when the stuff that goes on in Texas, like heat waves and things like that,
    0:25:23 if they even get it wrong by a couple of degrees where it’s like they think it’s going to be hot,
    0:25:26 but it gets actually really hot, that’s when you get these crises,
    0:25:28 like crises that end up causing a lot of strain on the grid
    0:25:32 and then you have to turn off all these very expensive plants and then you get the headlines.
    0:25:34 That are usually also the ones that are the worst for the environment as well.
    0:25:35 Yep.
    0:25:39 Let us know if you have any reactions to this or otherwise give us the state of nuclear.
    0:25:40 Where are we right now?
    0:25:41 What are the major bottlenecks?
    0:25:42 What are we excited about?
    0:25:48 I think that the biggest thing that’s shifted in the last two or three or three or four years in nuclear
    0:25:51 is that everybody now acknowledges that nuclear energy is clean energy.
    0:25:55 I think that’s been one major shift in public sentiment and perception.
    0:26:01 Nonetheless, there’s still major headwinds politically with nuclear that need to be overcome.
    0:26:04 Taiwan, for instance, turned off their last nuclear reactor.
    0:26:05 Insane.
    0:26:06 It’s unbelievable.
    0:26:11 This is an island country that is seven days away from a total energy blackout
    0:26:13 if they get an oil and gas blockade from China
    0:26:17 so that they can’t bring in ships to deliver oil and fuel.
    0:26:20 So at any given time, they’re like seven days away from a total blackout
    0:26:22 and they turned off their last nuclear reactors at all.
    0:26:22 Why’d they do it?
    0:26:27 Because they caved to political, like very loud vocal minority groups.
    0:26:28 Like environmentalist activist reason?
    0:26:29 Yeah.
    0:26:33 This party ran on a commitment to turn off the reactor before they realized how stupid it was.
    0:26:34 It’s just like colossally stupid.
    0:26:39 By the way, turning off a reactor, like a real full-scale reactor, it’s not like an SMR where
    0:26:42 you can just flip it on like a few days later or a month later.
    0:26:45 With these large reactors, it could take years to turn them back on.
    0:26:46 That was just terrible.
    0:26:51 But broadly, I think the tailwinds for nuclear are just getting stronger, where people recognize
    0:26:52 realize that it is clean energy.
    0:26:55 I think there’s still messaging work to be done.
    0:26:59 We should stop calling the spent fuel nuclear waste because it’s really not waste.
    0:27:01 Almost all of it can be recycled and reused.
    0:27:04 People do need to recognize that those tailwinds are shifting.
    0:27:05 So that’s happening.
    0:27:08 I think that people understand that it’s baseload power, right?
    0:27:10 So it’s not dependent on it only working during daytime.
    0:27:15 It’s not like hydro where you have to be around an appropriately configured water source.
    0:27:21 And then I think that one of the largest inhibitors to creating new power plants in this country,
    0:27:23 it’s not that we can’t do it.
    0:27:24 We can.
    0:27:29 There’s a huge regulatory and permitting, I would say, morass that has to be swam through
    0:27:36 that is incredibly expensive, requires an army of consultants, many tens of millions of dollars,
    0:27:41 many, many thousands of pages of applications and documentation and process review.
    0:27:45 And again, this has to do with building the power plant, getting the fuel, transporting the
    0:27:46 fuel, storing the fuel.
    0:27:52 Each step along the way is extremely laden with regulation and policy.
    0:27:54 And some of that’s for good reason.
    0:27:59 But finding ways to better navigate that to make it more efficient is really a step in areas
    0:28:02 that a lot of companies and people are focused on now.
    0:28:06 And actually, I think the government now is also focused on how do we streamline the approval
    0:28:08 process for a new reactor?
    0:28:10 How do we start approving new reactor designs?
    0:28:16 And then I think the last thing I guess I would say is that right now, if you’re going to put
    0:28:19 a lot of energy and work into building a nuclear power plant, you want to build a really big one.
    0:28:23 We largely only see really big power plants in this country, like the AP1000s that we turned
    0:28:24 on in Georgia.
    0:28:28 And those again came in, I think they were 10 years late and multiple billions over budget.
    0:28:31 We only do that because if you’re going to put in the effort and time, you want to get the
    0:28:33 most bang for your buck and generate the most power.
    0:28:38 We are now starting to see movement from the government in the DOD, in the Department of
    0:28:43 Energy, and from the national labs to really try to create a more fast track process for
    0:28:50 these small modular reactors or even micro reactors that use a much safer form of fuel, use much
    0:28:57 less nuclear fuel, use a different kind of nuclear fuel that’s not nearly as risk prone as the kind
    0:29:01 of nuclear material you’d use them like a weapon, but it’s not nearly as enriched to the same degree.
    0:29:03 It’s not even the same material.
    0:29:06 And so that process is now getting a lot of steam.
    0:29:08 We have an investment in a company called Radiant Nuclear.
    0:29:12 They are building a factory that creates what effectively is an SMR.
    0:29:14 They would probably call it a micro reactor.
    0:29:16 It’s a one megawatt reactor.
    0:29:19 It can be put on the back of an 18-wheeler and shipped around.
    0:29:23 You can move it to where you need power if there’s been a natural disaster, like a hurricane,
    0:29:25 and you need to bring in power overnight.
    0:29:30 You could bring in a few trucks with four or eight of these reactors and power up a whole
    0:29:31 city after a disaster.
    0:29:34 And so that kind of flexibility and power is really compelling.
    0:29:36 I think there’s a lot of tailwinds, a lot of good things happening.
    0:29:42 One thing to understand is that the DOD spends an incredible amount of money dealing with the
    0:29:46 cost and, frankly, the risk to human lives, not just the cost, but the real risk to human lives,
    0:29:50 transporting fuel around the world to forward operating bases.
    0:29:55 Anytime we do a military exercise, anytime we’re engaged in a conflict, the movement of
    0:30:00 fuel factors in as a primary concern and consideration of what we deal with.
    0:30:04 And so we’ve read reports that they spend well over $200 a gallon at times, sometimes up to
    0:30:09 $400 a gallon for diesel, effectively, to get diesel into the right place at the right time.
    0:30:14 And so you can just imagine that having a nuclear reactor you can put on the back of a C-130
    0:30:19 and fly around the world to wherever you need power, drop it in the middle of the desert,
    0:30:21 turn it on, you have power for five years.
    0:30:22 It’s just an incredibly compelling value prop.
    0:30:25 There is no question that nuclear needs to be part of the equation.
    0:30:30 Not only is that baseload power, but on the SMR micro reactor side, it gives us this incredible
    0:30:32 flexibility in grid resilience.
    0:30:36 There should not be a single military base in this country that’s not nuclear backed from
    0:30:42 a power standpoint, because if the grid goes down, whether it’s from a cyber attack or just
    0:30:48 instability or demand issues or cascading failures, you want to be able to fail over to nuclear power
    0:30:50 and not worry about the runway lights turning off.
    0:30:50 Yeah.
    0:30:55 And especially as we start to look at the kind of electrification of our weapon systems,
    0:31:02 our military vehicles, our drones, et cetera, like those all need to get charged up somewhere.
    0:31:05 And how better to charge them than a nuclear reactor?
    0:31:09 Another thing I’ll add to your nuclear comment is I think the advantage of nuclear, and I think
    0:31:13 that Radiant has done very well of really leaning in on, is the power density factor.
    0:31:18 If you want a reactor that is reliable and power dense, you want it to operate at very high
    0:31:19 temperatures.
    0:31:22 You want as highly enriched fuel as you possibly can, where it makes sense commercially.
    0:31:26 So you want Halo fuel and you want to serve customers that will pay the premium for that.
    0:31:29 That’ll be able to buy this reactor that they know is going to work.
    0:31:32 And if you’re doing that, you want to have these economies of scale on the manufacturing
    0:31:33 side.
    0:31:36 You want it to be done out the door and don’t need to like assemble it on site.
    0:31:38 You don’t want to have to like have constant maintenance.
    0:31:42 And I think the other sort of reactors that we see, maybe on the civilian side, if you build
    0:31:46 a reactor in a factory or you build modular components in a factory, but you still need
    0:31:51 to do construction work on site, you’re still a construction company, even if the technology
    0:31:51 is there.
    0:31:54 And I would argue a lot of the existing AP1000 technology is quite good.
    0:31:56 And other countries can do it quite cheaply.
    0:31:58 China is using a very similar design.
    0:32:00 The UAE just built one for incredibly cheap.
    0:32:04 And they have very similar nuclear regulation, like in terms of frameworks.
    0:32:07 And obviously their regulatory bodies might move faster and things like that, but they’re
    0:32:09 not like completely ignorant of some of the concerns.
    0:32:12 Well, and maybe this is a much more broad question, but the United States needs to get better
    0:32:13 at mega projects.
    0:32:16 Things that are a billion dollars, things that are at scale.
    0:32:20 And I would argue the NRC is a big component of why it’s expensive, but I think it’s also
    0:32:24 the same reason that it takes a billion dollars to build a bike lane in San Francisco is why
    0:32:25 we are not able to build stupid power.
    0:32:28 Or why we don’t have a high-speed rail in California.
    0:32:28 Yep.
    0:32:31 We might not have a high-speed rail in California because nobody wants it.
    0:32:33 And nobody wants it where they’re building it.
    0:32:33 I want it.
    0:32:34 I fly to LA all the time.
    0:32:35 Sorry.
    0:32:36 Nobody wants it where they’re building it.
    0:32:37 Sure.
    0:32:37 Yeah.
    0:32:39 Bakersfield is not a prime destination.
    0:32:42 I want to train from San Francisco to LA.
    0:32:44 It takes an hour and a half.
    0:32:49 We debate a lot internally, like, where does it make sense for VCs and VC capital to plug
    0:32:49 in?
    0:32:55 And arguably, like, we’re not going to move the needle on, you know, these multi-billion dollar
    0:32:56 mega projects in the U.S.
    0:33:01 Like, we’re probably not the best people to figure out how to capitalize and build a multi-billion
    0:33:04 dollar project in California to generate the power for the grid.
    0:33:10 But I do think that there is a role for technology at kind of, like, every single layer and every
    0:33:12 single phase of how mega projects get built.
    0:33:17 It’s like, how do you use AI to navigate kind of site selection?
    0:33:22 How do you use tools to, like, move through the various permitting processes faster?
    0:33:27 Like, how do you use AI to help you do extremely complex and interdependent project management
    0:33:28 better and more effectively?
    0:33:33 So that’s something that you have a project with 4,000 people working on it and everyone
    0:33:36 engaging with different suppliers and timelines that are dependent on each other.
    0:33:39 Like, how do you get all those things to align better so that you don’t get these 10-year
    0:33:44 delays so that projects actually happen on time and on budget and, as a result, attract
    0:33:46 private capital backers?
    0:33:48 Like, I think that there’s a role of technology here.
    0:33:49 You know what that looks like.
    0:33:50 TBD.
    0:33:53 We’ve seen a lot of companies that maybe five years ago were primarily trying to sell
    0:33:57 to utilities and grid operators, which is incredibly painful, incredibly difficult.
    0:33:58 Perhaps rightfully so.
    0:34:00 Like, they have poles in the ground that are 50 years old.
    0:34:03 Why would they trust a two-year-old company to sell them software?
    0:34:04 Are they going to be around in 20 years?
    0:34:07 And this is a fair question to ask, especially for something as critical as the grid.
    0:34:11 But now they’re developing this software and there’s such demand of understanding how grid
    0:34:15 operators might think and potentially get there faster or, you know, have different conclusions.
    0:34:20 And so now you can go to data centers or people who want to build solar farms or people who
    0:34:21 want to build massive, like, battery farms.
    0:34:23 And you can sell a very similar software.
    0:34:26 Individual people who want to make sure that their power isn’t going to go out and they’re
    0:34:29 going to be caught without energy during an important moment in their lives.
    0:34:29 Yeah.
    0:34:30 And so everyone cares now.
    0:34:35 There’s a lot more money who cares about what is the grid actually going to think and where
    0:34:35 can I build?
    0:34:37 Where is there excess capacity?
    0:34:41 Maybe I’m connected to the grid, but I also need some battery and solar backup or like a
    0:34:45 radiant microreactor or something like that to be used in certain situations.
    0:34:49 It’s a lot more complex, this sort of microgrid setup, but it’s the way we’re headed.
    0:34:50 And software is going to be a big piece of that.
    0:34:55 I want to hear more about our requests for startups or things that we want to exist that
    0:34:55 we haven’t yet discussed.
    0:34:59 I mean, put differently, I’m curious where we think there’s most bang for the buck in terms
    0:35:03 of the issues that we’ve been talking about in terms of if there was like a regulatory intervention
    0:35:05 or some sort of technological unlock.
    0:35:06 What comes to mind?
    0:35:11 One area where there’s probably a venture scale software company to be built is really around
    0:35:13 grid management monitoring.
    0:35:19 I think we see this in the IT landscape, we see it in the OT landscape, but we don’t really
    0:35:22 see it in the grid where there’s just full, very, very large.
    0:35:24 There is no Splunk for the electrical grid.
    0:35:27 There’s no power to networks for the electrical grid yet.
    0:35:31 There’s a whole bunch of things that mirror the IT and OT landscape, whether it’s around cyber
    0:35:34 and monitoring and logging and analytics.
    0:35:37 There’s no looker for the electrical grid yet.
    0:35:38 There’s just none of these companies exist.
    0:35:42 I’m not sure if it’s three separate companies, I’m not sure if it’s one company, but there
    0:35:48 is a big company to be built and really managing and monitoring the grid and helping to orchestrate
    0:35:53 and even deal with some of the things Ryan spoke around, around demand response, coordinating
    0:35:56 that, creating those marketplaces, tracking all those incentives.
    0:36:00 So I think when we see a company that we think can really be the breakout company there, we
    0:36:00 would lean into it.
    0:36:06 I also think around sort of project planning and development, how do you make it faster
    0:36:09 and easier to build projects within the current regulatory framework?
    0:36:10 How do you do site selection?
    0:36:12 How do you navigate permitting?
    0:36:14 How do you navigate project management?
    0:36:16 How do you navigate your sort of construction supply chain?
    0:36:20 We’re starting to see companies pick off pieces of that, but I think broadly speaking, there’s
    0:36:24 room for tech and software in that kind of project development space as well.
    0:36:29 I think, you know, in a more general sense, like anything that can bring generation capacity
    0:36:32 or storage capacity closer to load, I think is going to be very compelling.
    0:36:37 And a lot of the times it’s less maybe the technology, novel technology, but it’s a system integration
    0:36:38 or it’s an innovative business model.
    0:36:42 I think something like radicalizing that I experienced is, and I implore everybody to go home and check
    0:36:46 their power bill, they’ll now like often separate like the delivery costs from the actual
    0:36:47 generation costs.
    0:36:51 So what we’ve seen and we’ve mentioned it, but like the cost to generate electricity, the cost
    0:36:55 of like power has dropped immensely, gas, solar, things like that.
    0:36:58 But the cost to actually deliver that electricity has increased a ton.
    0:37:00 And so in net, it’s sort of not changed.
    0:37:01 And I think that’s terrible.
    0:37:03 And I think we all agree that’s bad.
    0:37:06 And so I think there’s a lot of opportunity of bringing sort of that generation capacity.
    0:37:09 In some ways, this is sort of like this more liberalizing force.
    0:37:10 It’s like we all should have our own backup.
    0:37:11 We all should have our own technology.
    0:37:14 I think there’s a lot of really interesting ways to do that and scale it.
    0:37:19 And overall, as a grid gets more heterogeneous, all of the seams and intersections between
    0:37:24 things, like there’s just so much more opportunity for technology than when you had a single utility
    0:37:27 managing a single source of power centrally distributed out broadly.
    0:37:30 I’ll throw another one out there as I’m just thinking about this.
    0:37:36 One thing that I’ve been noodling around is this idea that all the regulation and permitting
    0:37:40 and policy frameworks that we have in this country, you could think of those as part of
    0:37:44 the like infrastructure that we all have to like live and work with and interact with.
    0:37:49 So companies that really facilitate, and I would say applying AI to navigating the permitting
    0:37:50 process.
    0:37:51 So nuclear is a good example.
    0:37:58 Again, a nuclear reactor application or a fuel transport license or a fuel manufacturing license.
    0:38:03 These things have thousands and thousands of pages of regulation and documentation that go
    0:38:04 with them.
    0:38:08 You make one small change in your application, it has these reverberation effects.
    0:38:09 You have to update all your documents elsewhere.
    0:38:13 If you’re the regulator trying to go through all these applications, it’s just incredibly
    0:38:18 onerous, borderline impossible to imagine that a regulator can even possibly get it right.
    0:38:20 You could argue that it’s actually not possible.
    0:38:21 They just do a best effort.
    0:38:27 But AI could actually help these things, could help the applicants go through the process of
    0:38:30 filling out their applications and saying, hey, this is where you should drill down.
    0:38:31 This is where you should clarify.
    0:38:36 They can look at all previous published applications and say, this is how you need to tailor it.
    0:38:41 You can probably make it 85% the same, and then based on your design or your location or
    0:38:42 whatever, make some modifications.
    0:38:45 And then the regulator can do the same and say, look, here’s an application that came in,
    0:38:49 highlight all the areas I need to drill down, or show me the things that are different from
    0:38:52 every other nuclear fuel transport application we’ve ever seen.
    0:38:54 Are they using the rail infrastructure?
    0:38:57 Are they using the national highway infrastructure to move the fuel?
    0:39:02 AI can just automate all these things that take armies of consultants months or years to do,
    0:39:05 can be brought down into being minutes or hours.
    0:39:08 I’m not sure how big of a company, but I think potentially there’s a very large company
    0:39:09 to be built there.
    0:39:13 If our check sizes were in the billions, not just the millions, which were in Jason Horowitz,
    0:39:15 you never know, how would our strategy change?
    0:39:17 I think you need even more than just billions.
    0:39:18 It’s tens of billions, hundreds of billions.
    0:39:20 It’s such a tough question.
    0:39:23 I think there’s tons of policy around this as well.
    0:39:27 I hesitate to say we should look to how China has built up their grid,
    0:39:30 but I think the elephant in the room is in the early 2000s, they were experiencing blackouts.
    0:39:32 This was a very common thing.
    0:39:33 This was horrific.
    0:39:37 But now, think of 4X their grid in the last couple of decades.
    0:39:41 And so the way they’ve done this is by basically deploying generation capacity,
    0:39:46 building hydro, building massive storage facilities, of course, BYD, CATL, tons of battery production.
    0:39:51 They built HVDCs, these large high-voltage transmission lines.
    0:39:52 I would do all of that.
    0:39:53 I mean, I would look to all of it.
    0:39:56 And whether or not it’s a good investment or not, I’d look at a number of factors,
    0:40:00 but it is much more of the infrastructure projects, the glue that connects this stuff together.
    0:40:05 I think our lens today is looking at these technologies that enable a lot of this more flexible grid.
    0:40:08 But I think there’s also going to be these large infrastructure, the webbing in between it.
    0:40:12 I think software is a big piece of it that we’re spending a lot of time on looking at.
    0:40:14 But I think, how is ERCOT going to be connected to the rest of the grid?
    0:40:16 Or how are we going to move this power around?
    0:40:19 If it’s really sunny in the Southwest, solar is going to be really cheap.
    0:40:22 Is there an efficient way to move that to New York or something like that?
    0:40:23 China’s done this effectively.
    0:40:26 And I think if you had hundreds of billions of dollars spent or trillions of dollars,
    0:40:28 what does the grid look like?
    0:40:29 Like, it’s going to be a lot more interconnected.
    0:40:33 Maybe another answer to your question or a different answer to your question is,
    0:40:37 I think the energy industry is probably medium to long-term,
    0:40:42 like one of the most prime spots to deploy physical autonomy.
    0:40:50 So when you think about applications of robotics, whether it’s humanoids or more kind of task-specific
    0:40:53 robotics, we’re talking about dangerous jobs often.
    0:40:57 We’re talking about manufacturing jobs to build up, whether it’s small-scale reactors or batteries
    0:40:58 or whatever.
    0:41:04 So I don’t know what the shape of the company is here and how reliant it would be on some of
    0:41:06 the robot learning work that’s happening.
    0:41:10 But I do think that as we scale our energy capacity, there’s going to be a pretty massive
    0:41:13 application of industrial robotics to the energy sector.
    0:41:19 I made this joke, I think, once to Ryan, or maybe I made it at the American Dynamism Summit,
    0:41:24 that we survived the greatest nuclear disaster in U.S. history just recently when we finished
    0:41:28 the Vogel 3 and 4 reactors and let all those employees go back to other jobs.
    0:41:32 So I think if we were writing a billion-dollar check into power, what we would do is we would
    0:41:36 just give jobs to those people and not let them go back to whatever it was they were doing
    0:41:38 before they were building nuclear reactors.
    0:41:43 And we would just really work to streamline the process to make sure that we go build Vogel
    0:41:48 5, 6, 7, 8, 9, 10, and all these different states around the country and just put these
    0:41:52 people to work for the next decade-plus building reactors.
    0:41:56 And that, to me, was the greatest miss and probably the greatest opportunity.
    0:42:01 I don’t think it’s particularly our opportunity, but I do think it’s an opportunity for somebody
    0:42:02 to do.
    0:42:07 Labor broadly, like, this is a little tangential, but when Microsoft was building their new data
    0:42:12 center in Georgia last year, at one point, they had on staff at Microsoft or on contract
    0:42:15 more than a third of the electricians in the state of Georgia.
    0:42:17 And they basically maxed out.
    0:42:20 They hired every single electrician that they possibly could.
    0:42:23 So I don’t think it applies to just electricians.
    0:42:29 It’s, do you, to your point, like the cement mixers, it’s the mechanical engineers, it’s the nuclear
    0:42:30 engineers.
    0:42:36 Like, how do we actually train the next generation energy workforce that we’re going to need to
    0:42:39 modernize the grid is a big, big challenge.
    0:42:43 These are very high-paying jobs where you don’t have to check your email on your phone at 9 p.m.
    0:42:45 at night after you go home from work.
    0:42:46 They’re high-paying.
    0:42:48 It’s good exercise jobs.
    0:42:50 And they’re relatively low stress.
    0:42:51 These are good jobs for people.
    0:42:55 I think one more comment on this, and it’s more of an industrial policy question, is we’re
    0:42:59 talking about specific things, but like oftentimes that just moves the bottleneck.
    0:43:01 Like we could solve a lot of sort of the grid connection hookup.
    0:43:05 We could build a lot of like transmission lines, but then we need more transformers.
    0:43:07 And to build more transformers, you need more electric steel.
    0:43:10 You can do the same sort of equation for much of the supply chain.
    0:43:14 Battery is another good example is then, okay, cool, we’re building cells, but then we also
    0:43:17 need active materials and we need to mine and things like that.
    0:43:21 And so, you know, it’s sort of a whole effort of examining sort of our infrastructure and
    0:43:23 our supply chains, and you need to do all of it.
    0:43:25 And I think that’s a complicated question.
    0:43:26 That’s an expensive question.
    0:43:27 Any last reflections?
    0:43:32 I think the last thing I would say is that people underestimate how critical and important
    0:43:38 a resilient, reliable, dispatchable electrical grid is to our national security.
    0:43:42 You cannot have national defense and national security without reliable electricity.
    0:43:44 It’s just not possible.
    0:43:48 So all of these things we’re talking about are about the upside, about capitalizing on
    0:43:53 AI compute, the switch to electric vehicles and our insatiable thirst for electricity.
    0:43:56 But at a fundamental level, there is no safety.
    0:43:57 There is no national defense.
    0:44:00 There is no national security without a reliable electrical grid.
    0:44:06 To reiterate on that, people want reliable, cheap, and clean power in that order.
    0:44:09 And I think that’s, I think largely how we should think about our energy policy.
    0:44:12 And I think that’s sort of the direction we’re going.
    0:44:14 And I think we need to make sure we stay aligned with that.
    0:44:16 That’s an exciting note to wrap on.
    0:44:17 David, Ryan, Aaron, thanks so much for coming to the podcast.
    0:44:18 Thank you.
    0:44:19 Thank you.
    0:44:23 Thanks for listening to the A16Z podcast.
    0:44:28 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash
    0:44:29 A16Z.
    0:44:32 We’ve got more great conversations coming your way.
    0:44:33 See you next time.
    0:44:44 We’ll see you next time.
    0:44:44 We’ll see you next time.

    U.S. per capita energy usage peaked in 1973. Since then? Flat. Meanwhile, China’s per capita energy use has grown 9x.

    Today, AI, EVs, manufacturing, and data centers are driving demand for more electricity than ever—and our grid can’t keep up.

    In this episode, a16z general partners David Ulevitch and Erin Price-Wright, along with investing partner Ryan McEntush from the American Dynamism team, join us to unpack:

    – How America’s grid fell behind

    – Why we “forgot how to build” power infrastructure

    – The role of batteries, solar, nuclear, and software in reshaping the grid

    – How AI is both stressing and helping the system

    – What it’ll take to build a more resilient, decentralized, and dynamic energy future

    Whether you’re a founder, policymaker, or just someone who wants their lights to stay on, this conversation covers what’s broken—and how to fix it.

     

    Resources: 

    Find David on X: https://x.com/davidu

    Find Erin on X: https://x.com/espricewright

    Find Ryan on X: https://x.com/rmcentush

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

  • Aaron Levie on AI’s Enterprise Adoption

    AI transcript
    0:00:02 What is the journey over the next decade?
    0:00:06 It’s about the speed at which humans can change their workflows.
    0:00:08 Why doesn’t the breakthrough that we just saw get released?
    0:00:11 Why doesn’t that permeate every corporation within six months?
    0:00:16 It’s so strange to me how many disruptions are happening all at the same time.
    0:00:17 Your R&D is changing.
    0:00:18 Yeah.
    0:00:20 Every part of the stack is changing.
    0:00:20 Like everything.
    0:00:23 We’re not in like a fear of AI world.
    0:00:28 We’re in a, we know this is going to happen and it needs to happen to us faster than it
    0:00:31 happens to our competitors, which is a totally different dynamic than we saw with cloud.
    0:00:36 What do you think is the best metric for anybody interested in tracking this stuff as far as
    0:00:37 like how fast it’s going?
    0:00:37 Is it GDP?
    0:00:39 Is it margin?
    0:00:40 Is it top line?
    0:00:42 Is it headcount growth?
    0:00:43 Is it all the above?
    0:00:47 It’s basically fully assumed that AI is going to take over the enterprise.
    0:00:50 How does AI actually change the enterprise?
    0:00:54 Not just in theory, but in how software is built, sold, and used?
    0:01:00 In today’s episode, A16Z general partner Martin Casado sits down with Aaron Levy, co-founder
    0:01:05 and CEO of Box, to explore what it means to be an AI-first company from product strategy
    0:01:06 to internal workflows.
    0:01:11 They talk about why incumbents may be better positioned than expected, how startups can still
    0:01:15 break out, the rise of agents and vibe coding, and what happens when the bottleneck isn’t the
    0:01:16 tech, but the org chart.
    0:01:22 Aaron also shares how Box is using AI internally today and why he thinks the next generation of
    0:01:25 employees may spend more time managing agents than writing code.
    0:01:26 Let’s get into it.
    0:01:33 As a reminder, the content here is for informational purposes only, should not be taken as legal
    0:01:37 business, tax, or investment advice, or be used to evaluate any investment or security,
    0:01:42 and is not directed at any investors or potential investors in any A16Z fund.
    0:01:46 Please note that A16Z and its affiliates may also maintain investments in the companies
    0:01:47 discussed in this podcast.
    0:01:54 For more details, including a link to our investments, please see A16Z.com forward slash disclosures.
    0:02:01 Aaron, thank you very much for joining us.
    0:02:01 Thank you.
    0:02:03 Everybody here already knows you.
    0:02:05 However, I still think you should intro yourself, just for completeness.
    0:02:12 Aaron Levy, CEO, co-founder of Box, and at Box, we help enterprises basically take all of their
    0:02:18 unstructured data or enterprise content and turn it into valuable information, and AI is
    0:02:20 absolutely this incredible accelerant for that problem.
    0:02:22 I just learned that we’re investors, didn’t you?
    0:02:24 Well, many years ago.
    0:02:24 Many years ago.
    0:02:26 So no claims post-IPO.
    0:02:31 But actually, Ben Horowitz had this early kind of blog post on basically, I think it was
    0:02:32 the title of The Fat Startup.
    0:02:32 Yeah.
    0:02:33 Yeah, yeah, yeah.
    0:02:35 In response to enterprises, the lean startup.
    0:02:36 Yeah, that’s right.
    0:02:40 And let’s just say we very much took that to heart, and we basically like deployed every
    0:02:45 single lesson, which was like the name of that game is you get big fast, you scale aggressively,
    0:02:48 and that was a very important period in our company’s journey.
    0:02:52 So the notional topic of this is AI in the enterprise.
    0:02:56 But I think it’s good to be kind of nuanced about this, because it’s less obvious than
    0:03:02 people think, and you’ve been talking a lot about AI on X, but also you’re thinking about
    0:03:03 it in the terms of your business.
    0:03:07 So let me just kind of set up the first question as follows, which is, AI has historically been
    0:03:11 this very B2B enterprise thing, like chatbots or whatever, personalization systems.
    0:03:16 But what’s unique about Gen AI is a lot of the use cases are actually like a consumer or
    0:03:17 prosumer, right?
    0:03:23 Think like creativity or developers, and it actually hasn’t made intros as much.
    0:03:24 into the enterprise yet.
    0:03:25 It’s just starting now.
    0:03:27 So maybe just a couple of questions.
    0:03:30 First off, A, does that match with your experience?
    0:03:34 And then B, how are you thinking about this transition to the enterprise?
    0:03:35 Yeah.
    0:03:41 I think if you were to probably like do the idiosyncrasies of AI and then reverse engineer
    0:03:45 why that was the journey, basically up until, let’s say, pre-chat to be team moment,
    0:03:47 AI was extremely hard to use.
    0:03:52 It required in many cases having custom models for basically every problem you tried to solve.
    0:03:57 And so there was almost no way that a consumer ecosystem could flourish based on that.
    0:03:59 It was just not generalizable enough.
    0:04:03 There was really few products other than like maybe Siri, Alexa, et cetera, that you’d interact
    0:04:05 with that would even have some sense of AI.
    0:04:10 And so enterprises were the early adopters of AI systems to bring workflow automation to
    0:04:11 their companies.
    0:04:16 Then boom, ChatGPT happens and all of a sudden it’s the exact right form factor for mass adoption.
    0:04:17 There’s no startup costs.
    0:04:19 It costs two seconds to learn the product.
    0:04:20 It’s a chat interface.
    0:04:24 So it was like perfectly ripe for just taking off in the consumer space.
    0:04:29 And then you have also these incredible conditions set up for mass adoption.
    0:04:31 You have billions of people on the internet.
    0:04:32 It was set up as a free product.
    0:04:36 Again, it kind of solved this sort of latent kind of question mark that everybody had, which
    0:04:40 is like, when are we going to see AI work and touch our lives?
    0:04:45 And so everything was kind of like the perfect conditions to get mass consumer adoption.
    0:04:49 On the enterprise side, you have unfortunately kind of the opposite, right?
    0:04:53 You have lots of workflows that have been kind of ingrained for decades and decades.
    0:04:59 You have lots of legacy IT systems that have data kind of not set up well to be accessed by AI.
    0:05:05 You have a sort of shadow IT problem, which is most corporations don’t want, and users just
    0:05:10 injecting text into prompts that might contain information that the AI models could learn off of.
    0:05:14 So it’s sort of a difficult environment for that same level of virality.
    0:05:20 With the exception of a few of these pro-sumer categories, I have talked to large corporation
    0:05:24 CIOs that are seeing people just show up with Windsurf and Cursor and Replit.
    0:05:28 And so you’re getting actually this sort of shadow IT version that we saw 15 years ago.
    0:05:30 DevTools has always had that.
    0:05:31 Yeah, 100% fair.
    0:05:31 100%.
    0:05:32 So DevTools have had that.
    0:05:36 But I think that you’re still seeing that now in the chat to BT kind of leakage into organizations.
    0:05:37 Right.
    0:05:42 I’m sure their pro-sumer inside of a corporation firewall usage is off the charts, even separate
    0:05:43 from the people that pay for it.
    0:05:43 Totally.
    0:05:48 So now the question, though, is what is the journey over the next decade for the real
    0:05:54 change management of deployment of AI systems that drive the more like GDP changing productivity
    0:05:55 gains?
    0:05:58 And that’s something where I do think we have to be prepared for.
    0:05:59 This is many years.
    0:06:03 It’s about the speed at which humans can change their workflows as opposed to how kind of quickly
    0:06:06 the technology can just sort of evolve in advance.
    0:06:11 And so we in Silicon Valley and certainly anybody tuning into this sort of imagines like, well,
    0:06:13 why doesn’t the breakthrough that we just saw get released?
    0:06:16 Why isn’t that permeate every corporation within six months?
    0:06:20 And it’s because like people just have meetings and they have budget processes and they have
    0:06:24 to go through a governance council and they have to get compliance on board and they have
    0:06:28 to figure out like who has the liability when the thing recommends this stock and then the
    0:06:30 financial services provider shares that with a client.
    0:06:33 Like that takes years and there’s going to be case law that needs to happen.
    0:06:36 And we still have lawsuits that are going on about who owns the IP of this stuff.
    0:06:38 So that part is going to take years.
    0:06:42 What’s interesting, and I think you’ll especially appreciate this on the cloud side is I remember
    0:06:47 when we first were scaling up in the enterprise, let’s say 2007, 2008, 2009, let’s say that three
    0:06:53 to five year period, post AWS, post kind of cloud starting its journey, basically to a
    0:06:58 T, every conversation you’d have with a CIO or a group of CIOs was basically like, yeah,
    0:06:59 that’s nice.
    0:07:01 Maybe some little corner of our organization could use this.
    0:07:03 We are never going to go fully to the cloud.
    0:07:05 They had their arms wrapped around their servers.
    0:07:06 I remember.
    0:07:06 Yeah.
    0:07:09 And basically they did not want to give up the infrastructure.
    0:07:11 There was too many questions, too many compliance issues.
    0:07:15 There was just existential job questions of, well, what happens when this, you know, gets
    0:07:16 delivered as a service?
    0:07:17 Here’s a super interesting.
    0:07:21 Let’s say we’re now two and a half years into the ChatGPT moment.
    0:07:24 That same group of CIO conversations, none of that.
    0:07:29 It is basically assumed, it’s basically fully assumed that AI is going to take over the enterprise.
    0:07:36 Like the CEO, the CEO, the CIO, the CDO, every job, every org leader is basically like, we know this is going to happen.
    0:07:39 This is not like a, oh, we’re trying to kind of push it off.
    0:07:42 It is purely a sequence of events.
    0:07:43 Who do I deploy?
    0:07:44 How do I deploy it?
    0:07:45 How do I drive the change management?
    0:07:46 Is the model ready?
    0:07:55 So what’s really interesting is I think the level of buy-in you have now in the enterprise is like five times greater than we had in the early days of cloud.
    0:07:57 And you can even see it.
    0:08:04 Like to me, the classic litmus test was, if you remember like 15 years ago, I think Jamie Dimon was probably most famous for saying like, we’re never going to go to the cloud.
    0:08:04 Yes.
    0:08:07 So like they basically said, J.P. Morgan will never go to the cloud.
    0:08:07 Yeah.
    0:08:19 Today, that equivalent commentary, whether I don’t have a perfect Jamie Dimon quote, but David Solomon at Goldman Sachs has given this anecdote of they can write now an SEC filing or an S1 for an IPO in like a few minutes.
    0:08:21 That used to take a number of analysts a few days.
    0:08:28 And so the fact that like those are the anecdotes already coming out of the biggest banks means that we’re not in like a fear of AI world.
    0:08:36 We’re in a, we know this is going to happen and it needs to happen to us faster than it happens to our competitors, which is a totally different dynamic than we saw with cloud.
    0:08:42 So do you think this has implications for companies today that are building products that are pre-AI products?
    0:08:49 So for example, with the cloud wave, you basically had a bunch of cloud native companies that ended up taking over.
    0:08:54 So for example, Snowflake is a great example of this, which is like the ones that decided not to go all in and were hybrid.
    0:08:57 Like hybrid kind of became known as like means it won’t work.
    0:08:59 You know, anything called hybrid hasn’t worked.
    0:09:09 So do you think because the buyer and the enterprise is more ready that like companies that are pre-AI have more of an opportunity?
    0:09:13 Or do you think that you’re going to see the same thing with a lot of like AI native companies do well?
    0:09:16 I’m going to basically give you the non-answer of I think both.
    0:09:28 And one benefit that the cloud cohort has or the SaaS kind of posts like us all understanding and agreeing on what SaaS would look like, what we all have is whether we adhered to this perfectly or not is a question.
    0:09:31 But we basically all tried to build API first platforms.
    0:09:31 Yeah.
    0:09:34 Or at least like API kind of like equal platform.
    0:09:36 So we have the UI and we have the API.
    0:09:42 And if you think about it, like AI and AI agents are like the perfect consumers of an API, right?
    0:09:46 And so they basically become these super users within your system on your APIs.
    0:10:01 So if I had to just say, okay, I want to deploy agents to go and automate my ServiceNow workflows, I think I’m better off just deploying the ServiceNow agent to go do that than do an entire reinvention of my ITSM system to solve that use case.
    0:10:02 And you can just go down the list.
    0:10:11 Like Workday, if I want an AI agent to do some kind of HR-related task, I think I’m better off to just do that within Workday than I am building an entire new system.
    0:10:14 So you have a bunch of different factors versus the pre-cloud days.
    0:10:17 Like pre-cloud to post-cloud was an entire rewriting of your software.
    0:10:19 You had to go from single-tenant to multi-tenant.
    0:10:21 The scaling of the systems were totally different.
    0:10:25 Even the functionality and application logic was different because like it should be real-time.
    0:10:26 It should be collaborative.
    0:10:30 It shouldn’t be as sort of async and batches as the on-prem systems were.
    0:10:35 And so in a cloud world, it is a reinvention of the user experience and what you’re doing in the system.
    0:10:36 And we should definitely get to that.
    0:10:40 Well, I just want to make sure I tease this out because it’s actually a very interesting point.
    0:10:47 So your claim is to go from pre-cloud to post-cloud, like that ripped through the entire stack all the way down to like the infrastructure, for example, like tenancy.
    0:10:49 Like you have to rewrite everything.
    0:10:55 And then what you’re saying about AI is more of a consumption layer thing, which is you just treat the existing systems as they are.
    0:10:56 And then the AI becomes the consumption layer.
    0:11:02 Do you think this is like a 1.5 step and like the 2.0 step kind of rips through the entire stack?
    0:11:04 Okay, so let’s bookmark that one for one second.
    0:11:09 But like if you do pure Clay Christensen sort of approach, you know, sustaining innovation, disruptive innovation.
    0:11:14 Disruptive innovation is this thing that looks like so much harder, so different, so less profitable.
    0:11:17 Sustaining is like, actually, no, I’d like to build that because it’s incremental.
    0:11:19 It’s better for our business overall.
    0:11:22 The on-prem guys had a disruptive innovation.
    0:11:26 Everything about the business model of SaaS looked different, harder, stranger.
    0:11:28 I don’t have the talent.
    0:11:32 I’m running a service delivery operation as opposed to I ship you a CD-ROM with my code.
    0:11:35 Everything about the finances and pricing model.
    0:11:35 Yes, everything.
    0:11:36 Everything, the business model, everything.
    0:11:41 AI, again, with the bookmark being like the really big disruption that you could contemplate,
    0:11:45 right now with AI, everything kind of looks like a sustaining innovation if you’re an incumbent,
    0:11:49 which is like, instead of a user pressing the buttons in the application,
    0:11:53 let’s have an agent run through the API and operate as if they were that user.
    0:11:57 And so all of a sudden, for a lot of SaaS providers, this looks like a TAM expansion
    0:12:02 because now, for the first time ever, I can actually deploy my software for use cases
    0:12:06 where the customer didn’t have users on the other end before to do those things.
    0:12:07 So I think you have a lot of TAM expansion.
    0:12:09 Now, the good news for startups.
    0:12:11 With one caveat, which maybe we’ve bookmarked and we’re going to get to,
    0:12:12 but let me just say the one caveat.
    0:12:17 The one caveat is you now have a component that has a very different COGS model
    0:12:19 if you’re a software provider.
    0:12:19 Yes.
    0:12:22 And so like now, it’s almost like when we went from like on-prem to cloud,
    0:12:24 we went from perpetual to recurring.
    0:12:30 And it feels like with AI, you kind of have to go from recurring to usage-based just because.
    0:12:30 Yeah.
    0:12:33 So business model will shift for some of the use cases
    0:12:37 because even if you look at the cursors, replets, windsrifts of the world,
    0:12:40 there does seem to be this baseline seat price.
    0:12:43 And then your consumption usage thing is sort of this add-on.
    0:12:43 This overage sheet.
    0:12:47 And so SaaS providers are kind of well-structured to be able to have that kind of dynamic.
    0:12:48 Yeah.
    0:12:51 If it was 100% usage and the user seat goes away,
    0:12:53 I do agree that you have this, then you have a little bit of a business model crisis.
    0:12:57 Oh, so you think, but right now, it’s not clear that that’s going to go all the way over.
    0:13:00 Well, until the human literally is not a seat on the system,
    0:13:04 I think you don’t remove the end user license as a component.
    0:13:04 Okay.
    0:13:06 But again, that could be like the much bigger disruption.
    0:13:11 Now, just to fully lay out the market dynamics, I think SaaS incumbents,
    0:13:14 especially you have a couple other idiosyncrasies right now versus the on-prem days.
    0:13:17 Another idiosyncrasy is I would say like on the margin,
    0:13:20 you tend to have founders still leading the SaaS companies.
    0:13:20 100%.
    0:13:22 And so we didn’t really have that in the on-prem world.
    0:13:28 And so like Siebel already had three CEOs later and PeopleSoft already had multiple CEOs later.
    0:13:30 So it was a different leadership structure in these organizations.
    0:13:33 A lot of times you still have the founder around, they’re poking around,
    0:13:33 they’re really into AI.
    0:13:37 So there can be a more natural pivot of the company from the leadership standpoint.
    0:13:39 So a bunch of different factors.
    0:13:43 Now, to the benefit of startups, which is why I can hold both of these in my head,
    0:13:48 which is I’m very bullish on the SaaS incumbent being the natural place for that AI agent
    0:13:49 relative to that category.
    0:13:53 I just think we have this incredible expansion of categories for the first time
    0:13:55 that we haven’t seen in probably 15 years.
    0:14:02 So the SaaS 1.Wave actually expanded the software universe where we had these new categories of
    0:14:04 software that we didn’t expect before.
    0:14:08 Like nobody would have predicted the confluence and the snowflakes in the pre-on-prem days.
    0:14:11 We didn’t have all of these different cuts of how do you work with data?
    0:14:12 How do you do this workflow?
    0:14:16 Like lines of business didn’t have 15 different applications they got to use.
    0:14:17 Post-SaaS, they did.
    0:14:23 So for startups in the AI world, the equivalent of that is I think there’s a lot of categories now
    0:14:29 where there is no actually software incumbent in that category where AI agents all of a sudden
    0:14:31 let you go build software for that category.
    0:14:33 Legal, healthcare, education, and so on.
    0:14:36 So that’s definitely true on the consumer side, right?
    0:14:39 If you look at the top use cases of open AI, it’s almost like the top of the pyramid of needs,
    0:14:40 right?
    0:14:42 It’s like creativity and fulfillment, et cetera.
    0:14:46 And I think like number five is like professional coding, but everything above that is one of these.
    0:14:48 So on the consumer, that’s very clear.
    0:14:50 Is that clear on the enterprise side?
    0:14:50 I absolutely think so.
    0:14:57 I think if we did a snapshot 10 years ago of the size of the contract management market or
    0:15:00 the legal document market, it’s like sub 2 billion.
    0:15:01 I’m making up the numbers.
    0:15:02 It could be plus or minus a billion.
    0:15:08 Would you agree that in five years from now, the AI agent related spend on legal services
    0:15:12 should be in the many, many billions to double digit billions?
    0:15:13 Absolutely.
    0:15:13 Okay.
    0:15:13 No question.
    0:15:18 So all of a sudden there’s like not these natural incumbents that were like, oh, we captured all
    0:15:19 that market.
    0:15:23 AI agents all of a sudden expands the size of the software related spend in that space.
    0:15:28 So I can underwrite that for healthcare, legal, consulting services.
    0:15:31 I think there’s entire areas of financial services.
    0:15:33 Like we always think, oh, finance has been wired up for so many years.
    0:15:36 No, banking, consumer banking has been wired up.
    0:15:37 Trading has been wired up.
    0:15:39 Investment banking never went digital.
    0:15:41 Wealth management never went digital.
    0:15:46 Like these were not categories where you ever had like major software platforms to help these
    0:15:47 entire categories of the economy.
    0:15:51 And the reason it was because the work was unstructured.
    0:15:56 It’s very ad hoc, very dynamic, lots of unstructured data as opposed to stuff that goes into databases.
    0:15:58 All of that is now ripe for AI.
    0:16:02 And that will then largely be ripe for many startups because there won’t be a natural incumbent
    0:16:03 in those spaces.
    0:16:08 I mean, it’s so strange to me how many disruptions are happening all at the same time with AI,
    0:16:08 right?
    0:16:11 I mean, if you think about like everything you said, which is basically vertical SaaS or vertical
    0:16:13 use cases, which a lot of that is actually human budget, right?
    0:16:13 Yep.
    0:16:14 That’s being disrupted.
    0:16:17 There’s a bunch of new use cases that we never really thought about before, which is
    0:16:21 like the creativity and I mean, who would have thought that 2D image would be some massive
    0:16:21 market?
    0:16:22 Yes.
    0:16:23 But it’s a massive market, right?
    0:16:27 It turns out, you know, I’ve been a programmer for 30 years, right?
    0:16:30 And in that time, like software would disrupt other things.
    0:16:30 Yeah.
    0:16:33 Like we disrupt all of these things, but we never got disrupted.
    0:16:34 We’re safe.
    0:16:35 We’re screwing you guys.
    0:16:38 But clearly now software is being disrupted, right?
    0:16:40 For the first time like I’ve ever seen in 30 years.
    0:16:46 And so do you think this level of disruption is something that existing companies will not?
    0:16:48 Like maybe a more fine point.
    0:16:49 You are a business leader right now.
    0:16:50 You have to think about product.
    0:16:52 You have to think about your organization.
    0:16:53 Does it require you to have to think about too much?
    0:16:55 Like how do you structure your company as well?
    0:16:59 How do you structure your product or do you think this is actually all pretty manageable?
    0:17:01 I think it’s your R&D product.
    0:17:04 Like literally like I’m putting myself in your shoes, right?
    0:17:04 Yeah, yeah, yeah.
    0:17:05 Which is like your CEO.
    0:17:06 Like your R&D is changing.
    0:17:07 Yeah.
    0:17:08 You’re like-
    0:17:09 Every part of the stack is changing.
    0:17:09 Like everything.
    0:17:10 Yeah.
    0:17:23 I think the reason that I’m probably frankly more distracted by what we’re building that I don’t have enough time to stress out about the actual organizational side because I’m stressed out enough about just literally like the actual pure like delivery of the product.
    0:17:27 I think if I had a little bit more time, I’d get more stressed out about all the other change.
    0:17:30 We are very much leaning into the idea of being AI first.
    0:17:31 We have a twofer on this.
    0:17:36 Like one, by being as AI first as possible, we’ll see the use cases that our product should go solve for customers.
    0:17:37 So like check that box.
    0:17:41 And then second is I’m just a believer of the efficiency productivity gains.
    0:17:41 Yeah, yeah, sure.
    0:17:44 And I do think it does change basically everything about work.
    0:17:46 And there’s lots of these interesting examples of what it means.
    0:17:51 So in the future, does the individual contributor basically become a manager of agents?
    0:17:52 Yeah, yeah, yeah.
    0:17:53 So that’s a totally different job.
    0:17:54 Right.
    0:17:54 Right?
    0:18:03 Like my recent kind of go-to is just thinking about it as a lot of the productivity of your organization was rate limited by literally like how fast can somebody use a computer to do something?
    0:18:03 Yeah.
    0:18:06 To type an email, to write code, to generate a marketing asset.
    0:18:06 Yeah.
    0:18:10 When that’s no longer a limiter, how do these jobs begin to change?
    0:18:18 And it’s like, okay, your job is now orchestration, integration of work, planning, task management, reviewing, auditing, and that will radically change work.
    0:18:33 Interestingly, this probably behooves us to not over-rotate on transforming yet internally for any given company simply because the technology is changing so fast that like you probably wouldn’t want to snap the line right now, run your whole business on this technology.
    0:18:35 Because in two years from now, it’s going to happen.
    0:18:37 Because in two years from now, it’s going to be so much better.
    0:18:45 And so I think progressively figuring out which workflows have high impact upside, getting it rolled out in a decentralized way so people can experiment.
    0:18:47 Like I think you want to do a few of those kind of things first.
    0:18:50 I mean, I can’t imagine a listener not knowing what Box does.
    0:18:57 But just for completing this, maybe can you just talk to us very quickly about what Box does and how you’re thinking about how that dovetails with AI?
    0:19:02 Yeah, so we started the company with a really simple premise, make it easy to access and share your files from anywhere.
    0:19:06 And we pivoted about two years into the journey to focus on the enterprise market.
    0:19:10 And the whole idea was enterprises are awash with all this unstructured data.
    0:19:17 So corporate documents, research files, marketing assets, M&A documents, contracts, invoices, all of this.
    0:19:22 And as companies move to the cloud and as they move to mobile, they need a way to access that information.
    0:19:25 They need a way to collaborate securely on it.
    0:19:28 They want to be able to integrate that data across different systems.
    0:19:30 So we built a platform to help companies do that.
    0:19:34 We have about 120,000 customers, about 65 or so percent of the Fortune 500.
    0:19:40 And so what’s incredible right now is we’ve had this ongoing problem since the creation of the company,
    0:19:46 which is with structured data, the stuff that goes into your database, you can query it, you can synthesize it,
    0:19:49 you can calculate it, you can analyze it, your unstructured data, the stuff that we manage,
    0:19:54 you create it, you share it, you look at it, and then you basically kind of get forgotten about.
    0:19:57 Like it goes into some folder and you almost never see it again.
    0:20:01 And maybe you kind of find it once every five years for some task you’re doing, but that’s about it.
    0:20:06 And so most companies are sitting on most of their data being unstructured
    0:20:11 and getting the least amount of value from it relative to their other structured data.
    0:20:13 AI is basically the unlock.
    0:20:17 So AI lets you finally say, okay, we can ask this data questions.
    0:20:22 We can structure it so we can look at a contract, pull out the 10 most important fields.
    0:20:25 Once we have all that data, we can analyze that information.
    0:20:26 We can get insights from it.
    0:20:31 And then you can start to do things like workflow automation that was never possible with your unstructured data.
    0:20:36 So if I want to move a contract through an automatic process, I can’t do it if I don’t know what’s in the contract.
    0:20:40 And the computer previously was not able to know what’s in the contract.
    0:20:46 So for us, there’s just a huge unlock of now what you can finally do with your information and your content.
    0:20:50 So we’re building an AI platform to handle all of the kind of plumbing user experience
    0:20:53 to make then your content AI ready effectively.
    0:20:57 I don’t want to be like too bullshitty and provocative, but I have to ask this.
    0:20:58 Please.
    0:21:00 I’ve been in enterprise software for a very long time.
    0:21:05 A lot of the business model is predicated on the fact that building software is hard and takes a long time.
    0:21:05 Yeah.
    0:21:08 To what extent do you worry about that not being the truth going forward?
    0:21:13 Do you think we enter like this time of bespoke software being upon us?
    0:21:17 I’m bearish on the extreme version of the essence of that.
    0:21:22 So the extreme version of that, if you imagine the polls of this, like the extreme on one poll,
    0:21:24 basically all software is prepackaged.
    0:21:26 It’s the Ford Model T.
    0:21:28 It’s going to work only in one way.
    0:21:29 Everybody uses the same thing.
    0:21:30 Okay.
    0:21:30 That’s not going to happen.
    0:21:31 We get that.
    0:21:34 The other extreme is like everything is just like homebrew.
    0:21:37 You wake up in the morning, you utter something, you get your software for the day.
    0:21:38 You get your software for that thing.
    0:21:40 And then the next day you do it again and you change it.
    0:21:40 Okay.
    0:21:45 The downsides of that model of why basically I think it doesn’t work is I think if you
    0:21:49 ask like the world population, you probably find that 90 plus percent just don’t care enough.
    0:21:54 They just don’t care about the tabs on their software and the modules on their dashboard.
    0:21:57 Like it’s like they want someone else to just be like, this is what you should look at in
    0:21:58 the morning.
    0:21:58 Yeah.
    0:22:01 They don’t want to have to even prompt the AI to tell them what to look at.
    0:22:01 Yeah.
    0:22:05 So given that that’s basically guaranteed to be where 90% of the world, no matter how you
    0:22:06 cut anything.
    0:22:06 Yeah, that’s a great point.
    0:22:11 That means that basically 90% of our software should largely be like, okay, you log into the
    0:22:13 HR system and it just looks like an HR system.
    0:22:19 And in fact, there’s another interesting dynamic, which is like over many years, our software
    0:22:24 and our actual way that we operate companies, there’s this flywheel relationship between them.
    0:22:29 And so the way we run our HR department is like not so different than the way Workday wants
    0:22:31 us to run our HR department.
    0:22:31 Yeah.
    0:22:35 And it’s fine because that’s not the area that we’re going to have a lot of upside innovating
    0:22:35 on.
    0:22:39 And like the way that we do our ticket management from customer tickets is like the way that
    0:22:41 Zendesk decided to do ticket management.
    0:22:44 And that’s fine because that’s not the core IP of the company.
    0:22:47 In a way, it solves an operational problem for you.
    0:22:47 Yes.
    0:22:48 You don’t have to figure it out.
    0:22:48 Right.
    0:22:50 And people miss that about software.
    0:22:54 I don’t want to have to think about the workflow of an HR payroll process.
    0:22:56 I just want the software to do that.
    0:22:59 And so that’s what people are buying.
    0:23:01 And so nobody wants to customize those things.
    0:23:05 Now, again, given that we’re going to be in this world of many different outcomes playing
    0:23:11 out, the reason I’m still bullish on Replit and Vibe coding is for a different category,
    0:23:15 which is like I’m the IT person and I just have this crazy queue of tasks.
    0:23:17 And then someone’s like, can you build a website for this thing?
    0:23:21 Can you like code up some inventory random plugin for this product?
    0:23:24 It’s like that now becomes 10 times easier.
    0:23:27 So the new prototyping, scripting, the long tail of stuff that we want to do.
    0:23:31 And that long tail is so long and people never get to any of those things in that long tail.
    0:23:36 And so I could underwrite a 10x growth of the amount of custom software that gets written
    0:23:41 and the fact that these core systems don’t go away because there’s just actually going
    0:23:43 to be way more software in the world that gets created.
    0:23:44 Let me pressure test this.
    0:23:47 So like, okay, so I can imagine why it would be hard to rebuild Box because what you do
    0:23:48 is actually hard.
    0:23:49 This is core infrastructure.
    0:23:50 You store data like that’s really important.
    0:23:52 And so I don’t think you just Vibe code that away.
    0:23:56 But from my perspective, a lot of SaaS apps just look like CRUD.
    0:24:00 To me, CRUD is, I don’t know what the acronym stands for, but it’s basically you’re reading
    0:24:02 and writing data from like a backend.
    0:24:08 And so do you think that there is a world where the consumption layer evolves to just using AI
    0:24:09 and this class of companies go away?
    0:24:13 Or do you actually think, if I heard what you just said, that the durability of these companies
    0:24:16 is that it basically teaches you what the workflow is?
    0:24:17 Well, I’m still going to say the latter.
    0:24:20 Now, I don’t know if you need to bleep it out, but if you want to share a couple examples
    0:24:25 of who you put in the not hard CRUD layer, then we can parse that.
    0:24:26 But up to you.
    0:24:27 The not hard CRUD layer?
    0:24:28 Yeah.
    0:24:34 I mean, I would say most vertical SaaS companies I see, the technology is trivial.
    0:24:35 Yeah.
    0:24:36 But the understanding of the domain is not.
    0:24:36 No, no, no.
    0:24:37 This is what you said before.
    0:24:38 This is what I want to present.
    0:24:38 That’s the thing.
    0:24:40 That’s actually a great insight.
    0:24:44 I’ve always underestimated vertical SaaS relative to the outcome.
    0:24:44 Yeah.
    0:24:47 And 20 years into doing enterprise software, I’m just like, no longer going to underestimate
    0:24:48 vertical SaaS.
    0:24:49 It’s not about the technology.
    0:24:52 It’s the fact that somebody else has figured out the business model that works.
    0:24:57 Like they have 10 people from the pharma industry that is like sitting next to the engineer
    0:25:00 being like, this is how you should do the clinical trial workflow.
    0:25:02 And that becomes so much of the IP.
    0:25:07 Now, that translates fine to agents, but I still would then bet on that vertical player
    0:25:13 doing that as opposed to somebody prompts their way into ChattoBT to build a FDA compliance
    0:25:13 agent.
    0:25:19 I would still largely bet on complianceagent.ai to do that over the pure horizontal system
    0:25:21 that has no particular domain kind of expertise for that.
    0:25:26 And then I think the other thing, I still think that there’s a relationship between some
    0:25:31 amount of GUI and the agent and the APIs, because again, like you don’t want to every day
    0:25:34 of your life, go to a blank empty screen and say, what’s our revenue today?
    0:25:37 You just want a dashboard at some point and it just shows you the revenue.
    0:25:37 That’s right, of course.
    0:25:39 It’s almost like cast queries in a way.
    0:25:40 Like somebody has made the decision.
    0:25:43 Yes, this is like a known way to solve this problem in the enterprise.
    0:25:48 And so I think that’s why the theory of the full abstraction away from the interface and
    0:25:49 it’s all an API call.
    0:25:50 I don’t think that happens.
    0:25:55 And so ironically, probably what will happen is in a couple of years from now, we will see
    0:25:59 agents like rebuild entire webpages and dashboards.
    0:26:01 And then we’re going to find ourselves like, wait, why are we having an age?
    0:26:06 Why do I have to spend tokens to create a thing that is a config on a dashboard?
    0:26:10 And we’ll just be back to where we started for some amount of software, which will mean
    0:26:13 that basically like these things are going to live together.
    0:26:14 Cool.
    0:26:16 Let’s move from software to decision process.
    0:26:21 So I won’t say the name of the company, but I just spoke with a very, very legit company,
    0:26:22 household name.
    0:26:23 It’s a private company, though.
    0:26:24 It’s not a public company.
    0:26:30 We’re at the board level for every decision they ask the AI for like basically more information
    0:26:31 for the decision.
    0:26:32 Okay.
    0:26:36 And this has actually been great from like discussion fodder to be provocative.
    0:26:42 And it also shows how like fundamentally unoriginal the board members are.
    0:26:45 Like this founder was telling me, it’s literally better than half of my board members.
    0:26:46 Right.
    0:26:51 And so like, how much have you thought about bringing AIs in to like help with decision
    0:26:52 process?
    0:26:52 Yeah.
    0:26:57 And by the way, I think the board is like low hanging fruit because boards tend to not have
    0:26:58 a lot of context to the business.
    0:27:00 And so the incidents are probably less anyways.
    0:27:02 But is this something that you’ve thought about?
    0:27:04 The board one is an interesting one.
    0:27:05 So maybe we can unravel that one.
    0:27:10 But like I already use it for, let’s say, our earnings calls where we’ll do a draft of the
    0:27:11 initial earnings script.
    0:27:16 And then, I mean, again, because BoxAI deals with unstructured data, I just load up the
    0:27:21 earnings script and I’ll use a better model and say, give me 10 points that analysts are
    0:27:22 going to ask about this.
    0:27:23 And like, how would I improve the script?
    0:27:25 And it just spits out a bunch of things.
    0:27:25 And it’s…
    0:27:26 And how good is it at predicting?
    0:27:27 Oh, extremely good.
    0:27:28 Oh, 100%.
    0:27:29 But the thing is, that’s not surprising.
    0:27:33 Like it has access to every public earnings call in history.
    0:27:34 Yeah, yeah.
    0:27:38 And like at the end of the day, analysts can only ask you like, tailwinds, headwinds, who’s
    0:27:38 buying what?
    0:27:41 It’s not because the analysts are smart or not smart.
    0:27:44 It’s just like, those are the things that like you would try and deduce from an earnings
    0:27:45 call, buying a stock.
    0:27:47 And you wouldn’t have thought of these questions beforehand?
    0:27:48 Or is it just like…
    0:27:49 I think you’re doing…
    0:27:50 On the margin, on the margin.
    0:27:51 No, no, sorry.
    0:27:56 So what I’m using is then the specific parts of the document that is missing the answers
    0:27:57 to those questions.
    0:27:59 So I can actually inject the answers into that.
    0:28:03 Because like you’re typing out a thing and like, I forgot to give two case studies in
    0:28:04 this section or whatever.
    0:28:07 It’s a quick way to just do some analysis on something.
    0:28:12 But yeah, I mean, so Bezos famously had this memo-oriented, essay-oriented kind of meeting
    0:28:12 structure.
    0:28:13 And we never did that.
    0:28:16 But I was always fascinated by the companies that could do it.
    0:28:19 And actually, we’re entering a world where probably you could just pull that off, right?
    0:28:23 So imagine if, whether it’s a board meeting or product meeting, you just do a quick, deep
    0:28:24 research essay on the topic.
    0:28:29 Like, obviously, every meeting, every strategy meeting in history would be better off if
    0:28:32 you probably had that as a starting asset to get everybody informed.
    0:28:36 I think the argument against that would be, the reason Bezos said it is because it forced
    0:28:39 people to think clearly about what they’re doing and writing it down.
    0:28:42 So the exercise meant that people walking in the meeting had more context.
    0:28:46 This would almost argue that they would have less context because something else did the
    0:28:46 thinking.
    0:28:47 Well, two things.
    0:28:51 It was to make sure that the person doing the thing had the clarity to write it, for sure.
    0:28:54 But it was also still to inform everybody else that didn’t do that work.
    0:28:57 And so it certainly would have helped everybody else in the room.
    0:28:59 And I’m not 100%.
    0:29:03 I mean, we should do a full longitudinal analysis of like the people that wrote the essay.
    0:29:04 Did they actually have the better products?
    0:29:06 Or like, I mean, there’s some Amazon products I don’t like.
    0:29:09 And so they obviously wrote an essay also for those.
    0:29:12 So I don’t know the hit rate ultimately on the essay specifically as much as the idea of
    0:29:14 like write down a strategy, think it through.
    0:29:18 And so why not have an agent do 90% of the heavy lifting?
    0:29:24 So a lot of my workflows are like, if I have a topic where like maybe the direct change
    0:29:29 of my workflow on this front is the kind of thing that three years ago, I might sort
    0:29:32 of lob over to the chief of staff and say, hey, can you like go research like the pricing
    0:29:35 strategy of this ecosystem or something?
    0:29:37 That’s just a deep research query now.
    0:29:39 And then I’ll wake up and it all look at this thing.
    0:29:44 But what that does is because now I’m not having to calculate that person’s time, their
    0:29:45 tasks, their trade-offs.
    0:29:51 I just do it for the most random things, which means like I’m expanding and exploring
    0:29:54 way more spaces mentally than I would have before.
    0:29:55 And these are the kind of parts.
    0:29:59 And this is equally why I’m like actually more optimistic on the jobs front, because what we
    0:30:04 do too many times with an AI is we like look at today’s way of working and we’re just like
    0:30:06 AI will come in and take 30% of that.
    0:30:06 And it’s like, no, no, no.
    0:30:08 We’ll just do totally different things with AI.
    0:30:12 I wouldn’t have researched that thing before when it was people required to research it
    0:30:15 because that would have been an inane task to send to somebody.
    0:30:15 Yeah, yeah, yeah.
    0:30:17 One thing.
    0:30:22 So when we run the numbers and by run the numbers, I mean look through how AI companies are doing,
    0:30:23 where does the value accrue?
    0:30:26 There’s basically one takeaway.
    0:30:30 And that is like these markets are very large and growing very fast.
    0:30:32 And value is kind of accruing at every layer.
    0:30:35 Everything from like literally chips up to apps.
    0:30:40 And so like the only real sin is zero-sum thinking to be like, oh, like the models are
    0:30:43 not going to be defensible or whatever your zero-sum thinking is that just hasn’t proven
    0:30:44 out.
    0:30:47 Now, this has still largely been a consumer phenomenon.
    0:30:48 So what I’ve been thinking about, and I don’t have an answer.
    0:30:53 I’d love to hear your thought is, is when it comes to enterprise budgets, like you can’t
    0:30:55 just create budget out of thin air.
    0:30:57 So like you actually do have a limited resource.
    0:31:02 And so as budgets get reallocated, to what extent do you think this is like zero-sum,
    0:31:06 like the old budget gets robbed versus like budget accretive?
    0:31:07 Or like, how do you think about that?
    0:31:10 Because again, like where we’ve come from, that has not been an issue.
    0:31:12 I think in the enterprise, it probably will be.
    0:31:14 So it does have to come from somewhere.
    0:31:14 It’s fully logical.
    0:31:15 A couple of things.
    0:31:16 Yeah.
    0:31:20 A large number for startups can also be a very small number for a large corporation.
    0:31:21 Yeah.
    0:31:23 So you have that dynamic playing out.
    0:31:28 I’ll make up random stats, but you could probably take a meaningful engineering team and
    0:31:32 probably for the price of five of those engineers or 10 of those engineers, you could
    0:31:34 probably pay for cursor licenses for the entire engineering team.
    0:31:38 But this would argue that it’s actually coming out of headcount.
    0:31:41 So here’s where the asterisk is.
    0:31:43 There’s an infinite set of ways.
    0:31:46 This is why like you can never take a point in time snapshot on these kinds of things.
    0:31:46 Yeah, totally.
    0:31:48 There’s an infinite set of ways that this actually plays out.
    0:31:49 Yeah.
    0:31:56 Next year’s planning process, maybe in a perfectly like parallel universe, the salary increase
    0:31:59 that year would have been 3.5% for employees.
    0:32:02 And this coming year, it’s 3% because we’re going to take 0.5% and we’re going to deploy
    0:32:04 AI for the company.
    0:32:09 Or maybe next year, we would have added 50 engineers, but we’re going to add 25 and then pay for AI.
    0:32:10 But guess what?
    0:32:14 The year after, we’re going to have engineering productivity gains.
    0:32:16 So it increases because it’s still a competitive environment.
    0:32:21 We then now add engineers the year later because we’re getting higher productivity gains.
    0:32:21 Yeah.
    0:32:25 I think that most companies of any reasonable scale post 100 employees, let’s say, have
    0:32:31 enough sort of dynamism in the financial model within a one to two year period where it doesn’t
    0:32:33 look like what the economists would think it looks like.
    0:32:34 Can I just spit this back?
    0:32:37 Because I think this is actually a very good point that’s buried in there.
    0:32:42 I just want to make sure I’m following along, which is the software license cost to a startup
    0:32:46 relative to like a large people organization is relatively small.
    0:32:50 It’s just a couple of headcount, which if you just look like normal performance management,
    0:32:55 normal attrition, normal like variability, and even like hiring timelines is kind of in
    0:32:56 the noise.
    0:32:59 And so you already have an annual budgeting cycle to fix that up.
    0:33:03 And so like basically within the noise, even of just like headcount planning, all of this
    0:33:05 could work out without some massive disruption.
    0:33:06 And I think that’s such a cool point.
    0:33:10 And there could be an upper limit of this point, but let’s say the going rate in Silicon Valley
    0:33:15 of a new engineer coming out of college, let’s just say it’s somewhere between 125 and 200.
    0:33:16 Okay.
    0:33:17 I’m just making up.
    0:33:17 Okay.
    0:33:17 Yeah.
    0:33:22 Let’s say your most aggressive cursor usage or something is like a thousand bucks a year,
    0:33:22 2000 bucks a year.
    0:33:25 So you’re at like 1% salary maybe.
    0:33:26 Here’s the question.
    0:33:28 Again, do this like crazy apples to apples thing.
    0:33:33 If you went and recruited from Stanford right now and you said, okay, you Stanford grad have a
    0:33:33 choice.
    0:33:41 You can work at this company and get paid 125K with no AI, or you can get paid 123K with
    0:33:42 full access to AI.
    0:33:43 Which one are you going to do?
    0:33:45 They would do the 123 all day long.
    0:33:48 But even that, yeah, I mean like your argument is, which makes a lot of sense to me.
    0:33:50 It’s kind of on the margin when it comes to like 20.
    0:33:55 But just as a way of exploring like why these things are not the high order bit of the cost
    0:33:55 increase on budgets.
    0:33:56 Oh, I love that.
    0:33:57 That’s great.
    0:34:02 And I did one kind of late night sort of like modeling once, but I’m afraid to say all the
    0:34:03 numbers here because I think they’re just going to be so wrong.
    0:34:07 But I think it’s something on the order like five or six trillion in knowledge worker headcount
    0:34:08 spend in the U.S.
    0:34:08 Yeah.
    0:34:11 Everybody says for developers, let’s say 40 million.
    0:34:12 Let’s just say it’s 30 million.
    0:34:12 Yeah.
    0:34:14 Let’s say that the average is 100K or you’re at three trillion.
    0:34:16 Man, these are just massive, massive numbers.
    0:34:17 So it’s many trillion.
    0:34:17 Yeah.
    0:34:19 So you have many trillions of dollars.
    0:34:23 If you take a couple percent of that or five percent of that, you’re already doubling the
    0:34:24 entire sort of U.S.
    0:34:25 enterprise software spend.
    0:34:26 Yeah.
    0:34:28 So you can just make it work within.
    0:34:31 And this is why I don’t think that people will not make cuts because they have to pay for
    0:34:31 AI.
    0:34:33 They might make cuts for other reasons.
    0:34:33 Sure.
    0:34:37 But even in those cases, I think you’ll often have it be for myopic reasons temporarily.
    0:34:38 Yeah.
    0:34:42 And there’s enough flexibility to basically consume this and then actually like recap on
    0:34:43 the productivity game.
    0:34:43 Yeah.
    0:34:43 I think that’s great.
    0:34:49 I try and parse everything you say through the lens of like, where are you landing on AI
    0:34:50 coding?
    0:34:53 And you seem to have a very pragmatic view of where things actually are at.
    0:34:54 Yeah.
    0:34:55 Where are you landing right now?
    0:34:57 Well, it’s been an evolving.
    0:35:02 So I would say in the entire AI thing, the biggest surprise to me is how effective it is
    0:35:02 at code.
    0:35:08 And so my sense is, so I’m just going to say a couple of, I think, facts, and then we can
    0:35:10 kind of back out what this means in aggregate.
    0:35:16 Because I think one fact is, the reality is, I do think that AI helps better developers more
    0:35:17 than not better developers.
    0:35:22 And the reason is you just have to deal with and be able to like know what to ask for and
    0:35:22 know how to deal with the outcome.
    0:35:23 So I think that’s one.
    0:35:27 Someone said it, I thought, beautifully, which was, this is a very good developer.
    0:35:28 And this was on X.
    0:35:29 I forgot who it was, but I thought it capsulated.
    0:35:35 He’s like, you know, 90% of what I know, the value of it has gone to zero, but 10% has
    0:35:37 tripled more than 10X or whatever it is.
    0:35:38 They’ve got 100X.
    0:35:40 And I think that’s exactly right.
    0:35:46 I do think that for a lot of rote use cases, the AI can do it and it doesn’t need to be
    0:35:47 double checked, right?
    0:35:50 So there’s a lot, to your point, like things like prototyping, things like scripting.
    0:35:55 And so I do think if you look at usage of like open AI, if you actually look at code
    0:35:59 usage, it’s like the primary use is actually professional developers, which means it’s part
    0:36:00 of a developer workflow.
    0:36:07 And then probably the most controversial stance I have is, and this is probably like sunk cost
    0:36:10 fallacy because I’ve been a programmer for, I mean, like my PhD is in computer science,
    0:36:12 like, you know, so maybe this is sunk cost fallacy.
    0:36:17 But I just don’t see a world where you get rid of formal programming languages just because
    0:36:20 they arose out of natural languages for a reason.
    0:36:24 Like we started with English and then we made programming languages so that we could formally
    0:36:25 describe stuff.
    0:36:27 And so it’d be kind of a regression to go back.
    0:36:30 So I still think we’ll use languages and maybe they’ll change, maybe more like a scripting
    0:36:35 language, but I think like the existing tool set will evolve, but it’ll still be a professional
    0:36:35 developer.
    0:36:38 Like I think we’ll still have developers, still have developer tools.
    0:36:38 So that’s kind of where I am.
    0:36:39 I would love to hear where you’re at.
    0:36:42 If I’m fully, I’m on the exact same page.
    0:36:46 The fun thing to me is how coding is just at the tip of the kind of iceberg.
    0:36:51 It’s the best thing to first sort of experience agentic automation, but I think you’ll see
    0:36:52 this in basically every other space.
    0:36:59 But what’s so fun is just in a one-year shift, let’s say, of like the nature of the relationship
    0:37:00 with the AI.
    0:37:04 So if you think about the GitHub co-pilot moment was like, oh, this thing is incredible.
    0:37:07 It’s going to type ahead and predict what I’m typing.
    0:37:12 And then you’re basically using it to work 20 or 30% faster and which parts of it do you
    0:37:13 take on or not.
    0:37:19 And now the relationship is like totally different within, again, a year or two period where you’re
    0:37:23 using cursor, Windsor for whatever, and the agent is generating this chunk of an output
    0:37:25 and then you’re just reviewing it.
    0:37:30 But what’s incredible is like none of your expertise is any less valuable in that review.
    0:37:34 In fact, it’s probably even more important than ever before because in some cases, like
    0:37:39 it’s just going to be like wrong 3% of the time and then you review it, but then you’re
    0:37:41 literally doing 3x the amount of output.
    0:37:47 And the nature of how that changes both programming, but just like, why not have that for basically
    0:37:51 everything is sort of this new way that both software should work and then actually we will
    0:37:56 work is like, you know, the big joke a year after Chat to BT is like, okay, this thing generates
    0:37:59 a legal case and it’s like wrong 10% of the time.
    0:38:00 And it’s like, well, actually, hold on.
    0:38:04 If you think about what the new paradigm of work looks like, and it’s like such a weird inversion
    0:38:07 of it used to be the AI was fixing your errors.
    0:38:09 That’s what we thought the AI was going to be.
    0:38:10 And it’s just like a total flip.
    0:38:12 It’s like the human’s job is to fix the AI errors.
    0:38:14 And that’s the new way that we are going to work.
    0:38:15 Right.
    0:38:16 So this begs a very obvious question.
    0:38:17 I’m going to work up to the question.
    0:38:22 So there is a great paper in NSDI from an MIT team, which basically says you can optimize
    0:38:24 a running system with agents.
    0:38:29 And the way they did it is they basically have a teacher agent and then like more junior
    0:38:32 agents and then the more junior agents would go try a bunch of stuff.
    0:38:37 And of course, they had much more knowledge of the literature than any single human being.
    0:38:39 So they try all of different things to try it.
    0:38:41 And then the one at the senior agent would say, oh, this is good.
    0:38:42 This isn’t good.
    0:38:44 And then once it optimized the system, they would do it.
    0:38:48 And then, you know, like the human being is then kind of helping the teacher agent decide
    0:38:52 what are the parameters, what is good, what is not good and provide high level direction.
    0:38:52 Right.
    0:38:58 And so you’re already starting to see cases where human beings are running multiple agents
    0:39:02 and even that already is starting to have some kind of bifurcation, which one way to think
    0:39:08 about it is in any R&D organization, of course, people start as like ICs, but then they very
    0:39:11 quickly get interns and go into management.
    0:39:12 And so maybe we’re just skipping that step.
    0:39:17 So the obvious question is what happens to entry level engineers?
    0:39:22 Like does this change how people get introduced to computer science, for example?
    0:39:26 The cool thing is probably more people will even now get introduced to computer science
    0:39:27 because you’ll be able to…
    0:39:28 Anybody can learn.
    0:39:29 Anybody can learn it.
    0:39:33 And, you know, it’s been 25 years for me, but like in the early days of programming basic
    0:39:37 applications or putting other websites, it was just extremely frustrating that you would
    0:39:40 spend days and days being like, why does that thing not work?
    0:39:40 Yeah, yeah, yeah.
    0:39:43 And like I have very few resources of figuring out why the thing didn’t work.
    0:39:47 It would have been a hundred times easier if I could have had an agent write the thing.
    0:39:48 I would have learned 10 times faster.
    0:39:49 Yeah, yeah.
    0:39:53 Honestly, what you did is you was like, well, not 25 years ago, but 10 years ago, you go
    0:39:53 to Stack Overflow.
    0:39:55 And so it was like the slow version.
    0:40:00 Yeah, but so think about how many people missed the window pre-Stack Overflow that got sort
    0:40:03 of pushed out of the ecosystem because they’re just like, this is too frustrating.
    0:40:07 And so you’re going to have a way bigger funnel at the top of people now learning programming
    0:40:08 and computer science.
    0:40:11 I think a similar percentage of people will fall out.
    0:40:14 So it’s not like, again, you’re going to get a 10x increase in programmers because you
    0:40:17 still have to enjoy it and you have to like solving problems and whatnot.
    0:40:21 It’s going to change the nature of the incoming class of engineers that you hire.
    0:40:25 They will literally not be able to code without AI assisting them.
    0:40:30 And it’s not 100% obvious that’s a bad thing because assuming you have internet and the site
    0:40:32 stays up, like we should have access to the agents.
    0:40:37 So I think it’s mostly just like we have to adapt how we think about the role of an engineer
    0:40:40 and what these tools are giving us in terms of the productivity gains.
    0:40:45 Actually, I meet with a lot of larger, not tech-oriented companies as customers.
    0:40:50 And generally, the thing I’m recommending is hire a bunch of these people because they’re
    0:40:55 going to flip your company on its head of how much faster the organization can run.
    0:40:56 So I do understand.
    0:40:59 I want to be sympathetic to the job market for anybody coming out of college because I don’t
    0:41:00 think it’s easy right now.
    0:41:02 And it probably hasn’t been easy in a number of years.
    0:41:10 But if you are graduating, the thing I would be selling any corporation some way or another
    0:41:15 is that if you are AI native right now coming out of college, the amount you can teach a company
    0:41:16 is unbelievable.
    0:41:20 And then conversely, if you’re a company, you should be actually like prioritizing this talent
    0:41:26 that is just like, why does it take you guys two weeks to research a market to enter?
    0:41:29 I can do that in deep research and get an answer to you in 30 minutes.
    0:41:32 They will be able to show companies way faster ways of working.
    0:41:37 Do you think there’s any stumbling into problems this way, which is like you kind of adopt too quickly.
    0:41:42 It’s like you get into a morass you can’t get out of or you think at this point it’s pretty clear
    0:41:44 this stuff can be practically consumed.
    0:41:46 What would the morass be that you’d get into?
    0:41:50 You hire a bunch of vibe cutters and then they create something that nobody can maintain
    0:41:51 and it’s really slow.
    0:41:51 Oh, yeah, yeah, totally.
    0:41:54 Which, by the way, I will say I have seen this.
    0:41:54 Yeah, yeah, yeah.
    0:41:56 Okay, you could easily overdo this whole thing.
    0:42:01 So I think as with anything, like deploying these strategies in moderation
    0:42:05 while we’re all collectively still getting the technology to work better and better
    0:42:08 is super important and understanding the consequences of these systems.
    0:42:12 So, yes, this is not like a moment to just have your whole company vibe code.
    0:42:15 I will say one of my favorite things that I’m witnessing in the whole coding thing,
    0:42:18 I don’t know, the point of this talk is kind of the AI and the enterprise generally,
    0:42:20 but like the coding thing is just so salient,
    0:42:23 is that a lot of these OG programmers that I’ve known for a long time
    0:42:26 that are off-creating companies or CEOs of public companies like yourself
    0:42:27 are all back to programming.
    0:42:27 Yeah.
    0:42:30 And then you talk to them, you know, many of them, like I code, you know,
    0:42:33 most nights with Cursor just because it’s really enjoyable.
    0:42:37 And the reason I didn’t code before is because I just couldn’t keep up with the fucking frameworks.
    0:42:39 I’m like, dude, I don’t know how to install the fucking thing.
    0:42:41 And what is this Python environment stuff?
    0:42:45 And like, you’re literally learning bad design choices that somebody else just made up.
    0:42:47 Like, they’re not fundamental to the laws of the universe.
    0:42:49 They don’t make you any smarter.
    0:42:51 It’s just like waste of brain space.
    0:42:55 And so in some way, the AI just gets rid of this kind of crufty stuff
    0:42:57 that you probably shouldn’t be wasting brain space on anyway.
    0:43:01 The amount of frustration I have when I look through, let’s say, our product roadmap.
    0:43:01 Yeah.
    0:43:04 Let’s say pre-AI, although this still obviously happens
    0:43:05 because we haven’t fully transformed everything about how we work.
    0:43:08 But pre-AI, when you would see things like,
    0:43:12 we have to upgrade the Python library in this particular product.
    0:43:15 And it’s like three engineer, two quarters.
    0:43:16 No, exactly.
    0:43:20 And like, at the end of that project, zero customers will notice that we did something.
    0:43:24 We resolved some fringe vulnerability that is not going to even happen.
    0:43:26 But you have to do it because there’s some compliance thing
    0:43:30 where you have to make sure you’re on the latest version, which is super important.
    0:43:32 But like, the thing is never going to happen.
    0:43:37 And all of a sudden, like, you are wasting hundreds of thousands of dollars of engineering time.
    0:43:41 And the fact that like, that’s now like a codex task is just unbelievable.
    0:43:45 And the amount of just now things that you can actually relieve your team to go and work on is incredible.
    0:43:51 And the other big, like, boon for the economy, and this is again where the economists just totally miss this stuff,
    0:43:56 is think about every small business on the planet, of which there’s millions, tens of millions, whatever,
    0:44:01 that for the first time ever in history, they have access to resources that are somewhat approximate
    0:44:03 to the resources of a large company.
    0:44:06 Like, they can do any marketing campaign.
    0:44:08 Did you see the NBA finals video from Kalshi?
    0:44:09 No.
    0:44:10 The VO3 video?
    0:44:11 Oh, yeah, yeah, yeah, yeah, yeah.
    0:44:15 Like, you can now put together an otherwise million dollar marketing video.
    0:44:18 For a couple hundred bucks of tokens.
    0:44:21 And that being applied to every domain in every service area,
    0:44:24 I can run a campaign that translates into every language.
    0:44:29 I can have this long tail of bugs that I never got around to automatically get solved.
    0:44:34 I can have the analysis of a top tier consulting firm done for my particular business.
    0:44:43 So for the people or companies that are resourceful and are creative and imaginative, the access to resources right now is just truly unprecedented.
    0:44:48 What do you think is the best metric for anybody interested in tracking this stuff as far as, like, how fast it’s going?
    0:44:49 Is it GDP?
    0:44:51 Is it margin?
    0:44:52 Is it top line?
    0:44:53 Is it headcount growth?
    0:44:54 Is it all the above?
    0:44:55 Like, how do you measure it?
    0:44:56 Yeah.
    0:45:02 I mean, for us, so internally first, and then maybe we’ll spitball some macro solutions to this.
    0:45:08 Internally, we’ve actually explicitly taken the stance that we want to use AI to increase the capacity and capability of the company.
    0:45:10 So just do more.
    0:45:12 For anything that you track, just make sure it happens fast.
    0:45:13 Just do more.
    0:45:14 Just, like, do more or do faster.
    0:45:16 In a given time period, yeah.
    0:45:21 And so that somewhat relieves the pressure from people that, like, this is about cost cutting.
    0:45:21 Yeah, yeah, yeah.
    0:45:23 It’s just like, no, no, just, like, do more right now.
    0:45:24 Let’s figure out what works.
    0:45:25 Some things won’t work.
    0:45:26 We want experimentation.
    0:45:28 So just use AI to do more.
    0:45:29 Okay, so that’s us.
    0:45:35 The way you should measure that then in a couple years from now is either the growth rate of the company should be faster.
    0:45:36 Yeah.
    0:45:45 Or the amount of things that we’re collectively doing should be more, and the only reason that wouldn’t show up in growth rate is that every other company also does more, and so that gets competed away.
    0:45:45 Yeah.
    0:45:50 Which is, like, also a very viable outcome, is this is just the new standard of running a business.
    0:45:51 But there’s no shift in the equilibrium.
    0:45:52 Right.
    0:45:52 There’s no shift in the equilibrium.
    0:45:54 You just have to do it.
    0:46:00 And then the ultimate product of all of that is some other kind of metric of satisfaction of, like, our products get better.
    0:46:02 It could be, like, consumer price index or something.
    0:46:04 Yeah, but, like, did the iPhone show up in GDP?
    0:46:07 I don’t know, but my life is better with the iPhone pre than without the iPhone.
    0:46:08 I’m pretty sure it did.
    0:46:09 Okay, yeah, fine.
    0:46:15 So, but, like, it would ultimately then show up in, like, new cures to diseases, better health care.
    0:46:20 I don’t know that the dollars would move around all that differently as much as just, like, life expectancy should go up.
    0:46:22 Like, cost of housing should go down.
    0:46:30 Like, weird metrics that productivity gains will then drive that the economists wouldn’t naturally associate to, like, enterprise software and AI.
    0:46:36 By the way, this is where I am, which is, like, clearly there’s a disruption because marginal costs are going down on a bunch of stuff.
    0:46:36 Yeah.
    0:46:38 Like, writing code and language reasoning and whatever.
    0:46:43 And, like, some companies will take advantage of that, but I don’t think, like, the fundamental equilibrium changes.
    0:46:46 I think, to your point, I think we just do more tech, products get better faster.
    0:46:47 Yeah.
    0:46:51 We saw problems that we haven’t before, but, like, it’s not asymmetric that we’ve seen in other companies.
    0:46:59 The way I kind of think about it is, you know, if we go back to, like, 1985 and we just looked at how everybody works, I think we would just be totally stunned by how slow everything is.
    0:47:05 And just, like, how long did it take you to research a thing or analyze a market or create a campaign or whatever?
    0:47:05 Yeah.
    0:47:10 But, like, it just has now been baked into our human productivity that we just do all those things really fast now.
    0:47:10 Yeah.
    0:47:16 And so, in 10 years from now, when we all have AI agents running around, we will just look back to today and be, like, how did we function?
    0:47:22 Like, you spent two weeks to decide, like, the message for the marketing campaign?
    0:47:24 Like, how is that possible?
    0:47:27 Like, what we do now is we run 50 experiments with AI agents.
    0:47:28 They all come back with versions.
    0:47:32 We look at them all together, and then we make a decision in an hour, and we move on.
    0:47:34 That’s obviously, like, how work works.
    0:47:36 And, like, that’s what we will be saying 10 years from now.
    0:47:38 Do you think we’ll ever saturate the consumer?
    0:47:44 I caveat this by saying this comes up every one of these inflection points, and so I just wanted to ask it again for the umpteenth time.
    0:47:46 I think I’ll say yes, just because at some point, maybe.
    0:47:50 But, like, my list of purely consumer demands has not gone down.
    0:47:55 Like, healthcare is, like, a totally unmet need that I have.
    0:48:01 I do not like to go to doctors or dentists or anybody because of just how hard it is to get scheduled.
    0:48:03 I mean, buying a fucking car, man.
    0:48:05 Like, there are so many things that, like, just need to be sold.
    0:48:06 The cost of housing.
    0:48:07 We clearly don’t have enough houses.
    0:48:10 Like, where will AI, you know, drive that?
    0:48:12 Okay, so, you know, maybe robotics would be then the play there.
    0:48:17 But, like, I don’t think we’re anywhere close to consumer satisfaction or satisfying all needs of consumers.
    0:48:25 Well, actually, I meant more, will things change so fast that, like, it saturates the ability to adopt new things?
    0:48:27 I do think that that is certainly possible.
    0:48:34 I think I track sort of, let’s say, my parents as a decent kind of proxy or even just, like, college friends that aren’t particularly in tech.
    0:48:34 Yeah.
    0:48:38 And they’re still, like, in their ChatGPT phase of adoption.
    0:48:39 And they haven’t moved on from that.
    0:48:40 They haven’t made a VO video yet.
    0:48:45 They’re just, like, using ChatGPT to ask questions about life experiences they have.
    0:48:50 And so, maybe, ironically, like, one of the problems was ChatGPT was so good.
    0:48:58 If you, like, you know, imagine what people thought AI should be able to do for them, it already met, like, 80% of, like, where they would have sort of projected it.
    0:48:58 Yeah, yeah.
    0:49:01 But when we know, actually, no, it can probably still do 10 to 20x more.
    0:49:02 Yeah.
    0:49:05 But their needs are going to be satisfied for some time on those core use cases.
    0:49:06 Yeah.
    0:49:10 So, I think for, like, the most basic consumer query type things.
    0:49:19 But then this is the opportunity for startups, which is, like, now AI will show up in sort of ways that maybe the person isn’t even, like, in the market for an AI thing.
    0:49:21 They just want a better version of that category.
    0:49:24 I was going to say, this could just, like, simply be another market constraint.
    0:49:27 As soon as it saturates, you just make the product better given, like, the existing.
    0:49:35 Then, if I could just get better healthcare, but I don’t need to think about that as an AI problem or not an AI problem, but AI will be behind the scenes delivering that.
    0:49:35 Yeah.
    0:49:37 Then I don’t think you’re saturated anytime soon.
    0:49:39 Yeah, it’s just the consumption capacity just becomes another market constraint.
    0:49:41 But there’s a ton of other ways that you can improve things.
    0:49:42 100%.
    0:49:42 That’s great.
    0:49:42 Good.
    0:49:44 I love that you’re so optimistic about it.
    0:49:47 I am, I think, 98th percentile optimistic.
    0:49:47 Same.
    0:49:48 Good.
    0:49:48 All right.
    0:49:54 So, I think we’ve had a fairly pragmatic conversation about the current impacts and the near-term impacts.
    0:50:00 If you do a longer view, can you dare to guess what things look like in five to ten years?
    0:50:04 So, Sam Altman and Jack Altman had a podcast recently.
    0:50:05 Yeah, yeah, yeah.
    0:50:12 And I’m going to paraphrase probably in some wrong way, but they were going back and forth about how, like, we just got what we would have predicted as AGI five years ago.
    0:50:14 And it’s just like, we use it.
    0:50:16 And it’s, like, it’s now just built in.
    0:50:17 The most anticlimactic.
    0:50:18 Yeah, it’s anticlimactic.
    0:50:19 Anti-anticlimactic.
    0:50:24 And I think that’s my instinct for a lot of this is five years, ten years, whatever your number is.
    0:50:34 And this is why I’m so optimistic on just society and jobs and all this stuff is I don’t think it’s the Terminator kind of crazy outcome scenario of we automate away everything.
    0:50:48 I think the human capacity for wanting to solve new problems, for creating new products, for serving customers in new ways, for delivering better healthcare, to try and do scientific discovery, like, all of this stuff is just built in us.
    0:50:48 Yeah.
    0:50:49 And it will continue.
    0:50:55 And AI is this kind of up-leveling of the tools that we use to do all those things.
    0:50:59 And so I think the way we work will be totally different in five years or ten years.
    0:50:59 Totally.
    0:51:05 But you’re already seeing enough of probably what it will look like that I think it’s an extrapolation of that.
    0:51:15 It’s when you want the marketing campaign done, you have a set of agents that go and create the assets and choose the markets and figure out the ad plan.
    0:51:19 And then you have a few people review it and debate it and say, okay, let’s go in this direction instead.
    0:51:21 And then you deploy it and you’re on to the next thing.
    0:51:25 And so each company, their units of output grow.
    0:51:29 As a result of that growth, we’re all still in competitive spaces.
    0:51:33 So some of it gets competed out and others will keep growing faster than they would have before.
    0:51:35 So they’ll hire more people and you’ll have new types of jobs.
    0:51:38 Like we’ll have jobs for people just to manage agents.
    0:51:40 And like you’ll have operations teams.
    0:51:43 You know, Adam D’Angelo had this cool role that just kind of got announced.
    0:51:44 Yeah, that was really cool.
    0:51:49 Yeah, the role is to work with Adam at Quora and figure out which workflows can be automated with AI.
    0:51:51 I think you’ll have a lot of those kind of functions.
    0:51:56 But I think one of the exciting things about at least being in Silicon Valley or anybody kind of tuning in, being in this ecosystem,
    0:52:00 is like we’re seeing the change happen faster here.
    0:52:05 And it’s going to be five or ten years of this rolling out to the rest of the economy.
    0:52:11 And so I think we will spend the next five years making the technology actually deliver on the things that we’re all collectively talking about
    0:52:18 to make it more and more robust and the accuracy goes up and the costs go down and the workflows it can tie into are better.
    0:52:20 And we will be working on that for quite some time.
    0:52:29 And you think ultimately this leads to the biggest peace dividend of better products for users, better user experience?
    0:52:31 Yeah, I think the software gets better.
    0:52:33 Our healthcare gets better.
    0:52:34 The life sciences discoveries increase.
    0:52:36 I think it’s all a society net positive.
    0:52:37 I love it.
    0:52:42 Thanks for listening to the A16Z podcast.
    0:52:48 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash A16Z.
    0:52:50 We’ve got more great conversations coming your way.
    0:52:51 See you next time.

    In this episode, a16z General Partner Martin Casado sits down with Box cofounder and CEO Aaron Levie to talk about how AI is changing not just software, but the structure and speed of work itself.

    They unpack how enterprise adoption of AI is different from the consumer wave, why incumbents may be better positioned than people think, and how the role of the individual contributor is already shifting from executor to orchestrator. From vibe coding and agent UX to why startups should still go vertical, this is a candid, strategic  conversation about what it actually looks like to build and operate in an AI-native enterprise.

    Aaron also shares how Box is using AI internally today, and what might happen when agents outnumber employees.

     

    Resources: 

    Find Aaron on X: https://x.com/levie

    Find Martin on X: https://x.com/martin_casado

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

  • Ben Horowitz: What Founders Must Know About AI and Crypto

    AI transcript
    0:00:03 Today, we’re doing something a little different.
    0:00:05 We’re dropping an episode from Impact Theory,
    0:00:07 a show hosted by Tom Bilyeu,
    0:00:09 featuring a conversation with Ben Horowitz,
    0:00:11 co-founder of Andreessen Horowitz.
    0:00:13 Ben rarely does interviews like this,
    0:00:16 and in this one, he goes deep on AI, on power,
    0:00:18 on the future of work, and what it really means
    0:00:20 to be a human in a world of intelligent machines.
    0:00:24 He breaks down why AI is not life, not consciousness,
    0:00:25 but something else entirely.
    0:00:28 Why blockchain is critical to preserving trust and truth
    0:00:29 in the age of deepfakes.
    0:00:32 Why distribution may matter more than code,
    0:00:34 and why history tells us this isn’t the end of jobs,
    0:00:36 but the beginning of something new.
    0:00:37 Let’s get into it.
    0:00:42 This information is for educational purposes only
    0:00:45 and is not a recommendation to buy, hold, or sell
    0:00:47 any investment or financial product.
    0:00:49 This podcast has been produced by a third party
    0:00:51 and may include paid promotional advertisements,
    0:00:53 other company references, and individuals
    0:00:54 unaffiliated with A16Z.
    0:00:57 Such advertisements, companies, and individuals
    0:01:00 are not endorsed by AH Capital Management LLC,
    0:01:02 A16Z, or any of its affiliates.
    0:01:04 Information is from sources deemed reliable
    0:01:05 on the date of publication,
    0:01:08 but A16Z does not guarantee its accuracy.
    0:01:14 Revolutions don’t always come with banners and protests.
    0:01:18 Sometimes, the only shots fired are snippets of code.
    0:01:20 This is one of those moments.
    0:01:24 AI is the most disruptive force in history,
    0:01:26 and it’s no longer a distant possibility.
    0:01:28 It is here right now,
    0:01:31 and it’s already changing the foundations of power
    0:01:32 and the economy.
    0:01:34 Few people have been as influential
    0:01:36 in shaping the direction of AI
    0:01:38 than mega-investor Ben Horowitz.
    0:01:40 A pioneer in Silicon Valley,
    0:01:42 he spent decades at the center
    0:01:44 of every major technological disruption,
    0:01:47 including standing up to the Biden administration’s
    0:01:49 attempts to limit and control AI.
    0:01:50 In today’s episode,
    0:01:53 he lays out where AI is really taking us,
    0:01:55 the forces that will define the next decade,
    0:01:57 and how to position yourself
    0:01:59 before it’s too late.
    0:02:04 You are in an area making investments,
    0:02:05 thinking about some of the things
    0:02:07 that I think are the most consequential
    0:02:08 in the world today
    0:02:09 as it relates to innovation.
    0:02:10 But along those lines,
    0:02:13 you and Marc Andreessen are all in on AI,
    0:02:15 but how do we make sure
    0:02:16 that it benefits everyone
    0:02:18 instead of making humans obsolete?
    0:02:19 To begin with,
    0:02:21 we have to just realize,
    0:02:22 you know, what AI is
    0:02:23 because I think that
    0:02:24 because we called it
    0:02:25 artificial intelligence,
    0:02:26 you know,
    0:02:29 our whole industry of technology
    0:02:30 has a naming problem
    0:02:31 in that we started,
    0:02:32 you know,
    0:02:34 by calling computer science,
    0:02:35 computer science,
    0:02:37 which everybody thought,
    0:02:39 oh, that’s just like computers.
    0:02:42 It’s like the science of a machine
    0:02:44 as opposed to information theory
    0:02:45 and what it really was.
    0:02:47 And then in Web3 world,
    0:02:48 which you’re familiar with,
    0:02:49 we call it cryptocurrency,
    0:02:51 which to normal people
    0:02:52 means secret money.
    0:02:54 But that’s not what it does.
    0:02:55 That’s a good point.
    0:02:56 And then I think
    0:02:57 with artificial intelligence,
    0:03:01 I think that’s also like a bad name
    0:03:02 in a lot of ways
    0:03:03 and that, you know,
    0:03:04 look, the people
    0:03:05 who work on in the field
    0:03:08 call what they’re building models.
    0:03:09 And I think that’s
    0:03:11 a really accurate description
    0:03:12 in the sense that
    0:03:13 what we’re doing
    0:03:15 is we’re kind of modeling
    0:03:17 something we’ve always done,
    0:03:18 which is we’re trying
    0:03:19 to model the world
    0:03:22 in a way that enables us
    0:03:23 to predict it.
    0:03:24 And then, you know,
    0:03:25 we’ve built much more
    0:03:26 sophisticated models
    0:03:27 with this technology
    0:03:29 than we could.
    0:03:30 in the old days,
    0:03:31 we had E equals MC squared,
    0:03:33 which was like amazing,
    0:03:35 but a relatively simple model.
    0:03:38 Now we have models
    0:03:39 with what,
    0:03:41 like 600 billion variables
    0:03:42 and this kind of thing.
    0:03:43 And we can model like,
    0:03:45 what’s the next word
    0:03:46 that I should say
    0:03:48 and that kind of thing.
    0:03:51 So that’s amazing and powerful.
    0:03:52 But I would say like,
    0:03:53 we need to distinguish
    0:03:54 the fact that
    0:03:56 it’s a model
    0:03:59 that is directed by us
    0:04:00 to tell us things
    0:04:01 about the world
    0:04:02 and do things
    0:04:03 on our behalf.
    0:04:05 But it’s not a,
    0:04:06 it’s not life.
    0:04:08 It doesn’t have a free will
    0:04:09 and these kinds of things.
    0:04:11 So I think default,
    0:04:13 you know,
    0:04:13 we are the master
    0:04:16 and it is the servant
    0:04:18 as opposed to vice versa.
    0:04:20 The question though
    0:04:21 that you’re getting at,
    0:04:21 which is, okay,
    0:04:24 how do we not get obsoleted?
    0:04:26 Like, why do we need us
    0:04:27 if we’ve got these things
    0:04:28 that can do all the jobs
    0:04:29 that we currently do?
    0:04:31 And I think, you know,
    0:04:33 we’ve gone through that
    0:04:34 in the past
    0:04:36 and it’s been interesting, right?
    0:04:39 So in I think 1750,
    0:04:41 over 90% of the jobs
    0:04:42 in the country
    0:04:43 were agricultural.
    0:04:45 And, you know,
    0:04:46 there was a huge fight
    0:04:47 and a group called
    0:04:47 the Luddites
    0:04:48 that fought the plow
    0:04:49 and, you know,
    0:04:50 some of these
    0:04:53 newfangled inventions
    0:04:54 that eventually,
    0:04:55 by the way,
    0:04:57 eliminated 97%
    0:04:58 of the jobs
    0:04:59 that were there.
    0:05:01 But I think that
    0:05:03 most people would say,
    0:05:03 gee,
    0:05:05 the life I have now
    0:05:06 is better than
    0:05:07 the life that I would have
    0:05:08 had on the farm
    0:05:10 where all I did
    0:05:11 was farm
    0:05:12 and nothing else
    0:05:13 in life.
    0:05:15 But, you know,
    0:05:15 like,
    0:05:16 and if you want to farm
    0:05:16 still,
    0:05:17 you can.
    0:05:18 that is an option.
    0:05:20 But most people
    0:05:21 don’t take that option.
    0:05:23 So the jobs
    0:05:24 that we have now
    0:05:25 will,
    0:05:25 you know,
    0:05:26 a lot of them
    0:05:27 will go away
    0:05:28 and will have,
    0:05:29 but will likely
    0:05:30 have new jobs.
    0:05:30 I mean,
    0:05:31 humans are pretty good
    0:05:33 at figuring out
    0:05:34 new things to do
    0:05:36 and new things
    0:05:36 to pursue
    0:05:37 and so forth.
    0:05:40 And, you know,
    0:05:40 like including,
    0:05:41 like,
    0:05:42 going to Mars
    0:05:42 and that kind of thing,
    0:05:44 which obviously
    0:05:45 isn’t a thing today
    0:05:46 but could very well
    0:05:47 be a thing tomorrow.
    0:05:50 So I think that,
    0:05:50 you know,
    0:05:51 we have to stay creative
    0:05:53 and keep dreaming
    0:05:54 about, like,
    0:05:55 a better future
    0:05:56 and how to kind of
    0:05:57 improve things for people.
    0:05:57 But I think that,
    0:05:58 you know,
    0:05:58 particularly
    0:06:01 for kind of the people
    0:06:01 in the world
    0:06:03 that are on the struggle bus
    0:06:04 who are living
    0:06:05 on a dollar a day
    0:06:07 or, you know,
    0:06:07 kind of subject
    0:06:08 to all kinds of diseases
    0:06:09 and so forth,
    0:06:10 life is going to get,
    0:06:10 like,
    0:06:11 radically better for them.
    0:06:13 When I look at the plow example
    0:06:13 and the Luddites
    0:06:14 fighting against it,
    0:06:15 I think you’ll see
    0:06:16 the same thing with AI.
    0:06:17 You’re going to get people
    0:06:19 that just completely reject it,
    0:06:19 refuse to engage
    0:06:21 in anything for sure.
    0:06:24 But when I look at AI,
    0:06:26 what I worry about
    0:06:26 is that there will be
    0:06:28 no refuge to go to,
    0:06:30 meaning if you realize,
    0:06:30 oh,
    0:06:31 I can’t plow as well
    0:06:33 as a plow
    0:06:34 or a tractor
    0:06:35 or now a combine,
    0:06:36 there are still
    0:06:37 a lot of other things
    0:06:38 that technology
    0:06:39 can’t do better than me.
    0:06:40 Do you think
    0:06:41 there’s an upper bound
    0:06:42 to artificial intelligence,
    0:06:43 bad name or not,
    0:06:45 or do you think
    0:06:46 that it keeps going
    0:06:47 and it literally
    0:06:49 becomes better than us
    0:06:50 once it’s embodied
    0:06:50 in robotics
    0:06:52 at everything?
    0:06:54 We are kind of limited
    0:06:55 by the new ideas
    0:06:56 that we have.
    0:06:58 And artificial intelligence
    0:06:58 is really,
    0:06:59 by the way,
    0:07:00 artificial human intelligence,
    0:07:01 meaning,
    0:07:03 right,
    0:07:05 humans looked at the world,
    0:07:06 humans figured out
    0:07:07 what it was,
    0:07:08 you know,
    0:07:09 described it,
    0:07:09 came up with these
    0:07:11 concepts like trees
    0:07:11 and,
    0:07:12 you know,
    0:07:14 air and all this stuff
    0:07:16 that it’s not necessarily
    0:07:16 real,
    0:07:19 just how we decided
    0:07:20 to structure the world.
    0:07:21 And AI has learned
    0:07:22 our structure.
    0:07:23 Like,
    0:07:24 they’ve learned language,
    0:07:25 human language,
    0:07:26 which is a version
    0:07:27 of the universe
    0:07:27 that is not
    0:07:29 an accurate version
    0:07:29 of the universe.
    0:07:30 It’s just our version
    0:07:31 of the universe.
    0:07:32 But you’re going to
    0:07:33 have to go deep on that.
    0:07:34 I know my audience
    0:07:35 is going to be like,
    0:07:36 air seems pretty real
    0:07:37 when you’re underwater.
    0:07:38 What do you mean
    0:07:39 that trees and air
    0:07:40 are not necessarily real?
    0:07:41 It’s,
    0:07:41 look,
    0:07:42 it’s a construction
    0:07:43 that we made.
    0:07:43 You know,
    0:07:44 we decided
    0:07:46 it is literally
    0:07:48 the way humans
    0:07:48 have interpreted
    0:07:49 the world
    0:07:50 in order for humans
    0:07:51 to navigate it.
    0:07:52 And,
    0:07:53 you know,
    0:07:55 as is language,
    0:07:55 right?
    0:07:56 Language isn’t
    0:07:57 the universe
    0:07:58 as it is.
    0:07:59 Like,
    0:08:00 if completely objective,
    0:08:02 if you had an objective
    0:08:02 look at,
    0:08:03 you know,
    0:08:05 the atoms and so forth
    0:08:06 and how they were arranged
    0:08:07 and whatnot,
    0:08:09 you probably,
    0:08:10 you know,
    0:08:11 those descriptions
    0:08:12 are lacking
    0:08:13 in a lot of ways.
    0:08:13 They’re not
    0:08:14 completely accurate.
    0:08:16 They don’t,
    0:08:17 you know,
    0:08:17 they certainly don’t
    0:08:18 kind of predict
    0:08:19 everything about
    0:08:19 how the world works.
    0:08:21 And so,
    0:08:23 what machines have learned
    0:08:23 or like what
    0:08:25 artificial intelligence is
    0:08:26 is an understanding
    0:08:27 of our knowledge,
    0:08:29 of the human knowledge.
    0:08:31 so it’s taken
    0:08:31 in our knowledge
    0:08:32 of the universe
    0:08:34 and then it is
    0:08:36 kind of,
    0:08:36 can refine that
    0:08:37 and it can work on that
    0:08:39 and it can derive things
    0:08:40 from our knowledge,
    0:08:42 our axiom set.
    0:08:44 But it isn’t actually
    0:08:45 observing the world
    0:08:46 at this point
    0:08:47 and figuring out
    0:08:47 new stuff.
    0:08:48 So,
    0:08:48 you know,
    0:08:49 at the very least,
    0:08:50 you know,
    0:08:52 humans still have to
    0:08:55 discover new principles
    0:08:56 of the universe
    0:08:57 or kind of
    0:08:57 interpret it
    0:08:58 in a different way
    0:08:59 or the machines
    0:09:00 have to somehow
    0:09:01 observe directly
    0:09:02 the world
    0:09:02 which they’re not
    0:09:03 yet doing.
    0:09:06 And so,
    0:09:06 you know,
    0:09:07 that’s a pretty big role
    0:09:08 I would say
    0:09:09 but then,
    0:09:09 you know,
    0:09:10 in addition,
    0:09:11 you know,
    0:09:11 we direct the world.
    0:09:12 I think like Star Trek
    0:09:13 is actually a pretty
    0:09:14 good metaphor for that.
    0:09:15 Like the Star Trek
    0:09:16 computer was pretty
    0:09:16 badass
    0:09:18 but,
    0:09:19 you know,
    0:09:20 the people in Star Trek
    0:09:21 were still like
    0:09:22 flying around the universe
    0:09:23 discovering new things
    0:09:23 about it.
    0:09:24 You know,
    0:09:24 there was still
    0:09:25 much to do
    0:09:27 and I think that
    0:09:27 it’s always
    0:09:29 a little
    0:09:31 kind of difficult
    0:09:32 to figure out
    0:09:33 what the new jobs
    0:09:34 that get created
    0:09:34 are.
    0:09:35 And we’ve had
    0:09:37 intelligence for a while,
    0:09:37 right?
    0:09:38 Like we’ve had
    0:09:39 machines that could
    0:09:40 do math way better
    0:09:40 than us
    0:09:42 and,
    0:09:43 you know,
    0:09:43 I mean,
    0:09:44 I can remember
    0:09:46 when I was in
    0:09:47 junior high school
    0:09:49 or junior high
    0:09:49 put on a play
    0:09:50 about like
    0:09:51 how bad it was
    0:09:51 that there were
    0:09:52 calculators
    0:09:52 because nobody
    0:09:53 would know
    0:09:53 how to do
    0:09:54 arithmetic
    0:09:54 and then
    0:09:56 all the calculators
    0:09:56 would break
    0:09:57 and then we’d
    0:09:57 be stuck.
    0:09:58 We’d be trying
    0:09:59 to like fly around
    0:09:59 the universe
    0:10:00 and rockets
    0:10:01 but we wouldn’t
    0:10:02 be able to do math
    0:10:03 and the calculators
    0:10:03 would be broken
    0:10:04 and we’d be screwed.
    0:10:06 So there is always
    0:10:07 that fear
    0:10:09 and,
    0:10:09 you know,
    0:10:10 we’ve had computers
    0:10:11 that can play games
    0:10:12 better than us.
    0:10:12 We currently have
    0:10:13 computers that can
    0:10:14 drive better than us
    0:10:15 and so forth.
    0:10:16 So we have a lot
    0:10:16 of intelligence
    0:10:17 out there
    0:10:19 but it hasn’t,
    0:10:21 you know,
    0:10:22 created like
    0:10:23 this super dystopia,
    0:10:24 you know,
    0:10:25 in any degree.
    0:10:25 It’s actually
    0:10:26 made things better,
    0:10:27 you know,
    0:10:28 everywhere it’s appeared.
    0:10:29 So I would expect
    0:10:30 that to continue.
    0:10:31 What do you think
    0:10:32 though is the limiting
    0:10:32 function?
    0:10:34 So when I look
    0:10:34 at AI,
    0:10:36 I always say
    0:10:37 unless we run
    0:10:38 into an upper bound
    0:10:39 where the computation
    0:10:40 just can’t allow
    0:10:41 the intelligence
    0:10:42 to keep progressing,
    0:10:45 it seems like
    0:10:45 it will become
    0:10:46 not only
    0:10:47 generalized human
    0:10:47 intelligence
    0:10:48 and thusly be able
    0:10:48 to do everything
    0:10:49 that we can do,
    0:10:50 it will become
    0:10:52 embodied as robotics
    0:10:53 and if,
    0:10:54 I ran the math
    0:10:54 on this once,
    0:10:56 Einstein is roughly
    0:10:58 2.4 times smarter
    0:10:58 than someone
    0:11:00 who is definitionally
    0:11:00 a moron
    0:11:02 and the gap
    0:11:03 just between
    0:11:04 those two
    0:11:06 is so dramatic
    0:11:07 the army won’t
    0:11:08 even draft
    0:11:09 somebody
    0:11:10 that is a moron
    0:11:11 at, you know,
    0:11:11 whatever,
    0:11:12 81 IQ
    0:11:13 or whatever it is.
    0:11:14 because it’s,
    0:11:14 they create
    0:11:15 more problems
    0:11:16 than they solve
    0:11:17 even just by being,
    0:11:18 you know,
    0:11:19 bullet fodder.
    0:11:21 So do you think
    0:11:22 there is something
    0:11:23 that’s going to cause
    0:11:25 that upper bound
    0:11:27 or you have a belief
    0:11:29 about the nature
    0:11:30 of intelligence
    0:11:31 that will keep
    0:11:33 AI subservient
    0:11:34 to us?
    0:11:35 The smartest people
    0:11:36 don’t rule the world,
    0:11:37 you know,
    0:11:38 Einstein wasn’t in charge
    0:11:42 and, you know,
    0:11:42 many of us
    0:11:43 are like ruled
    0:11:45 by our cats
    0:11:47 and so,
    0:11:48 like,
    0:11:49 power and intelligence
    0:11:50 don’t necessarily
    0:11:51 go together,
    0:11:51 particularly when
    0:11:52 the intelligence
    0:11:53 has no free will
    0:11:54 or has no
    0:11:55 desire to do
    0:11:56 free will.
    0:11:56 It doesn’t have will.
    0:11:58 You know,
    0:11:59 it is kind of a model
    0:12:00 that’s computing things.
    0:12:02 I think also,
    0:12:02 you know,
    0:12:03 the whole general
    0:12:04 intelligence thing
    0:12:06 is interesting
    0:12:07 in that,
    0:12:10 Waymo’s got a super
    0:12:10 smart AI
    0:12:11 that can drive a car
    0:12:14 but that AI
    0:12:14 that drives a car
    0:12:16 doesn’t know English
    0:12:18 and, you know,
    0:12:19 isn’t, you know,
    0:12:20 particularly good
    0:12:21 at other tasks,
    0:12:22 you know,
    0:12:23 currently
    0:12:24 and then the
    0:12:26 chat GPT
    0:12:27 can’t drive a car.
    0:12:30 And so that’s,
    0:12:31 you know,
    0:12:33 how much things
    0:12:35 generalize particularly
    0:12:36 and if you look
    0:12:36 at, well,
    0:12:37 why is that?
    0:12:38 A lot of it
    0:12:39 actually has to do
    0:12:40 with the long tail
    0:12:41 of human behavior
    0:12:43 where humans,
    0:12:45 you know,
    0:12:46 the distribution
    0:12:47 of human behavior
    0:12:47 is,
    0:12:49 it’s fractal,
    0:12:50 it’s mantle
    0:12:51 broadion or whatever,
    0:12:53 it’s not evenly
    0:12:54 distributed at all
    0:12:56 and so,
    0:12:58 you know,
    0:12:58 an AI
    0:13:00 that kind of
    0:13:01 captures all that
    0:13:02 turns out to be,
    0:13:04 you know,
    0:13:05 we’re not so much
    0:13:06 on the track,
    0:13:07 we’re more on the track
    0:13:08 for kind of
    0:13:10 really great reasoning
    0:13:12 over kind of
    0:13:12 a set of axioms
    0:13:14 that we came up with,
    0:13:15 you know,
    0:13:15 in say math
    0:13:16 or physics
    0:13:18 but not so much
    0:13:21 you know,
    0:13:21 kind of general
    0:13:23 human intelligence
    0:13:24 which is,
    0:13:25 you know,
    0:13:26 being able to
    0:13:28 navigate other humans
    0:13:30 and the world
    0:13:30 in a way
    0:13:32 that is productive
    0:13:33 for us
    0:13:33 is kind of a,
    0:13:34 it’s a little bit
    0:13:35 of a different
    0:13:36 dimension of things.
    0:13:37 You know,
    0:13:37 yeah,
    0:13:38 you can compare
    0:13:40 the math capabilities
    0:13:41 or the go
    0:13:43 playing capabilities
    0:13:43 or the driving
    0:13:44 capabilities
    0:13:46 or the IQ test
    0:13:47 capabilities
    0:13:48 of a computer
    0:13:49 but that,
    0:13:51 that’s not really
    0:13:52 a human.
    0:13:53 I think a human
    0:13:54 is kind of different
    0:13:55 in a fairly
    0:13:56 fundamental way.
    0:13:58 So what we end up
    0:13:58 doing,
    0:13:59 I think is going
    0:14:00 to be different
    0:14:00 than what we’re
    0:14:01 doing today
    0:14:02 just like what
    0:14:02 we’re doing today
    0:14:03 is very different
    0:14:04 than what we did
    0:14:04 a hundred years ago.
    0:14:06 But,
    0:14:07 you know,
    0:14:08 the not having
    0:14:09 a need for us,
    0:14:10 I think that,
    0:14:10 you know,
    0:14:12 these AIs
    0:14:13 are tools for us
    0:14:15 to basically
    0:14:16 navigate the world
    0:14:17 and help us
    0:14:18 solve problems
    0:14:19 and do things
    0:14:20 like,
    0:14:21 you know,
    0:14:22 everything from
    0:14:24 prevent pandemics
    0:14:25 to deal with
    0:14:27 climate change
    0:14:28 to that sort of thing,
    0:14:29 to not kill each other
    0:14:30 driving cars,
    0:14:32 which we do a lot of.
    0:14:34 You know,
    0:14:35 hopefully it doesn’t,
    0:14:36 you know,
    0:14:38 create more wars,
    0:14:38 hopefully it creates
    0:14:39 less wars,
    0:14:40 but we’ll see.
    0:14:41 What I know
    0:14:43 about the human brain
    0:14:44 maybe tricking me
    0:14:46 into painting a vision
    0:14:46 of the future
    0:14:47 that isn’t going
    0:14:47 to come true,
    0:14:49 let me put words
    0:14:49 in your mouth
    0:14:50 and you tell me
    0:14:52 if they fit appropriately.
    0:14:53 What I hear you saying
    0:14:54 is something akin
    0:14:55 to the way
    0:14:56 that we’re approaching
    0:14:57 artificial intelligence
    0:14:57 right now,
    0:14:58 let’s round it
    0:15:00 to large language models,
    0:15:02 that is going
    0:15:04 to hit an upper bound
    0:15:05 where it’s not able
    0:15:07 to have insights
    0:15:07 that a human
    0:15:08 won’t already have,
    0:15:10 that they are trapped
    0:15:11 inside of the box
    0:15:11 that we have created,
    0:15:12 what you’re calling
    0:15:13 the axioms
    0:15:14 by which we navigate
    0:15:14 the world.
    0:15:15 They get trapped
    0:15:16 inside that box
    0:15:18 and thusly will never
    0:15:19 be able to look
    0:15:19 at the world
    0:15:20 and go,
    0:15:21 I’m not going
    0:15:21 to predict
    0:15:22 the next frame,
    0:15:22 I’m going to render
    0:15:23 the next frame
    0:15:24 based on what I know
    0:15:25 about physics.
    0:15:27 And so water reacts
    0:15:28 this way
    0:15:29 in an earthbound
    0:15:29 gravity system
    0:15:30 and so it’s going
    0:15:31 to splash like this
    0:15:32 and it understands
    0:15:32 liquid dynamics,
    0:15:33 et cetera,
    0:15:33 et cetera.
    0:15:36 So is that accurate?
    0:15:37 Are you saying
    0:15:37 that it is trapped
    0:15:38 inside of our box
    0:15:39 and will never have?
    0:15:40 It hasn’t demonstrated
    0:15:41 that capability yet
    0:15:43 so it hasn’t
    0:15:44 walked up to a rock
    0:15:45 and said this is a rock,
    0:15:46 right?
    0:15:47 We labeled it a rock
    0:15:49 because that’s our structure.
    0:15:50 But a rock
    0:15:51 isn’t probably
    0:15:54 a more intelligent being
    0:15:55 would have called it
    0:15:55 something else
    0:15:57 or maybe the rock
    0:15:58 is irrelevant
    0:16:00 to how you actually
    0:16:01 can navigate
    0:16:02 the world safely.
    0:16:05 and kind of figuring
    0:16:06 those things out
    0:16:07 or kind of adopting
    0:16:08 to them
    0:16:08 is just not something
    0:16:11 that, you know,
    0:16:12 it’s trained on our
    0:16:14 rendition of the universe
    0:16:16 in our kind of
    0:16:19 literally like the way
    0:16:20 we have described it
    0:16:22 using language
    0:16:22 that we invented.
    0:16:26 and so it is
    0:16:29 constrained a bit
    0:16:30 to that in nature
    0:16:31 currently.
    0:16:33 You know,
    0:16:34 that doesn’t mean
    0:16:34 it’s not like
    0:16:36 a massively useful tool
    0:16:37 and can do things
    0:16:38 and by the way
    0:16:39 can derive new rules
    0:16:40 from the old rules
    0:16:41 that we’ve given it
    0:16:42 for sure.
    0:16:46 but, you know,
    0:16:46 we’ll,
    0:16:49 like I think
    0:16:49 it’s a bit of a jump
    0:16:50 to go,
    0:16:51 you know,
    0:16:52 it’s going to replace
    0:16:53 us entirely
    0:16:56 when the whole
    0:16:57 discovery process
    0:16:58 is something that we do
    0:16:59 that it doesn’t do yet.
    0:17:01 Okay.
    0:17:02 The way that the human
    0:17:03 mind is architected
    0:17:04 is you have
    0:17:05 competing regions
    0:17:06 of the brain
    0:17:06 like if you cut
    0:17:07 the corpus callosum
    0:17:08 the part that connects
    0:17:09 the left and the right
    0:17:09 hemisphere
    0:17:10 you can get
    0:17:12 two distinct personalities
    0:17:13 one that is atheist
    0:17:14 for instance
    0:17:15 and one that
    0:17:17 believes deeply in God
    0:17:17 and they’ll argue
    0:17:18 back and forth.
    0:17:18 I mean,
    0:17:19 this is in the same
    0:17:20 human brain.
    0:17:22 So that tells me
    0:17:23 that what you have
    0:17:24 is basically
    0:17:25 regions of the brain
    0:17:26 that get good
    0:17:26 at a thing
    0:17:28 and then they end up
    0:17:28 coming together
    0:17:29 to collaborate
    0:17:30 and that is
    0:17:31 sort of human intelligence
    0:17:31 and I’ve heard you
    0:17:32 talk about
    0:17:32 there’s something
    0:17:34 like 200 computers
    0:17:36 inside of a single car
    0:17:38 so if we already know
    0:17:38 that you can
    0:17:39 daisy chain
    0:17:40 all of these
    0:17:40 like it’s
    0:17:41 it’s a very deep
    0:17:42 knowledge about one thing
    0:17:44 but as you daisy chain
    0:17:45 then the intelligence
    0:17:47 gets what I’ll call
    0:17:48 more generalized
    0:17:49 you don’t see that
    0:17:50 as a flywheel
    0:17:51 that is going to
    0:17:52 keep going.
    0:17:53 You know,
    0:17:54 what we can compute
    0:17:55 will get
    0:17:57 better and better
    0:17:57 and better
    0:18:00 but having said that
    0:18:00 you know,
    0:18:01 that doesn’t
    0:18:02 say that
    0:18:05 like humans
    0:18:06 one,
    0:18:07 you know,
    0:18:08 humans
    0:18:09 built the machines
    0:18:10 plug them in
    0:18:11 give them the batteries
    0:18:12 all these kinds
    0:18:13 of things
    0:18:15 and
    0:18:17 you know,
    0:18:18 and they’ve been
    0:18:19 created to
    0:18:20 fulfill our purposes
    0:18:21 so
    0:18:22 you know,
    0:18:23 what it means
    0:18:23 to be a human
    0:18:24 will probably
    0:18:25 will change
    0:18:26 like it has been
    0:18:26 changing
    0:18:28 and kind of
    0:18:29 how humans
    0:18:29 live their life
    0:18:30 will change
    0:18:31 but humans
    0:18:32 still find things
    0:18:32 to do
    0:18:32 I mean,
    0:18:33 like it’s kind
    0:18:33 of like
    0:18:34 you know,
    0:18:35 like a cheetah
    0:18:35 has been able
    0:18:36 to run faster
    0:18:36 than a human
    0:18:37 forever
    0:18:38 but we
    0:18:38 never watch
    0:18:39 cheetahs race
    0:18:40 we only watch
    0:18:40 humans race
    0:18:41 each other
    0:18:43 you know,
    0:18:44 computers have
    0:18:44 played chess
    0:18:44 better than
    0:18:45 humans for a
    0:18:45 long time
    0:18:46 but nobody
    0:18:47 watches computers
    0:18:47 play chess
    0:18:48 anymore
    0:18:48 they watch
    0:18:49 humans play
    0:18:49 humans
    0:18:49 and chess
    0:18:50 is more
    0:18:50 popular than
    0:18:51 it’s ever
    0:18:51 been
    0:18:52 and so I
    0:18:53 think we
    0:18:53 have like
    0:18:53 a keen
    0:18:54 interest
    0:18:54 in each
    0:18:54 other
    0:18:55 and
    0:18:57 how that’s
    0:18:57 going to
    0:18:57 work
    0:18:57 and these
    0:18:58 will be
    0:18:58 kind of
    0:18:58 tools
    0:18:59 to enhance
    0:19:00 that whole
    0:19:00 experience
    0:19:01 for us
    0:19:01 but I
    0:19:01 think
    0:19:02 it’s
    0:19:03 you know
    0:19:06 like a
    0:19:07 world of
    0:19:07 just
    0:19:08 machines
    0:19:09 seems
    0:19:10 like
    0:19:10 that
    0:19:10 seems
    0:19:11 like
    0:19:11 really
    0:19:11 unlikely
    0:19:12 so
    0:19:13 you’ve
    0:19:13 got
    0:19:13 people
    0:19:14 like
    0:19:15 Elon Musk
    0:19:16 Sam
    0:19:16 Altman
    0:19:17 who have
    0:19:17 both
    0:19:18 expressed
    0:19:18 deep
    0:19:19 concerns
    0:19:19 about
    0:19:21 how
    0:19:22 AI
    0:19:24 may
    0:19:24 in fact
    0:19:25 make us
    0:19:25 obsolete
    0:19:25 Elon
    0:19:26 has likened
    0:19:27 he’s
    0:19:27 certainly
    0:19:27 become
    0:19:28 fatalistic
    0:19:29 but he
    0:19:30 gave a
    0:19:31 rant
    0:19:31 that I
    0:19:31 absolutely
    0:19:32 love
    0:19:32 that is
    0:19:33 AI
    0:19:33 is a
    0:19:33 demon
    0:19:34 summoning
    0:19:34 circle
    0:19:35 and
    0:19:35 you’re
    0:19:35 calling
    0:19:36 forward
    0:19:36 this demon
    0:19:36 that you
    0:19:37 were just
    0:19:37 convinced
    0:19:38 you’re going
    0:19:38 to be
    0:19:38 able to
    0:19:38 control
    0:19:39 and he
    0:19:39 certainly
    0:19:40 is not
    0:19:40 so sure
    0:19:40 and at
    0:19:41 one point
    0:19:41 and again
    0:19:42 I’m fully
    0:19:42 aware
    0:19:43 that he’s
    0:19:43 on his
    0:19:44 fatalist
    0:19:44 arc
    0:19:44 and he’s
    0:19:45 just
    0:19:45 moving
    0:19:45 forward
    0:19:45 and he’s
    0:19:46 building
    0:19:46 as fast
    0:19:46 as he
    0:19:46 can
    0:19:48 it’s
    0:19:48 interesting
    0:19:49 that both
    0:19:50 of them
    0:19:50 despite
    0:19:51 saying
    0:19:51 these
    0:19:52 things
    0:19:52 are
    0:19:52 building
    0:19:53 AI
    0:19:53 as
    0:19:54 fast
    0:19:54 as
    0:19:54 like
    0:19:55 they’re
    0:19:55 literally
    0:19:56 in a race
    0:19:56 with each
    0:19:56 other
    0:19:57 to see
    0:19:57 who can
    0:19:57 build it
    0:19:58 faster
    0:20:01 that they’re
    0:20:01 warning
    0:20:02 about
    0:20:03 what do
    0:20:03 you take
    0:20:04 away
    0:20:04 from that
    0:20:04 is it
    0:20:04 just
    0:20:05 regulatory
    0:20:05 capture
    0:20:06 on both
    0:20:06 of their
    0:20:07 parts
    0:20:07 is it
    0:20:08 is Elon
    0:20:09 being
    0:20:09 sincere
    0:20:10 not that I
    0:20:10 need you
    0:20:10 to mind
    0:20:10 read him
    0:20:11 but like
    0:20:11 what do
    0:20:12 you take
    0:20:12 away
    0:20:13 in
    0:20:14 the fact
    0:20:14 that
    0:20:14 they’ve
    0:20:15 both
    0:20:15 warned
    0:20:15 against
    0:20:16 it
    0:20:16 and
    0:20:16 they’re
    0:20:16 both
    0:20:17 deploying
    0:20:18 it
    0:20:18 as
    0:20:18 fast
    0:20:18 as
    0:20:18 they
    0:20:18 can
    0:20:19 yeah
    0:20:20 it seems
    0:20:20 fairly
    0:20:21 contradictory
    0:20:23 look I
    0:20:24 think there
    0:20:24 is
    0:20:25 like I
    0:20:26 won’t
    0:20:26 question
    0:20:27 either
    0:20:27 their
    0:20:28 sincerity
    0:20:28 at some
    0:20:28 degree
    0:20:29 but I
    0:20:29 do
    0:20:29 think
    0:20:29 there
    0:20:30 are
    0:20:32 many
    0:20:32 reasons
    0:20:33 to warn
    0:20:33 about
    0:20:33 it
    0:20:33 but like
    0:20:34 I also
    0:20:34 think
    0:20:35 that
    0:20:36 you know
    0:20:37 any
    0:20:37 kind
    0:20:38 of new
    0:20:38 super
    0:20:38 powerful
    0:20:39 technology
    0:20:40 you know
    0:20:41 in a way
    0:20:41 they’re right
    0:20:42 to kind
    0:20:42 of warn
    0:20:42 about
    0:20:42 like
    0:20:43 okay
    0:20:44 this
    0:20:45 thing
    0:20:46 if we
    0:20:48 you know
    0:20:48 if we
    0:20:48 don’t
    0:20:49 think
    0:20:49 about
    0:20:49 some
    0:20:49 of
    0:20:49 the
    0:20:50 implications
    0:20:50 of
    0:20:50 it
    0:20:51 could
    0:20:51 get
    0:20:51 dangerous
    0:20:52 and I
    0:20:52 think
    0:20:52 that’s
    0:20:52 a good
    0:20:53 thing
    0:20:53 like
    0:20:53 every
    0:20:53 technology
    0:20:54 we’ve
    0:20:54 ever
    0:20:54 had
    0:20:54 from
    0:20:55 fire
    0:20:55 to
    0:20:57 you know
    0:20:58 from
    0:20:58 fire
    0:20:59 to
    0:20:59 automobiles
    0:20:59 to
    0:21:00 nuclear
    0:21:00 to
    0:21:01 AI
    0:21:02 has
    0:21:02 got
    0:21:03 the
    0:21:03 internet
    0:21:05 has
    0:21:05 got
    0:21:05 downsides
    0:21:06 to
    0:21:06 it
    0:21:06 they
    0:21:06 all
    0:21:06 have
    0:21:07 downsides
    0:21:08 and
    0:21:08 the
    0:21:09 more
    0:21:09 powerful
    0:21:09 the
    0:21:10 more
    0:21:10 kind
    0:21:10 of
    0:21:12 you know
    0:21:12 kind
    0:21:12 of
    0:21:12 intriguing
    0:21:13 the
    0:21:13 downside
    0:21:13 and
    0:21:13 you know
    0:21:14 maybe
    0:21:14 like
    0:21:15 you know
    0:21:15 without
    0:21:16 the
    0:21:16 internet
    0:21:16 we probably
    0:21:17 would have
    0:21:17 never gotten
    0:21:18 to AI
    0:21:20 and so
    0:21:22 maybe that was the downside of the internet
    0:21:24 that it led to AI or something like that you could argue
    0:21:26 but I think
    0:21:27 generally
    0:21:33 we would take every technology we’ve invented and keep it because you know net net they’ve
    0:21:38 been positive for humanity and for the world and and that’s generally and that’s
    0:21:40 why I think they’re building it so fast because I think they know that
    0:21:47 all right so anybody with a 17 year old right now is thinking oh my how where do I point
    0:21:53 my kid what do I tell them to go study that’s future proof what can we learn about the way you guys are investing at
    0:22:01 Andreessen Horowitz that would give somebody an inclination of what you think a 17 year old should be focused on now
    0:22:10 yeah you know it’s really interesting I think one of the things what we’re saying in the kind of smartest young people that come out is
    0:22:18 they spend a lot of time with AI learning everything they possibly can so I think you want to get very good at like high
    0:22:19 curiosity
    0:22:21 and
    0:22:23 and then learning you know you have
    0:22:26 available to you all of human knowledge
    0:22:28 in something that will talk to you
    0:22:33 and that’s you know that’s an incredible opportunity and I think that
    0:22:36 anything you want to do in the world to make the world better
    0:22:40 you now have the tools as an individual to do that in a way that
    0:22:44 you know if you look at kind of what Thomas Edison had to do
    0:22:47 in creating GE and like what that took and
    0:22:51 and so forth you know it was a way higher bar to have an impact
    0:22:53 whereas now I think
    0:22:55 you know you can
    0:22:57 very quickly
    0:23:02 you know build something or do something that you know just pick a problem
    0:23:07 it’s not like you know sometimes in this AI conversation the thing that we ignore is like
    0:23:10 well what are the problems that we have in the world
    0:23:10 well they
    0:23:13 well we still have
    0:23:14 cancer and diabetes
    0:23:16 and sickle cell
    0:23:17 and
    0:23:18 and every disease
    0:23:19 and we still have
    0:23:20 the threat of pandemics
    0:23:21 and we still have
    0:23:22 climate change
    0:23:23 and we still have
    0:23:25 you know lots of people who are
    0:23:26 starving to death
    0:23:27 and we still have malaria
    0:23:27 and
    0:23:30 and so like pick a problem you want to solve
    0:23:32 and now
    0:23:34 you know
    0:23:36 you have a huge helping hand in doing that
    0:23:38 that nobody in the history of
    0:23:40 the planet has ever had before
    0:23:40 so
    0:23:42 I think there’s
    0:23:44 really great opportunities along those lines
    0:23:47 so that
    0:23:47 that would be my
    0:23:48 you know
    0:23:50 my
    0:23:51 best advice
    0:23:53 I think is to
    0:23:55 to get really good with that
    0:23:55 and
    0:23:58 and look I think a lot of the things that we’ve learned
    0:24:01 or that have been valuable skills traditionally
    0:24:02 are going to change
    0:24:03 so you really
    0:24:04 you know again
    0:24:06 want to be able to learn how to do anything
    0:24:08 and
    0:24:09 and I think that’s probably
    0:24:11 going to be key
    0:24:13 when I look at
    0:24:13 the
    0:24:14 things you were just talking about
    0:24:15 that feels right
    0:24:17 for people that have
    0:24:17 the
    0:24:18 the inclination
    0:24:19 that have the
    0:24:20 cognitive horsepower
    0:24:22 to go and say
    0:24:23 okay I’m going to leverage AI
    0:24:25 to extend my capabilities
    0:24:26 to tackle the biggest problems in the world
    0:24:28 certainly right now in this moment
    0:24:28 that
    0:24:29 that is the
    0:24:30 thrilling reality
    0:24:31 that people should focus on
    0:24:32 but then
    0:24:34 I contrast that with
    0:24:36 the deaths of despair
    0:24:37 a young
    0:24:38 among
    0:24:39 largely
    0:24:40 young men
    0:24:43 we have this problem
    0:24:43 in
    0:24:44 call it
    0:24:45 middle America
    0:24:46 where
    0:24:47 manufacturing jobs
    0:24:48 have gone away
    0:24:48 so for that
    0:24:49 normal
    0:24:50 just sort of everyday person
    0:24:51 I want to have a trade
    0:24:52 I want to go out into the world
    0:24:53 and get something done
    0:24:55 is AI going to be
    0:24:56 useful to them
    0:24:58 or are they going to
    0:24:59 get replaced by
    0:24:59 robotics
    0:25:01 the truth of it is
    0:25:02 is there’s only one
    0:25:03 robot supply chain in the world
    0:25:04 and that’s in China
    0:25:05 and
    0:25:07 you know
    0:25:07 so
    0:25:07 like
    0:25:08 we all need a robot
    0:25:09 supply chain
    0:25:11 we need to
    0:25:11 manufacture that
    0:25:13 so I think there’s going to be
    0:25:13 like a real
    0:25:16 manufacturing opportunity
    0:25:18 coming up
    0:25:19 and it’ll be
    0:25:20 a different kind of
    0:25:20 manufacturing
    0:25:22 certainly more will be
    0:25:23 automated and so forth
    0:25:24 but there will be a lot of
    0:25:26 things to learn
    0:25:26 in that field
    0:25:28 that I think will be
    0:25:29 you know
    0:25:30 super interesting
    0:25:31 and
    0:25:33 and you know
    0:25:34 likely very very good job
    0:25:35 so I
    0:25:36 ironically
    0:25:37 I would say
    0:25:38 like going into
    0:25:39 manufacturing now
    0:25:40 as a young man
    0:25:41 and trying to
    0:25:42 you know
    0:25:43 kind of figure out
    0:25:44 what that is
    0:25:45 and get engaged in it
    0:25:46 will probably lead to
    0:25:47 you know
    0:25:48 quite a
    0:25:49 good career
    0:25:50 you know
    0:25:50 maybe in
    0:25:52 creating
    0:25:53 factories
    0:25:53 have become
    0:25:55 like insanely valuable
    0:25:55 and
    0:25:55 and
    0:25:56 and kind of
    0:25:57 and strategic
    0:25:58 to the national
    0:25:59 interests as well
    0:26:00 that makes sense
    0:26:01 so again
    0:26:02 at the level of
    0:26:03 the guy smart enough
    0:26:04 to build the facility
    0:26:04 yes
    0:26:05 and I recently saw
    0:26:06 a video of the
    0:26:07 grocery store
    0:26:08 of the future
    0:26:08 where it is
    0:26:10 a huge grid
    0:26:11 inside of a
    0:26:12 giant facility
    0:26:13 and there’s just
    0:26:14 like these
    0:26:15 bots that
    0:26:15 look kind of like
    0:26:16 small shopping carts
    0:26:17 and they’re just
    0:26:18 grid patterning
    0:26:19 across all the items
    0:26:20 snatching up
    0:26:20 whatever you order
    0:26:22 so you order online
    0:26:22 these things
    0:26:23 grab all that stuff
    0:26:24 and then they send
    0:26:25 it off to you
    0:26:26 so for the person
    0:26:27 that’s savvy enough
    0:26:28 to build that facility
    0:26:29 yes
    0:26:30 tremendous
    0:26:31 but
    0:26:33 what I think
    0:26:33 I hear you saying
    0:26:34 and correct me
    0:26:34 if I’m wrong
    0:26:35 is that
    0:26:36 okay there are
    0:26:37 two opportunities here
    0:26:37 the opportunity
    0:26:38 one is
    0:26:39 if you’re the kind
    0:26:39 of person
    0:26:40 that can leverage
    0:26:40 AI to build
    0:26:41 that facility
    0:26:42 massive opportunity
    0:26:44 if you’re the kind
    0:26:44 of person
    0:26:44 that would traditionally
    0:26:45 work at that
    0:26:45 factory
    0:26:46 something new
    0:26:47 is coming
    0:26:47 we know that
    0:26:48 because looking
    0:26:48 back at history
    0:26:49 all these technologies
    0:26:50 unleash things
    0:26:51 we can’t yet see
    0:26:52 and so I have faith
    0:26:53 in the
    0:26:54 we can’t yet see it
    0:26:54 but it is coming
    0:26:55 yeah
    0:26:56 no for sure
    0:26:56 look I mean
    0:26:58 you know what
    0:26:58 like the biggest
    0:27:00 in-demand job
    0:27:00 in the world
    0:27:01 is right now
    0:27:02 data labelers
    0:27:04 and
    0:27:05 like data labeling
    0:27:06 wasn’t a job
    0:27:08 not long ago
    0:27:09 but if you talk
    0:27:10 I’ve even heard of this
    0:27:11 what is data labeling
    0:27:12 yeah
    0:27:13 so it’s what
    0:27:13 Alex said
    0:27:14 it’s what scale
    0:27:15 AI does
    0:27:16 you know
    0:27:16 they pay
    0:27:18 armies and armies
    0:27:19 and armies of people
    0:27:20 to label data
    0:27:21 so say hey
    0:27:23 this is a plant
    0:27:24 or this is
    0:27:24 you know
    0:27:24 a fig
    0:27:25 or whatever it is
    0:27:26 for the AI
    0:27:28 to then understand it
    0:27:29 and then you know
    0:27:30 now with the
    0:27:32 you know
    0:27:33 with the kind of
    0:27:34 reinforcement learning
    0:27:35 coming back
    0:27:36 into play
    0:27:37 you know
    0:27:38 labeling
    0:27:38 you know
    0:27:39 that kind of
    0:27:40 supervised learning
    0:27:41 is still like
    0:27:42 very very very important
    0:27:44 and I think that
    0:27:46 you know
    0:27:47 right now
    0:27:48 like he’s got
    0:27:50 unlimited hiring demand
    0:27:51 which is
    0:27:51 you know
    0:27:52 ironic
    0:27:53 for scale AI
    0:27:54 to have
    0:27:55 unlimited need
    0:27:56 for humans
    0:27:57 and I think
    0:27:58 you know
    0:27:59 in manufacturing
    0:28:00 there are going to be
    0:28:01 jobs like that
    0:28:02 and there will be
    0:28:02 the kind of physical
    0:28:05 well when you go
    0:28:06 and you go
    0:28:07 into these robot
    0:28:08 like the software
    0:28:09 companies that are
    0:28:10 doing robotics
    0:28:12 they have people
    0:28:14 managing the robots
    0:28:15 right
    0:28:15 like they’re training
    0:28:16 the robots
    0:28:17 humans train robots
    0:28:19 to do all kinds
    0:28:20 of things
    0:28:21 and it turns out
    0:28:21 that like
    0:28:23 folding clothes
    0:28:25 doesn’t necessarily
    0:28:27 generalize
    0:28:28 to making eggs
    0:28:30 they’re like
    0:28:31 super different
    0:28:32 for robots
    0:28:34 and so you need
    0:28:35 you know
    0:28:36 these robots
    0:28:36 trained in all
    0:28:37 these kinds
    0:28:37 of fields
    0:28:38 and so forth
    0:28:38 so I think
    0:28:38 there’s
    0:28:39 you know
    0:28:40 there’s a whole
    0:28:40 new class
    0:28:41 of jobs
    0:28:42 that are
    0:28:42 a little bit
    0:28:43 hard to
    0:28:44 anticipate
    0:28:46 you know
    0:28:47 in advance
    0:28:48 but I think
    0:28:49 at least
    0:28:49 for the next
    0:28:50 10 years
    0:28:50 I think
    0:28:51 the number
    0:28:52 of new
    0:28:52 jobs
    0:28:53 related
    0:28:54 to making
    0:28:54 these machines
    0:28:55 smarter
    0:28:56 is going
    0:28:57 to increase
    0:28:57 a lot
    0:28:59 and then
    0:29:00 after that
    0:29:00 you know
    0:29:00 like
    0:29:02 I think
    0:29:02 there will
    0:29:03 be
    0:29:04 there just
    0:29:05 tend to be
    0:29:05 like
    0:29:06 throughout
    0:29:07 history
    0:29:08 so many
    0:29:09 needs
    0:29:09 for new
    0:29:09 things
    0:29:10 that we
    0:29:10 never
    0:29:11 anticipated
    0:29:11 like well
    0:29:12 I mean
    0:29:12 you know
    0:29:12 one of my
    0:29:13 favorite
    0:29:13 examples
    0:29:13 is
    0:29:14 okay
    0:29:15 computers
    0:29:17 are going
    0:29:18 to kill
    0:29:20 the typesetting
    0:29:20 business
    0:29:21 and they did
    0:29:22 everybody knew
    0:29:22 that
    0:29:23 like that
    0:29:23 was coming
    0:29:24 nobody
    0:29:26 nobody
    0:29:26 said oh
    0:29:26 and then
    0:29:27 there’s
    0:29:27 going to be
    0:29:28 5 million
    0:29:28 graphic design
    0:29:29 jobs
    0:29:29 that come
    0:29:30 out of
    0:29:30 the PC
    0:29:31 like
    0:29:32 nobody
    0:29:33 not a
    0:29:33 person
    0:29:34 predicted
    0:29:34 that
    0:29:35 so
    0:29:35 it’s
    0:29:35 really
    0:29:36 easy
    0:29:36 to figure
    0:29:36 out
    0:29:36 which
    0:29:37 jobs
    0:29:37 are
    0:29:37 going
    0:29:37 to go
    0:29:37 away
    0:29:38 it’s
    0:29:38 much
    0:29:39 more
    0:29:39 difficult
    0:29:39 to
    0:29:39 kind
    0:29:40 of
    0:29:40 figure
    0:29:40 out
    0:29:40 which
    0:29:41 jobs
    0:29:41 are
    0:29:41 going
    0:29:41 to
    0:29:41 come
    0:29:42 but
    0:29:43 like
    0:29:43 if
    0:29:43 you
    0:29:43 look
    0:29:43 at
    0:29:43 the
    0:29:44 history
    0:29:44 of
    0:29:44 automation
    0:29:46 which
    0:29:46 is
    0:29:46 kind
    0:29:47 of
    0:29:47 automated
    0:29:47 away
    0:29:48 everything
    0:29:48 we
    0:29:48 did
    0:29:48 100
    0:29:49 years
    0:29:49 ago
    0:29:50 there’s
    0:29:54 and so
    0:29:54 you go
    0:29:56 okay
    0:29:57 and then
    0:29:57 you know
    0:29:57 like
    0:29:58 some of
    0:29:58 the
    0:29:58 employment
    0:29:59 will be
    0:30:00 much
    0:30:00 more
    0:30:01 I think
    0:30:01 enjoyable
    0:30:02 than the
    0:30:02 old
    0:30:02 employment
    0:30:03 as well
    0:30:04 as it
    0:30:04 has been
    0:30:05 you know
    0:30:05 over time
    0:30:06 and you
    0:30:07 always talk
    0:30:07 about
    0:30:07 manufacturing
    0:30:08 jobs
    0:30:08 going
    0:30:08 away
    0:30:08 but
    0:30:09 the
    0:30:09 manufacturing
    0:30:09 jobs
    0:30:10 that
    0:30:10 have
    0:30:10 gone
    0:30:10 away
    0:30:10 have
    0:30:11 been
    0:30:11 the
    0:30:11 most
    0:30:11 mind
    0:30:12 numbing
    0:30:12 so
    0:30:13 I
    0:30:13 think
    0:30:13 things
    0:30:14 evolve
    0:30:14 in
    0:30:14 very
    0:30:15 very
    0:30:15 unpredictable
    0:30:16 ways
    0:30:17 and
    0:30:18 you know
    0:30:18 like
    0:30:18 I
    0:30:18 think
    0:30:19 the
    0:30:19 hope
    0:30:19 is
    0:30:19 that
    0:30:20 you
    0:30:20 know
    0:30:20 the
    0:30:20 world
    0:30:21 just
    0:30:21 gets
    0:30:22 much
    0:30:22 better
    0:30:23 but
    0:30:23 I’m
    0:30:23 not
    0:30:24 I’m
    0:30:24 not
    0:30:25 so
    0:30:26 worried
    0:30:26 about
    0:30:27 kind
    0:30:27 of
    0:30:28 anticipating
    0:30:28 all the
    0:30:28 horror
    0:30:29 that’s
    0:30:29 going to
    0:30:29 come
    0:30:29 I mean
    0:30:29 I think
    0:30:30 the main
    0:30:30 reason
    0:30:30 we’re
    0:30:31 making
    0:30:31 these
    0:30:31 things
    0:30:31 is
    0:30:32 you
    0:30:32 know
    0:30:33 the
    0:30:33 ways
    0:30:34 that
    0:30:34 they’re
    0:30:34 making
    0:30:34 life
    0:30:35 better
    0:30:36 and
    0:30:36 you know
    0:30:37 just like
    0:30:37 we finally
    0:30:38 figured out
    0:30:38 a way
    0:30:39 for
    0:30:39 everybody
    0:30:40 like
    0:30:40 we
    0:30:40 already
    0:30:41 have
    0:30:42 in
    0:30:42 our
    0:30:43 hands
    0:30:44 everybody
    0:30:44 can
    0:30:44 get
    0:30:45 a
    0:30:45 great
    0:30:45 education
    0:30:46 like
    0:30:47 that
    0:30:47 whole
    0:30:49 inequality
    0:30:50 of
    0:30:50 access
    0:30:50 to
    0:30:51 education
    0:30:51 is
    0:30:52 like
    0:30:52 literally
    0:30:53 gone
    0:30:53 right
    0:30:53 now
    0:30:54 which
    0:30:55 is
    0:30:56 pretty
    0:30:56 amazing
    0:30:57 I mean
    0:30:57 it’s
    0:30:57 certainly
    0:30:58 huge
    0:30:59 yeah
    0:30:59 nothing
    0:30:59 that I
    0:31:00 ever
    0:31:00 thought
    0:31:00 I’d
    0:31:00 see
    0:31:01 so
    0:31:02 hopefully
    0:31:04 things
    0:31:04 go well
    0:31:05 the great
    0:31:05 irony
    0:31:05 it’s
    0:31:06 so
    0:31:06 crazy
    0:31:07 I
    0:31:07 don’t
    0:31:07 think
    0:31:08 anybody
    0:31:08 anybody
    0:31:09 saw that
    0:31:09 coming
    0:31:09 it was
    0:31:09 always
    0:31:10 going
    0:31:10 to
    0:31:10 be
    0:31:10 it’s
    0:31:10 going
    0:31:10 to
    0:31:10 go
    0:31:10 for
    0:31:10 the
    0:31:11 drivers
    0:31:11 it’s
    0:31:11 going
    0:31:11 to
    0:31:12 go
    0:31:12 for
    0:31:12 all
    0:31:12 those
    0:31:12 hard
    0:31:13 difficult
    0:31:13 repetitive
    0:31:14 tasks
    0:31:14 yeah
    0:31:14 it’s
    0:31:15 been
    0:31:15 very
    0:31:16 fascinating
    0:31:16 to
    0:31:16 see
    0:31:16 what
    0:31:17 actually
    0:31:17 is
    0:31:17 in
    0:31:18 danger
    0:31:18 like
    0:31:19 super
    0:31:19 creative
    0:31:20 jobs
    0:31:20 very
    0:31:21 much
    0:31:21 in
    0:31:21 danger
    0:31:23 but
    0:31:23 yeah
    0:31:23 as
    0:31:23 you
    0:31:23 get
    0:31:24 down
    0:31:24 I
    0:31:24 mean
    0:31:24 look
    0:31:24 I
    0:31:25 think
    0:31:25 I
    0:31:25 believe
    0:31:26 way
    0:31:26 more
    0:31:26 strongly
    0:31:27 than
    0:31:27 you
    0:31:27 do
    0:31:27 that
    0:31:28 robots
    0:31:28 are
    0:31:28 just
    0:31:28 going
    0:31:28 to
    0:31:28 get
    0:31:29 better
    0:31:29 and
    0:31:29 better
    0:31:29 and
    0:31:29 better
    0:31:30 and
    0:31:30 better
    0:31:31 but
    0:31:32 that
    0:31:32 could
    0:31:32 be
    0:31:32 that
    0:31:33 I’m
    0:31:33 not
    0:31:33 as
    0:31:33 close
    0:31:33 to
    0:31:33 the
    0:31:34 problem
    0:31:34 as
    0:31:34 you
    0:31:34 are
    0:31:35 speaking
    0:31:35 of
    0:31:35 which
    0:31:36 how
    0:31:36 the
    0:31:37 insights
    0:31:37 that
    0:31:37 you’ve
    0:31:38 had
    0:31:38 into
    0:31:39 AI
    0:31:39 how
    0:31:39 are
    0:31:39 they
    0:31:40 informing
    0:31:40 the
    0:31:41 investments
    0:31:41 that
    0:31:41 you
    0:31:41 guys
    0:31:42 make
    0:31:42 the
    0:31:43 theory
    0:31:43 is
    0:31:43 there’s
    0:31:44 this
    0:31:44 one
    0:31:44 like
    0:31:44 super
    0:31:45 intelligent
    0:31:45 big
    0:31:46 brain
    0:31:46 that’s
    0:31:46 going to
    0:31:46 do
    0:31:47 everything
    0:31:47 the
    0:31:48 reality
    0:31:48 on the
    0:31:48 ground
    0:31:49 is
    0:31:49 even
    0:31:50 with
    0:31:50 the
    0:31:50 state
    0:31:50 of
    0:31:50 the
    0:31:50 art
    0:31:51 models
    0:31:51 they’re
    0:31:51 all
    0:31:52 kind
    0:31:52 of
    0:31:52 good
    0:31:53 at
    0:31:53 slightly
    0:31:53 different
    0:31:54 things
    0:31:54 right
    0:31:54 like
    0:31:54 you know
    0:31:55 Anthropic
    0:31:55 is
    0:31:55 like
    0:31:55 really
    0:31:56 good
    0:31:56 at
    0:31:56 code
    0:31:57 and
    0:31:58 Grok
    0:31:58 is
    0:31:58 really
    0:31:59 good
    0:31:59 at
    0:31:59 like
    0:31:59 real
    0:31:59 time
    0:32:00 data
    0:32:00 because
    0:32:00 they’ve
    0:32:00 got
    0:32:00 the
    0:32:01 Twitter
    0:32:01 stuff
    0:32:01 and
    0:32:02 then
    0:32:02 you
    0:32:02 know
    0:32:03 OpenAI
    0:32:03 has
    0:32:04 gotten
    0:32:04 like
    0:32:04 very
    0:32:04 very
    0:32:05 good
    0:32:05 at
    0:32:05 reasoning
    0:32:07 so
    0:32:09 with
    0:32:09 all
    0:32:09 of
    0:32:10 them
    0:32:10 are
    0:32:10 doing
    0:32:11 AGI
    0:32:11 but
    0:32:12 then
    0:32:12 they’re
    0:32:12 all
    0:32:12 good
    0:32:12 at
    0:32:13 different
    0:32:13 stuff
    0:32:13 which
    0:32:14 is
    0:32:14 you know
    0:32:15 from
    0:32:15 an
    0:32:15 investing
    0:32:16 standpoint
    0:32:16 it’s
    0:32:16 very
    0:32:16 good
    0:32:17 to
    0:32:17 know
    0:32:17 that
    0:32:19 because
    0:32:20 if
    0:32:20 something
    0:32:20 like
    0:32:20 that’s
    0:32:21 not
    0:32:21 winner
    0:32:21 take
    0:32:21 all
    0:32:22 that’s
    0:32:22 very
    0:32:23 that
    0:32:23 becomes
    0:32:23 like
    0:32:24 super
    0:32:24 interesting
    0:32:25 it
    0:32:25 also
    0:32:26 is
    0:32:26 interesting
    0:32:27 for
    0:32:28 what
    0:32:28 it
    0:32:28 means
    0:32:28 at
    0:32:28 the
    0:32:29 application
    0:32:29 layer
    0:32:29 because
    0:32:30 if
    0:32:30 the
    0:32:32 infrastructure
    0:32:32 products
    0:32:33 aren’t
    0:32:33 winner
    0:32:34 take
    0:32:34 all
    0:32:34 and
    0:32:34 then
    0:32:35 the
    0:32:35 other
    0:32:35 thing
    0:32:35 about
    0:32:35 the
    0:32:36 infrastructure
    0:32:36 products
    0:32:37 that’s
    0:32:37 interesting
    0:32:37 is
    0:32:38 that
    0:32:38 they’re
    0:32:39 not
    0:32:39 particularly
    0:32:40 sticky
    0:32:41 in the
    0:32:42 way
    0:32:42 that
    0:32:42 kind
    0:32:43 of
    0:32:44 Microsoft
    0:32:44 Windows
    0:32:45 was
    0:32:45 very
    0:32:45 sticky
    0:32:46 right
    0:32:46 it was
    0:32:46 sticky
    0:32:47 you build
    0:32:47 an application
    0:32:48 on Windows
    0:32:49 it doesn’t run
    0:32:49 on other
    0:32:50 stuff
    0:32:50 you’ve got
    0:32:51 to do a lot
    0:32:51 of work
    0:32:52 to move
    0:32:52 it to
    0:32:52 something
    0:32:52 else
    0:32:53 so you
    0:32:53 get this
    0:32:53 network
    0:32:55 effect
    0:32:55 with
    0:32:56 developers
    0:32:57 then you
    0:32:57 go
    0:32:58 okay
    0:32:58 well
    0:32:59 how does
    0:32:59 that work
    0:32:59 with
    0:33:00 state-of-the-art
    0:33:00 models
    0:33:01 well people
    0:33:01 build
    0:33:01 applications
    0:33:02 on these
    0:33:02 things
    0:33:03 but
    0:33:03 guess
    0:33:04 what
    0:33:04 like
    0:33:05 to move
    0:33:05 your
    0:33:05 application
    0:33:05 to
    0:33:06 DeepSeq
    0:33:06 you didn’t
    0:33:07 have to
    0:33:08 they just
    0:33:09 literally
    0:33:09 took the
    0:33:09 open
    0:33:11 Python
    0:33:11 API
    0:33:12 and like
    0:33:12 it runs
    0:33:13 on DeepSeq
    0:33:13 now
    0:33:16 so
    0:33:17 you know
    0:33:17 that
    0:33:17 that kind
    0:33:18 of thing
    0:33:19 really impacts
    0:33:20 you know
    0:33:20 how you
    0:33:20 think about
    0:33:21 investing
    0:33:21 and like
    0:33:22 what is
    0:33:23 the value
    0:33:23 of having
    0:33:24 lead in
    0:33:24 application
    0:33:25 and then
    0:33:25 you know
    0:33:26 where is
    0:33:26 the mode
    0:33:26 going to
    0:33:27 come from
    0:33:28 and of
    0:33:28 course
    0:33:28 AI is
    0:33:29 also
    0:33:29 getting
    0:33:29 like
    0:33:30 the one
    0:33:30 thing
    0:33:30 it is
    0:33:30 getting
    0:33:31 amazingly
    0:33:32 good at
    0:33:32 is
    0:33:32 writing
    0:33:33 code
    0:33:34 and
    0:33:34 so
    0:33:35 then
    0:33:35 you know
    0:33:35 how much
    0:33:36 of a
    0:33:36 lead
    0:33:36 do you
    0:33:37 have
    0:33:38 in
    0:33:38 the
    0:33:38 code
    0:33:39 itself
    0:33:39 versus
    0:33:41 you know
    0:33:42 kind of
    0:33:42 the other
    0:33:42 traditional
    0:33:43 things
    0:33:43 you know
    0:33:43 when I
    0:33:44 started
    0:33:44 in the
    0:33:44 industry
    0:33:45 the
    0:33:46 sales
    0:33:46 people
    0:33:46 were
    0:33:46 in
    0:33:47 charge
    0:33:48 they
    0:33:48 were
    0:33:48 kind
    0:33:49 of
    0:33:49 like
    0:33:49 the
    0:33:49 big
    0:33:50 there’s
    0:33:50 a
    0:33:50 great
    0:33:52 TV
    0:33:52 show
    0:33:52 called
    0:33:53 Halt
    0:33:53 and Catch
    0:33:53 Fire
    0:33:54 and
    0:33:55 if you
    0:33:55 watch it
    0:33:56 like the
    0:33:56 thing
    0:33:57 that’s
    0:33:57 really
    0:33:58 stunning
    0:33:58 if
    0:33:58 you’re
    0:33:59 you know
    0:33:59 kind
    0:33:59 of
    0:34:00 coming
    0:34:00 from
    0:34:01 the
    0:34:02 2010s
    0:34:03 2020s
    0:34:04 world
    0:34:04 is
    0:34:05 why
    0:34:05 are
    0:34:05 the
    0:34:05 sales
    0:34:06 people
    0:34:06 so
    0:34:06 powerful
    0:34:08 but
    0:34:09 they
    0:34:09 were
    0:34:09 the
    0:34:09 most
    0:34:09 powerful
    0:34:10 in
    0:34:10 those
    0:34:10 days
    0:34:11 and
    0:34:11 it
    0:34:11 was
    0:34:12 because
    0:34:12 you know
    0:34:12 distribution
    0:34:13 was the
    0:34:14 most
    0:34:14 difficult
    0:34:14 thing
    0:34:16 and
    0:34:16 I
    0:34:17 you know
    0:34:17 I think
    0:34:18 distribution
    0:34:18 is going
    0:34:19 to get
    0:34:19 in
    0:34:19 very
    0:34:19 very
    0:34:20 important
    0:34:20 again
    0:34:23 because
    0:34:24 maintaining
    0:34:24 a
    0:34:25 technological
    0:34:25 lead
    0:34:25 is
    0:34:26 a lot
    0:34:26 harder
    0:34:27 you know
    0:34:28 when the
    0:34:28 machine
    0:34:29 is writing
    0:34:29 the code
    0:34:30 and writing
    0:34:30 it very
    0:34:30 fast
    0:34:32 although
    0:34:32 it’s not
    0:34:33 all the way
    0:34:34 where it
    0:34:34 can build
    0:34:35 like super
    0:34:35 complex
    0:34:36 systems
    0:34:37 but there’s
    0:34:37 you know
    0:34:38 a bunch
    0:34:39 of things
    0:34:40 out now
    0:34:40 you know
    0:34:40 Replet’s
    0:34:41 got a
    0:34:41 great
    0:34:42 product
    0:34:42 for it
    0:34:43 there’s
    0:34:43 a company
    0:34:43 called
    0:34:44 Lovable
    0:34:44 that’s got
    0:34:44 one out
    0:34:45 in Sweden
    0:34:46 that just
    0:34:46 builds you
    0:34:47 an app
    0:34:47 like if
    0:34:47 you need
    0:34:48 an app
    0:34:48 for something
    0:34:49 just say
    0:34:49 build me
    0:34:50 this app
    0:34:51 and there
    0:34:51 it is
    0:34:52 yeah
    0:34:53 another thing
    0:34:53 in cursor
    0:34:54 that you
    0:34:54 can select
    0:34:56 what model
    0:34:56 you want
    0:34:57 to use
    0:34:58 for whatever
    0:34:58 thing that
    0:34:58 you’re about
    0:34:59 to generate
    0:34:59 so the
    0:35:00 ability to
    0:35:00 go oh
    0:35:01 I want
    0:35:01 10 of
    0:35:01 these
    0:35:02 things
    0:35:02 I’m gonna
    0:35:02 use this
    0:35:03 one for
    0:35:03 this kind
    0:35:03 of code
    0:35:04 this one
    0:35:04 for that
    0:35:04 kind
    0:35:04 of code
    0:35:05 it’s
    0:35:06 really
    0:35:06 fascinating
    0:35:08 now the
    0:35:08 Biden
    0:35:08 administration
    0:35:09 was super
    0:35:09 hostile
    0:35:10 towards tech
    0:35:12 when you
    0:35:13 look at
    0:35:14 what’s going
    0:35:14 on now
    0:35:15 with the
    0:35:15 changes
    0:35:16 in
    0:35:16 regulatory
    0:35:18 what do
    0:35:19 you think
    0:35:19 about the
    0:35:20 race
    0:35:21 between
    0:35:21 us
    0:35:21 and
    0:35:21 China
    0:35:22 were we
    0:35:22 headed
    0:35:23 down a
    0:35:23 dark
    0:35:23 path
    0:35:24 where
    0:35:24 if
    0:35:25 that
    0:35:26 administration
    0:35:26 had
    0:35:27 stayed
    0:35:27 with that
    0:35:27 like we’re
    0:35:28 going to have
    0:35:28 one or two
    0:35:29 companies
    0:35:29 we’re going to
    0:35:30 control them
    0:35:30 that’s going
    0:35:30 to be that
    0:35:32 is it possible
    0:35:32 we could have
    0:35:33 lost that race
    0:35:34 is that race
    0:35:35 a figment
    0:35:35 of my
    0:35:36 imagination
    0:35:36 is that
    0:35:36 real
    0:35:38 I think
    0:35:38 there’s
    0:35:40 multiple layers
    0:35:41 to the
    0:35:43 AI race
    0:35:43 with China
    0:35:44 and then
    0:35:45 you know
    0:35:45 the Biden
    0:35:46 administration
    0:35:47 was
    0:35:48 kind of
    0:35:48 hostile
    0:35:49 in many
    0:35:50 ways
    0:35:50 but all
    0:35:51 for kind
    0:35:51 of a
    0:35:51 central
    0:35:52 reason
    0:35:52 I think
    0:35:54 so
    0:35:55 you know
    0:35:56 in AI
    0:35:56 in particular
    0:35:58 you know
    0:35:58 when we
    0:35:59 met
    0:36:01 and I
    0:36:01 should be
    0:36:02 very specific
    0:36:03 we did
    0:36:03 meet with
    0:36:04 Jake Sullivan
    0:36:04 but he was
    0:36:05 very good
    0:36:05 about it
    0:36:05 we met
    0:36:06 with Gina
    0:36:06 Raimondo
    0:36:07 she was
    0:36:07 very good
    0:36:08 about it
    0:36:09 but we
    0:36:10 met with
    0:36:10 the kind
    0:36:11 of White
    0:36:11 House
    0:36:13 and
    0:36:13 their
    0:36:14 you know
    0:36:15 their position
    0:36:15 was
    0:36:16 super
    0:36:18 kind of
    0:36:18 I would
    0:36:18 say
    0:36:19 ill-informed
    0:36:20 so
    0:36:21 they
    0:36:21 basically
    0:36:22 were
    0:36:23 they
    0:36:24 walked in
    0:36:24 with this
    0:36:25 idea
    0:36:26 that like
    0:36:26 we’ve got
    0:36:27 a three-year
    0:36:28 lead on
    0:36:28 China
    0:36:30 and
    0:36:30 we have
    0:36:31 to protect
    0:36:31 that lead
    0:36:33 and
    0:36:34 there’s
    0:36:34 no
    0:36:36 and
    0:36:36 therefore
    0:36:37 we need
    0:36:37 to shut
    0:36:38 down
    0:36:38 open
    0:36:38 source
    0:36:39 and
    0:36:39 that
    0:36:39 doesn’t
    0:36:40 matter
    0:36:41 to
    0:36:41 you
    0:36:42 guys
    0:36:42 and
    0:36:42 startups
    0:36:42 because
    0:36:43 startups
    0:36:44 can’t
    0:36:44 participate
    0:36:45 in AI
    0:36:45 anyway
    0:36:47 because
    0:36:47 they don’t
    0:36:48 have enough
    0:36:48 money
    0:36:50 and
    0:36:50 the only
    0:36:51 companies
    0:36:51 that are
    0:36:52 going to
    0:36:52 do
    0:36:52 AI
    0:36:53 are
    0:36:54 going to
    0:36:54 be
    0:36:54 kind
    0:36:54 of
    0:36:55 ironically
    0:36:56 the
    0:36:57 two
    0:36:57 startups
    0:36:58 Anthropic
    0:36:58 and Open
    0:36:58 AI
    0:36:59 that are
    0:36:59 out
    0:37:00 and then
    0:37:00 the big
    0:37:00 companies
    0:37:00 Google
    0:37:01 and so
    0:37:01 forth
    0:37:01 and
    0:37:02 Microsoft
    0:37:03 and so
    0:37:03 we can
    0:37:04 put a
    0:37:05 huge
    0:37:05 regulatory
    0:37:06 barrier
    0:37:06 on them
    0:37:06 because
    0:37:07 they have
    0:37:07 the money
    0:37:07 and the
    0:37:08 people
    0:37:08 to deal
    0:37:08 with it
    0:37:09 and then
    0:37:10 that’ll
    0:37:10 be
    0:37:11 and you
    0:37:11 know
    0:37:12 in their
    0:37:12 minds
    0:37:12 I think
    0:37:13 they actually
    0:37:13 believe
    0:37:14 that
    0:37:15 that would
    0:37:15 be how
    0:37:16 we would
    0:37:16 win
    0:37:18 but of course
    0:37:18 you know
    0:37:19 in retrospect
    0:37:20 that makes
    0:37:20 no sense
    0:37:21 and it
    0:37:21 kind of
    0:37:23 it damages
    0:37:23 you know
    0:37:23 if you look
    0:37:24 at China
    0:37:24 and what
    0:37:24 China’s
    0:37:25 great at
    0:37:25 then this
    0:37:26 goes to the
    0:37:26 next thing
    0:37:27 so there’s
    0:37:27 how good
    0:37:28 is your
    0:37:28 AI
    0:37:28 and then
    0:37:30 how well
    0:37:30 is it
    0:37:31 integrated
    0:37:31 into
    0:37:33 your
    0:37:33 military
    0:37:34 and the way
    0:37:34 the government
    0:37:34 works
    0:37:35 and so
    0:37:35 forth
    0:37:36 and I
    0:37:36 think
    0:37:36 that
    0:37:37 China
    0:37:37 being a
    0:37:38 top-down
    0:37:39 society
    0:37:40 their strength
    0:37:40 is
    0:37:42 you know
    0:37:43 that whatever
    0:37:44 AI they have
    0:37:44 they’re going
    0:37:45 to integrate
    0:37:45 into it’s
    0:37:46 already all the
    0:37:46 companies are
    0:37:47 highly integrated
    0:37:47 into the
    0:37:48 government
    0:37:49 so you know
    0:37:50 they’re going
    0:37:50 to be able
    0:37:50 to deploy
    0:37:51 that and
    0:37:52 and we’re
    0:37:52 going to
    0:37:52 see it
    0:37:53 in action
    0:37:55 with their
    0:37:55 military
    0:37:56 very fast
    0:37:57 I think
    0:37:57 that the
    0:37:58 advantage
    0:37:58 of the
    0:37:59 US
    0:37:59 is like
    0:38:00 we’re not
    0:38:01 a top-down
    0:38:01 society
    0:38:02 we’re like
    0:38:02 a wild
    0:38:03 messy
    0:38:03 society
    0:38:04 but it
    0:38:05 means that
    0:38:05 all of our
    0:38:06 smart people
    0:38:07 can participate
    0:38:08 in the field
    0:38:10 and look
    0:38:11 there’s more
    0:38:11 to AI
    0:38:12 than just
    0:38:12 the big
    0:38:12 models
    0:38:13 as you said
    0:38:13 like
    0:38:14 you know
    0:38:15 how important
    0:38:15 is cursor
    0:38:16 it’s really
    0:38:17 important
    0:38:18 if you’re
    0:38:18 building stuff
    0:38:19 so like
    0:38:20 oh you want
    0:38:20 to go
    0:38:21 build
    0:38:23 you know
    0:38:24 the next
    0:38:25 whatever
    0:38:26 thing that
    0:38:27 the CIA
    0:38:27 needs
    0:38:27 or the
    0:38:28 NSA
    0:38:28 needs
    0:38:28 or this
    0:38:29 and that
    0:38:30 like
    0:38:30 you’re
    0:38:30 building
    0:38:30 that
    0:38:30 with
    0:38:31 cursor
    0:38:31 you’re
    0:38:31 using
    0:38:31 a state
    0:38:32 of the
    0:38:32 art
    0:38:32 model
    0:38:33 but like
    0:38:33 if you
    0:38:33 had
    0:38:34 eliminated
    0:38:34 if you
    0:38:34 know
    0:38:35 if
    0:38:35 the
    0:38:35 Biden
    0:38:36 White
    0:38:36 House
    0:38:36 had
    0:38:36 gotten
    0:38:36 their
    0:38:36 way
    0:38:37 they’d
    0:38:37 eliminate
    0:38:37 things
    0:38:38 like
    0:38:38 cursor
    0:38:38 they’d
    0:38:38 eliminate
    0:38:39 startups
    0:38:39 being
    0:38:40 able
    0:38:40 to do
    0:38:41 anything
    0:38:41 in AI
    0:38:43 and so
    0:38:44 the advantages
    0:38:45 that we have
    0:38:45 that we
    0:38:45 don’t just
    0:38:46 have a model
    0:38:47 we’ve got
    0:38:47 all this
    0:38:47 other stuff
    0:38:48 that goes
    0:38:48 with it
    0:38:50 and then
    0:38:50 we have
    0:38:51 new ideas
    0:38:51 on models
    0:38:53 with new
    0:38:53 algorithms
    0:38:53 and this
    0:38:54 and that
    0:38:55 and that’s
    0:38:55 what the
    0:38:55 U.S.
    0:38:56 is great
    0:38:56 at
    0:38:57 and I
    0:38:57 think
    0:38:57 what China
    0:38:58 is great
    0:38:58 at
    0:38:58 is
    0:39:01 by the
    0:39:01 way
    0:39:01 they’re
    0:39:01 very good
    0:39:01 at math
    0:39:02 people
    0:39:02 people
    0:39:03 are good
    0:39:03 at math
    0:39:03 and AI
    0:39:04 is math
    0:39:04 so
    0:39:04 their
    0:39:06 models
    0:39:06 are good
    0:39:07 they also
    0:39:07 have
    0:39:08 a data
    0:39:08 advantage
    0:39:09 on us
    0:39:10 where they
    0:39:10 have access
    0:39:10 to the
    0:39:11 Chinese
    0:39:11 internet
    0:39:12 they have
    0:39:12 access
    0:39:12 to
    0:39:13 copywritten
    0:39:13 material
    0:39:14 which
    0:39:15 they do
    0:39:15 not
    0:39:16 have
    0:39:16 the
    0:39:17 same
    0:39:17 difference
    0:39:18 for it
    0:39:18 that we
    0:39:18 do
    0:39:18 in the
    0:39:19 U.S.
    0:39:19 and so
    0:39:19 they’re
    0:39:19 able
    0:39:20 to
    0:39:20 kind
    0:39:21 of
    0:39:21 get
    0:39:21 to
    0:39:22 you know
    0:39:23 if you
    0:39:23 use
    0:39:23 DeepSeq
    0:39:24 you go
    0:39:24 wow
    0:39:24 DeepSeq
    0:39:25 really is
    0:39:25 a great
    0:39:25 writer
    0:39:26 compared
    0:39:27 to a lot
    0:39:27 of the
    0:39:28 U.S.
    0:39:28 models
    0:39:28 why
    0:39:28 is
    0:39:29 that
    0:39:29 well
    0:39:29 they
    0:39:30 train
    0:39:30 on
    0:39:30 bigger
    0:39:30 data
    0:39:31 set
    0:39:31 than
    0:39:31 we
    0:39:31 do
    0:39:32 and
    0:39:33 that’s
    0:39:33 amazing
    0:39:34 so
    0:39:35 it
    0:39:35 really
    0:39:36 you know
    0:39:37 I think
    0:39:37 what we
    0:39:38 want
    0:39:38 is
    0:39:38 we
    0:39:39 want
    0:39:39 to
    0:39:39 have
    0:39:40 kind
    0:39:40 of
    0:39:40 world
    0:39:41 class
    0:39:41 first
    0:39:41 class
    0:39:41 AI
    0:39:42 in
    0:39:42 the
    0:39:42 U.S.
    0:39:42 and I
    0:39:43 think
    0:39:43 of it
    0:39:43 less
    0:39:43 as
    0:39:44 you know
    0:39:45 is it
    0:39:45 ahead
    0:39:45 of
    0:39:45 China
    0:39:46 is it
    0:39:46 slightly
    0:39:46 ahead
    0:39:46 of
    0:39:46 China
    0:39:47 I
    0:39:47 think
    0:39:47 that
    0:39:47 model
    0:39:50 you
    0:39:52 know
    0:39:52 what
    0:39:52 we’ve
    0:39:52 seen
    0:39:53 with our
    0:39:53 own
    0:39:53 state
    0:39:53 of the
    0:39:53 art
    0:39:54 models
    0:39:54 are
    0:39:55 very
    0:39:55 shallow
    0:39:57 and I
    0:39:57 think
    0:39:57 that’ll
    0:39:58 continue
    0:39:58 as long
    0:39:58 as we’re
    0:39:59 able
    0:39:59 and allowed
    0:40:00 to build
    0:40:00 AI
    0:40:00 and then
    0:40:01 economically
    0:40:02 what you’d
    0:40:02 like
    0:40:02 is you’d
    0:40:03 like
    0:40:04 you know
    0:40:04 to have
    0:40:05 a vibrant
    0:40:06 AI
    0:40:06 ecosystem
    0:40:07 coming out
    0:40:07 of the
    0:40:07 U.S.
    0:40:08 so other
    0:40:08 countries
    0:40:09 you know
    0:40:10 who aren’t
    0:40:11 state-of-the-art
    0:40:12 with this
    0:40:12 stuff
    0:40:13 adopt our
    0:40:14 technology
    0:40:14 and you know
    0:40:15 we continue
    0:40:16 to be strong
    0:40:16 economically
    0:40:18 as opposed
    0:40:18 to everything
    0:40:19 goes to
    0:40:19 China
    0:40:19 and that
    0:40:20 was like
    0:40:20 a big
    0:40:21 big risk
    0:40:21 with the
    0:40:21 Biden
    0:40:22 administration
    0:40:22 I think
    0:40:24 and you
    0:40:24 know
    0:40:25 which was
    0:40:26 you know
    0:40:26 what they
    0:40:27 were doing
    0:40:27 on AI
    0:40:28 was
    0:40:29 you know
    0:40:30 tough
    0:40:30 I would
    0:40:30 say
    0:40:30 what they
    0:40:31 were doing
    0:40:31 on kind
    0:40:32 of fintech
    0:40:32 and crypto
    0:40:32 was even
    0:40:33 tougher
    0:40:35 in that
    0:40:35 they were
    0:40:36 just trying
    0:40:36 to get
    0:40:36 rid of the
    0:40:37 industry
    0:40:37 in its
    0:40:38 entirety
    0:40:38 you know
    0:40:38 with AI
    0:40:39 they were
    0:40:39 trying to
    0:40:42 I would
    0:40:42 say they
    0:40:43 were extremely
    0:40:43 arrogant
    0:40:44 in their
    0:40:46 in what
    0:40:46 they thought
    0:40:47 their ability
    0:40:47 was to
    0:40:47 predict
    0:40:48 the future
    0:40:50 you know
    0:40:50 Mark and I
    0:40:51 were in
    0:40:51 there
    0:40:51 you know
    0:40:52 like our
    0:40:52 job is
    0:40:52 to
    0:40:53 predict
    0:40:53 it
    0:40:53 like this
    0:40:53 is our
    0:40:54 job to
    0:40:54 invest in
    0:40:55 the future
    0:40:55 to predict
    0:40:56 the future
    0:40:57 and they
    0:40:57 were saying
    0:40:58 things that
    0:40:59 like were
    0:41:01 so arrogant
    0:41:01 that we
    0:41:02 would never
    0:41:02 even think
    0:41:03 to say
    0:41:03 them even
    0:41:03 if we
    0:41:03 thought
    0:41:03 them
    0:41:04 because
    0:41:04 we’re
    0:41:04 like
    0:41:05 we know
    0:41:06 that we
    0:41:06 don’t know
    0:41:07 the future
    0:41:07 like that
    0:41:09 you know
    0:41:09 it’s just
    0:41:10 unknowable
    0:41:11 there’s too
    0:41:11 many moving
    0:41:11 parts
    0:41:12 and these
    0:41:12 things are
    0:41:12 really
    0:41:13 complicated
    0:41:15 all right
    0:41:16 well speaking
    0:41:16 of the future
    0:41:18 fully accepting
    0:41:18 that it is
    0:41:19 very opaque
    0:41:20 and very
    0:41:20 difficult to
    0:41:21 see
    0:41:21 what would
    0:41:21 you say
    0:41:22 is the
    0:41:22 most
    0:41:23 controversial
    0:41:23 view that
    0:41:24 you hold
    0:41:24 about the
    0:41:25 future
    0:41:26 if we
    0:41:27 don’t
    0:41:28 get to
    0:41:29 world class
    0:41:30 in crypto
    0:41:30 we’re gonna
    0:41:32 be you know
    0:41:32 AI really has
    0:41:34 the potential
    0:41:34 to wreck
    0:41:35 society
    0:41:37 and what I
    0:41:37 mean by that
    0:41:38 is if you
    0:41:39 think about
    0:41:40 what is
    0:41:42 obviously
    0:41:43 clearly gonna
    0:41:43 happen in an
    0:41:44 AI world
    0:41:44 is one
    0:41:46 we’re not
    0:41:46 gonna be able
    0:41:46 to tell the
    0:41:47 difference between
    0:41:47 a human and
    0:41:48 a robot
    0:41:49 two
    0:41:49 we’re not
    0:41:50 gonna know
    0:41:50 what’s
    0:41:50 real or
    0:41:51 fake
    0:41:53 three
    0:41:54 the level
    0:41:55 of security
    0:41:56 attacks
    0:41:57 on big
    0:41:58 central data
    0:41:58 repositories
    0:41:59 is gonna
    0:41:59 get so
    0:42:00 good
    0:42:01 that
    0:42:02 everybody’s
    0:42:02 data is
    0:42:03 going to
    0:42:04 be out
    0:42:04 there
    0:42:05 and you
    0:42:05 know
    0:42:06 there is
    0:42:06 no safe
    0:42:07 haven for
    0:42:07 a consumer
    0:42:09 and then
    0:42:10 finally
    0:42:10 you know
    0:42:10 for these
    0:42:11 agents
    0:42:12 and these
    0:42:12 bots
    0:42:13 actually be
    0:42:13 useful
    0:42:14 they
    0:42:14 actually
    0:42:15 need to
    0:42:15 be able
    0:42:15 to use
    0:42:16 money
    0:42:17 and pay
    0:42:17 for stuff
    0:42:18 and get
    0:42:18 paid for
    0:42:19 stuff
    0:42:19 like so
    0:42:22 and if you
    0:42:22 think about
    0:42:22 all those
    0:42:23 problems
    0:42:23 those are
    0:42:24 problems
    0:42:25 that are
    0:42:25 by far
    0:42:26 best solved
    0:42:27 by kind
    0:42:27 of blockchain
    0:42:28 technology
    0:42:30 so one
    0:42:32 we absolutely
    0:42:33 need
    0:42:34 a public
    0:42:34 key
    0:42:34 infrastructure
    0:42:36 such that
    0:42:37 every citizen
    0:42:38 has their
    0:42:39 own wallet
    0:42:39 with their
    0:42:40 own data
    0:42:40 with their
    0:42:41 own
    0:42:41 information
    0:42:43 and if
    0:42:44 you need
    0:42:44 to get
    0:42:45 credit
    0:42:46 or prove
    0:42:46 you’re a
    0:42:46 citizen
    0:42:47 or whatever
    0:42:48 you can do
    0:42:49 that with
    0:42:49 the zero
    0:42:49 knowledge
    0:42:49 proof
    0:42:50 you don’t
    0:42:50 have to
    0:42:50 hand over
    0:42:51 your social
    0:42:51 security
    0:42:52 numbers
    0:42:52 your bank
    0:42:53 account
    0:42:53 information
    0:42:54 all this
    0:42:55 kind of
    0:42:55 thing
    0:42:56 because
    0:42:58 the AI
    0:42:58 will get
    0:42:59 it
    0:43:00 so you
    0:43:01 really need
    0:43:01 your own
    0:43:02 keys
    0:43:02 and your
    0:43:02 own
    0:43:02 data
    0:43:03 and there
    0:43:03 can’t be
    0:43:04 these
    0:43:04 gigantic
    0:43:06 massive
    0:43:06 honeypots
    0:43:07 of information
    0:43:08 that people
    0:43:08 can go
    0:43:09 after
    0:43:10 I think
    0:43:10 that with
    0:43:11 deep
    0:43:11 fakes
    0:43:12 if you
    0:43:12 think
    0:43:12 about
    0:43:13 okay
    0:43:14 we’re
    0:43:14 going to
    0:43:15 have to
    0:43:15 be able
    0:43:16 to
    0:43:16 whitelist
    0:43:17 things
    0:43:17 we’re
    0:43:17 going to
    0:43:17 have to
    0:43:17 be able
    0:43:18 to say
    0:43:18 what’s
    0:43:18 real
    0:43:19 but who
    0:43:20 keeps
    0:43:20 track
    0:43:20 of what’s
    0:43:21 true
    0:43:21 then
    0:43:21 is it
    0:43:21 the
    0:43:22 government
    0:43:23 you know
    0:43:24 please
    0:43:24 Jesus
    0:43:24 everybody
    0:43:25 trust
    0:43:25 everybody
    0:43:25 trust
    0:43:26 Trump
    0:43:26 now
    0:43:27 you know
    0:43:27 everybody
    0:43:27 trusted
    0:43:28 Biden
    0:43:28 is it
    0:43:29 going to
    0:43:29 be
    0:43:29 Google
    0:43:29 we
    0:43:30 trust
    0:43:30 those
    0:43:30 guys
    0:43:31 or is
    0:43:31 it
    0:43:31 going to
    0:43:32 be
    0:43:32 the
    0:43:32 game
    0:43:33 theoretic
    0:43:33 mathematical
    0:43:34 properties
    0:43:34 of the
    0:43:35 blockchain
    0:43:35 that
    0:43:36 can hold
    0:43:36 that
    0:43:39 and so
    0:43:39 I think
    0:43:39 that
    0:43:40 you know
    0:43:40 it’s
    0:43:41 essential
    0:43:42 that we
    0:43:43 regenerate
    0:43:43 our
    0:43:44 kind of
    0:43:45 blockchain
    0:43:45 crypto
    0:43:46 development
    0:43:46 in the
    0:43:46 US
    0:43:47 and we
    0:43:47 get
    0:43:47 very
    0:43:48 serious
    0:43:48 about
    0:43:48 it
    0:43:49 and
    0:43:49 you know
    0:43:49 like
    0:43:50 if the
    0:43:50 government
    0:43:50 were to
    0:43:51 do
    0:43:51 something
    0:43:51 I think
    0:43:51 it should
    0:43:52 be to
    0:43:52 start to
    0:43:53 require
    0:43:54 these
    0:43:55 information
    0:43:56 distribution
    0:43:56 networks
    0:43:57 these social
    0:43:57 networks
    0:43:58 to have
    0:43:59 a way
    0:43:59 to
    0:44:00 you know
    0:44:01 verifiably
    0:44:01 prove
    0:44:02 you’re human
    0:44:02 you know
    0:44:03 prove where
    0:44:04 a piece
    0:44:04 of data
    0:44:05 came from
    0:44:05 and so
    0:44:05 forth
    0:44:06 and I
    0:44:07 think
    0:44:07 that
    0:44:08 you know
    0:44:09 we have
    0:44:09 to
    0:44:09 you know
    0:44:10 have
    0:44:10 banks
    0:44:12 start
    0:44:12 accepting
    0:44:13 zero
    0:44:13 knowledge
    0:44:14 proofs
    0:44:15 and you
    0:44:15 know
    0:44:16 and that
    0:44:16 be just
    0:44:17 the way
    0:44:17 the world
    0:44:17 works
    0:44:17 we need
    0:44:19 a network
    0:44:20 architecture
    0:44:20 that is
    0:44:21 up to
    0:44:22 the challenge
    0:44:22 of you
    0:44:23 know
    0:44:23 these
    0:44:23 super
    0:44:24 intelligent
    0:44:24 agents
    0:44:25 that are
    0:44:25 running
    0:44:25 around
    0:44:26 we were
    0:44:27 talking
    0:44:27 before we
    0:44:27 started
    0:44:28 rolling
    0:44:28 that you
    0:44:28 guys
    0:44:28 have an
    0:44:29 office
    0:44:29 in DC
    0:44:30 and part
    0:44:30 of what
    0:44:30 you do
    0:44:32 is advise
    0:44:32 on that
    0:44:33 like what
    0:44:33 what does
    0:44:34 the infrastructure
    0:44:35 changes
    0:44:35 what do
    0:44:35 they need
    0:44:36 to look
    0:44:36 like
    0:44:37 what are
    0:44:37 a small
    0:44:38 handful
    0:44:38 of things
    0:44:39 that you
    0:44:39 guys are
    0:44:40 really
    0:44:40 pushing
    0:44:41 to see
    0:44:41 the government
    0:44:42 adopt
    0:44:42 to modernize
    0:44:43 the way
    0:44:43 that
    0:44:44 the whole
    0:44:44 bureaucracy
    0:44:45 works
    0:44:46 yeah so
    0:44:47 there’s a few
    0:44:48 things you know
    0:44:48 and one of the
    0:44:49 things is because
    0:44:50 you know
    0:44:51 blockchain
    0:44:52 technology
    0:44:53 involves money
    0:44:55 we do need
    0:44:55 right it’s not
    0:44:56 like we don’t
    0:44:56 need any
    0:44:57 regulation
    0:44:58 we do need
    0:44:58 regulation
    0:45:00 and there are
    0:45:00 kind of very
    0:45:01 specific things
    0:45:02 that that
    0:45:03 we’re working
    0:45:04 with the
    0:45:04 administration
    0:45:05 to make
    0:45:05 sure
    0:45:06 are done
    0:45:07 in a way
    0:45:08 that kind
    0:45:09 of creates
    0:45:09 a great
    0:45:10 environment
    0:45:10 for everybody
    0:45:11 what do
    0:45:12 you guys
    0:45:12 hoping
    0:45:13 will get
    0:45:13 blocked
    0:45:13 out
    0:45:14 for instance
    0:45:14 is that
    0:45:14 what you’re
    0:45:14 about to
    0:45:15 cover
    0:45:16 yeah I mean
    0:45:17 so like
    0:45:18 one of the
    0:45:20 first thing
    0:45:20 you need
    0:45:21 is you know
    0:45:22 we do need
    0:45:23 electronic money
    0:45:23 you know
    0:45:24 in the
    0:45:26 form of
    0:45:26 stable coin
    0:45:27 so actual
    0:45:27 currency
    0:45:29 and
    0:45:30 but we need
    0:45:31 that to not
    0:45:32 like it’s very
    0:45:32 bad if one of
    0:45:33 those collapses
    0:45:36 because then
    0:45:36 like the
    0:45:37 whole trust
    0:45:37 and the
    0:45:37 system
    0:45:38 breaks down
    0:45:38 and so
    0:45:38 forth
    0:45:39 well why
    0:45:39 do we need
    0:45:40 this kind
    0:45:40 of money
    0:45:41 this kind
    0:45:42 of internet
    0:45:42 native money
    0:45:44 well
    0:45:45 I’ll give you
    0:45:45 an example
    0:45:46 so we have
    0:45:46 a company
    0:45:48 called
    0:45:48 Daylight Energy
    0:45:50 and what
    0:45:50 they do
    0:45:50 is
    0:45:52 so we’re
    0:45:52 running
    0:45:52 we’re going
    0:45:53 to run
    0:45:53 into a big
    0:45:54 energy problem
    0:45:55 with AI
    0:45:55 that I think
    0:45:56 most people
    0:45:57 probably listening
    0:45:58 to this know
    0:45:58 about where
    0:45:59 AI consumes
    0:46:00 a massive
    0:46:00 amount of
    0:46:01 energy
    0:46:01 you know
    0:46:02 much more
    0:46:02 than Bitcoin
    0:46:02 ever did
    0:46:03 by the way
    0:46:04 which everybody
    0:46:04 was all up
    0:46:05 in arms about
    0:46:07 and you know
    0:46:07 so much so
    0:46:08 that like
    0:46:08 you can’t
    0:46:09 really even
    0:46:09 get it out
    0:46:09 of the power
    0:46:10 grid
    0:46:10 and I think
    0:46:11 Trump has
    0:46:11 been smart
    0:46:12 about this
    0:46:12 saying hey
    0:46:12 you probably
    0:46:13 need to build
    0:46:15 a kind
    0:46:17 of power
    0:46:17 next to your
    0:46:18 data center
    0:46:19 because we
    0:46:19 can’t be
    0:46:20 giving it
    0:46:20 to you
    0:46:20 from the
    0:46:21 central tank
    0:46:22 but beyond
    0:46:22 that I think
    0:46:23 that you know
    0:46:23 kind of
    0:46:24 individuals
    0:46:25 you know
    0:46:26 now have
    0:46:27 Tesla
    0:46:28 kind of
    0:46:28 solar panels
    0:46:29 and power
    0:46:29 walls
    0:46:30 and these
    0:46:30 kinds of
    0:46:31 things
    0:46:32 and when
    0:46:32 you have
    0:46:32 one of
    0:46:33 those
    0:46:33 you
    0:46:33 sometimes
    0:46:34 have
    0:46:34 more
    0:46:34 energy
    0:46:35 than you
    0:46:35 need
    0:46:36 and sometimes
    0:46:36 have
    0:46:36 less
    0:46:37 and wouldn’t
    0:46:37 it be
    0:46:38 great
    0:46:39 if you
    0:46:39 could
    0:46:40 you know
    0:46:40 if there
    0:46:41 was a
    0:46:41 nice
    0:46:42 system
    0:46:43 that figured
    0:46:43 out who
    0:46:44 needed
    0:46:44 energy
    0:46:45 and who
    0:46:46 had
    0:46:46 energy
    0:46:47 and you
    0:46:47 could just
    0:46:47 trade
    0:46:48 and there
    0:46:49 was some
    0:46:50 kind of
    0:46:50 contract
    0:46:51 that said
    0:46:51 okay this
    0:46:51 is what
    0:46:52 you pay
    0:46:52 during peak
    0:46:53 this is what
    0:46:53 you pay
    0:46:53 at different
    0:46:54 periods
    0:46:55 and that
    0:46:56 contract
    0:46:56 probably
    0:46:57 best done
    0:46:57 in the
    0:46:57 form of
    0:46:58 a smart
    0:46:58 contract
    0:46:59 but a
    0:46:59 power wall
    0:47:00 is not
    0:47:00 a human
    0:47:01 so it
    0:47:01 doesn’t have
    0:47:01 a credit
    0:47:02 card
    0:47:02 it can’t
    0:47:02 get a credit
    0:47:03 card
    0:47:03 doesn’t have
    0:47:03 a bank
    0:47:04 account
    0:47:04 doesn’t have
    0:47:04 a social
    0:47:05 security number
    0:47:07 but it can
    0:47:07 trade crypto
    0:47:08 it can trade
    0:47:08 stable coins
    0:47:11 and so we
    0:47:11 need that
    0:47:11 kind of
    0:47:12 currency
    0:47:13 to kind
    0:47:14 of facilitate
    0:47:16 all these
    0:47:16 kind of
    0:47:17 automated
    0:47:17 agreements
    0:47:18 and automated
    0:47:19 transfer
    0:47:19 of
    0:47:22 kind of
    0:47:23 wealth
    0:47:24 between
    0:47:25 entities
    0:47:26 in order to
    0:47:26 kind of
    0:47:27 solve these
    0:47:28 big problems
    0:47:28 that we have
    0:47:29 like energy
    0:47:31 and so we
    0:47:31 need a
    0:47:32 stable coin
    0:47:32 bill that
    0:47:33 kind of
    0:47:33 says okay
    0:47:34 look
    0:47:35 we need
    0:47:36 these
    0:47:36 currencies
    0:47:37 to be
    0:47:37 backed
    0:47:38 one for
    0:47:39 one with
    0:47:39 US dollars
    0:47:40 or whatever
    0:47:41 it is
    0:47:42 so that
    0:47:43 you know
    0:47:43 we can
    0:47:44 have a
    0:47:46 system that
    0:47:46 works and
    0:47:46 is trusted
    0:47:47 now there’s
    0:47:48 this really
    0:47:49 interesting side
    0:47:49 benefit to
    0:47:50 that which
    0:47:51 is if
    0:47:51 you look
    0:47:52 at treasury
    0:47:52 auctions
    0:47:53 lately
    0:47:55 the demand
    0:47:56 for dollars
    0:47:56 is not
    0:47:57 good
    0:47:58 you know
    0:47:58 and a lot
    0:47:59 of that
    0:47:59 is the
    0:48:00 two biggest
    0:48:01 kind of
    0:48:02 lenders
    0:48:02 to the
    0:48:03 US have
    0:48:04 been China
    0:48:04 and Japan
    0:48:06 and you
    0:48:06 know China
    0:48:06 has backed
    0:48:07 off a lot
    0:48:08 and Japan
    0:48:09 has backed
    0:48:10 off somewhat
    0:48:11 and so
    0:48:11 the demand
    0:48:12 for dollars
    0:48:12 has gone
    0:48:12 down
    0:48:13 we’ve done
    0:48:13 things to
    0:48:14 also dampen
    0:48:15 demand like
    0:48:16 you know
    0:48:16 when we
    0:48:17 sanctioned
    0:48:18 Russia
    0:48:18 and we
    0:48:19 seized the
    0:48:19 assets of
    0:48:20 the Russian
    0:48:20 central bank
    0:48:21 you know
    0:48:21 there were
    0:48:22 other countries
    0:48:22 that had
    0:48:23 you know
    0:48:23 other entities
    0:48:24 that had
    0:48:24 money there
    0:48:26 and their
    0:48:26 money got
    0:48:26 frozen
    0:48:27 and they
    0:48:27 couldn’t
    0:48:28 access it
    0:48:28 and so
    0:48:28 that makes
    0:48:29 people more
    0:48:30 wary of holding
    0:48:30 everything in
    0:48:31 dollars
    0:48:32 so we’ve
    0:48:33 done a lot
    0:48:34 to dampen
    0:48:34 that which
    0:48:34 of course
    0:48:35 is you
    0:48:35 know
    0:48:36 fueled
    0:48:36 inflation
    0:48:37 in the
    0:48:38 same way
    0:48:38 that increasing
    0:48:39 supply
    0:48:40 fuels
    0:48:40 inflation
    0:48:41 killing
    0:48:41 demand
    0:48:42 fuels
    0:48:43 inflation
    0:48:44 so here
    0:48:44 we would
    0:48:44 have this
    0:48:45 new
    0:48:46 major source
    0:48:46 of demand
    0:48:47 for dollars
    0:48:48 and then the
    0:48:49 dollars would be
    0:48:49 much more useful
    0:48:50 because you can
    0:48:50 use them
    0:48:51 online as well
    0:48:53 and machines
    0:48:53 can use them
    0:48:54 and so forth
    0:48:54 so we
    0:48:55 really
    0:48:55 and sorry
    0:48:56 really fast
    0:48:56 for people
    0:48:57 that are
    0:48:57 trying to
    0:48:58 track that
    0:48:58 the reason
    0:48:59 that that
    0:49:01 would increase
    0:49:01 the demand
    0:49:01 for dollars
    0:49:02 is that they
    0:49:02 would the
    0:49:03 stable coin
    0:49:03 would be
    0:49:04 backed one
    0:49:04 for one
    0:49:05 with debt
    0:49:06 is that the
    0:49:06 idea
    0:49:07 well yeah
    0:49:07 with
    0:49:09 where you
    0:49:09 would basically
    0:49:10 have
    0:49:10 you would
    0:49:11 have to
    0:49:11 have a
    0:49:12 dollar
    0:49:12 right
    0:49:15 for
    0:49:16 every
    0:49:16 whole
    0:49:17 treasuries
    0:49:17 stable coin
    0:49:17 yeah you
    0:49:18 basically
    0:49:18 hold
    0:49:18 treasuries
    0:49:19 so that
    0:49:20 if somebody
    0:49:20 wanted to
    0:49:21 redeem their
    0:49:21 stable coins
    0:49:22 they could
    0:49:23 and then that
    0:49:23 way
    0:49:24 you know
    0:49:24 kind of
    0:49:25 the equivalent
    0:49:26 of the
    0:49:26 gold standard
    0:49:27 in the old
    0:49:27 days
    0:49:27 you know
    0:49:28 when dollars
    0:49:28 were trying
    0:49:29 to get
    0:49:29 credible
    0:49:30 you know
    0:49:31 we would
    0:49:31 need
    0:49:31 like
    0:49:33 dollars
    0:49:34 to be the
    0:49:34 gold standard
    0:49:35 for the
    0:49:35 stable coin
    0:49:37 you know
    0:49:38 and probably
    0:49:38 we should
    0:49:39 never back
    0:49:40 off of that
    0:49:40 maybe we
    0:49:40 should never
    0:49:41 backed off
    0:49:41 of gold
    0:49:42 but
    0:49:43 you know
    0:49:43 it’s easier
    0:49:44 when it’s
    0:49:44 dollars
    0:49:45 because we
    0:49:45 did kind of
    0:49:46 start to run
    0:49:46 out of gold
    0:49:47 a bit
    0:49:49 so
    0:49:50 yeah so
    0:49:51 that’s
    0:49:51 you know
    0:49:51 one thing
    0:49:53 then
    0:49:54 secondly
    0:49:55 there’s
    0:49:56 a bill
    0:49:57 that went
    0:49:58 through the
    0:49:58 house
    0:49:58 known as
    0:49:59 the market
    0:49:59 structure
    0:50:00 bill
    0:50:00 it was
    0:50:01 technically
    0:50:01 called
    0:50:02 fit 21
    0:50:03 that’s a
    0:50:04 very important
    0:50:04 you know
    0:50:05 whether it’s
    0:50:05 exactly that
    0:50:06 or some
    0:50:06 form of
    0:50:07 that
    0:50:08 because
    0:50:09 you know
    0:50:09 when you
    0:50:10 talk about
    0:50:11 tokens
    0:50:12 which are
    0:50:13 this kind
    0:50:13 of instrument
    0:50:14 that’s very
    0:50:15 very important
    0:50:16 in blockchain
    0:50:16 world
    0:50:17 because it’s
    0:50:17 the way
    0:50:19 that this
    0:50:20 amazing
    0:50:21 kind of
    0:50:21 network
    0:50:22 of computers
    0:50:23 gets paid
    0:50:23 for it
    0:50:23 so
    0:50:24 you know
    0:50:25 who
    0:50:26 who pays
    0:50:26 the people
    0:50:26 for running
    0:50:28 the computers
    0:50:28 well that’s
    0:50:29 paid in the
    0:50:29 form of
    0:50:30 these tokens
    0:50:31 but these
    0:50:31 tokens
    0:50:33 which can be
    0:50:34 created on
    0:50:34 blockchain
    0:50:36 have
    0:50:38 they can be
    0:50:39 many things
    0:50:41 so you can
    0:50:41 create a token
    0:50:42 that’s a
    0:50:42 collectible
    0:50:43 you know
    0:50:44 you can
    0:50:45 create a
    0:50:45 token
    0:50:47 that is a
    0:50:48 digital property
    0:50:48 right
    0:50:49 you know
    0:50:49 that links
    0:50:51 to you know
    0:50:51 some piece
    0:50:52 of real
    0:50:52 estate
    0:50:53 or a piece
    0:50:53 of art
    0:50:53 or so
    0:50:54 forth
    0:50:55 a token
    0:50:55 can be
    0:50:57 a you know
    0:50:58 pokemon card
    0:50:58 a token
    0:50:58 could be
    0:50:59 a coupon
    0:51:00 a token
    0:51:00 could be
    0:51:01 a security
    0:51:02 that represents
    0:51:02 a stock
    0:51:04 it could be
    0:51:04 you know
    0:51:05 a dollar
    0:51:06 so which
    0:51:07 one is it
    0:51:08 is a very
    0:51:09 kind of
    0:51:09 important set
    0:51:10 of rules
    0:51:11 that doesn’t
    0:51:11 exist
    0:51:12 and this is
    0:51:12 one of the
    0:51:14 most insidious
    0:51:14 thing that
    0:51:15 the Biden
    0:51:15 administration
    0:51:16 did was
    0:51:17 basically say
    0:51:18 well everything
    0:51:19 is a security
    0:51:19 everything’s
    0:51:20 a stock
    0:51:22 or you know
    0:51:22 like some
    0:51:23 thing with
    0:51:23 asymmetric
    0:51:24 information
    0:51:25 which kind
    0:51:26 of basically
    0:51:28 undermines
    0:51:28 the whole
    0:51:30 power of the
    0:51:30 technology
    0:51:31 and so it
    0:51:32 was basically
    0:51:33 a scheme
    0:51:34 for them
    0:51:36 to kind
    0:51:37 of get rid
    0:51:37 of the industry
    0:51:38 but it was
    0:51:39 very very
    0:51:40 dark
    0:51:43 cynical
    0:51:44 way of
    0:51:46 legislating
    0:51:47 things and
    0:51:47 they would
    0:51:47 make these
    0:51:48 fake claims
    0:51:49 about scams
    0:51:50 and so forth
    0:51:51 but the
    0:51:52 market structure
    0:51:52 bill is
    0:51:53 very very
    0:51:53 important in
    0:51:54 that way
    0:51:54 and by the
    0:51:55 way also
    0:51:55 you know
    0:51:56 another thing
    0:51:57 that was in
    0:51:57 the original
    0:51:58 market structure
    0:51:59 bill which
    0:52:00 is important
    0:52:00 is look
    0:52:01 there are
    0:52:03 also scams
    0:52:04 there are
    0:52:04 you know
    0:52:05 and we call
    0:52:05 it the casino
    0:52:07 but you know
    0:52:08 like I can
    0:52:09 create some
    0:52:10 coin like the
    0:52:11 Hak Tua girl
    0:52:11 did right
    0:52:12 like and
    0:52:13 she creates
    0:52:14 a coin
    0:52:15 she kind
    0:52:16 of lies
    0:52:17 about
    0:52:19 you know
    0:52:20 her holdings
    0:52:20 and says
    0:52:21 she’s going
    0:52:21 to hold
    0:52:21 them but
    0:52:22 then sells
    0:52:23 them you
    0:52:23 know
    0:52:24 after people
    0:52:25 buy it
    0:52:25 in a short
    0:52:26 time period
    0:52:26 and so forth
    0:52:27 and there’s
    0:52:28 no part of
    0:52:28 the problem
    0:52:28 is there’s
    0:52:29 no kind
    0:52:30 of rules
    0:52:30 around that
    0:52:31 but in
    0:52:31 the kind
    0:52:32 of bill
    0:52:33 that passed
    0:52:33 the house
    0:52:34 it said
    0:52:34 like you
    0:52:35 can create
    0:52:35 a token
    0:52:37 but if
    0:52:37 you hold
    0:52:37 it you
    0:52:38 can’t trade
    0:52:38 it for
    0:52:39 four years
    0:52:40 that kind
    0:52:41 of takes
    0:52:41 a lot
    0:52:42 of the
    0:52:42 ability
    0:52:43 to scam
    0:52:44 out of
    0:52:44 it
    0:52:45 and kind
    0:52:45 of forces
    0:52:46 people to
    0:52:46 do things
    0:52:47 that are
    0:52:47 their real
    0:52:48 utilities
    0:52:48 or if
    0:52:48 it’s a
    0:52:49 collectible
    0:52:50 you know
    0:52:50 if it is
    0:52:51 the Hak Tua
    0:52:51 collectible
    0:52:53 you know
    0:52:53 it’s got to
    0:52:54 be a real
    0:52:54 collectible
    0:52:55 where you
    0:52:56 don’t just
    0:52:56 you know
    0:52:57 rug
    0:52:58 the users
    0:52:59 of it
    0:52:59 right away
    0:53:00 and so
    0:53:01 that you
    0:53:01 know
    0:53:01 that’s
    0:53:02 right away
    0:53:04 it’s okay
    0:53:04 to do it
    0:53:04 later
    0:53:04 but
    0:53:05 well but
    0:53:05 you know
    0:53:06 like in
    0:53:06 four years
    0:53:07 it is
    0:53:07 what it
    0:53:07 is
    0:53:08 right
    0:53:08 yeah
    0:53:09 no no
    0:53:09 I’m just
    0:53:09 giving you
    0:53:10 a hard
    0:53:10 time
    0:53:10 I just
    0:53:10 know how
    0:53:11 that’s
    0:53:11 going to
    0:53:11 sound
    0:53:11 to
    0:53:11 people
    0:53:12 yeah
    0:53:13 yeah
    0:53:14 thank you
    0:53:15 but you
    0:53:15 know
    0:53:15 so these
    0:53:16 kinds of
    0:53:16 things I
    0:53:17 think are
    0:53:18 are going
    0:53:18 to be
    0:53:18 really
    0:53:18 important
    0:53:19 to
    0:53:19 making
    0:53:19 the
    0:53:20 whole
    0:53:20 industry
    0:53:20 work
    0:53:21 and so
    0:53:21 we’re
    0:53:21 working
    0:53:22 you know
    0:53:23 on that
    0:53:23 you know
    0:53:24 trying to
    0:53:24 make it
    0:53:25 safe for
    0:53:25 everybody
    0:53:26 but as
    0:53:27 I said
    0:53:27 it’s just
    0:53:28 such a
    0:53:28 critical
    0:53:29 technology
    0:53:34 yes it’s
    0:53:34 just going
    0:53:34 to be
    0:53:35 like a
    0:53:35 very
    0:53:36 kind of
    0:53:37 problematic
    0:53:39 you know
    0:53:40 it’s going
    0:53:40 to be
    0:53:40 it’s
    0:53:41 cyberpunk
    0:53:41 it’s
    0:53:44 it’s
    0:53:44 a
    0:53:44 it’s
    0:53:44 a
    0:53:45 not
    0:53:46 yeah
    0:53:46 it’s
    0:53:46 a
    0:53:47 high
    0:53:47 technology
    0:53:49 difficult
    0:53:50 society
    0:53:51 yeah
    0:53:51 yeah
    0:53:52 it
    0:53:53 it was
    0:53:54 shocking
    0:53:54 to me
    0:53:55 the level
    0:53:55 of
    0:53:56 backlash
    0:53:57 that
    0:53:57 the
    0:53:57 blockchain
    0:53:58 web
    0:53:58 3
    0:53:59 community
    0:54:00 got
    0:54:01 what
    0:54:02 do you
    0:54:02 think
    0:54:03 drives
    0:54:03 that
    0:54:03 is
    0:54:03 it
    0:54:04 just
    0:54:04 the
    0:54:05 perception
    0:54:05 that
    0:54:05 it
    0:54:06 was
    0:54:06 only
    0:54:06 scams
    0:54:06 and
    0:54:07 there’s
    0:54:07 nothing
    0:54:07 real
    0:54:08 like
    0:54:08 what
    0:54:08 was
    0:54:08 that
    0:54:08 all
    0:54:09 about
    0:54:09 so
    0:54:09 there
    0:54:10 was
    0:54:10 multiple
    0:54:11 factors
    0:54:11 so
    0:54:11 the
    0:54:11 first
    0:54:11 one
    0:54:12 is
    0:54:12 the
    0:54:12 one
    0:54:12 that
    0:54:12 hits
    0:54:13 all
    0:54:13 new
    0:54:13 technology
    0:54:14 where
    0:54:15 oh
    0:54:16 it’s
    0:54:16 a
    0:54:16 toy
    0:54:18 it
    0:54:18 doesn’t
    0:54:18 do
    0:54:18 anything
    0:54:19 new
    0:54:19 like
    0:54:20 the
    0:54:20 old
    0:54:20 way
    0:54:20 of
    0:54:20 doing
    0:54:21 things
    0:54:21 is
    0:54:21 better
    0:54:22 and
    0:54:24 you
    0:54:24 know
    0:54:24 we
    0:54:24 saw
    0:54:24 that
    0:54:25 with
    0:54:25 social
    0:54:25 networking
    0:54:26 we
    0:54:26 actually
    0:54:27 saw
    0:54:27 that
    0:54:27 with
    0:54:27 the
    0:54:27 internet
    0:54:28 I
    0:54:28 think
    0:54:29 Paul
    0:54:29 Krugman
    0:54:29 famously
    0:54:30 said
    0:54:30 never
    0:54:31 have
    0:54:31 more
    0:54:32 economic
    0:54:32 impact
    0:54:32 on a
    0:54:33 fax
    0:54:33 machine
    0:54:33 and so
    0:54:33 forth
    0:54:34 so
    0:54:34 that’s
    0:54:34 just
    0:54:34 kind of
    0:54:34 a
    0:54:35 normal
    0:54:35 thing
    0:54:35 that
    0:54:35 happens
    0:54:36 with
    0:54:36 new
    0:54:36 technologies
    0:54:37 as they
    0:54:37 start
    0:54:37 out
    0:54:57 and
    0:54:57 so
    0:54:58 you
    0:54:58 know
    0:54:58 if
    0:54:58 you
    0:54:59 look
    0:54:59 at
    0:55:00 even
    0:55:00 like
    0:55:01 the
    0:55:01 iPhone
    0:55:02 it
    0:55:02 was
    0:55:02 a
    0:55:03 bad
    0:55:03 phone
    0:55:04 it
    0:55:05 had
    0:55:05 a
    0:55:05 horrible
    0:55:06 keyboard
    0:55:06 you
    0:55:07 know
    0:55:07 if
    0:55:07 you
    0:55:07 compared
    0:55:07 it
    0:55:07 to
    0:55:08 anything
    0:55:08 it
    0:55:08 wasn’t
    0:55:08 very
    0:55:09 powerful
    0:55:09 it
    0:55:09 had
    0:55:10 a
    0:55:10 little
    0:55:10 itty
    0:55:10 bitty
    0:55:11 screen
    0:55:12 but
    0:55:13 it
    0:55:13 had
    0:55:13 a
    0:55:13 feature
    0:55:14 that
    0:55:15 was
    0:55:15 pretty
    0:55:15 awesome
    0:55:16 which
    0:55:16 you
    0:55:16 could
    0:55:16 put
    0:55:16 in
    0:55:16 your
    0:55:17 pocket
    0:55:18 and
    0:55:18 it
    0:55:18 had
    0:55:18 like
    0:55:18 a
    0:55:19 GPS
    0:55:19 in
    0:55:19 it
    0:55:19 and
    0:55:19 a
    0:55:19 camera
    0:55:20 in
    0:55:20 it
    0:55:21 and
    0:55:21 so
    0:55:21 now
    0:55:21 you
    0:55:21 could
    0:55:21 build
    0:55:22 Instagram
    0:55:22 you
    0:55:22 could
    0:55:22 build
    0:55:23 Uber
    0:55:24 which
    0:55:24 you
    0:55:24 could
    0:55:24 not
    0:55:24 build
    0:55:24 with
    0:55:25 the
    0:55:25 PC
    0:55:25 and
    0:55:25 you
    0:55:25 still
    0:55:26 can’t
    0:55:26 build
    0:55:26 with
    0:55:26 the
    0:55:26 PC
    0:55:28 and
    0:55:28 so
    0:55:28 that
    0:55:29 was
    0:55:29 enough
    0:55:30 and
    0:55:30 then
    0:55:30 eventually
    0:55:31 it
    0:55:31 started
    0:55:31 to
    0:55:31 add
    0:55:32 the
    0:55:32 other
    0:55:32 features
    0:55:33 and
    0:55:33 it’s
    0:55:33 an
    0:55:33 awfully
    0:55:34 powerful
    0:55:34 computer
    0:55:35 these
    0:55:35 days
    0:55:36 if
    0:55:36 you
    0:55:37 look
    0:55:37 at
    0:55:37 blockchain
    0:55:38 it’s
    0:55:39 slower
    0:55:40 it’s
    0:55:40 more
    0:55:41 complicated
    0:55:41 to
    0:55:41 program
    0:55:42 like
    0:55:42 there’s
    0:55:42 a lot
    0:55:42 of
    0:55:43 issues
    0:55:43 with
    0:55:43 it
    0:55:44 but
    0:55:44 it’s
    0:55:44 got
    0:55:44 a
    0:55:51 that
    0:55:52 code
    0:55:53 you
    0:55:53 know
    0:55:53 says
    0:55:54 there’s
    0:55:54 only
    0:55:54 21
    0:55:55 million
    0:55:55 bitcoin
    0:55:56 you
    0:55:57 can
    0:55:57 absolutely
    0:55:58 count
    0:55:58 on
    0:55:58 that
    0:55:59 in
    0:55:59 a
    0:55:59 way
    0:56:00 that
    0:56:00 like
    0:56:00 you
    0:56:00 can’t
    0:56:01 trust
    0:56:01 Google
    0:56:02 you
    0:56:02 can’t
    0:56:02 trust
    0:56:03 you
    0:56:03 know
    0:56:04 Facebook
    0:56:04 to
    0:56:04 say
    0:56:05 like
    0:56:05 oh
    0:56:05 these
    0:56:05 are
    0:56:06 our
    0:56:06 privacy
    0:56:07 rules
    0:56:08 like
    0:56:08 you
    0:56:08 can’t
    0:56:08 trust
    0:56:09 that
    0:56:09 at
    0:56:09 all
    0:56:10 you
    0:56:10 can’t
    0:56:10 trust
    0:56:10 the
    0:56:10 US
    0:56:11 government
    0:56:11 to
    0:56:11 say
    0:56:11 they’re
    0:56:11 not
    0:56:11 going
    0:56:11 to
    0:56:11 print
    0:56:12 any
    0:56:12 more
    0:56:12 money
    0:56:13 like
    0:56:13 that’s
    0:56:13 for
    0:56:13 sure
    0:56:14 and
    0:56:15 so
    0:56:15 you
    0:56:15 know
    0:56:21 count
    0:56:21 on
    0:56:22 and
    0:56:22 you
    0:56:22 don’t
    0:56:23 have
    0:56:23 to
    0:56:23 trust
    0:56:23 a
    0:56:23 company
    0:56:24 you
    0:56:24 don’t
    0:56:24 have
    0:56:24 to
    0:56:24 trust
    0:56:25 a
    0:56:26 lawyer
    0:56:26 you
    0:56:26 just
    0:56:27 have
    0:56:27 to
    0:56:27 trust
    0:56:28 the
    0:56:28 game
    0:56:28 theoretic
    0:56:29 mathematical
    0:56:30 properties
    0:56:30 of the
    0:56:30 blockchain
    0:56:32 and
    0:56:32 that’s
    0:56:32 amazing
    0:56:33 so
    0:56:33 now
    0:56:33 you
    0:56:33 can
    0:56:34 you
    0:56:34 know
    0:56:35 program
    0:56:35 property
    0:56:36 rights
    0:56:36 and
    0:56:36 money
    0:56:37 and
    0:56:37 law
    0:56:37 and
    0:56:38 all
    0:56:38 these
    0:56:38 kinds
    0:56:38 of
    0:56:39 things
    0:56:39 that
    0:56:39 you
    0:56:39 could
    0:56:39 never
    0:56:39 do
    0:56:40 before
    0:56:41 and
    0:56:42 so
    0:56:42 I
    0:56:42 think
    0:56:43 that’s
    0:56:43 hard
    0:56:43 for
    0:56:44 normal
    0:56:44 people
    0:56:44 to
    0:56:45 understand
    0:56:46 who
    0:56:46 aren’t
    0:56:47 deep
    0:56:52 and
    0:56:52 then
    0:56:53 you
    0:56:53 know
    0:56:53 I
    0:56:53 think
    0:56:54 the
    0:56:54 next
    0:56:54 wave
    0:56:55 was
    0:56:55 you
    0:56:55 had
    0:56:56 look
    0:56:57 you
    0:56:57 know
    0:56:57 it
    0:56:58 was
    0:56:58 a
    0:56:58 very
    0:56:59 odd
    0:56:59 thing
    0:57:01 with
    0:57:01 the
    0:57:01 Biden
    0:57:01 administration
    0:57:02 because
    0:57:02 he
    0:57:02 wasn’t
    0:57:03 really
    0:57:04 I
    0:57:04 think
    0:57:04 it’s
    0:57:05 come
    0:57:05 out
    0:57:05 now
    0:57:05 he
    0:57:05 wasn’t
    0:57:06 really
    0:57:06 the
    0:57:06 president
    0:57:06 he
    0:57:07 wasn’t
    0:57:07 really
    0:57:07 making
    0:57:07 any
    0:57:08 decisions
    0:57:09 you
    0:57:09 couldn’t
    0:57:10 even
    0:57:10 get a
    0:57:10 meeting
    0:57:10 with
    0:57:10 him
    0:57:11 if
    0:57:11 you
    0:57:11 were
    0:57:11 in
    0:57:12 his
    0:57:12 cabinet
    0:57:13 and
    0:57:13 in
    0:57:13 terms
    0:57:13 of
    0:57:14 domestic
    0:57:14 policy
    0:57:14 that
    0:57:15 was
    0:57:15 run
    0:57:15 by
    0:57:16 Elizabeth
    0:57:16 Warren
    0:57:17 and
    0:57:17 then
    0:57:17 the
    0:57:17 second
    0:57:18 confusing
    0:57:18 thing
    0:57:18 is
    0:57:19 Elizabeth
    0:57:19 Warren
    0:57:20 is
    0:57:20 always
    0:57:20 calling
    0:57:21 like
    0:57:21 people
    0:57:21 fascist
    0:57:22 her
    0:57:23 whole
    0:57:24 push
    0:57:24 with
    0:57:25 fintech
    0:57:25 and
    0:57:25 crypto
    0:57:26 was
    0:57:27 to
    0:57:27 make
    0:57:28 sure
    0:57:29 that
    0:57:29 she
    0:57:29 could
    0:57:30 kick
    0:57:30 people
    0:57:30 out
    0:57:30 of
    0:57:30 the
    0:57:31 banking
    0:57:31 system
    0:57:31 who
    0:57:31 are
    0:57:32 political
    0:57:32 enemies
    0:57:33 and
    0:57:33 so
    0:57:34 in
    0:57:34 order
    0:57:34 to
    0:57:34 do
    0:57:34 that
    0:57:35 you
    0:57:35 have
    0:57:35 to
    0:57:35 outlaw
    0:57:36 new
    0:57:36 forms
    0:57:36 of
    0:57:37 financial
    0:57:38 technology
    0:57:39 because
    0:57:39 those
    0:57:40 would
    0:57:40 be
    0:57:40 kind
    0:57:40 of
    0:57:41 back
    0:57:41 doors
    0:57:41 or
    0:57:42 side
    0:57:42 doors
    0:57:42 or
    0:57:43 parallels
    0:57:44 to
    0:57:45 the
    0:57:46 g-sibs
    0:57:46 and
    0:57:46 the
    0:57:46 banking
    0:57:46 system
    0:57:47 which
    0:57:47 she
    0:57:48 comprehensively
    0:57:49 and I
    0:57:49 think
    0:57:49 this is
    0:57:50 coming out
    0:57:50 now
    0:57:50 could
    0:57:50 kick
    0:57:51 people
    0:57:51 out
    0:57:51 of
    0:57:52 and
    0:57:52 so
    0:57:52 when
    0:57:52 you
    0:57:53 use
    0:57:53 when
    0:57:54 it’s
    0:57:54 a full
    0:57:55 top-down
    0:57:55 hierarchy
    0:57:56 and you
    0:57:56 can
    0:57:56 use
    0:57:57 private
    0:57:57 companies
    0:57:58 to enforce
    0:57:58 your will
    0:57:59 that is
    0:58:00 the way
    0:58:00 fascism
    0:58:01 works
    0:58:01 and then
    0:58:02 the way
    0:58:02 she does
    0:58:02 it
    0:58:02 is
    0:58:03 she
    0:58:03 sells
    0:58:04 this
    0:58:04 fake
    0:58:05 story
    0:58:05 about
    0:58:06 you know
    0:58:07 it’s
    0:58:08 funding
    0:58:08 terror
    0:58:08 and it
    0:58:09 turns out
    0:58:09 like
    0:58:09 the
    0:58:10 USAID
    0:58:10 was
    0:58:11 funding
    0:58:11 the
    0:58:11 terrorist
    0:58:11 groups
    0:58:12 but
    0:58:12 that’s
    0:58:12 a
    0:58:12 different
    0:58:12 story
    0:58:13 but
    0:58:13 you know
    0:58:13 it’s
    0:58:14 doing
    0:58:14 all
    0:58:14 these
    0:58:15 nefarious
    0:58:15 things
    0:58:16 which
    0:58:16 you know
    0:58:17 was just
    0:58:17 a very
    0:58:18 unfair
    0:58:18 portrayal
    0:58:19 and so
    0:58:19 then
    0:58:20 the whole
    0:58:20 industry
    0:58:21 got this
    0:58:21 reputation
    0:58:22 as scammy
    0:58:22 and this
    0:58:23 and that
    0:58:23 and the
    0:58:23 other
    0:58:24 and then
    0:58:24 of course
    0:58:25 we had
    0:58:26 Sam
    0:58:26 Bankman
    0:58:26 freed
    0:58:27 who
    0:58:27 didn’t
    0:58:27 do
    0:58:27 us
    0:58:27 any
    0:58:28 favors
    0:58:28 by
    0:58:30 you know
    0:58:30 and this
    0:58:31 is another
    0:58:32 kind of
    0:58:32 though
    0:58:33 issue
    0:58:33 with what
    0:58:34 Elizabeth Warren
    0:58:34 did
    0:58:34 is she
    0:58:35 blocked
    0:58:35 all
    0:58:36 legislation
    0:58:38 and so
    0:58:38 the
    0:58:39 criminals
    0:58:39 were running
    0:58:40 free
    0:58:40 and the
    0:58:41 people
    0:58:41 doing
    0:58:41 things
    0:58:42 that
    0:58:42 should
    0:58:42 have
    0:58:42 been
    0:58:42 legal
    0:58:43 were
    0:58:43 getting
    0:58:44 terrorized
    0:58:44 by the
    0:58:45 government
    0:58:46 when they
    0:58:46 should
    0:58:46 have been
    0:58:46 looking
    0:58:47 they should
    0:58:48 have been
    0:58:48 looking at
    0:58:49 FTX
    0:58:49 they were
    0:58:50 looking at
    0:58:50 Coinbase
    0:58:51 which was
    0:58:51 totally
    0:58:52 compliant
    0:58:52 public
    0:58:53 company
    0:58:54 you know
    0:58:54 begging
    0:58:55 for
    0:58:55 feedback
    0:58:56 tell us
    0:58:56 what you
    0:58:56 want us
    0:58:57 to do
    0:58:57 exactly
    0:58:59 yeah
    0:58:59 that
    0:59:00 whole
    0:59:00 thing
    0:59:00 was
    0:59:00 crazy
    0:59:01 so
    0:59:01 given
    0:59:02 that
    0:59:02 AI
    0:59:02 is
    0:59:03 putting
    0:59:03 us
    0:59:03 on
    0:59:03 a
    0:59:03 collision
    0:59:04 course
    0:59:04 with
    0:59:05 I
    0:59:05 don’t
    0:59:05 know
    0:59:06 who’s
    0:59:06 real
    0:59:06 I
    0:59:06 don’t
    0:59:06 know
    0:59:07 what’s
    0:59:07 fake
    0:59:07 do
    0:59:07 you
    0:59:08 think
    0:59:08 that
    0:59:08 blockchain
    0:59:09 is
    0:59:09 about
    0:59:09 to
    0:59:09 have
    0:59:09 its
    0:59:09 day
    0:59:10 like
    0:59:10 in
    0:59:10 the
    0:59:10 next
    0:59:11 12
    0:59:11 to
    0:59:12 24
    0:59:12 months
    0:59:13 or
    0:59:13 is
    0:59:13 this
    0:59:13 still
    0:59:14 something
    0:59:14 that
    0:59:14 it’s
    0:59:14 so
    0:59:15 embedded
    0:59:15 deep
    0:59:15 in
    0:59:15 the
    0:59:16 infrastructure
    0:59:16 it’s
    0:59:16 going
    0:59:16 to
    0:59:17 take
    0:59:17 a
    0:59:17 long
    0:59:17 time
    0:59:17 to
    0:59:18 really
    0:59:18 have
    0:59:19 its
    0:59:19 I
    0:59:19 told
    0:59:19 you
    0:59:19 so
    0:59:20 moment
    0:59:21 you
    0:59:21 know
    0:59:21 I
    0:59:22 think
    0:59:22 it’s
    0:59:23 I
    0:59:23 think
    0:59:23 it’s
    0:59:24 within
    0:59:24 24
    0:59:24 months
    0:59:24 for
    0:59:25 sure
    0:59:25 I
    0:59:25 mean
    0:59:25 I
    0:59:25 think
    0:59:26 that
    0:59:27 there’s
    0:59:27 enough
    0:59:28 you
    0:59:29 know
    0:59:29 there
    0:59:29 were
    0:59:29 like
    0:59:30 actual
    0:59:30 like
    0:59:30 if
    0:59:30 you
    0:59:30 like
    0:59:31 kind
    0:59:31 of
    0:59:31 the
    0:59:31 last
    0:59:31 wave
    0:59:31 of
    0:59:32 blockchain
    0:59:33 there
    0:59:33 were
    0:59:34 real
    0:59:35 technological
    0:59:35 limitations
    0:59:36 that
    0:59:36 made
    0:59:36 it
    0:59:37 you
    0:59:37 know
    0:59:38 I
    0:59:38 think
    0:59:39 we’re
    0:59:39 slowing
    0:59:39 it
    0:59:40 down
    0:59:40 from
    0:59:40 getting
    0:59:40 broader
    0:59:41 adoption
    0:59:41 so
    0:59:42 you
    0:59:42 know
    0:59:42 very
    0:59:43 obvious
    0:59:43 usability
    0:59:44 challenges
    0:59:45 the
    0:59:46 fees
    0:59:46 were
    0:59:46 really
    0:59:46 high
    0:59:47 the
    0:59:47 blockchains
    0:59:47 were
    0:59:48 slow
    0:59:49 so
    0:59:49 there
    0:59:49 were
    0:59:49 just
    0:59:49 a
    0:59:50 lot
    0:59:50 of
    0:59:50 use
    0:59:50 cases
    0:59:50 that
    0:59:50 you
    0:59:51 just
    0:59:51 couldn’t
    0:59:51 do
    0:59:51 on
    0:59:51 them
    0:59:53 I
    0:59:53 think
    0:59:54 that’s
    0:59:54 changing
    0:59:55 very
    0:59:55 very
    0:59:55 fast
    0:59:56 things
    0:59:56 are
    0:59:56 you
    0:59:56 know
    0:59:57 the
    0:59:57 chains
    0:59:58 are
    0:59:58 much
    0:59:58 faster
    0:59:59 the
    1:00:00 layer
    1:00:00 two
    1:00:00 stuff
    1:00:01 makes
    1:00:01 them
    1:00:02 very
    1:00:02 fast
    1:00:02 and
    1:00:03 cheap
    1:00:05 you
    1:00:05 know
    1:00:05 people
    1:00:05 are
    1:00:05 doing
    1:00:06 a
    1:00:06 lot
    1:00:07 on
    1:00:07 usability
    1:00:09 you
    1:00:09 know
    1:00:09 for
    1:00:10 wallets
    1:00:10 and
    1:00:10 these
    1:00:11 kinds
    1:00:11 of
    1:00:11 things
    1:00:11 so
    1:00:12 I
    1:00:13 think
    1:00:13 we’re
    1:00:13 getting
    1:00:13 pretty
    1:00:14 close
    1:00:14 and
    1:00:14 then
    1:00:14 I
    1:00:19 world
    1:00:19 coin
    1:00:20 you
    1:00:20 know
    1:00:20 to
    1:00:21 me
    1:00:21 the
    1:00:22 difference
    1:00:23 between
    1:00:23 that
    1:00:23 thing
    1:00:23 being
    1:00:24 very
    1:00:24 broadly
    1:00:25 adopted
    1:00:25 and
    1:00:26 where
    1:00:26 it
    1:00:26 is
    1:00:26 now
    1:00:26 where
    1:00:26 it’s
    1:00:27 I
    1:00:27 think
    1:00:27 half
    1:00:27 the
    1:00:27 people
    1:00:28 in
    1:00:28 Buenos
    1:00:28 Aires
    1:00:29 use
    1:00:29 it
    1:00:30 daily
    1:00:31 so
    1:00:31 it’s
    1:00:32 widely
    1:00:33 adopted
    1:00:33 where
    1:00:33 it’s
    1:00:33 been
    1:00:34 legal
    1:00:35 I
    1:00:35 think
    1:00:35 that
    1:00:36 if
    1:00:36 they
    1:00:37 are
    1:00:37 able
    1:00:37 to
    1:00:38 get
    1:00:38 integrated
    1:00:39 into
    1:00:39 some
    1:00:39 of
    1:00:39 the
    1:00:39 big
    1:00:40 social
    1:00:41 platforms
    1:00:42 then
    1:00:43 everybody
    1:00:44 needs
    1:00:44 proof
    1:00:44 of
    1:00:45 human
    1:00:47 and
    1:00:49 you
    1:00:49 know
    1:00:49 like
    1:00:49 it
    1:00:49 would
    1:00:50 make
    1:00:50 the
    1:00:51 experience
    1:00:52 online
    1:00:52 so
    1:00:52 much
    1:00:52 better
    1:00:53 if
    1:00:53 you
    1:00:53 knew
    1:00:53 who
    1:00:53 was
    1:00:53 human
    1:00:54 and
    1:00:54 who
    1:00:54 was
    1:00:54 not
    1:00:55 and
    1:00:55 right
    1:00:55 now
    1:00:56 like
    1:00:56 you
    1:00:56 can’t
    1:00:57 tell
    1:00:57 at
    1:00:58 all
    1:00:59 and
    1:00:59 so
    1:00:59 and
    1:01:00 that
    1:01:00 problem
    1:01:00 is
    1:01:00 going
    1:01:00 to
    1:01:00 get
    1:01:01 worse
    1:01:01 and
    1:01:01 then
    1:01:01 the
    1:01:02 solution
    1:01:02 is
    1:01:02 really
    1:01:02 here
    1:01:04 so
    1:01:04 I
    1:01:05 think
    1:01:05 it’s
    1:01:05 going
    1:01:05 to
    1:01:06 start
    1:01:06 to
    1:01:06 take
    1:01:06 off
    1:01:06 and
    1:01:06 you
    1:01:06 only
    1:01:07 need
    1:01:07 one
    1:01:07 or
    1:01:08 two
    1:01:09 big
    1:01:09 use
    1:01:10 cases
    1:01:10 to
    1:01:11 start
    1:01:11 getting
    1:01:11 the
    1:01:12 whole
    1:01:12 infrastructure
    1:01:13 deployed
    1:01:14 and
    1:01:14 once
    1:01:15 the
    1:01:15 infrastructure
    1:01:15 is
    1:01:16 deployed
    1:01:16 I
    1:01:17 think
    1:01:17 we’ll
    1:01:17 certainly
    1:01:18 rely
    1:01:18 on
    1:01:18 it
    1:01:19 and
    1:01:20 if
    1:01:20 you
    1:01:20 look
    1:01:20 at
    1:01:21 actually
    1:01:21 the
    1:01:21 curve
    1:01:22 of
    1:01:22 people
    1:01:22 who
    1:01:22 have
    1:01:23 active
    1:01:23 wallets
    1:01:24 and
    1:01:25 the
    1:01:25 curve
    1:01:25 of
    1:01:26 internet
    1:01:27 adoption
    1:01:27 they’re
    1:01:28 pretty
    1:01:28 similar
    1:01:32 it’s
    1:01:32 about
    1:01:32 I think
    1:01:33 blockchain
    1:01:33 is
    1:01:33 growing
    1:01:34 a little
    1:01:34 faster
    1:01:34 than
    1:01:34 the
    1:01:35 internet
    1:01:35 did
    1:01:36 initially
    1:01:37 and
    1:01:37 so
    1:01:37 I
    1:01:37 think
    1:01:38 we’ll
    1:01:38 get
    1:01:38 to
    1:01:39 a
    1:01:39 place
    1:01:39 where
    1:01:40 certainly
    1:01:40 everybody
    1:01:40 in
    1:01:40 the
    1:01:41 US
    1:01:41 will
    1:01:41 be
    1:01:42 on
    1:01:42 it
    1:01:44 which
    1:01:44 by
    1:01:44 the
    1:01:44 way
    1:01:45 could
    1:01:45 be
    1:01:46 great
    1:01:46 from
    1:01:46 a
    1:01:46 government
    1:01:47 standpoint
    1:01:47 you
    1:01:47 know
    1:01:48 Elon
    1:01:48 has
    1:01:48 talked
    1:01:49 about
    1:01:49 putting
    1:01:49 all
    1:01:49 the
    1:01:50 government
    1:01:50 payments
    1:01:50 on
    1:01:50 the
    1:01:51 blockchain
    1:01:51 which
    1:01:51 I
    1:01:51 think
    1:01:52 would
    1:01:52 be
    1:01:52 really
    1:01:53 good
    1:01:53 for
    1:01:53 transparency
    1:01:55 we never
    1:01:55 get into
    1:01:56 this weird
    1:01:56 situation
    1:01:57 now
    1:01:57 where
    1:01:58 half
    1:01:58 the
    1:01:58 country
    1:01:59 wants
    1:01:59 to
    1:02:00 tear
    1:02:00 down
    1:02:01 all
    1:02:01 the
    1:02:01 government
    1:02:02 services
    1:02:02 and half
    1:02:02 them
    1:02:03 wants
    1:02:03 to
    1:02:03 keep
    1:02:03 it
    1:02:03 because
    1:02:04 nobody
    1:02:04 knows
    1:02:04 what
    1:02:04 the
    1:02:05 spending
    1:02:05 is
    1:02:06 but
    1:02:07 that
    1:02:07 would
    1:02:07 be
    1:02:07 great
    1:02:08 but
    1:02:08 you know
    1:02:09 beyond
    1:02:09 that
    1:02:09 like
    1:02:09 if
    1:02:10 you
    1:02:10 think
    1:02:10 about
    1:02:11 well
    1:02:11 why
    1:02:11 is
    1:02:11 there
    1:02:12 so
    1:02:12 much
    1:02:12 waste
    1:02:12 and
    1:02:12 fraud
    1:02:14 well
    1:02:14 part
    1:02:14 of
    1:02:14 it
    1:02:14 is
    1:02:14 you know
    1:02:15 like
    1:02:16 you know
    1:02:17 I
    1:02:17 get
    1:02:18 taxed
    1:02:18 I
    1:02:18 give
    1:02:19 my
    1:02:19 money
    1:02:19 to
    1:02:19 the
    1:02:20 IRS
    1:02:21 the
    1:02:21 IRS
    1:02:22 gives
    1:02:22 it
    1:02:23 to
    1:02:23 Congress
    1:02:23 they
    1:02:24 you know
    1:02:24 do
    1:02:24 whatever
    1:02:24 they
    1:02:24 do
    1:02:25 with
    1:02:25 it
    1:02:25 and
    1:02:25 so
    1:02:25 forth
    1:02:26 and
    1:02:27 well
    1:02:27 how
    1:02:27 does
    1:02:27 it
    1:02:27 get
    1:02:27 to
    1:02:27 the
    1:02:28 people
    1:02:28 who
    1:02:28 need
    1:02:29 it
    1:02:30 you know
    1:02:30 that’s
    1:02:30 a
    1:02:30 very
    1:02:31 lossy
    1:02:31 process
    1:02:32 you know
    1:02:32 and
    1:02:32 we
    1:02:32 don’t
    1:02:32 even
    1:02:33 know
    1:02:33 they
    1:02:33 are
    1:02:33 and
    1:02:33 it’s
    1:02:33 very
    1:02:34 you know
    1:02:35 one of the things we found out during COVID
    1:02:38 the government’s not very good at sending people money
    1:02:39 it’s good at taking money
    1:02:41 it’s not good at sending them money
    1:02:41 right
    1:02:44 we lost like 400 billion dollars trying to give people
    1:02:44 stimulus
    1:02:45 ridiculous
    1:02:46 yeah
    1:02:46 crazy
    1:02:47 ridiculous
    1:02:48 you know
    1:02:48 like
    1:02:48 if
    1:02:49 everybody
    1:02:49 in the US
    1:02:50 had
    1:02:51 an address
    1:02:52 on the blockchain
    1:02:54 you could just tell me
    1:02:56 okay here’s 10,000 people who need money
    1:02:58 please send them
    1:02:59 you know
    1:03:00 $5,000 each
    1:03:02 well probably that’s too much money for me
    1:03:02 but
    1:03:03 you know
    1:03:04 something like that
    1:03:05 you know
    1:03:07 whatever my tax bill is
    1:03:09 or whatever that portion of wealth redistribution is
    1:03:11 that would be 100%
    1:03:12 zero loss
    1:03:13 by the way
    1:03:14 I’d feel a lot better about it
    1:03:16 because I’d know I’d be helping people
    1:03:17 and you know
    1:03:17 look
    1:03:18 maybe even
    1:03:19 somebody would go
    1:03:20 hey this is great
    1:03:21 thank you
    1:03:21 and we’re like
    1:03:22 maybe it would bring us
    1:03:24 we wouldn’t have this crazy class warfare
    1:03:25 because everybody would go
    1:03:26 hey we’re all integrated
    1:03:26 you know
    1:03:27 like I’m helping you
    1:03:28 you’re helping me
    1:03:30 and then
    1:03:31 like if you had that
    1:03:32 then you’d fix the whole
    1:03:33 kind of
    1:03:35 democracy integrity problem
    1:03:35 because
    1:03:37 everybody could vote
    1:03:38 off that address
    1:03:39 and by the way
    1:03:40 everybody would have an address
    1:03:42 because everybody would want the money
    1:03:44 so what bigger incentive
    1:03:46 to register to vote
    1:03:47 than
    1:03:48 you
    1:03:49 in order to get money
    1:03:50 you have to have an address
    1:03:52 which registers you to vote
    1:03:53 and so that kind of thing
    1:03:53 I think
    1:03:55 could get us to
    1:03:57 just like a much
    1:03:58 higher trust
    1:04:00 in our own institutions
    1:04:01 hmm
    1:04:02 yeah
    1:04:02 so
    1:04:03 wow
    1:04:04 speaking of that
    1:04:05 I wanted to
    1:04:05 absolutely
    1:04:06 scream
    1:04:08 into the abyss
    1:04:08 when I heard
    1:04:09 that people couldn’t retire
    1:04:10 from the government
    1:04:12 faster than the elevator
    1:04:13 would lower their records
    1:04:15 down into a mine
    1:04:16 I was like
    1:04:17 what is happening
    1:04:19 what do you take away from
    1:04:20 why does Elon
    1:04:22 want to do this
    1:04:22 why is he sleeping
    1:04:23 in hallways
    1:04:25 um
    1:04:26 why
    1:04:27 why is he doing this
    1:04:28 is it just to get
    1:04:28 government contracts
    1:04:29 and it’s nefarious
    1:04:31 in the way that so many people
    1:04:31 think it is
    1:04:32 or is there something
    1:04:33 positive there
    1:04:34 what’s the
    1:04:35 what’s the game
    1:04:36 I think there’s a
    1:04:38 couple of different things
    1:04:38 so
    1:04:40 one is
    1:04:41 the strong thing
    1:04:41 is he
    1:04:42 he truly believes
    1:04:43 that
    1:04:44 um
    1:04:45 America’s the best
    1:04:46 country in the world
    1:04:46 you know
    1:04:47 he is an immigrant
    1:04:48 um
    1:04:49 and that
    1:04:50 it’s not guaranteed
    1:04:51 to stay that way
    1:04:52 and
    1:04:54 and we have been
    1:04:56 in danger of losing it
    1:04:57 uh
    1:04:58 and
    1:04:59 you know
    1:05:00 and so the
    1:05:01 most important thing
    1:05:01 for him to do
    1:05:02 in order for his
    1:05:03 companies to be relevant
    1:05:04 in order for going to Mars
    1:05:05 to be relevant
    1:05:06 in order for anything
    1:05:07 he wants to do in life
    1:05:08 to be relevant
    1:05:08 is
    1:05:09 we’ve got to stabilize
    1:05:10 U.S. government
    1:05:11 I think that’s the main thing
    1:05:12 driving him
    1:05:12 you know
    1:05:13 so then you say
    1:05:13 well how did he get
    1:05:14 to that conclusion
    1:05:15 that
    1:05:16 you know
    1:05:17 the whole country
    1:05:18 is in jeopardy
    1:05:19 and it really
    1:05:19 like
    1:05:21 it’s a pretty
    1:05:22 uh
    1:05:22 it was a pretty
    1:05:23 interesting thing
    1:05:24 to watch
    1:05:24 um
    1:05:25 because
    1:05:26 right in 2021
    1:05:28 I think he was a democrat
    1:05:29 and he was certainly
    1:05:30 pretty apolitical
    1:05:32 and uh
    1:05:32 I was actually
    1:05:34 in a chat group
    1:05:34 with him
    1:05:35 um
    1:05:36 when he got
    1:05:37 the idea
    1:05:38 or posed
    1:05:38 the question
    1:05:39 should he buy
    1:05:40 Twitter
    1:05:41 um
    1:05:42 and
    1:05:44 a lot of it
    1:05:45 stemmed from
    1:05:45 um
    1:05:46 you know
    1:05:47 it started with
    1:05:48 the U.S. government
    1:05:50 just harassing him
    1:05:51 uh
    1:05:51 which was a very odd
    1:05:52 thing right
    1:05:53 like here’s your
    1:05:54 I mean I think
    1:05:55 you could very well
    1:05:56 argue he was our
    1:05:57 most productive citizen
    1:05:58 he um
    1:05:59 was our entire
    1:06:00 space program
    1:06:02 he advanced
    1:06:02 he advanced
    1:06:02 the state
    1:06:03 of electric
    1:06:03 cars
    1:06:04 by 20 years
    1:06:05 he’s still
    1:06:06 like 95%
    1:06:06 of there’s
    1:06:07 something like
    1:06:07 that
    1:06:07 percent
    1:06:08 of the electric
    1:06:09 cars sold
    1:06:09 in the U.S.
    1:06:12 you know he’s
    1:06:13 you know
    1:06:14 done the things
    1:06:15 with Neuralink
    1:06:16 to you know
    1:06:17 help people
    1:06:18 like uh
    1:06:19 use their arms
    1:06:20 and legs
    1:06:20 who have been
    1:06:21 paralyzed
    1:06:21 and this kind
    1:06:22 of thing
    1:06:22 so you know
    1:06:23 you’ve got
    1:06:24 um
    1:06:25 you know
    1:06:25 he’s a really
    1:06:26 kind of remarkable
    1:06:27 person to want
    1:06:28 to pick on
    1:06:29 but uh
    1:06:30 what happened
    1:06:31 was uh
    1:06:33 because he got
    1:06:34 like this PR
    1:06:35 for being very
    1:06:35 wealthy
    1:06:36 um
    1:06:37 the Biden
    1:06:37 administration
    1:06:38 targeted him
    1:06:39 um
    1:06:39 you know
    1:06:40 and again
    1:06:40 they’re fascists
    1:06:41 so uh
    1:06:42 it’s really
    1:06:43 a power struggle
    1:06:44 always with the
    1:06:44 fascists and
    1:06:45 anybody who looks
    1:06:46 like they’re
    1:06:46 becoming powerful
    1:06:48 and yeah
    1:06:48 some of the
    1:06:49 things they did
    1:06:50 and you know
    1:06:50 one of the ones
    1:06:51 that’s talked
    1:06:51 about a lot
    1:06:52 was um
    1:06:54 they sued him
    1:06:55 the Biden
    1:06:55 Department of
    1:06:56 Justice sued
    1:06:58 him for uh
    1:07:00 discriminating
    1:07:01 against refugees
    1:07:02 um
    1:07:03 but he had a
    1:07:04 contract with
    1:07:04 the U.S.
    1:07:05 Department of
    1:07:05 Defense
    1:07:06 that required
    1:07:06 him to only
    1:07:07 hire U.S.
    1:07:07 citizens
    1:07:09 so he was
    1:07:10 breaking the law
    1:07:10 either way
    1:07:11 uh
    1:07:11 and they
    1:07:12 they never
    1:07:13 dropped the lawsuit
    1:07:14 even after it was
    1:07:14 pointed out
    1:07:15 even after
    1:07:16 you know
    1:07:16 it was pointed
    1:07:17 out by Congress
    1:07:18 uh
    1:07:18 so it was
    1:07:19 clear harassment
    1:07:20 and I think
    1:07:21 that his
    1:07:22 conclusion from
    1:07:23 that was
    1:07:24 you know
    1:07:24 this
    1:07:25 we are
    1:07:26 like ironically
    1:07:27 um
    1:07:28 I think his
    1:07:29 conclusion was
    1:07:29 we’re losing
    1:07:30 the democracy
    1:07:31 um
    1:07:32 you know
    1:07:33 we’re going
    1:07:34 into this
    1:07:34 very very
    1:07:35 strange world
    1:07:36 uh
    1:07:36 where the
    1:07:36 incentives
    1:07:37 are all
    1:07:37 upside down
    1:07:38 and
    1:07:39 you know
    1:07:40 the way
    1:07:41 Elon thinks
    1:07:42 is
    1:07:42 it’s up to
    1:07:43 him to
    1:07:43 save it
    1:07:44 um
    1:07:45 and so he
    1:07:45 got like
    1:07:46 extremely
    1:07:46 involved
    1:07:47 and then I
    1:07:47 think the
    1:07:48 more involved
    1:07:49 he got
    1:07:49 the more
    1:07:51 he both
    1:07:52 realized like
    1:07:53 a lot of the
    1:07:53 things really
    1:07:54 were dangerous
    1:07:54 and then
    1:07:55 secondly
    1:07:56 that
    1:07:57 um
    1:07:57 he
    1:07:58 personally
    1:07:59 uh
    1:08:00 would be
    1:08:01 somebody who
    1:08:01 would know
    1:08:01 how to fix
    1:08:01 it
    1:08:02 and
    1:08:03 you go
    1:08:03 like
    1:08:03 well why
    1:08:04 the hell
    1:08:04 would Elon
    1:08:05 must know
    1:08:05 how to
    1:08:05 fix the
    1:08:06 government
    1:08:06 all this
    1:08:07 and you
    1:08:07 know
    1:08:08 this is
    1:08:08 the thing
    1:08:09 that everybody’s
    1:08:09 saying now
    1:08:10 uh
    1:08:11 and it’s
    1:08:11 funny
    1:08:11 because I
    1:08:12 told this
    1:08:13 to um
    1:08:14 Andreessen
    1:08:15 years ago
    1:08:15 uh
    1:08:16 was
    1:08:17 because I
    1:08:18 I’m a big
    1:08:19 fan of
    1:08:19 Isaac Newton
    1:08:20 and you
    1:08:20 know like
    1:08:21 we always
    1:08:21 talked about
    1:08:22 like who
    1:08:22 is Elon
    1:08:22 like
    1:08:23 you know
    1:08:23 like
    1:08:23 what
    1:08:24 entrepreneur
    1:08:25 comes to
    1:08:25 mind
    1:08:26 you know
    1:08:26 and it
    1:08:26 really
    1:08:27 wasn’t
    1:08:27 maybe
    1:08:28 Thomas
    1:08:28 Edison
    1:08:28 but not
    1:08:29 really
    1:08:30 um
    1:08:31 but Isaac
    1:08:32 Newton
    1:08:33 uh
    1:08:33 whoa
    1:08:34 was really
    1:08:34 the one
    1:08:35 that I
    1:08:35 always
    1:08:36 thought
    1:08:37 he was
    1:08:37 most
    1:08:37 like
    1:08:38 um
    1:08:38 you know
    1:08:39 because
    1:08:39 it’s
    1:08:39 like okay
    1:08:39 who
    1:08:40 can build
    1:08:41 like rockets
    1:08:41 and cars
    1:08:42 and this
    1:08:42 and that
    1:08:42 and the
    1:08:42 other
    1:08:43 but the
    1:08:43 reason I
    1:08:43 thought he
    1:08:44 was like
    1:08:44 Isaac
    1:08:44 Newton
    1:08:44 was
    1:08:45 at the
    1:08:46 end of
    1:08:46 Isaac
    1:08:46 Newton’s
    1:08:46 life
    1:08:47 um
    1:08:48 but I
    1:08:48 think
    1:08:48 he was
    1:08:49 he was
    1:08:49 in his
    1:08:49 late
    1:08:50 60s
    1:08:51 uh
    1:08:51 maybe
    1:08:52 like
    1:08:52 67
    1:08:53 68
    1:08:53 and this
    1:08:54 is you
    1:08:54 know
    1:08:54 for those
    1:08:54 of you
    1:08:54 don’t
    1:08:55 know
    1:08:56 Isaac
    1:08:56 Newton
    1:08:56 like
    1:08:56 he
    1:08:56 figured
    1:08:57 out
    1:08:57 how
    1:08:57 the
    1:08:57 entire
    1:08:58 world
    1:08:58 works
    1:08:58 and
    1:08:59 wrote
    1:08:59 it
    1:08:59 down
    1:08:59 in
    1:08:59 a
    1:08:59 book
    1:09:00 um
    1:09:01 you
    1:09:01 know
    1:09:01 called
    1:09:02 Principia
    1:09:02 Mathematica
    1:09:03 which is
    1:09:03 probably
    1:09:04 the most
    1:09:04 amazing
    1:09:05 work
    1:09:05 in the
    1:09:05 history
    1:09:05 of
    1:09:05 science
    1:09:06 uh
    1:09:07 and he
    1:09:07 did
    1:09:07 it
    1:09:08 like
    1:09:08 entirely
    1:09:08 by
    1:09:09 himself
    1:09:10 like
    1:09:11 he didn’t
    1:09:11 even talk
    1:09:11 to anybody
    1:09:12 at this
    1:09:12 you know
    1:09:12 at the
    1:09:13 time he
    1:09:13 wrote it
    1:09:13 I think
    1:09:14 he was
    1:09:14 trying to
    1:09:15 figure out
    1:09:16 what God
    1:09:16 was or
    1:09:16 something
    1:09:17 like that
    1:09:18 um
    1:09:19 but
    1:09:19 as you
    1:09:19 do
    1:09:20 but
    1:09:20 so he
    1:09:20 gets to
    1:09:21 be
    1:09:21 you know
    1:09:21 like
    1:09:22 in his
    1:09:22 late
    1:09:22 60s
    1:09:23 and
    1:09:23 the
    1:09:24 bank
    1:09:24 of
    1:09:24 England
    1:09:24 has
    1:09:25 a
    1:09:25 crisis
    1:09:25 which
    1:09:26 is
    1:09:26 causing
    1:09:27 a
    1:09:27 huge
    1:09:27 crisis
    1:09:27 for
    1:09:28 the
    1:09:28 whole
    1:09:28 country
    1:09:29 um
    1:09:29 which
    1:09:29 is
    1:09:30 uh
    1:09:30 there’s
    1:09:31 a
    1:09:31 giant
    1:09:32 counterfeiting
    1:09:32 problem
    1:09:33 so the
    1:09:33 currency
    1:09:33 is
    1:09:34 going to
    1:09:34 be
    1:09:34 undermined
    1:09:35 and
    1:09:35 uh
    1:09:36 England’s
    1:09:36 going to
    1:09:36 basically
    1:09:37 go
    1:09:37 bankrupt
    1:09:38 and
    1:09:39 so
    1:09:39 they
    1:09:39 had
    1:09:39 no
    1:09:40 idea
    1:09:40 what
    1:09:40 to
    1:09:40 do
    1:09:41 about
    1:09:41 it
    1:09:41 um
    1:09:41 so
    1:09:42 they
    1:09:42 call
    1:09:42 Isaac
    1:09:42 Newton
    1:09:43 because
    1:09:43 he’s
    1:09:43 the
    1:09:43 smartest
    1:09:44 man
    1:09:44 in the
    1:09:44 world
    1:09:45 of
    1:09:45 course
    1:09:45 you’re
    1:09:45 going
    1:09:45 to
    1:09:45 call
    1:09:46 him
    1:09:46 so
    1:09:46 Isaac
    1:09:47 Newton
    1:09:48 67
    1:09:48 year
    1:09:48 old
    1:09:49 like
    1:09:50 hermit
    1:09:51 physicist
    1:09:52 goes
    1:09:52 in
    1:09:53 and
    1:09:53 he
    1:09:53 says
    1:09:54 okay
    1:09:54 I
    1:09:54 can
    1:09:54 help
    1:09:55 with
    1:09:55 the
    1:09:55 problem
    1:09:56 make
    1:09:56 me
    1:09:56 CEO
    1:09:57 of
    1:09:57 the
    1:09:57 mint
    1:09:57 so
    1:09:58 they
    1:09:58 make
    1:09:58 him
    1:09:58 CEO
    1:09:58 of
    1:09:59 the
    1:09:59 mint
    1:09:59 you
    1:09:59 know
    1:09:59 kind
    1:10:00 of
    1:10:00 head
    1:10:00 of
    1:10:00 doge
    1:10:00 whatever
    1:10:02 uh
    1:10:02 and
    1:10:03 he
    1:10:04 reorganizes
    1:10:04 the
    1:10:04 mint
    1:10:05 in like
    1:10:05 a week
    1:10:06 and
    1:10:06 then
    1:10:06 fixes
    1:10:06 the
    1:10:07 technology
    1:10:07 in a
    1:10:07 month
    1:10:08 and
    1:10:09 completely
    1:10:09 makes
    1:10:09 it
    1:10:10 impossible
    1:10:10 to
    1:10:10 counterfeit
    1:10:11 then
    1:10:12 he
    1:10:12 becomes
    1:10:13 a
    1:10:13 private
    1:10:13 eye
    1:10:15 and
    1:10:15 goes
    1:10:15 into
    1:10:16 all
    1:10:16 the
    1:10:16 pubs
    1:10:16 where
    1:10:16 the
    1:10:17 counterfeiters
    1:10:17 are
    1:10:18 arrests
    1:10:18 all
    1:10:18 of
    1:10:18 them
    1:10:19 then
    1:10:20 he
    1:10:20 learns
    1:10:20 the
    1:10:20 law
    1:10:21 and
    1:10:21 becomes
    1:10:21 the
    1:10:22 prosecutor
    1:10:23 and
    1:10:23 prosecutes
    1:10:24 all
    1:10:24 the
    1:10:24 counterfeiters
    1:10:25 and has
    1:10:26 100%
    1:10:26 conviction
    1:10:27 record
    1:10:28 and
    1:10:28 that
    1:10:29 by
    1:10:29 the
    1:10:29 way
    1:10:30 that’s
    1:10:30 Elon
    1:10:30 so
    1:10:32 if
    1:10:32 you
    1:10:33 I
    1:10:33 didn’t
    1:10:33 know
    1:10:34 that
    1:10:34 part
    1:10:34 of
    1:10:34 his
    1:10:34 story
    1:10:35 oh
    1:10:35 yeah
    1:10:35 yeah
    1:10:36 so
    1:10:36 it’s
    1:10:36 an
    1:10:37 amazing
    1:10:37 thing
    1:10:39 and
    1:10:39 if
    1:10:39 you
    1:10:39 look
    1:10:40 at
    1:10:40 Elon
    1:10:40 and
    1:10:41 Doge
    1:10:41 to
    1:10:42 me
    1:10:42 the
    1:10:42 most
    1:10:43 remarkable
    1:10:43 thing
    1:10:43 about
    1:10:44 Doge
    1:10:45 is
    1:10:46 how
    1:10:46 he’s
    1:10:47 done
    1:10:47 it
    1:10:47 so
    1:10:48 if
    1:10:48 you
    1:10:48 or
    1:10:49 I
    1:10:50 were
    1:10:50 to
    1:10:51 say
    1:10:52 okay
    1:10:53 let’s
    1:10:53 go in
    1:10:54 and
    1:10:54 kind
    1:10:54 of
    1:10:55 get
    1:10:55 the
    1:10:55 waste
    1:10:55 and
    1:10:56 fraud
    1:10:56 out
    1:10:56 of
    1:10:56 the
    1:10:56 government
    1:10:57 we
    1:10:57 would
    1:10:58 like
    1:10:58 audit
    1:10:58 the
    1:10:59 departments
    1:10:59 or
    1:10:59 this
    1:10:59 and
    1:11:00 that
    1:11:00 the
    1:11:00 other
    1:11:00 and
    1:11:00 so
    1:11:01 forth
    1:11:02 no
    1:11:02 no
    1:11:02 no
    1:11:03 like
    1:11:03 that’s
    1:11:03 how
    1:11:03 he
    1:11:06 like
    1:11:07 how
    1:11:07 do
    1:11:07 the
    1:11:07 checks
    1:11:07 go
    1:11:08 out
    1:11:10 like
    1:11:11 how
    1:11:11 is
    1:11:11 the
    1:11:11 system
    1:11:12 designed
    1:11:12 like
    1:11:12 when
    1:11:13 does
    1:11:13 the
    1:11:13 money
    1:11:13 leave
    1:11:13 the
    1:11:14 building
    1:11:15 and
    1:11:15 then
    1:11:15 oh
    1:11:16 all
    1:11:16 comes
    1:11:16 out
    1:11:16 of
    1:11:17 one
    1:11:17 system
    1:11:18 let
    1:11:18 me
    1:11:18 have
    1:11:19 access
    1:11:19 to
    1:11:19 that
    1:11:19 system
    1:11:20 and
    1:11:20 I’ll
    1:11:20 look
    1:11:20 at
    1:11:20 all
    1:11:20 the
    1:11:21 payments
    1:11:22 I’m
    1:11:22 not
    1:11:22 asking
    1:11:22 anybody
    1:11:23 what
    1:11:23 they’re
    1:11:23 spending
    1:11:24 I’m
    1:11:24 looking
    1:11:25 at
    1:11:25 what
    1:11:25 they’re
    1:11:25 spending
    1:11:26 like
    1:11:26 I’m
    1:11:26 getting
    1:11:26 to
    1:11:26 ground
    1:11:27 truth
    1:11:27 and
    1:11:28 then
    1:11:28 I’m
    1:11:28 going
    1:11:28 to
    1:11:28 work
    1:11:28 my
    1:11:29 way
    1:11:29 backwards
    1:11:29 from
    1:11:30 there
    1:11:30 and
    1:11:30 he’s
    1:11:31 probably
    1:11:32 and
    1:11:32 you know
    1:11:33 so
    1:11:34 not only
    1:11:34 is he
    1:11:35 not
    1:11:35 unqualified
    1:11:36 he’s
    1:11:36 maybe the
    1:11:37 only person
    1:11:37 qualified
    1:11:38 to figure
    1:11:38 out
    1:11:38 like
    1:11:39 how
    1:11:46 you know
    1:11:46 he’s
    1:11:47 just
    1:11:47 a
    1:11:47 very
    1:11:48 unique
    1:11:48 individual
    1:11:51 he’s
    1:11:51 also
    1:11:52 a troll
    1:11:53 he also
    1:11:53 likes
    1:11:54 upsetting
    1:11:54 people
    1:11:55 I get
    1:11:55 all
    1:11:56 that
    1:11:57 but
    1:11:58 what
    1:11:58 he
    1:11:58 brings
    1:11:58 to
    1:11:58 the
    1:11:59 table
    1:11:59 is
    1:11:59 pretty
    1:11:59 interesting
    1:12:00 I would
    1:12:00 say
    1:12:01 I would
    1:12:01 say
    1:12:02 very
    1:12:02 extraordinary
    1:12:03 you
    1:12:04 have
    1:12:04 also
    1:12:04 written
    1:12:04 about
    1:12:05 another
    1:12:05 extraordinary
    1:12:06 historical
    1:12:07 figure
    1:12:09 from
    1:12:09 the
    1:12:10 Haitian
    1:12:10 revolution
    1:12:11 a guy
    1:12:11 named
    1:12:12 Toussaint
    1:12:16 Tell us
    1:12:16 about him
    1:12:16 because
    1:12:16 there’s
    1:12:17 something
    1:12:17 about
    1:12:17 this
    1:12:18 moment
    1:12:18 about
    1:12:19 being
    1:12:19 a
    1:12:19 master
    1:12:20 strategist
    1:12:20 about
    1:12:22 using
    1:12:22 what you
    1:12:23 have
    1:12:23 being
    1:12:24 creative
    1:12:24 that
    1:12:25 feels
    1:12:25 like
    1:12:25 it’s
    1:12:26 very
    1:12:27 apropos
    1:12:27 to
    1:12:28 this
    1:12:28 moment
    1:12:31 what
    1:12:31 made
    1:12:32 his
    1:12:32 story
    1:12:32 special
    1:12:33 yeah
    1:12:33 so
    1:12:34 Toussaint
    1:12:34 was
    1:12:35 another
    1:12:35 one
    1:12:35 of
    1:12:35 these
    1:12:36 characters
    1:12:36 in
    1:12:36 history
    1:12:37 like
    1:12:37 there
    1:12:37 are
    1:12:37 certain
    1:12:38 I
    1:12:39 call
    1:12:39 them
    1:12:39 like
    1:12:39 once
    1:12:39 in
    1:12:40 every
    1:12:40 400
    1:12:40 year
    1:12:41 type
    1:12:41 people
    1:12:43 where
    1:12:43 you
    1:12:43 just
    1:12:43 don’t
    1:12:43 see
    1:12:43 them
    1:12:44 that
    1:12:44 often
    1:12:44 but
    1:12:45 so
    1:12:46 it
    1:12:46 turns
    1:12:46 out
    1:12:46 like
    1:12:46 in
    1:12:47 the
    1:12:47 history
    1:12:47 of
    1:12:48 humanity
    1:12:50 there’s
    1:12:50 been
    1:12:51 one
    1:12:51 kind
    1:12:51 of
    1:12:52 successful
    1:12:53 slave
    1:12:53 revolt
    1:12:53 that
    1:12:54 like
    1:12:54 entered
    1:12:55 in
    1:12:55 an
    1:12:55 independent
    1:12:56 state
    1:12:57 which
    1:12:57 you
    1:12:57 know
    1:12:57 if
    1:12:57 you
    1:12:58 think
    1:12:58 about
    1:12:59 the
    1:12:59 history
    1:12:59 of
    1:13:00 slavery
    1:13:00 which
    1:13:00 goes
    1:13:01 back
    1:13:02 thousands
    1:13:02 of
    1:13:02 years
    1:13:02 really
    1:13:03 kind
    1:13:03 of
    1:13:04 from
    1:13:04 the
    1:13:05 beginning
    1:13:05 of
    1:13:05 written
    1:13:06 history
    1:13:06 like
    1:13:07 we’ve
    1:13:07 had
    1:13:07 slavery
    1:13:07 so
    1:13:08 it’s
    1:13:09 like
    1:13:09 a
    1:13:09 pretty
    1:13:10 old
    1:13:11 time
    1:13:11 construct
    1:13:12 and
    1:13:13 there’s
    1:13:13 a lot
    1:13:13 of
    1:13:14 motivation
    1:13:14 to have
    1:13:14 a
    1:13:14 revolt
    1:13:15 if
    1:13:15 you’re
    1:13:15 a
    1:13:15 slave
    1:13:17 but
    1:13:17 why
    1:13:17 only
    1:13:18 one
    1:13:18 successful
    1:13:18 one
    1:13:18 and
    1:13:19 it
    1:13:19 turns
    1:13:19 out
    1:13:19 it’s
    1:13:20 it’s
    1:13:20 really
    1:13:21 hard
    1:13:23 you know
    1:13:24 for
    1:13:25 to
    1:13:25 generate
    1:13:26 an
    1:13:26 effective
    1:13:27 revolt
    1:13:27 if
    1:13:27 you’re
    1:13:28 slaves
    1:13:28 because
    1:13:30 slave
    1:13:30 culture
    1:13:31 is
    1:13:32 difficult
    1:13:32 because
    1:13:32 you
    1:13:32 don’t
    1:13:32 own
    1:13:33 right
    1:13:33 if
    1:13:33 you
    1:13:34 don’t
    1:13:34 have
    1:13:36 any
    1:13:37 sense
    1:13:37 of
    1:13:38 you know
    1:13:38 owning
    1:13:38 anything
    1:13:39 you don’t
    1:13:39 own
    1:13:39 your
    1:13:39 own
    1:13:40 will
    1:13:40 right
    1:13:40 like
    1:13:40 you
    1:13:41 are
    1:13:42 at
    1:13:42 the
    1:13:42 kind
    1:13:43 of
    1:13:43 pleasure
    1:13:44 of
    1:13:44 who’s
    1:13:45 ever
    1:13:45 running
    1:13:45 things
    1:13:46 so
    1:13:47 long
    1:13:47 term
    1:13:48 thinking
    1:13:48 doesn’t
    1:13:48 make
    1:13:49 sense
    1:13:50 and
    1:13:51 what
    1:13:51 it
    1:13:52 because
    1:13:52 like
    1:13:52 why
    1:13:53 plan
    1:13:53 for
    1:13:53 next
    1:13:53 week
    1:13:54 it
    1:13:54 doesn’t
    1:13:55 matter
    1:13:55 what
    1:13:55 you
    1:13:55 plan
    1:13:56 like
    1:13:56 it’s
    1:13:56 not
    1:13:57 yours
    1:13:58 so
    1:13:58 everything’s
    1:13:58 going to be
    1:13:59 very short
    1:13:59 term
    1:14:00 and short
    1:14:00 termism
    1:14:01 is
    1:14:02 difficult
    1:14:02 in a
    1:14:02 military
    1:14:03 context
    1:14:04 because
    1:14:06 in order
    1:14:06 to have
    1:14:06 an effective
    1:14:07 military
    1:14:07 there needs
    1:14:08 to be
    1:14:09 a
    1:14:09 trust
    1:14:10 right
    1:14:10 like
    1:14:10 a
    1:14:10 trust
    1:14:12 you know
    1:14:12 you have
    1:14:12 to be
    1:14:12 able
    1:14:12 to
    1:14:13 trust
    1:14:13 people
    1:14:13 to
    1:14:14 execute
    1:14:14 the
    1:14:14 order
    1:14:14 like
    1:14:15 I give
    1:14:15 an
    1:14:15 order
    1:14:16 it’s
    1:14:16 kind
    1:14:16 of
    1:14:16 like
    1:14:17 the
    1:14:17 Byzantine
    1:14:17 generals
    1:14:18 problem
    1:14:19 to go
    1:14:19 back
    1:14:19 to
    1:14:19 crypto
    1:14:21 where
    1:14:22 like
    1:14:23 I have
    1:14:23 to
    1:14:23 trust
    1:14:23 that
    1:14:24 you’re
    1:14:24 going
    1:14:24 to
    1:14:24 do
    1:14:24 the
    1:14:24 order
    1:14:25 you
    1:14:25 have
    1:14:25 to
    1:14:25 trust
    1:14:25 that
    1:14:26 I’m
    1:14:26 giving
    1:14:26 the
    1:14:26 correct
    1:14:27 order
    1:14:28 but
    1:14:28 trust
    1:14:28 is
    1:14:28 a
    1:14:29 long
    1:14:29 term
    1:14:29 idea
    1:14:29 because
    1:14:30 it
    1:14:30 comes
    1:14:30 from
    1:14:30 okay
    1:14:31 I’m
    1:14:31 going to
    1:14:31 do
    1:14:32 something
    1:14:32 for
    1:14:32 you
    1:14:32 today
    1:14:33 because
    1:14:33 I
    1:14:33 trust
    1:14:33 that
    1:14:33 down
    1:14:33 the
    1:14:34 line
    1:14:34 you’ll
    1:14:34 do
    1:14:34 something
    1:14:34 for
    1:14:35 me
    1:14:35 that
    1:14:36 doesn’t
    1:14:36 really
    1:14:36 exist
    1:14:36 in
    1:14:47 to
    1:14:47 running
    1:14:47 a
    1:14:48 successful
    1:14:48 revolution
    1:14:49 and
    1:14:49 then
    1:14:49 if
    1:14:49 you
    1:14:49 look
    1:14:49 at
    1:14:50 Haiti
    1:14:50 at
    1:14:50 the
    1:14:51 time
    1:14:53 you
    1:14:53 had
    1:14:53 the
    1:14:54 French
    1:14:54 army
    1:14:54 the
    1:14:55 British
    1:14:55 army
    1:14:57 and
    1:14:57 the
    1:14:57 Spanish
    1:14:58 army
    1:14:58 all
    1:14:58 in
    1:14:58 there
    1:14:59 fighting
    1:14:59 for it
    1:15:00 so
    1:15:00 really
    1:15:01 well
    1:15:01 developed
    1:15:03 the strongest
    1:15:04 militaries
    1:15:04 of the
    1:15:04 era
    1:15:05 all
    1:15:06 in
    1:15:06 that
    1:15:06 region
    1:15:07 all
    1:15:07 very
    1:15:08 interested
    1:15:08 in
    1:15:09 the
    1:15:09 sugar
    1:15:11 which
    1:15:11 was
    1:15:12 quite
    1:15:12 valuable
    1:15:12 at
    1:15:12 the
    1:15:13 time
    1:15:13 so
    1:15:14 how
    1:15:14 in
    1:15:14 the
    1:15:14 world
    1:15:14 would
    1:15:15 you ever
    1:15:15 get out
    1:15:15 of
    1:15:15 that
    1:15:17 and
    1:15:17 it
    1:15:17 turned
    1:15:17 out
    1:15:18 he
    1:15:19 was
    1:15:20 probably
    1:15:20 the
    1:15:21 great
    1:15:21 cultural
    1:15:22 genius
    1:15:22 of
    1:15:22 the
    1:15:22 last
    1:15:23 you know
    1:15:25 maybe
    1:15:25 in
    1:15:25 history
    1:15:26 but
    1:15:26 certainly
    1:15:26 the
    1:15:26 last
    1:15:27 several
    1:15:27 hundred
    1:15:28 years
    1:15:30 and
    1:15:31 he
    1:15:32 was
    1:15:32 able
    1:15:33 because
    1:15:34 he
    1:15:34 was
    1:15:34 a
    1:15:35 person
    1:15:35 who
    1:15:35 although
    1:15:35 he
    1:15:36 was
    1:15:36 born
    1:15:36 a
    1:15:36 slave
    1:15:37 was
    1:15:37 very
    1:15:38 very
    1:15:38 integrated
    1:15:38 into
    1:15:39 European
    1:15:39 culture
    1:15:40 because
    1:15:40 he
    1:15:40 was
    1:15:40 so
    1:15:41 smart
    1:15:42 and
    1:15:43 so
    1:15:44 the
    1:15:44 person
    1:15:44 who
    1:15:45 ran
    1:15:45 the
    1:15:45 plantation
    1:15:47 kind
    1:15:47 of
    1:15:48 took
    1:15:48 him
    1:15:48 to
    1:15:48 all
    1:15:48 the
    1:15:49 diplomatic
    1:15:49 meetings
    1:15:50 around
    1:15:50 and
    1:15:50 so
    1:15:51 forth
    1:15:51 and
    1:15:51 he
    1:15:52 got
    1:15:52 very
    1:15:52 involved
    1:15:53 and
    1:15:53 kind
    1:15:53 of
    1:15:53 mastered
    1:15:55 European
    1:15:55 culture
    1:15:57 so to
    1:15:57 speak
    1:15:57 and
    1:15:57 the
    1:15:57 different
    1:15:58 subtleties
    1:15:58 around
    1:15:59 it
    1:16:00 and
    1:16:00 he
    1:16:01 started
    1:16:01 adopting
    1:16:01 those
    1:16:02 things
    1:16:02 and
    1:16:02 applying
    1:16:03 them
    1:16:03 to
    1:16:03 his
    1:16:04 leadership
    1:16:04 and
    1:16:05 then
    1:16:06 furthermore
    1:16:07 incorporated
    1:16:08 Europeans
    1:16:09 into
    1:16:09 the slave
    1:16:10 army
    1:16:10 so he
    1:16:10 would
    1:16:11 capture
    1:16:12 you know
    1:16:13 he
    1:16:13 would
    1:16:14 defeat
    1:16:14 the
    1:16:14 Spanish
    1:16:15 capture
    1:16:16 some
    1:16:16 guys
    1:16:16 rather
    1:16:17 than
    1:16:17 kill
    1:16:17 them
    1:16:17 he
    1:16:18 incorporated
    1:16:19 the best
    1:16:19 leaders
    1:16:19 into
    1:16:20 his
    1:16:20 army
    1:16:21 and
    1:16:21 he
    1:16:21 built
    1:16:21 this
    1:16:22 very
    1:16:23 advanced
    1:16:23 hybrid
    1:16:24 fighting
    1:16:24 system
    1:16:24 where
    1:16:25 they
    1:16:25 used
    1:16:25 a lot
    1:16:25 of
    1:16:25 the
    1:16:26 guerrilla
    1:16:27 techniques
    1:16:28 that
    1:16:28 he
    1:16:29 had
    1:16:29 brought
    1:16:29 over
    1:16:29 from
    1:16:30 Africa
    1:16:31 and
    1:16:32 then
    1:16:32 he
    1:16:33 had
    1:16:34 combined
    1:16:34 that
    1:16:35 with
    1:16:35 some
    1:16:36 of
    1:16:36 the
    1:16:37 more
    1:16:38 regimented
    1:16:40 discipline
    1:16:41 strategies
    1:16:42 of
    1:16:42 the
    1:16:43 Europeans
    1:16:44 and
    1:16:44 in
    1:16:44 building
    1:16:44 all
    1:16:44 that
    1:16:45 he
    1:16:46 ended
    1:16:46 up
    1:16:46 building
    1:16:46 this
    1:16:47 massive
    1:16:47 army
    1:16:48 and
    1:16:49 defeated
    1:16:49 Napoleon
    1:16:50 and
    1:16:51 everyone
    1:16:51 else
    1:16:51 and
    1:16:52 it
    1:16:52 was
    1:16:52 just
    1:16:53 quite
    1:16:53 a
    1:16:54 remarkable
    1:16:54 story
    1:16:55 about
    1:16:55 how
    1:16:55 he
    1:16:56 figured
    1:16:56 everything
    1:16:57 out
    1:16:57 from
    1:16:57 his
    1:16:58 principles
    1:16:58 and
    1:16:58 in
    1:16:58 a
    1:16:59 way
    1:16:59 that
    1:16:59 was
    1:17:01 very
    1:17:02 much
    1:17:02 like
    1:17:03 Elon
    1:17:03 in
    1:17:04 that
    1:17:04 sense
    1:17:05 one
    1:17:05 of
    1:17:06 the
    1:17:06 things
    1:17:06 I heard
    1:17:06 you talk
    1:17:07 about
    1:17:10 that
    1:17:10 I
    1:17:10 thought
    1:17:11 was
    1:17:11 so
    1:17:11 ingenious
    1:17:12 was
    1:17:12 he
    1:17:12 would
    1:17:12 basically
    1:17:13 use
    1:17:13 song
    1:17:14 and
    1:17:15 sound
    1:17:16 as
    1:17:16 like
    1:17:16 encrypted
    1:17:17 language
    1:17:18 it’s
    1:17:19 really
    1:17:19 yeah
    1:17:19 so
    1:17:20 that
    1:17:20 was
    1:17:21 like
    1:17:21 a
    1:17:21 very
    1:17:21 cool
    1:17:22 thing
    1:17:22 so
    1:17:23 right
    1:17:24 remember
    1:17:24 that
    1:17:24 this
    1:17:24 is
    1:17:24 in
    1:17:25 the
    1:17:25 days
    1:17:25 before
    1:17:26 telephony
    1:17:26 or the
    1:17:27 internet
    1:17:27 or any
    1:17:28 of
    1:17:28 these
    1:17:28 things
    1:17:28 you know
    1:17:28 it’s
    1:17:29 pre
    1:17:29 Alexander
    1:17:30 Graham Bell
    1:17:30 and all
    1:17:30 that
    1:17:30 kind
    1:17:30 of
    1:17:31 thing
    1:17:32 and
    1:17:33 so
    1:17:34 you know
    1:17:34 they were
    1:17:34 literally
    1:17:35 you know
    1:17:35 the Europeans
    1:17:36 were on
    1:17:36 like
    1:17:37 you know
    1:17:37 notes
    1:17:38 carrier pigeons
    1:17:40 guys running
    1:17:41 you know
    1:17:41 back and
    1:17:42 forth
    1:17:42 and so
    1:17:42 forth
    1:17:43 and so
    1:17:43 as a
    1:17:43 result
    1:17:45 you know
    1:17:45 you kind
    1:17:45 of needed
    1:17:46 the army
    1:17:47 together
    1:17:47 in one
    1:17:48 place
    1:17:48 just
    1:17:48 so you
    1:17:49 could
    1:17:49 communicate
    1:17:49 the
    1:17:50 order
    1:17:52 Toussaint
    1:17:53 basically
    1:17:54 you know
    1:17:55 had these
    1:17:57 drummers
    1:17:57 and these
    1:17:58 songs
    1:17:59 which
    1:17:59 he could
    1:18:00 put on
    1:18:00 top of
    1:18:00 like
    1:18:02 the hill
    1:18:02 who could
    1:18:03 be very
    1:18:03 very loud
    1:18:04 and then
    1:18:04 he would
    1:18:04 separate his
    1:18:05 army
    1:18:05 you know
    1:18:06 into like
    1:18:06 six or
    1:18:07 seven groups
    1:18:09 but in
    1:18:10 the song
    1:18:11 would be
    1:18:11 embedded
    1:18:13 the order
    1:18:13 of when
    1:18:14 to attack
    1:18:14 and you
    1:18:15 know
    1:18:15 when to
    1:18:16 retreat
    1:18:16 and all
    1:18:17 these kinds
    1:18:17 of things
    1:18:17 so he
    1:18:18 had this
    1:18:18 like
    1:18:18 super
    1:18:19 advanced
    1:18:20 you know
    1:18:20 wide area
    1:18:21 communication
    1:18:22 system
    1:18:23 that nobody
    1:18:24 else had
    1:18:24 and that
    1:18:24 you know
    1:18:25 that was a
    1:18:26 big advantage
    1:18:27 for him
    1:18:28 yeah that to
    1:18:29 me the reason
    1:18:29 that that
    1:18:30 comes up
    1:18:30 for me
    1:18:30 now is
    1:18:31 we have
    1:18:31 all these
    1:18:32 new
    1:18:32 technologies
    1:18:33 that are
    1:18:33 coming
    1:18:33 online
    1:18:33 and the
    1:18:34 person
    1:18:34 that’s
    1:18:34 going to
    1:18:34 be able
    1:18:34 to get
    1:18:35 outside
    1:18:35 that box
    1:18:37 and see
    1:18:37 something
    1:18:38 new
    1:18:38 and fresh
    1:18:39 is going
    1:18:39 to be able
    1:18:39 to use
    1:18:40 this in
    1:18:40 totally
    1:18:41 different
    1:18:41 ways
    1:18:41 and while
    1:18:42 in the
    1:18:43 final analysis
    1:18:43 I think
    1:18:44 you and I
    1:18:44 see it
    1:18:44 very differently
    1:18:45 in terms
    1:18:46 of AI’s
    1:18:46 ability
    1:18:47 to ultimately
    1:18:48 gobble up
    1:18:49 what humans
    1:18:49 can do
    1:18:50 but right
    1:18:51 now
    1:18:52 AI is this
    1:18:53 incredible
    1:18:54 tool that
    1:18:54 as an
    1:18:54 entrepreneur
    1:18:55 for me
    1:18:56 it has
    1:18:56 been
    1:18:58 ridiculously
    1:18:59 exciting
    1:19:00 to one
    1:19:01 see how
    1:19:01 much farther
    1:19:02 each of my
    1:19:02 employees
    1:19:03 can push
    1:19:03 their own
    1:19:04 abilities
    1:19:04 by using
    1:19:04 AI
    1:19:05 and then
    1:19:05 it does
    1:19:06 not take
    1:19:06 much to
    1:19:07 prognosticate
    1:19:08 out you
    1:19:08 know 12
    1:19:09 18 months
    1:19:10 to understand
    1:19:11 where the
    1:19:11 tools are
    1:19:12 going to be
    1:19:12 and how
    1:19:12 much more
    1:19:13 they’re going
    1:19:13 to let
    1:19:13 you do
    1:19:13 because
    1:19:13 we’re
    1:19:14 largely
    1:19:14 an
    1:19:15 entertainment
    1:19:15 company
    1:19:16 so for
    1:19:17 us to
    1:19:17 look at
    1:19:17 that
    1:19:18 and just
    1:19:19 the
    1:19:20 revolutionary
    1:19:21 changes
    1:19:21 but you
    1:19:22 can’t be
    1:19:22 trapped
    1:19:23 inside the
    1:19:23 old way
    1:19:23 of thinking
    1:19:24 you’ve got
    1:19:25 to like
    1:19:25 you said
    1:19:25 build up
    1:19:26 from first
    1:19:26 principles
    1:19:27 yeah it’s
    1:19:27 a new
    1:19:28 creative
    1:19:28 canvas
    1:19:29 I think
    1:19:29 that’s
    1:19:29 like a
    1:19:30 really great
    1:19:30 way of
    1:19:30 thinking
    1:19:31 about it
    1:19:31 in that
    1:19:33 it’s
    1:19:33 like
    1:19:34 well is
    1:19:34 your
    1:19:35 creativity
    1:19:36 going to
    1:19:37 be used
    1:19:37 on
    1:19:39 you know
    1:19:40 kind of
    1:19:41 frame
    1:19:42 by frame
    1:19:42 editing
    1:19:43 of like
    1:19:44 a video
    1:19:45 or will
    1:19:45 it be
    1:19:46 thinking
    1:19:46 of like
    1:19:47 incredible
    1:19:47 new
    1:19:47 things
    1:19:48 you can
    1:19:48 do
    1:19:48 in a
    1:19:48 video
    1:19:49 ad
    1:19:50 that
    1:19:50 you
    1:19:50 could
    1:19:50 never
    1:19:50 do
    1:19:51 before
    1:19:51 and
    1:19:51 have
    1:19:52 the
    1:19:52 AI
    1:19:52 do
    1:19:52 that
    1:19:52 for
    1:19:53 you
    1:19:53 you know
    1:19:53 like
    1:19:54 and so
    1:19:54 it’s
    1:19:54 a little
    1:19:54 bit
    1:19:55 of a
    1:19:55 readjustment
    1:19:56 of
    1:19:57 where
    1:19:58 you put
    1:19:58 your
    1:19:59 creative
    1:20:00 energy
    1:20:00 into
    1:20:01 and
    1:20:01 the
    1:20:01 things
    1:20:01 that
    1:20:01 are
    1:20:02 possible
    1:20:02 and
    1:20:02 so
    1:20:02 forth
    1:20:03 and
    1:20:03 I
    1:20:03 think
    1:20:03 that’s
    1:20:04 you know
    1:20:04 we’re
    1:20:04 really
    1:20:05 seeing
    1:20:05 that
    1:20:05 across
    1:20:05 the
    1:20:05 board
    1:20:10 is
    1:20:10 this
    1:20:11 going
    1:20:11 to
    1:20:11 mean
    1:20:12 you
    1:20:12 know
    1:20:12 like
    1:20:13 you
    1:20:13 don’t
    1:20:13 have
    1:20:13 human
    1:20:14 investors
    1:20:14 anymore
    1:20:15 and
    1:20:15 it’s
    1:20:15 actually
    1:20:15 been
    1:20:16 like
    1:20:16 totally
    1:20:16 the
    1:20:17 opposite
    1:20:17 like
    1:20:18 instead
    1:20:19 of
    1:20:19 this
    1:20:20 like
    1:20:22 painstakingly
    1:20:23 collecting
    1:20:24 you know
    1:20:24 all
    1:20:24 all
    1:20:25 the
    1:20:25 data
    1:20:25 needed
    1:20:25 to
    1:20:26 put
    1:20:26 the
    1:20:26 investment
    1:20:27 memo
    1:20:27 together
    1:20:28 like
    1:20:28 yeah
    1:20:28 it
    1:20:28 just
    1:20:28 does
    1:20:29 that
    1:20:29 for
    1:20:29 you
    1:20:29 and
    1:20:30 then
    1:20:30 you’re
    1:20:30 just
    1:20:30 thinking
    1:20:30 about
    1:20:31 like
    1:20:31 okay
    1:20:31 what
    1:20:31 are
    1:20:32 the
    1:20:32 like
    1:20:32 the
    1:20:33 really
    1:20:33 compelling
    1:20:33 things
    1:20:34 about
    1:20:34 this
    1:20:35 or
    1:20:35 rather
    1:20:36 than
    1:20:36 you know
    1:20:37 trying
    1:20:37 to
    1:20:37 track
    1:20:38 every
    1:20:39 entrepreneur
    1:20:40 and
    1:20:41 like
    1:20:41 great
    1:20:41 engineer
    1:20:42 in our
    1:20:42 database
    1:20:43 the AI
    1:20:43 is
    1:20:43 just
    1:20:44 tracking
    1:20:44 all
    1:20:44 those
    1:20:44 people
    1:20:45 and
    1:20:45 letting
    1:20:45 you
    1:20:45 know
    1:20:45 hey
    1:20:46 that
    1:20:46 guy
    1:20:46 just
    1:20:46 updated
    1:20:47 his
    1:20:47 LinkedIn
    1:20:47 profile
    1:20:48 or
    1:20:54 like
    1:20:54 a
    1:20:54 much
    1:20:55 more
    1:20:55 kind
    1:20:56 of
    1:20:56 fun
    1:20:56 part
    1:20:56 of
    1:20:56 the
    1:20:57 game
    1:20:58 and
    1:20:59 and
    1:21:00 so
    1:21:00 you know
    1:21:00 look
    1:21:01 I
    1:21:01 would
    1:21:02 say
    1:21:02 the
    1:21:03 best
    1:21:03 predictor
    1:21:04 of
    1:21:04 kind
    1:21:04 of
    1:21:05 how
    1:21:05 things
    1:21:05 are
    1:21:05 going
    1:21:05 to
    1:21:06 go
    1:21:06 are
    1:21:07 more
    1:21:07 like
    1:21:07 what’s
    1:21:08 happening
    1:21:08 now
    1:21:08 than
    1:21:10 like
    1:21:10 the
    1:21:10 most
    1:21:11 dystopian
    1:21:11 view
    1:21:12 of it
    1:21:12 that
    1:21:13 we
    1:21:13 can
    1:21:14 possibly
    1:21:14 think
    1:21:14 of
    1:21:15 which
    1:21:15 I
    1:21:15 think
    1:21:16 is
    1:21:16 where
    1:21:16 a lot
    1:21:16 of
    1:21:16 people
    1:21:16 go
    1:21:17 to
    1:21:17 and
    1:21:17 like
    1:21:17 I
    1:21:17 said
    1:21:17 I
    1:21:17 think
    1:21:17 some
    1:21:18 of
    1:21:18 that’s
    1:21:18 the
    1:21:18 name
    1:21:19 you
    1:21:19 know
    1:21:20 artificial
    1:21:21 intelligence
    1:21:21 we
    1:21:22 hate
    1:21:23 everything
    1:21:23 artificial
    1:21:24 so
    1:21:25 why
    1:21:25 do
    1:21:25 we
    1:21:26 name
    1:21:26 it
    1:21:26 artificial
    1:21:28 that’s
    1:21:28 too
    1:21:28 true
    1:21:29 Ben
    1:21:29 I’ve
    1:21:30 enjoyed
    1:21:30 every
    1:21:30 minute
    1:21:30 of
    1:21:31 this
    1:21:31 where
    1:21:31 can
    1:21:31 people
    1:21:31 keep
    1:21:32 up
    1:21:32 with
    1:21:32 you
    1:21:33 yeah
    1:21:33 well
    1:21:33 I
    1:21:34 am
    1:21:35 B
    1:21:35 Horowitz
    1:21:35 on
    1:21:36 AX
    1:21:37 and
    1:21:38 you
    1:21:39 know
    1:21:40 that’s
    1:21:40 probably
    1:21:40 the
    1:21:40 best
    1:21:41 thing
    1:21:41 we’re
    1:21:42 a16z.com
    1:21:43 and
    1:21:45 hope
    1:21:45 you
    1:21:45 enjoyed
    1:21:45 it
    1:21:46 and
    1:21:46 that
    1:21:46 was
    1:21:47 great
    1:21:47 fun
    1:21:47 good
    1:21:48 fun
    1:21:48 catching
    1:21:48 it
    1:21:49 was
    1:21:49 indeed
    1:21:49 and
    1:21:49 then
    1:21:49 you
    1:21:49 also
    1:21:50 have
    1:21:51 multiple
    1:21:51 books
    1:21:52 that
    1:21:52 people
    1:21:52 can
    1:21:52 read
    1:21:52 that
    1:21:53 are
    1:21:53 extraordinarily
    1:21:54 well
    1:21:55 respected
    1:21:55 in the
    1:21:55 field
    1:21:56 so
    1:21:57 also
    1:21:57 thank
    1:21:57 you
    1:21:57 for
    1:21:57 those
    1:21:58 absolutely
    1:22:00 awesome
    1:22:00 thanks
    1:22:00 all right
    1:22:01 well thank
    1:22:01 you brother
    1:22:02 I appreciate
    1:22:02 it
    1:22:02 all right
    1:22:02 everybody
    1:22:03 if you
    1:22:03 have
    1:22:03 not
    1:22:04 already
    1:22:04 be
    1:22:04 sure
    1:22:04 to
    1:22:04 subscribe
    1:22:05 and
    1:22:05 until
    1:22:05 next
    1:22:05 time
    1:22:06 my
    1:22:06 friends
    1:22:06 be
    1:22:07 legendary
    1:22:07 take
    1:22:07 care
    1:22:08 peace
    1:22:20 we’ve
    1:22:20 got
    1:22:20 more
    1:22:20 great
    1:22:21 conversations
    1:22:21 coming
    1:22:21 your
    1:22:21 way
    1:22:22 see
    1:22:22 you
    1:22:22 next
    1:22:22 time

    This week on the a16z Podcast, we’re sharing a feed drop from Impact Theory with Tom Bilyeu, featuring a wide-ranging conversation with a16z cofounder Ben Horowitz.

    Artificial intelligence isn’t just a tool — it’s a tectonic shift. In this episode, Ben joins Tom to break down what AI really is (and isn’t), where it’s taking us, and why it matters. They dive into the historical parallels, the looming policy battles, and how innovation cycles have always created — not destroyed — opportunity.

    From the future of work and education to the global AI race and the role of blockchain in preserving trust, Ben shares hard-won insights from decades at the forefront of technological disruption. It’s a masterclass in long-term thinking for anyone building, investing, or navigating what’s coming next.

    Resources: 

    Listen to more episodes of Impact Theory with Tom Bilyeu: https://link.chtbl.com/impacttheory

    Watch full conversations on YouTube: youtube.com/tombilyeu
    Follow Tom on Instagram: @tombilyeu

    Learn more about Impact Theory: impacttheory.com

    Timecodes: 

    00:00 Introduction to Impact Theory with Ben Horowitz

    01:12 The Disruptive Power of AI

    02:01 Understanding AI and Its Implications

    04:19 The Future of Jobs in an AI-Driven World

    06:52 Human Intelligence vs. Artificial Intelligence

    10:31 The Role of AI in Society

    21:41 AI and the Future of Work

    35:07 The AI Race: US vs. China

    41:25 The Importance of Blockchain in an AI World

    44:26 Government Regulation and Blockchain

    45:16 The Need for Stablecoins

    45:45 Energy Challenges and AI

    49:53 Market Structure Bill and Token Regulation

    53:51 Blockchain’s Trust and Adoption

    01:04:17 Elon Musk’s Government Involvement

    01:12:03 Historical Figures and Modern Parallels

    01:18:41 AI and Creativity in Business

    01:21:29 Conclusion and Final Thoughts

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures

  • Stablecoins & the Future Financial System

    AI transcript
    0:00:03 – Crypto can help decentralize the power structures
    0:00:05 that are emerging in AI.
    0:00:08 – Chris always talks about, do you wanna be the indie band
    0:00:11 or do you wanna play like the Super Bowl or the mega stadium?
    0:00:14 And I think like stable coins really have the ability
    0:00:16 to appeal to like a much broader audience.
    0:00:18 – Stable coins are beginning to really gain traction.
    0:00:21 So there’s something like $16 trillion in volume
    0:00:23 on stable coins per year.
    0:00:25 – I actually think it’s a great time
    0:00:28 for folks to be building token networks.
    0:00:32 – Crypto is like a fundamentally radical set of technologies
    0:00:34 that is very, very hard for incumbent players
    0:00:36 to adopt and run with,
    0:00:38 precisely because it is so fundamentally disruptive
    0:00:40 to the way that they do things.
    0:00:43 – What’s actually working in crypto right now?
    0:00:46 For a long time, the space has been defined by bold visions
    0:00:48 and has elicited strong skepticism.
    0:00:50 So today we’re getting clear on what’s real,
    0:00:53 what’s being used at scale and what’s coming next.
    0:00:57 Joining me are two of my fellow general partners here at A16Z,
    0:01:00 Ali Yahya, who leads investment across crypto infrastructure
    0:01:01 and developer tools.
    0:01:04 And Ariana Simpson, who focuses on early stage crypto networks
    0:01:07 and founders building at the edge.
    0:01:09 We get into why stable coins may finally be crypto’s
    0:01:13 breakout product, how AI agents are creating new demand
    0:01:16 for crypto rails, and what’s changing in policy that could unlock
    0:01:18 the next wave of token networks.
    0:01:22 We also talk about the enduring vision for decentralized social,
    0:01:24 the evolving smart contract landscape,
    0:01:27 and why Ethereum is still widely misunderstood.
    0:01:29 Let’s get into it.
    0:01:35 As a reminder, the content here is for informational purposes only,
    0:01:38 should not be taken as legal business, tax, or investment advice,
    0:01:41 or be used to evaluate any investment or security,
    0:01:45 and is not directed at any investors or potential investors in any A16Z fund.
    0:01:50 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
    0:01:57 For more details, including a link to our investments, please see A16Z.com forward slash disclosures.
    0:02:06 So I’m excited to do a deep dive with you on where we’re at today in this space.
    0:02:11 So crypto is a space where people have long been excited about the vision and the potential,
    0:02:17 and people have long also been skeptical about where the use case is, what’s happening, what’s actually working.
    0:02:20 So here we are in May 2025.
    0:02:24 Why don’t you give some context on what’s actually worked so far, what’s working right now?
    0:02:30 It’s quite interesting because if you go back all the way to 2009 when the original Bitcoin white paper was published,
    0:02:36 one of the first few lines of the paper describes Bitcoin as a peer-to-peer electronic payment system.
    0:02:39 which was kind of the original vision behind what a blockchain could do.
    0:02:48 And it’s really taken us like 15, 16 years to get to a point where the technology is mature enough to actually make that a reality.
    0:02:50 And this is now manifesting with stable coins.
    0:02:57 So some of the big issues that Bitcoin had that made it impossible for Bitcoin to become that peer-to-peer electronic payment system
    0:03:02 is that one, it was extremely inefficient and very slow, and it still is.
    0:03:06 And therefore, it’s become more of a store of value type of system as opposed to stable coins.
    0:03:09 And two, Bitcoin is not a stable unit of account.
    0:03:11 And so it’s very hard to use it for payments.
    0:03:16 And so since then, one of the big things that has happened is that the infrastructure has matured tremendously
    0:03:21 to the point at which now we are at a level where a transaction of any amount of money
    0:03:25 can be done for less than a penny in cost.
    0:03:27 And it can be done in under a second, roughly.
    0:03:32 Like those numbers are approximate, which finally makes something like a peer-to-peer transaction
    0:03:35 of a few dollars viable on the blockchain.
    0:03:41 And that combined with the regulatory clarity that we’re having now as of the new administration
    0:03:44 makes stable coins something that are really beginning to happen.
    0:03:48 So that’s maybe like the biggest thing that’s going on in the crypto world at the moment
    0:03:51 is that stable coins are beginning to really gain traction.
    0:03:55 So there’s something like $16 trillion in volume on stable coins per year.
    0:04:01 And there are many traditional financial institutions that are beginning to use stable coins
    0:04:05 to rip out a lot of the back end of their financial systems.
    0:04:06 And these are like fintech companies.
    0:04:07 Think Stripe.
    0:04:08 Think Revolut.
    0:04:08 Think Robinhood.
    0:04:13 Some of the companies in the traditional financial system that rely heavily on the trad financial system
    0:04:16 are now realizing that stable coins are a much better way to do things.
    0:04:19 So that’s kind of like the biggest thing that’s going on.
    0:04:23 And we believe that will likely lead to this cascading trend of adoption.
    0:04:29 Because once stable coins become a more kind of mainstay of the way that the financial system works,
    0:04:35 that opens the door for a lot of the other more advanced and futuristic ideas that crypto has introduced,
    0:04:38 like DeFi, to begin to also gain adoption.
    0:04:42 And I think that as a result will lead to all of the other things that we believe crypto can offer
    0:04:44 to really start to ramp up.
    0:04:47 One interesting thing, though, is that stable coins,
    0:04:51 at least us who are in the industry full time, have been thinking about for like years and years.
    0:04:54 Because I remember talking about them in 2017, 2018.
    0:04:58 And I think there was always a narrative about them being useful for remittances
    0:05:01 or in countries that have had hyperinflation.
    0:05:05 And for those countries, Bitcoin is a better store of value than their native currencies
    0:05:07 because sometimes it goes up, unlike those which only go down.
    0:05:12 But it’s not ideal because, again, as Ali mentioned, it’s not a stable unit of account.
    0:05:17 And so it’s interesting to see that even though this has been talked about for years,
    0:05:18 now it’s really having its moment.
    0:05:22 And I think, to Ali’s point, a big part of why that’s happening is because the infrastructure
    0:05:26 has evolved to a point where you can now efficiently move money
    0:05:31 and you’re not having to spend a huge amount of money to move the money, among other things.
    0:05:34 So I think that’s why we’re starting to really see it shine now.
    0:05:38 I would also add that it’s intersecting in interesting ways.
    0:05:42 We’re still super early in this, but there’s obviously a lot of talk about AI and agents.
    0:05:46 And if you want to dispatch your agent to go transact on your behalf,
    0:05:50 you can’t really give them your bank account or your credit card,
    0:05:52 but instead you can give them your crypto wallet.
    0:05:56 And so this interplay of agents buying or spending money on behalf of their users
    0:06:00 with stable coins is a really interesting theme that we’re starting to explore.
    0:06:04 Actually, to that point, it’s kind of ridiculous to think about the way that
    0:06:11 the financial system works today, where even a normal local and domestic financial transaction
    0:06:14 where you go to a coffee shop and you buy a coffee with a credit card,
    0:06:20 that transaction involves like the point of sale, the payment processor, the issuing bank,
    0:06:22 the acquiring bank, the credit card network.
    0:06:26 And each of these intermediaries takes some cut, some fee on the transaction
    0:06:29 to add up to something like multiple percentage points on the transaction.
    0:06:32 And that is the case in a domestic transaction.
    0:06:34 And if the transaction happens to be international,
    0:06:38 then that entire stack of participants and intermediaries gets duplicated
    0:06:42 and mirrored on the other side to the point that any kind of financial transaction
    0:06:45 across borders is insane in terms of its inefficiency.
    0:06:49 It can take up to like three to seven days to move money from one country to another,
    0:06:52 and it can cost like up to 10% of the transaction to do it.
    0:06:56 So when you have a technology that can now, again, move an arbitrary amount of money
    0:07:01 from anywhere in the world to any other place in the world for under a penny and under a second,
    0:07:03 that truly is very transformative.
    0:07:06 And it’ll be very disruptive to the way that the financial system works.
    0:07:13 So to Ariana’s point about AI agents, it’s kind of inconceivable that an AI agent that a human may want
    0:07:17 to participate in the financial system would have to go through all of that inefficiency
    0:07:22 and deal with all of these arcane human intermediaries, some of which are not even really automated.
    0:07:23 That’s inconceivable.
    0:07:31 And the only real way to bring online millions or potentially billions or more AI agents into the kind of the financial system
    0:07:38 is through a technology that’s fully based on software and as efficient as like the crypto rails that are now available
    0:07:39 and now can be used.
    0:07:44 Yeah. So say more about some of the use cases that stablecoins are currently enabling.
    0:07:47 Is it mostly on an institutional level? Is it on a consumer level?
    0:07:48 Or what are the common interactions people are having?
    0:07:51 I think it’s both. It depends what markets you’re talking about.
    0:07:56 There’s a company in our current accelerator batch called Zarpay, which is operating in Pakistan.
    0:08:01 And they’re basically creating a network of small, you know, the little shops if you’ve been to Africa
    0:08:06 or wherever they have these little sort of mobile kiosks where you can put money on your phone and that sort of stuff.
    0:08:15 And so they’re basically using that network in order to create a way for people to come in and deposit their local currency and get stablecoins.
    0:08:20 And then they’re building a whole suite of like financial services around this as the atomic unit.
    0:08:26 And I think a lot of countries that have unstable currencies or other financial issues for which holding dollars
    0:08:33 or the equivalent of dollars in stablecoins is very appealing immediately understand the value of this and are very attracted to using it.
    0:08:38 So I think it goes from that all the way through to banks and financial institutions.
    0:08:44 I think in many cases there’s been an interest in crypto and some of the banks and financial institutions have wanted to get involved,
    0:08:49 but it’s been very unclear how they could do it, largely as a result of the lack of regulatory clarity,
    0:08:52 but also because crypto can be a little scary or whatever.
    0:08:56 And so it hasn’t always been obvious for them to see a path.
    0:08:57 How do we get involved?
    0:08:59 What’s the way that we can bring this to our consumers?
    0:09:03 And so stablecoins, I think, are kind of like a baby step in,
    0:09:08 in the sense that it’s a lot more clear what the value proposition is.
    0:09:11 It’s a non-speculative use case.
    0:09:15 And so I think it’s just a good entry point for some of these larger institutions.
    0:09:16 Yeah.
    0:09:22 Help us understand better the stablecoin landscape around like what big companies or types of companies have emerged or will emerge as a result of it,
    0:09:25 or how it impacts the crypto startup ecosystem more broadly.
    0:09:30 So right now at the center of all of the action are the stablecoin issuers.
    0:09:38 So we’ve got two of the major ones right now are USDC, which is created by this consortium between Coinbase and Circle.
    0:09:39 And then there’s Tether.
    0:09:42 And so both of these are like the kind of the biggest two issuers of stablecoins today.
    0:09:47 Then, of course, both of these stablecoins operate on top of blockchains.
    0:09:53 So another important piece of the stack is the infrastructure on top of which some of these stablecoins operate on.
    0:10:02 And then you have this kind of collection of companies at the periphery that are generally just companies that help connect the crypto world to the external world.
    0:10:04 And that would include wallets.
    0:10:08 It would include some of the fintech companies that are using blockchain technology as the back end,
    0:10:15 but have a front end that looks more like a Web2 type of front end and doesn’t expose the crypto aspects to the end user as much.
    0:10:18 And all of those players will be a part of the story as well.
    0:10:24 So one of the things that we talk a ton about is what will this stack look like end to end?
    0:10:25 As the space evolves.
    0:10:38 And one of the exciting things that we are hoping will happen soon is it will get legislation that sets the rules of the road for stablecoins and for what is required for an issuer to create a stablecoin,
    0:10:41 what kind of collateral is needed for the stablecoin to be compliant.
    0:10:46 And what that likely will do is if that works, and we strongly believe it will likely happen this year,
    0:10:56 it will to some extent commoditize the issuance layer because it’ll be easier for new issuers to emerge and create their own stablecoins that are also USD denominated
    0:11:02 to the point that those new stablecoins are somewhat fungible and interchangeable with USDC and with Tether.
    0:11:10 Because if all of them are compliant, then you can trust that all of them are likely to be ultimately redeemable for a dollar and they’re equally trustworthy.
    0:11:16 Which means that then the issuers may no longer be the ones that capture all the value the way that they do now.
    0:11:19 And instead, a lot of the value might be captured by some of the other layers.
    0:11:24 Like, for example, the infrastructure is likely to capture a lot of the value because a lot of the activity,
    0:11:29 a lot of these sort of transactions, stablecoin transactions that are happening,
    0:11:35 happen on blockchains like Solana, Ethereum, Sui, a number of other kind of important layer one blockchains.
    0:11:40 And all of those require payment of gas for all of those transactions.
    0:11:44 So those blockchains are likely to be important players in the way that this unfolds.
    0:11:45 That’s on one another stack.
    0:11:50 And then the other end will be kind of the endpoint, the interface that connects this whole crypto world to the end users.
    0:11:53 So wallets will be likely important.
    0:11:59 One of our portfolio companies, Phantom, likely be well positioned as a gateway or an interface for people to interact.
    0:12:04 interact with stablecoins and get kind of exposure to US dollars, regardless of where they may be.
    0:12:07 So that’s maybe a bit of a layout for what the ecosystem looks like at the moment.
    0:12:14 Yeah. And it seems like for years there’s been this question of, hey, what’s going to make it so that there’s hundreds of millions of users?
    0:12:17 I’m not sure what it is at the moment across all of crypto or a billion users.
    0:12:19 They asked before that kind of what’s the iPhone moment?
    0:12:21 What’s the product that everyone’s going to be using?
    0:12:22 That’s also a platform for everything.
    0:12:24 Is it stablecoins or is it something else?
    0:12:24 Or how do we think about that?
    0:12:28 I think the odds are good that stablecoins are that thing.
    0:12:31 I also don’t think that necessarily there needs to be one thing.
    0:12:33 I think we mentioned AI.
    0:12:35 Ali has made some investments in that category.
    0:12:37 We’ve done some as a team.
    0:12:40 I think there’s going to be different waves that bring in different users.
    0:12:44 A while ago, Web3 Games was a big entry point.
    0:12:46 Now it’s AI and stablecoins.
    0:12:48 So I think the users do come in waves.
    0:12:53 I think there’s a lot of it that sort of tracks the cycles that we see every couple of years in crypto.
    0:12:58 Chris always talks about, do you want to be the indie band or do you want to play like the Super Bowl or the mega stadium?
    0:13:06 And I think like stablecoins really have the ability to appeal to like a much broader audience because, like we said, it’s just a use case that makes sense.
    0:13:09 It’s pretty clear what the value proposition is.
    0:13:11 And so it appeals to a broader audience.
    0:13:11 Yeah.
    0:13:12 Yeah.
    0:13:15 And in part also because it addresses a very real pain point.
    0:13:24 Whether it be for people in third world countries that want exposure to the dollar because their local currency may not be as reliable.
    0:13:29 Or people who want to move money between borders, and we talked about how that can be extremely inefficient.
    0:13:34 Or even companies that want to move money across borders, they still have to deal with all the inefficiency.
    0:13:45 Apparently, companies like SpaceX are already using stablecoins for treasury management to move money from one country to another in a way that’s much more efficient than using the traditional financial rate.
    0:13:47 Yeah, I believe they were using Bridge, which Stripe now acquired.
    0:13:48 Yeah, I mean, it’s interesting.
    0:13:52 Stripe Sessions, their big conference was like all about stablecoins.
    0:13:55 And so many of the talk tracks last week were about that.
    0:14:00 And so I think it’s really indicative of the fact that this is permeating not just crypto companies, but out more broadly.
    0:14:10 And I think, obviously, the other necessary element in addition to the infrastructure improvements and all of that is just the fact that now we have a more friendly regulatory regime, which is interested in seeing these kinds of things flourish.
    0:14:10 Yeah.
    0:14:12 Just one meta point.
    0:14:20 It’s one thing I’ve always appreciated about crypto investing is you guys, as domain experts, don’t just need to understand the technology, which is complex enough in itself.
    0:14:29 But you also need to understand the policy regime, law, monetary policy, economics, foreign policy, and how all these things are intersecting with crypto startups.
    0:14:29 Yeah.
    0:14:36 I mean, I’m certainly not the domain expert on some of the policy stuff, but we’ve really assembled a super strong team who’s been very involved in D.C.
    0:14:38 and trying to push the ball forward for the whole industry.
    0:14:39 Yeah.
    0:14:42 You mentioned Stripe getting deeply involved in crypto.
    0:14:46 It’s interesting because people often contrast it with AI and say, hey, AI is mostly sustaining innovation.
    0:14:55 And that, of course, there’s massive companies that have been formed, but a lot of the gains have gone to the biggest companies, whereas crypto is mostly a startup, though some bigger companies are getting involved, too.
    0:14:58 It’s funny, like maybe Facebook was just a few years too early.
    0:15:01 If they launched Libra in a more friendly regime, might that have worked?
    0:15:06 How do you think about the startup versus incumbent distinction in this space?
    0:15:20 Yeah, crypto is like a fundamentally radical set of technologies that is very, very hard for incumbent players to adopt and run with, precisely because it is so fundamentally disruptive to the way that they do things.
    0:15:22 I was at Google a while back.
    0:15:26 I was at Google X working on a robotics project, but I was already very interested in crypto.
    0:15:33 And Google X is supposed to be like the moonshot factory and it’s supposed to be super innovative and open to new ideas, open to start new companies.
    0:15:34 But they don’t want it.
    0:15:34 New ideas.
    0:15:36 I tried to get Google X to touch crypto.
    0:15:37 I tried at Facebook, by the way.
    0:15:38 Same thing.
    0:15:45 And it’s like Google would not touch crypto with a thousand foot pole unless it was like a very, very kind of vanilla, we will run some note or whatever.
    0:15:47 Were they like, this makes no sense or were they like, it’s evil?
    0:15:49 I think, yeah, they kind of fundamentally didn’t get it.
    0:15:51 They were afraid about the optics.
    0:15:56 They were afraid about the regulatory association with it, the reputational consequences.
    0:16:08 Also, like the whole Web3 vision, the vision of decentralizing web services, which is, I think, kind of the most futuristic vision for crypto, is fundamentally disruptive to the way that these companies work.
    0:16:15 These are centralized companies that make money and have power by virtue of being so centralized.
    0:16:37 And if you build something like a social network that is fully decentralized and has no core central company, like no monopolistic tech giant that’s worth $44 billion that controls what recommendation algorithm is used and who gets to follow whom and all of the data and the social graph itself, then that company no longer has a business model.
    0:16:45 It’s a very, very different business model to build a social network that’s decentralized in the way that a company like Farcaster currently is.
    0:16:55 And so for a company like, say, Facebook or Google in its own way to decentralize itself and to truly embrace crypto with arms wide open, it would have to cannibalize its own business model.
    0:16:58 And I think that’s actually becoming true for AI as well.
    0:17:07 Like, I think that it was very, very true that AI was a very sustaining innovation before, but it’s gotten so powerful that there are many elements of it that are disruptive.
    0:17:11 Google wanted to clearly embrace AI.
    0:17:15 It would have to replace search with an LLM instead of having its current model.
    0:17:21 And of course, that’s a hard thing for it to do, given the kind of the insanely profitable business model that they currently have.
    0:17:23 Yeah, that’s fascinating.
    0:17:27 Is that still the vision that we’ll have decentralized social networks, decentralized marketplace?
    0:17:28 Or where are we on that vision?
    0:17:36 And what are the bottlenecks to that vision being realized of networks at scale that are truly decentralized and competing with some of the centralized ones?
    0:17:39 Is it technological or is it that people just don’t really care about this in the same way?
    0:17:40 Or like, why hasn’t it happened yet?
    0:17:44 I think it’s mostly a consumer preference issue.
    0:17:48 I do think like now some of the products have gotten really good.
    0:17:51 Like Farcaster, for example, the product experience is very good.
    0:17:58 But it’s just challenging to get people to switch because the reason you’re on the social network is for the graph.
    0:18:01 And so it’s difficult to export an entire graph.
    0:18:06 I think like users are accustomed to being the product.
    0:18:08 If you’re not paying for the product, you are the product.
    0:18:11 And I think in many cases, consumers are used to that experience.
    0:18:14 And, you know, ads are annoying, but they’re not necessarily that bad.
    0:18:18 And so people just accept it and don’t think too much about it.
    0:18:25 And by the way, this is interesting because if you look at like all of the big social networks, none of them have been started in the last decade.
    0:18:27 And that’s not true just of crypto.
    0:18:28 It’s true in general.
    0:18:37 And so it’s just very difficult, I think, to get over this hurdle of reaching a critical mass whereby people actually say, oh, I’m in the network and I’m going to stay in the network.
    0:18:39 So it’s not just a crypto thing.
    0:18:40 I think it’s just difficult nowadays.
    0:18:42 People only have so much attention.
    0:18:47 And with the networks that there are, most of the attention span has already been captured.
    0:18:54 So I think we may need to see some of the existing ones falter before there’s like enough room for some of the new ones to really take hold.
    0:18:55 We’ll see.
    0:19:00 Yeah, we used to believe that these ideas and these companies would be the first to gain adoption.
    0:19:07 And that was largely because all of the financial use cases, like the DeFi use cases, even like the stablecoin use cases were illegal.
    0:19:08 Yeah.
    0:19:10 This was the case in the previous administration.
    0:19:17 And so it felt to us like the more kind of innocuous seeming social network use cases, gaming use cases would more likely gain adoption.
    0:19:25 And then that will be the gateway for other things to eventually become legitimate and get acceptance from a regulatory standpoint.
    0:19:31 But now that the regulatory landscape has shifted so much to the point at which it’s now a much more friendly landscape.
    0:19:39 And as a result, we have all these traditional financial institutions getting involved and stablecoins are really having a moment combined with the infrastructure clicking into place.
    0:19:44 It’s now much more clear that the more financial use cases are likely to happen first.
    0:19:47 Those will act as a legitimizing force for the rest of the space.
    0:19:50 And then the consumer use cases, which we still believe in, will take longer.
    0:19:53 And I think as Arana is saying, it’s very, very hard to get those things right.
    0:20:00 Like the bar that a consumer has on the quality of a consumer facing application is extremely high.
    0:20:09 And crypto has not yet figured out all of the UX challenges and kind of the seamlessness and usability challenges of crypto are still, it’s still a nascent technology.
    0:20:10 On that front.
    0:20:13 So it’ll take longer for all of those things to get resolved.
    0:20:22 But in the meantime, we have all these other financial use cases, which I think will solidify the technology, will legitimize the space for a broader group of people, will get more entrepreneurs to come into the space.
    0:20:22 Right.
    0:20:35 Yeah, I think on the point of the attention span or lack thereof of consumers, it’s interesting when you see a new network created around an area that doesn’t already have somebody in the non-Web3 world occupying it.
    0:20:41 So, for example, I think a good example of this is Blackbird, which is kind of a network for restaurant lovers.
    0:20:45 And you can think about it as Amex points for restaurants.
    0:20:54 And they’re occupying a space that nobody really owns it right now, like the credit card companies do, but it’s still sort of a so-so experience at best.
    0:21:15 And when you have a great entrepreneur who is really deep in restaurant tech, like Ben Leventhal, who’s the founder, tackling a problem like that, bringing a consumer Web 2 experience, but using Web 3 as the ownership, therefore allowing the restaurants and the consumers to actually have ownership in this network, which wouldn’t be possible in a Web 2 context.
    0:21:19 Then it’s pretty interesting because that’s something that you couldn’t really give the same ownership.
    0:21:25 And if you look at some of the platforms like Uber Eats or DoorDash, the restaurants have to work with them because their margins are so slim.
    0:21:28 And so they need as much volume as they can, basically.
    0:21:35 But it’s not great because the platforms are, in many cases, quite extractive and dig deeper into the margins of the restaurants.
    0:21:48 And so if you are using stablecoin payments to bring down transaction costs and you’re also giving restaurants actual ownership in the network, therefore helping their bottom line in that way, too, it’s really interesting.
    0:22:00 Yeah, in order for it to match the consumer expectations that we’re talking about, it has to be a new interaction model or new value that’s unlocked that goes straight to the consumer as opposed to something that’s like abstract, like the same product, but just decentralized.
    0:22:05 It has to be a forecaster with frames and some of its other experiments has been net new things and only it could do.
    0:22:07 And a Web 2 and Blackboard is another example of that.
    0:22:08 Yeah, exactly.
    0:22:12 Ali, let’s go a bit deeper on AI and the intersection between AI and crypto.
    0:22:13 What’s working there?
    0:22:14 Where are you most excited?
    0:22:24 Peter Thiel actually had this tongue-in-cheek line back in 2018, which I think rings true, which is that AI is communist and crypto is libertarian.
    0:22:30 I think the meta point is that these two technologies are very different from one another.
    0:22:33 And in many ways, they’re actually counterweights for each other.
    0:22:37 So there are many ways in which they are intersecting and we can talk through a few of them.
    0:22:55 I think one of the most important ways is that AI is creating a kind of overabundance of media and of human-looking entities, agents that can pretend to be human, or deep fakes of video or audio that seems very human.
    0:23:00 And it’s hard to know whether you’re looking at something that’s real or something that’s purely generated.
    0:23:07 And crypto happens to be a really good technology to help authenticate media or help authenticate data in general.
    0:23:15 One of the ways in which these two worlds will collide is that there are crypto projects that are working on, among many other things, proof of humanity,
    0:23:20 which would allow someone, anyone, a user on the internet to prove that they actually are human,
    0:23:26 so that anyone on the other end can know that they’re interacting with a human and not with an AI bot or an AI agent.
    0:23:32 WorldCoin is one of these companies, is one of our portfolio companies, and they built an orb that uses biometric information
    0:23:35 and also uses zero-knowledge proofs to keep all of the biometric data private.
    0:23:44 In fact, the data itself never leaves the orb, and only a code or a cryptographic object that is derived from the biometric data ever leaves the orb.
    0:23:50 And from that cryptographic object, it is not possible to infer anything about the biometric data itself.
    0:23:54 It’s a technology that allows anyone to prove their humanity on the internet.
    0:23:58 There was that famous line in the 90s that, on the internet, nobody knows you’re a dog.
    0:24:03 And that is very, very true now in 2025, where, like, on the internet, nobody knows you’re a human.
    0:24:04 Like, you could be anything.
    0:24:05 You could be…
    0:24:06 A dog or an ape.
    0:24:07 A dog or anything else, yeah.
    0:24:17 So that’s one way I think that cryptography blockchains will help deal with the immensity and abundance of signal and noise that AI will generate.
    0:24:24 Another big one is that crypto can help decentralize the power structures that are emerging in AI.
    0:24:31 At the moment, it seems like there will be a small number of very, very powerful players in the AI world.
    0:24:35 Even though it’s unclear as to whether there are things like network effects that drive defensibility,
    0:24:41 there are just a handful of really powerful players in the space, at least at the kind of the model layer,
    0:24:45 like OpenAI and the other kind of big companies that build foundation models.
    0:24:50 Crypto offers an alternative for creating AI systems that’s more decentralized.
    0:24:58 So an example of this is a company called Jensen, which is also in our portfolio that builds a kind of marketplace for compute.
    0:25:04 So someone on one side of the marketplace can provide their idle GPU capacity to the network.
    0:25:11 And then someone on the other side who might want to use the GPU compute for training a model or for doing inference on a model
    0:25:14 can, through the network, make use of all of that compute.
    0:25:21 And then the network manages all of these heterogeneous computational resources to create something that feels like a unified cloud
    0:25:26 on which you can run all of these machine learning AI workloads in a way that’s fully decentralized,
    0:25:28 in a way that is not controlled by a single company.
    0:25:37 And in a way that could actually be more efficient than a cloud by virtue of using capacity that otherwise would just go idle and unused.
    0:25:44 It’s just capacity that’s locked away in all of these like pockets that are far removed and not in one particular data center.
    0:25:50 There are many hard technical challenges to get there, but there are a lot of smart people working toward figuring that out.
    0:25:52 And we’re very optimistic that’ll also happen.
    0:25:58 It’ll also allow for machine learning workloads to run in a way that is also verifiable.
    0:26:05 So this way you don’t have to trust a centralized company, say, Facebook or like one of the other social media companies,
    0:26:14 that the machine learning model or the AI model that they’re running for, say, like the recommendation algorithm is unbiased or has particular properties.
    0:26:21 You can actually, with cryptography, verify that those things are the case and that these things are executed in a way that’s correct.
    0:26:23 You can use some of these decentralized systems for that as well.
    0:26:33 And then maybe the final one, which I think is the most futuristic and the most challenging, is having crypto help AI figure out the new business models for the internet.
    0:26:54 So one of the issues that AI will create with the current business models of the internet is that right now, the way that the internet works is that you have an aggregator, like a search engine, driving traffic to creators of media, like say someone who has written a blog post or someone who has like a page that has content.
    0:26:59 And there’s ads as like the business model that mediates that whole interaction.
    0:27:06 And that entire business model goes away if you just have an AI that just gives you the answer that you’re looking for.
    0:27:16 So instead of doing search on Google, getting exposed to a bunch of ads and then clicking through to a website and having all of those parties be happy because a business model includes all of them.
    0:27:25 Now, instead of that, you just interact with an LLM, you get the answer immediately and you never click through to the final page and you never get exposed to an ad.
    0:27:27 That kind of completely changes the way that the internet works.
    0:27:31 And we’re going to need new business models for the internet if that’s the case.
    0:27:46 So one idea is that you could through, there are a lot of these research efforts to try to figure out attribution in the training of a machine learning model for what pieces of data contributed to a particular output.
    0:27:59 So you’re asking an LLM a question, you kind of want to know what pieces of data that were used to train that LLM contributed to the answer that the LLM ultimately gives you when you ask it a question.
    0:28:06 And if you could know that, then you could come up with a business model that rewards the people who contributed that data originally.
    0:28:07 And crypto could be part of that.
    0:28:09 So there are open problems on both sides.
    0:28:14 Like you have to figure out this attribution challenge in the AI world and there are people working on that problem.
    0:28:16 And then there’s a challenge on the crypto side.
    0:28:26 Like how do you build a network that can, using that information, compensate all of the parties involved in having the AI actually give you what you ultimately want?
    0:28:27 Fascinating.
    0:28:30 Are the big labs interested in crypto?
    0:28:32 Do they need to be interested in crypto for this to happen?
    0:28:33 Or is this largely coming from startups?
    0:28:36 It’s funny, Sam Altman, of course, open AI, but also WorldCoin.
    0:28:37 So it has some familiarity.
    0:28:38 But what can you say about this?
    0:28:40 For the most part, I don’t think so.
    0:28:43 I think for the most part, the AI labs, they’re just running with AI.
    0:28:46 And there’s so much that’s exciting in that world that I think crypto doesn’t really factor in at all.
    0:28:55 But there are crypto companies that are very interested in AI and are thinking about the ways in which crypto will ultimately make a difference in that world.
    0:29:00 So like this company I mentioned, Jensen, actually the founders are very, very deep in AI.
    0:29:08 They have an AI background, but they also happen to have a deep commitment to building things as open source networks that are ultimately decentralized.
    0:29:12 And so they are of the few people who really do straddle both worlds.
    0:29:18 Zooming out a little bit, what are some of the biggest misconceptions you think people have about the space right now?
    0:29:30 Well, I mean, I think for the last few years, it’s been really challenging to launch a token network in the United States in particular, because there was a lack of clear legislation.
    0:29:40 And then there were very aggressive folks in several agencies working on basically not allowing entrepreneurs to launch networks.
    0:29:46 And that applied to entrepreneurs who were very well-meaning and very much wanted to do things by the books.
    0:29:58 And so I think one of the challenges was that people obviously didn’t want to end up in legal trouble and therefore, in many cases, pulled back their plans on that front, which really impeded their progress on the product side as well.
    0:30:06 So they couldn’t really build their vision, because I think tokens are part and parcel of what’s valuable and interesting about crypto.
    0:30:08 And so if you remove that piece, it doesn’t make any sense.
    0:30:13 So the thing that’s a misconception, perhaps, is that the situation is very different now.
    0:30:16 We have a much friendlier administration in place.
    0:30:21 We have a very different situation in terms of the leadership of these agencies now.
    0:30:25 And so I actually think it’s a great time for folks to be building token networks.
    0:30:30 And I think that message hasn’t necessarily fully made it out there.
    0:30:38 So I’m hopeful that more entrepreneurs realize that the situation is, again, very different from what it was just a few months ago and start to come back in force.
    0:30:40 I completely agree with that.
    0:30:53 I think another big one is outside of like our immediate circles, like outside of the world of tech, it’s shocking to me that people continue to think of crypto as just like a thing that’s supposed to be money only.
    0:30:56 Or they think of blockchain as a kind of ledger for money.
    0:31:04 And I think that that misconception comes from Bitcoin, from Bitcoin trying to be money and only money and not really trying to be anything else.
    0:31:18 And the fact that this misconception is that like Ethereum is like Bitcoin and Ethereum is actually the silver to Bitcoin’s gold and that all that crypto really is, it’s just another kind of attempt to doing what Bitcoin did.
    0:31:24 The fact that Ethereum actually is a fundamentally different thing than Bitcoin is still not widely understood.
    0:31:34 The fact that Ethereum is actually a kind of computer where you can build all sorts of different applications, where the software that runs on top of the computer has unique properties.
    0:31:40 that no other software has ever had is, I think, not widely understood.
    0:31:47 And these are programs that, like the programs that run on a blockchain like Ethereum are programs that have a life of their own.
    0:31:54 They are programs that can make commitments that no one has to trust anyone to believe in.
    0:32:00 It’s a program that is essentially free from interference from anyone, including the people who originally wrote the program.
    0:32:03 And so that’s a very unique property that no other kind of software has.
    0:32:17 It’s a kind of technology that inverts the power relationship between the software and the hardware, whereas historically, the hardware has always had power over the software because whomever controls the hardware can turn off the software or change it in some way.
    0:32:29 Whereas in crypto with blockchains, the hardware commodities, these are people who run like the miners, for example, or validators in the blockchain context don’t have any power over the software that runs on top.
    0:32:31 And that’s what makes a blockchain unique.
    0:32:35 And that’s what makes it capable of doing so much more than just money.
    0:32:37 You can kind of build far more sophisticated primitives.
    0:32:45 So stable coins are the first thing, but the things that come after, things like DeFi, where you can build much more sophisticated financial primitives on chain,
    0:32:54 or some of these other more futuristic ideas where you can do AI, you can do deep end, you can do some of these consumer facing applications like decentralized social networks.
    0:32:58 All of that relies on the properties of a blockchain computer.
    0:32:59 That’s not just a ledger.
    0:33:02 It’s a full on computer on which you can build applications.
    0:33:05 That I think is not something that most people really get.
    0:33:06 Yeah.
    0:33:15 Maybe gearing towards closing here, Ali, can you give a bit of an update on kind of the smart contract platform wars as an outsider or someone who’s paid attention at certain times and not at certain times?
    0:33:21 What I’ve heard or gleaned is Bitcoin, as you mentioned, has tried to be money, but there’s a little bit of a nascent Bitcoin builder movement.
    0:33:24 I’m not sure if that’s led to something particularly meaningful in the space.
    0:33:38 And then my understanding is that Ethereum has tried to optimize across multiple dimensions, both trying to be money, but also trying to be the base layer for sort of decentralized internet and committed to decentralization in a way that some people think is at the sacrifice of usability.
    0:33:43 And whereas Solana has not had the same commitments to decentralization, has really optimized for usability.
    0:33:45 One, is that a fair characterization?
    0:33:46 How would you edit the characterization?
    0:33:49 And two, how is this all played out or where are we right now on that level?
    0:33:50 There’s actually a really good characterization.
    0:34:01 The way that I would break things down is that there is a very large and multidimensional trade-off space, and it’s very hard for any one system to cover the entire space.
    0:34:09 So it makes sense that you’d end up with different systems specialized for different things, and then as a result, having different use cases and different value propositions.
    0:34:14 So Bitcoin, I think, has been extremely successful at becoming like a kind of digital gold.
    0:34:26 It’s been extremely volatile, but I think that there is this belief, there’s this mimetic value that Bitcoin is long-term, a pretty good store of value that will be around for a very, very long time.
    0:34:27 It’s not going anywhere.
    0:34:34 And we’ll have properties that are desirable that are not provided by other things like fiat or gold itself or anything.
    0:34:37 It’s funny, it’s only been around for less than 20 years, but in my head, I treat it as gold.
    0:34:38 It’s going to be there forever.
    0:34:40 Exactly, exactly.
    0:34:48 So it’s really succeeded at that, and I think some of the things that has helped it succeed at that is the fact that it is so hard to change and the fact that it is so simple and you can’t do that much with it.
    0:34:54 Those things are disadvantages in some contexts, but they’re real advantages when trying to solve for that particular thing.
    0:35:00 Then there are like all of the other smart contract platforms that are trying to do much more and are trying to be computers.
    0:35:08 And Ethereum lands in some part of the trade-off space here where they do really optimize for decentralization, and they are fully decentralized.
    0:35:15 And so it’s hard for Ethereum to change quickly because there are a lot of stakeholders and a lot of people who want to be able to influence its direction.
    0:35:25 And so the choices that it has made have made it a pretty good platform for some of the higher stakes like DeFi applications or for the issuance, for example, issuance of new assets on Ethereum.
    0:35:32 That might be the default simply because it’s been around the longest and its high amount of decentralization make it very suitable for that.
    0:35:36 And then there are blockchains like, say, Solana and Sui, which are extremely high performance.
    0:35:42 They are very well suited for transactions and for payments and for things that do require that level of performance.
    0:35:47 If you wanted to build something like the Nasdaq exchange on-chain, there’s no way you’re doing that on Ethereum L1.
    0:35:56 You probably need a blockchain that has the kind of the level of performance that a Solana or a Sui or some of the other kind of more modern or more recent blockchains have.
    0:36:01 So I think I expect that each of these ecosystems will likely find their niche.
    0:36:09 It’s obviously very uncertain and there’s all this talk about how maybe Solana will eat Ethereum’s lunch and that’s a possibility.
    0:36:11 But it’s still wide open is basically what you’re saying.
    0:36:14 Yeah, it’s wide open and there’s like a lot of ways in which it could play out.
    0:36:20 Closing out, Ariana, I want to double click on your point about the misconception in terms of how the policy regime has changed.
    0:36:25 I mean, I think if you look at the Novi Libra, you know, it had seven different names.
    0:36:36 Whichever one you want to use, that was something that could have been incredibly interesting because you have Facebook now Meta with such an enormous distribution network already has all the users.
    0:36:41 Integrating payments into that via crypto made all the sense in the world.
    0:36:46 But obviously, they were told in no uncertain terms that that was not something that they could proceed with.
    0:36:49 And then, unfortunately, the whole project died.
    0:36:55 I will say it went on to flourish in other forms because we’re investors in Missin and Sui.
    0:36:55 Yeah, they spun out of there.
    0:36:58 So there have been actually a number of great teams who came from there.
    0:37:03 So I think the diaspora of talent has continued to fight the good fight and build.
    0:37:09 But in general, that’s another project that I think, as it was initially conceived, had to die on the vine because of that.
    0:37:16 So I think as investors, it’s not necessarily our job to envision, like, what is possible, but rather to recognize it when we see it.
    0:37:24 And so I’m personally very excited to see what entrepreneurs come up with in the next couple of years now that we have a new opportunity space.
    0:37:25 That’s a perfect place to wrap.
    0:37:27 Ali, Ariana, thanks so much for coming to the podcast.
    0:37:28 Thanks, Eric.
    0:37:29 Appreciate it.
    0:37:33 Thanks for listening to the A16Z podcast.
    0:37:39 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash A16Z.
    0:37:41 We’ve got more great conversations coming your way.
    0:37:43 See you next time.

    a16z Crypto General Partners Ali Yahya, Arianna Simpson, and Erik Torenberg break down what’s actually working in crypto today – starting with the rise of stablecoins as a real-world payments layer. They discuss how stablecoins are being adopted by companies like Stripe and SpaceX, why regulatory shifts are opening new doors for crypto startups, and how AI and crypto are beginning to intersect.

    They also cover:

    • The future of decentralized social networks
    • Where Ethereum, Solana, and others stand today
    • Misconceptions still holding the space back

    A grounded conversation on what’s real, what’s hype, and where crypto’s finally finding traction.

    Timecodes:

    00:00 Introduction to Crypto and AI

    00:16 The Rise of Stable Coins

    00:40 Current State of Crypto

    02:02 Deep Dive into Stable Coins

    07:39 Institutional and Consumer Adoption

    22:09 The Future of Crypto and AI

    29:13 Misconceptions and Policy Changes

    33:06 Smart Contract Platforms

    36:14 Closing Thoughts

    Resources: 

    Find Ali on X: https://x.com/alive_eth

    Find Arianna on X: 

    https://x.com/AriannaSimpson

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures

  • How Andreessen Horowitz Disrupted VC & What’s Coming Next

    AI transcript
    0:00:03 We met with one firm, Ben, you might remember one of the partners said,
    0:00:05 venture capital is like being at the sushi boat restaurant.
    0:00:07 Thousand startups come through and you just meet with them.
    0:00:11 And then every once in a while, you kind of reach down and you just pluck a startup out of the sushi boat and you invest in it.
    0:00:13 And I was like, oh my God.
    0:00:16 By the way, the sushi there is typically not great.
    0:00:20 What does it take to build a venture firm from scratch?
    0:00:23 And how should it evolve as the world changes around it?
    0:00:28 I recently sat down with Mark and Ben for a wide ranging conversation on the origins of A16Z,
    0:00:30 the evolution of the venture capital industry,
    0:00:34 and the structural choices that have shaped the firm from platform to federation and beyond.
    0:00:41 We talk about everything from how A16Z got started in 2009 to how we think about platform media governance
    0:00:44 and why venture is more barbell shaped than ever.
    0:00:47 As it happens, this conversation took place during my first week.
    0:00:51 So it was the perfect moment to reflect on where the firm has been and where it’s headed.
    0:00:54 This episode originally aired on the Ben and Mark show,
    0:00:57 which you can follow for more candid conversations from inside the firm.
    0:00:59 Let’s get into it.
    0:01:06 The content here is for informational purposes only,
    0:01:10 should not be taken as legal, business, tax, or investment advice,
    0:01:13 or be used to evaluate any investment or security,
    0:01:18 and is not directed at any investor or potential investors in any A16Z fund.
    0:01:24 Please note that A16Z and its affiliates may maintain investments in the companies discussed in this podcast.
    0:01:30 For more details, including a link to our investments, please see A16Z.com slash disclosures.
    0:01:35 Hey everybody, welcome to another episode of the Ben and Mark show.
    0:01:37 I’m Eric Torenberg.
    0:01:41 I’m Andreessen Horowitz’s newest general partner, and this is my first week.
    0:01:47 Lots of exciting things planned for the future of the firm, but we thought that this would be a great opportunity to talk about a bit of the history,
    0:01:52 about the conversations that led you guys to start Andreessen Horowitz.
    0:01:55 When did you know that, hey, you guys had to do this?
    0:02:01 I think it was a conversation on AOL Instant Messenger, if I recall it correctly.
    0:02:08 We had sold Opsware to HP and had been out of it for a little while.
    0:02:12 We had started doing some angel investing and that kind of stuff.
    0:02:15 We had an angel fund called Horowitz Andreessen.
    0:02:17 That’s not even a joke, too.
    0:02:24 So we were doing that, and we were talking about what might each one of us do next.
    0:02:36 And as I recall it, and Mark might recall it differently, he said, you know, venture capital is so underwhelming in that it’s a great product for investors, for LPs,
    0:02:42 but it’s a kind of very mediocre product for entrepreneurs because, you know, you get almost nothing.
    0:02:51 You get like some money and then a smart person, but, you know, that smart person, they see you once a quarter, they don’t know much about what you’re doing.
    0:02:54 So their value kind of diminishes to zero in about three or four months.
    0:02:58 And we thought, you know, it’s so hard to build a company.
    0:03:00 Somebody ought to be able to do something better than that.
    0:03:09 And I said to Mark, I was like, we could start a firm and we could call it Ben Mark, you know, which was a pun on benchmark.
    0:03:12 I don’t think I ever got that until just now.
    0:03:12 Yeah.
    0:03:15 So that was the beginning of the conversation.
    0:03:18 And I was surprised because Mark was into the idea.
    0:03:21 I think he had the idea separately.
    0:03:23 So it was just one of those things.
    0:03:27 I’m just like absolutely amazed and flabbergasted that venture capital even exists.
    0:03:29 My first 22 years of life, I had no idea.
    0:03:30 I had never heard about it growing up.
    0:03:32 I never heard about it even in college at Illinois.
    0:03:36 And I came to Silicon Valley and my first business partner, Jim Clark, was like, yeah, we start a company.
    0:03:38 We go raise money from these venture capitalists.
    0:03:39 And I was like, what’s that?
    0:03:44 And it just like completely blew my mind that there were these people.
    0:03:47 They were literally scouting for basically crazy startup entrepreneurs to start these things.
    0:03:49 And they would give you money when you had nothing.
    0:03:51 And I was like, wow, that’s amazing.
    0:03:53 And so one is just the fact that Field Exists is amazing.
    0:03:56 VC and its modern form started in the 1960s.
    0:04:01 And I just look back at the history of these guys, Don Valentine and Tom Perkins and Pitch Johnson and Bill Draper, Arthur Rock.
    0:04:09 These guys for me are like legends because the fact that they could go out and source a Bob Noyce, you know, to start Intel or Steve Jobs to start Apple or these things is just amazing.
    0:04:11 I had quite good experiences.
    0:04:17 Ben mentioned the issues with the field, but Ben and I had the chance to work with two VCs at the very top of their game when their firms were on top of the world.
    0:04:21 And that was John Doerr when he was at Kleiner Perkins in the 90s and was kind of the top VC.
    0:04:25 And then later, Andy Rockleff, who was a founding partner of Benchmark, when they were kind of on top of the world.
    0:04:30 And, you know, in general, we got a lot of value from those guys and considered them partners and had very good relationships.
    0:04:32 And they helped us build, I think, good businesses.
    0:04:35 But basically, over the years, what happened was we kind of learned from those experiences.
    0:04:41 But also what happened over the years was Ben and I had become active angel investors and sort of advisors and mentors to a new generation of founders.
    0:04:44 And this is in the 2000s through the 2000s.
    0:04:49 And just so people understand the setting for this, after the dot-com crash in 2000, there had been an angel and venture boom in the late 90s.
    0:04:55 And then after the dot-com crash in 2000, like almost all angel investing went away and a large amount of VC went away.
    0:04:58 And, you know, it was like a full-on depression for early stage tech.
    0:05:02 So about like 2004, the crash sort of unspooled over five years.
    0:05:08 And by 2004, when Ben and I kind of ramped up our angel investing, I think, I don’t know, maybe the whole industry is down to like a half dozen angel investors or something.
    0:05:09 I mean, it was just a tiny.
    0:05:09 Yeah, it was small.
    0:05:10 We knew all of them.
    0:05:13 Angel investors had so much power that there was this scandal.
    0:05:17 Angel Gate, remember, of like angels meeting together and fixing prices.
    0:05:18 The Angel Gate, yeah.
    0:05:19 Well, we all talked to each other.
    0:05:21 So there was something to that.
    0:05:25 So then basically tech started to take off again in the late 2000s.
    0:05:31 And then TechCrunch at the time, Michael Arrington had built up a TechCrunch to be kind of the main online news source for startups and venture investing.
    0:05:38 And Michael, one of his like most remarkable things was he was at some dinner and he somehow cracked the code that there was a back room at the dinner where like all of the, actually Ben and I were not there.
    0:05:44 But like most of the prominent angel investors of that era were basically sitting around this round table and Michael Arrington kind of walks in and he’s a journalist.
    0:05:47 And at least he described it as like these massively guilty looks on everybody’s faces.
    0:05:49 Ben and I weren’t there, so I don’t know what happened.
    0:05:52 But the accusation was that they were colluding, right?
    0:05:57 They were sort of, you know, all teaming up to try to see if they could keep valuations low, which is a no-no in business.
    0:06:02 But a significant thing about that maybe is that that actually meant that there were enough angels to actually fill a table, right?
    0:06:05 Because that was like when there were like a dozen of them as opposed to just a half dozen.
    0:06:09 Yeah, anyway, so Ben and I just started working with what turned into be dozens of founders.
    0:06:12 And, you know, we ended up being very involved in because we had raised venture before and run companies.
    0:06:15 We ended up helping a lot of startups in that era raise venture.
    0:06:19 And so we helped them meet the venture firms and understand how to negotiate the deals.
    0:06:23 And then the other thing that happened was is we would get called in when they would get sideways with their VCs.
    0:06:25 It’s happened kind of a lot.
    0:06:25 A lot.
    0:06:30 So this became, it turns out, this is one of the main things people were calling us on was like, all right, I’m in some big fight with my VC.
    0:06:34 He wants the money back or this or that, or he freaked out at the board meeting and what’s going on.
    0:06:37 And I hear rumors this firm is shutting down and he’s going to fire us on my stock.
    0:06:43 Or, by the way, or the VCs would call us up and they’d be like, this founder is nuts and could you please talk to him and try to get him to like do the right thing.
    0:06:48 So we ended up in this sort of, I don’t know, arbitrator, coach, judge, arbitrator mode helping patch these things up.
    0:06:57 And I think, Ben, because you and I were angel investing at that point, a big part of it was, well, hell, like if we’re going to end up doing that anyway, if we just showed up with the checkbook, we could short circuit the process.
    0:06:58 Yeah, yeah, yeah, yeah, right.
    0:07:00 We wouldn’t have to fix all these problems.
    0:07:00 I mean, yeah.
    0:07:14 And that was like a little bit of it as well, which is there were just very few people at that point in venture capital world who had built any kind of company that had gone to any kind of complexity or was worth anything.
    0:07:17 It was just a different kind of background to people.
    0:07:24 So the ability to relate to founders, like really relate to founders, was a little bit missing.
    0:07:28 Okay, so you decided to start Andreessen Horowitz in 2009.
    0:07:30 You come out with a $300 million fund.
    0:07:34 Talk about what the strategy was going into it or how you were planning to differentiate from the outside in.
    0:07:36 You know, you guys were very loud.
    0:07:37 You had this platform approach.
    0:07:40 Talk about behind the scenes, how you thought about differentiating.
    0:07:46 By the way, like the big VCs at that time seemed just overwhelmingly invincible.
    0:07:50 They were giant, like long-lasting businesses.
    0:07:54 I mean, the whole industry had been around for 50 years.
    0:08:04 Some of these guys had invested in like every good, I mean, if you look at some of the things Sequoia had invested in along the way, it’s like quite a spectacular set of companies.
    0:08:08 And so we’re trying to figure out how to challenge the status quo.
    0:08:17 And one of the ideas we had was we would do like a lot of angel investments in addition to venture investments, which was unheard of at the time.
    0:08:20 And we would start out in this complimentary way.
    0:08:29 And then like eventually we’d build enough reputation where we could start doing kind of the A’s ourselves was how we pitched it.
    0:08:31 And we actually took it around.
    0:08:39 We visit a lot of our VC friends and many of them said it was a really dumb ass idea and we should definitely not pursue it.
    0:08:42 And it’s been tried before and it didn’t work and so forth.
    0:08:53 And then the other idea that we had was what I alluded to earlier, which is, gee, like what if the venture capital firm and we got a lot of this from our friend Michael Ovitz from CAA.
    0:08:58 So what if the firm wasn’t just a collection of partners?
    0:08:59 What if it was more than that?
    0:09:05 What if rather than paying the partners a lot of money or in our case, we didn’t pay ourselves any money?
    0:09:07 What if we took all that money?
    0:09:08 Because it was a lot of money.
    0:09:11 You know, we’re like even on a $300 million fund, it was a really a lot of money.
    0:09:33 What if we took that money and we built a platform and the purpose of that platform was to give an entrepreneur basically the confidence and power of a big time CEO like a Bob Iger or a Jamie Dimon or somebody like that who could literally pick up the phone and call anybody like at any time.
    0:09:37 Why should I be CEO of this little company, even though I’ve never managed anything myself?
    0:09:39 All I did was invent the product.
    0:09:48 Well, because like Bob Iger, I call anybody from the president of the United States to the CEO of FedEx or whatever, and I can get them on the phone.
    0:09:51 And so we’re like, what if we built that capability for our companies?
    0:09:56 And people said, I think that criticism was it’s been tried.
    0:09:57 It’ll never work.
    0:09:58 You guys are stupid.
    0:10:01 That was like the polite version of it.
    0:10:05 You know, those two things were really the original idea that we pitched.
    0:10:09 In fact, if you look at our original deck, that’s how we pitched LPs.
    0:10:12 And the first fund works, right?
    0:10:16 I believe you put $50 million into Skype and there’s a big markup there in that acquisition.
    0:10:17 There’s Instagram.
    0:10:18 It’s actually $65 million.
    0:10:24 But 15 of it was generously given to us by Silver Lake for participating in the deal, yeah.
    0:10:26 But we invested $65 million, yeah.
    0:10:30 Yeah, so Instagram, TinySpec, which turned into Slack.
    0:10:31 So the first fund is a winner.
    0:10:32 Yeah, Okta.
    0:10:33 Okta.
    0:10:37 Did you know, fund one, what your long-term vision was?
    0:10:41 Would you say, hey, we’re going to start 300 and then we’re going to scale and be one of the biggest firms in venture?
    0:10:43 Or what were you thinking was the future at the time?
    0:11:01 The thing that Mark and I knew from our experience in starting companies was it is just as hard to start a small boutique thing that means nothing in the world and build it as it is to build the world-dominating monster.
    0:11:05 Like, it’s no more amount of work to do the latter.
    0:11:08 So we were never interested in anything but the latter.
    0:11:16 We had zero interest at all in building a little venture capital firm that was like a beta to the big boys.
    0:11:18 We were never wanting to do that.
    0:11:24 We always thought, like, this was the start of doing something much bigger and much more important.
    0:11:32 And, you know, how we got there, we certainly didn’t have all mapped out from the beginning, but the ambition was always there.
    0:11:36 Yeah, and a lot of this comes from the fact that we had been running companies.
    0:11:43 And so if you’re running a company, like it’s a product company and it’s in, like, full competitive battle with other companies and you’re going through the wars that companies go through,
    0:11:54 like, you naturally think in terms of strategy, ambition, industry structure, the economics of the business, the competitive position, evolution over time, marketing strategy, unique selling proposition, differentiation.
    0:11:57 It’s all these things that any business operator thinks about.
    0:12:03 And actually, to their massive credit, a lot of the original VCs who built the firms originally back in the 50s, 60s, 70s were actually operators, right?
    0:12:05 So Tom Perkins and Gene Kleiner had been operators.
    0:12:07 Don Valentine, Pierre Lamont had been operators.
    0:12:09 The founders of Greylock had been operators.
    0:12:13 And so it was very natural for the sort of first generation to think in those terms.
    0:12:17 So by the time we entered the field, their successors, for the most part, had not run businesses.
    0:12:19 They had sort of grown up as professional investors.
    0:12:21 And they were, you know, inheriting businesses that other people had built.
    0:12:25 And so there was just this sort of fundamental difference in mindset.
    0:12:27 And by the way, the formula is working very well for them.
    0:12:30 They were very happy to, you know, the cliche goes kind of sit on Sand Hill Road and the deals.
    0:12:35 I mean, I’ve got countless stories in this, but we met with one firm, Ben, you might remember, a very prominent firm.
    0:12:38 And one of the partners said, it’s like, oh, venture capital is like so much fun as a business.
    0:12:42 He’s like, venture capital is like being at the sushi boat restaurant, right?
    0:12:44 And so these are like the sushi restaurants where there’s like this sort of track.
    0:12:49 By the way, the sushi there is typically not great.
    0:12:50 Yeah.
    0:12:53 That was the first thing that jumped out at me is, yeah, that’s not where the good sushi is.
    0:12:55 I wasn’t sure if we went to the same sushi restaurants.
    0:12:59 But, you know, you say, if you haven’t met a sushi boat restaurant, you sit there and like literally these little sushi boats,
    0:13:02 these little trays go by on the conveyor belt and you just pick up pieces of sushi.
    0:13:04 And he said, that’s just what it’s like.
    0:13:07 And you just sit here on Santa Road and a thousand startups come through and you just meet with them.
    0:13:08 And every once in a while, you eat it with his hand.
    0:13:12 You kind of reach down and you just pluck a startup out of the sushi boat and you invest in it.
    0:13:15 And I was like, oh, my God.
    0:13:18 You know, like, you know, complacency, right?
    0:13:19 Like entitlement.
    0:13:23 Immediately, it was just, you know, the hair on the back of my neck went up and I was like, all right.
    0:13:24 So, you know, basically a soft target.
    0:13:25 Yeah.
    0:13:30 We were just so oriented, you know, with a startup, all you do is work and you’re focused on the work.
    0:13:35 And if you aren’t doing enough work, you think of other work that you could do that might improve things.
    0:13:41 So it was such a foreign idea that you would be oriented around doing no work.
    0:13:47 Like literally sitting in a sushi boat restaurant and having a great life and playing golf or whatever they did.
    0:13:51 And so that was inspiring that like, okay, we can do it better.
    0:13:54 And by the way, like building a company was so hard.
    0:13:58 And so any additional help would be so appreciated.
    0:13:59 Yeah.
    0:14:00 It was how we always thought about it.
    0:14:01 Yeah.
    0:14:03 By the way, one more story.
    0:14:04 So around the same time, we met with another VC.
    0:14:06 And like I said, we’d been running companies.
    0:14:07 We’d been dealing with investors.
    0:14:09 And then we’d been running public companies.
    0:14:10 Ben was the CEO of a public company.
    0:14:14 And so when you’re running a public company, you’ve got investors, you’ve got your investor relations team.
    0:14:16 But like you’ve got hedge funds invested in your company.
    0:14:18 And like when you meet with them, you don’t even know if they’re long or short your stock.
    0:14:20 Often short.
    0:14:21 Often short.
    0:14:25 So like half the time, you’re giving them like ammunition that they’re going to use to try to go out and swear you and drive your stock price down.
    0:14:29 So it’s just like absolute bedlam when you’re running a company with shareholders.
    0:14:32 And so in venture, it’s different, or at least we thought it was different.
    0:14:34 Because your investors and the funds are called limited partners.
    0:14:38 And these are these institutions like endowments and foundations and sovereigns and so forth that invest.
    0:14:40 And then they invest on these lockups.
    0:14:42 They invest on like these 10 or 15-year lockups.
    0:14:45 So they put the capital in and they really sit patiently and let you do your thing.
    0:14:51 So we were just like, wow, dealing with an investor who is locked in to belong with your company for 15 years sounds like the best thing in the world.
    0:14:52 Like this sounds amazing.
    0:14:53 These are like super smart people.
    0:14:57 We’re running these endowments, people like David Swenson and others and Ann Martin and all these people.
    0:14:58 And so we were like, wow, this is going to be great.
    0:15:01 And then we met with a very famous, prominent, longtime VC.
    0:15:05 He said, boys, he said, the part of the job you’re going to hate the most is dealing with the LPs.
    0:15:07 Because he said, these people are like not smart.
    0:15:08 They’re not, you know, whatever.
    0:15:12 And he’s like, the way that you do it is you treat your LPs like they’re mushrooms.
    0:15:14 You put them in a cardboard box.
    0:15:18 You put the lid on the cardboard box and you put the box under the bed and you don’t take it out for two years.
    0:15:28 He literally said that, which was like, it was such an insane thing to hear because like we treated hedge funds that were shorting us better than that.
    0:15:43 But then, you know, like when we went out, this was actually the thing that kind of made me know that we had made a good choice by starting the firm was when we went out to visit the potential LPs, you know, and we’re pitching them and so forth.
    0:15:46 Now, look, they had very interesting things to say.
    0:15:48 They knew a lot about the industry.
    0:15:51 And they knew a lot about investing in general.
    0:15:56 And Dave Swanson, who Mark mentioned, wrote the definitive book on like how endowments invest and so on.
    0:16:08 And then when they invested, they were so interested in us, which, you know, it’s just a nice thing in life when anybody takes any interest in you.
    0:16:14 And our LPs did 30 or 35 reference calls on both me and Mark.
    0:16:16 Every single one of them did.
    0:16:18 And they learned a lot about us.
    0:16:20 I mean, they really got deep on it.
    0:16:26 And funny, actually, one of the funny things about the firm is I think we’re the only firm in Silicon Valley who has this.
    0:16:35 We have a two person key man thing where so normally as long as one person is intact, there’s no vote on the fund or whether the fund continues.
    0:16:41 But in fund one, both of us had to be there because they’re like, you guys are both flawed.
    0:16:43 But when you’re together, the flaws go away.
    0:16:45 Like they had gotten that deep on us.
    0:16:46 And so it was cool.
    0:16:48 Say more about that.
    0:16:50 How do you guys complement each other?
    0:16:51 Say more about what they were getting at.
    0:16:54 So like there was Netscape and LoudCloud.
    0:16:58 And I think that with Netscape, Mark started that company.
    0:17:01 He’s 22 years old or 21 to 22 years old.
    0:17:02 And so he’s like literally a kid.
    0:17:05 So he had some things in his reputation.
    0:17:07 For one, he was like actually a little kid.
    0:17:08 Like you grow up.
    0:17:11 The shit that I couldn’t do when I was 22, I can do now and so forth.
    0:17:12 So there was some of that.
    0:17:14 And then there was some of the same thing on me.
    0:17:17 Like, you know, LoudCloud got into absolutely horrible trouble.
    0:17:20 We burned through a stupid amount of cash and this and that and the other.
    0:17:23 And so there were definitely negative things.
    0:17:26 But it was interesting because both companies had very good outcomes.
    0:17:34 And so I think how the legend went was somehow, you know, between us, we could figure it out.
    0:17:36 Now, I don’t know if the criticisms were right.
    0:17:37 Maybe they were.
    0:17:47 And I don’t know if the solution was correct, but it was just kind of a fun thing that they had got so deep into our backgrounds that they would, like, insist that that be in the LPA.
    0:17:54 Talk about how you guys have made it work or divided, you know, divided and conquered or just your working style or whatever you could share about that.
    0:18:02 We’re co-founders and we work very, very closely together on the strategy and the direction of the firm.
    0:18:06 But, like, the CEO position is a chain of command position.
    0:18:10 And, you know, that’s me.
    0:18:13 I’m the CEO of the firm in that sense.
    0:18:19 You know, Mark doesn’t try to override, like, you know, these kinds of chain of command decisions.
    0:18:22 It’s not, you know, his thing.
    0:18:25 And then, you know, like, Mark, of course, does things that I can’t do.
    0:18:27 He’s just like a much bigger celebrity.
    0:18:37 He’s, you know, kind of, I always say, like, he’s a little bit of a magic trick that people in the firm call him Mark GPT because he knows everything about everything.
    0:18:39 So, like, he’s got very unique things that he does.
    0:18:44 Mark initially recruited me and then said, hey, Ben Eric, you guys figure out the details.
    0:18:45 Yeah.
    0:18:46 So, that’s a good example.
    0:18:52 So, you know, Mark had kind of been on this thing that, look, the world has moved.
    0:19:02 And the way we kind of market the firm and think about the media hasn’t changed nearly as much as the world has moved.
    0:19:05 And so, we need to bring somebody in.
    0:19:08 And, you know, I was like, do you have someone in mind?
    0:19:10 And he had you in mind.
    0:19:12 And so, I was like, okay, good.
    0:19:15 So, I listened to, you know, and I had been on the show, so I knew who you were and whatnot.
    0:19:19 But I went back and listened to a lot of the turpentine stuff and so forth.
    0:19:22 And I was like, yep, that seems like a good idea.
    0:19:27 And then it was on kind of me as in my kind of CEO job to put the thing together.
    0:19:34 And, Mark, maybe just give us a couple minutes on how the world had changed from it.
    0:19:36 We’ll do a whole separate episode, deep dive on it.
    0:19:43 But maybe just preview what was sort of the main change that you identified of like, hey, the world has changed from a media perspective.
    0:19:45 Yeah.
    0:19:55 So, you know, a lot of my thinking on this is from a book from our friend Martin Gurry that he wrote back in 2015 called The Revolt of the Public and the Crisis of Authority in the New Millennium.
    0:20:05 And so, basically, it’s like the world really did change, like how information flows through the world really did change, not just with the arrival of the Internet, but specifically with the arrival of social media.
    0:20:23 And so, you know, it just so happens that like all of us who grew up over the last 70 or 80 years, like we grew up in an environment of primarily top-down media, you know, in which there’s, you know, these sort of major kind of forces in, you know, broadcast, you know, TV or cable TV or newspapers and magazines where, you know, editors and publishers and reporters.
    0:20:27 And they sort of write all the stories and then, you know, it’s everybody else’s job to kind of read them and keep up.
    0:20:32 You know, to a world that looks, you know, completely different, whereas it’s basically everything is peer-to-peer.
    0:20:37 And so, you know, hierarchy to network and then centralized institution to, you know, decentralized network.
    0:20:41 And, you know, that’s happening throughout the economy and throughout, you know, throughout society.
    0:20:43 And, you know, there’s good and there’s bad to it.
    0:20:47 And there’s, you know, tons of, tons of arguments to be had about it, but it is happening.
    0:20:55 And so just, you know, in the new world, it’s just, you’re not, if you’re running a business or running a movement, like you’re just not going to do it through the traditional method.
    0:20:59 You may still participate to some extent, but you’re going to primarily tell your own story.
    0:21:00 I mean, you’re going to go direct.
    0:21:05 I mean, you’re going to have your own relationship with your constituents or with your fans or with your customers.
    0:21:12 And, you know, like in some sense, that sounds like, it all sounds like a truism and a cliche, but like, I think there were a couple of tipping points where it really started to happen.
    0:21:19 And one was around 2015 because social networking kind of hit mainstream and smartphones hit mainstream around that time, which is when Martin wrote his book.
    0:21:34 But then I think really it’s only been in the last five years when I think almost everybody, like, let’s say, let’s say basically everybody under the age of 70 and a very large number of people over the age of 70 basically have shifted from top-down media as their main source of information to social media as their main source of information.
    0:21:40 And so it actually is relatively new to live in this world in which the information really does flow differently.
    0:21:44 And so, like, it’s just a fundamental shift.
    0:21:50 And so I think, you know, as a firm, we spent, you know, as Ben said, we always had a big focus on marketing and in telling a story.
    0:21:56 You know, we did that primarily through the old centralized channels from 2009 to, you know, probably 2017 or something.
    0:22:01 You know, but really since then, it’s been, you know, at least as effective or more effective to kind of do it the new way.
    0:22:04 And, you know, it’s like the old William Gibson thing.
    0:22:06 It’s like the future is already here.
    0:22:07 It just isn’t evenly distributed yet.
    0:22:09 Like, everything I’m saying, people can, like, nod out.
    0:22:13 But, like, you know, as you know, Eric, like, most companies have still not adjusted to this.
    0:22:13 Right.
    0:22:16 Most politicians have still not adjusted to this.
    0:22:18 Most entertainers have still not adjusted to this.
    0:22:20 Most sports leagues have still not adjusted to this.
    0:22:24 And so it’s very important that we continue to do it.
    0:22:27 And then I think it’s also important that we set an example for our portfolio companies.
    0:22:30 Yeah, and a lot of it has to do with kind of like the apparatus, right?
    0:22:36 Like, so there’s the, you know, from a company standpoint, you know what you do.
    0:22:38 Like, you know, we know how to help entrepreneurs, this and that.
    0:22:42 And the other, what a, you know, product company knows how to build their product and so forth.
    0:22:45 And then it’s like, okay, and now you’ve got to get your message out.
    0:22:45 How do you do that?
    0:22:54 And then the apparatus that gets your message out, all the people, all the kind of tools, all the channels are oriented, at least partially in the old world.
    0:23:06 And so, you know, you, it actually is, you know, and somebody’s much longer to adjust than it is for the individual consumer who goes, oh, there’s just better stuff over here.
    0:23:16 Yeah, and Mark, also talk about the shift from corporations to individuals in terms of kind of where brands went and who people want to hear from.
    0:23:20 Not to say there isn’t, of course, a role for the corporation, but talk a little bit about that for the corporate brand.
    0:23:24 Yeah, so rewind history a little bit.
    0:23:25 So it’s actually pretty striking.
    0:23:29 Like, this sort of decentralized media environment that we’re entering is not new.
    0:23:30 It’s actually very old.
    0:23:35 And correspondingly, the centralized media environment we all grew up in is not the historical norm.
    0:23:43 And so basically, the way that we think about centralized top-down media today is basically an artifact of basically the period of the 1940s through, you know, essentially the 1970s.
    0:23:47 Like, before the 1940s, you didn’t have top-down media in the same way.
    0:23:50 If you go back to the 1930s or before, you had a much larger number of newspapers.
    0:23:53 You had a much larger number of radio stations.
    0:23:58 You had a much larger number of sort of fly-by-night publishing operations, you know, pamphlets and so forth.
    0:24:06 And then if you go back even further, one of the most interesting things to study on this is just go back to the American Revolution, you know, the time of the colonies in the 1760s through, like, the 1790s.
    0:24:15 And it basically, I’ve been rereading some of this stuff recently, like, basically the media environment of the colonial American era was a lot like today’s social media environment.
    0:24:18 You would have 15, 20, 30 little newspapers per city.
    0:24:21 They would be, like, they would occupy every micro-sliced knit.
    0:24:22 You know, talk about echo chambers or whatever.
    0:24:25 Like, they each have their own little echo chamber.
    0:24:28 You know, the founding fathers would write all these columns and essays.
    0:24:31 They would fight things out by writing essays, and then they would publish the essays under pseudonyms.
    0:24:37 And you’d have these characters like Benjamin Franklin or Alexander Hamilton that would literally have, like, a dozen or two dozen pseudonyms at a time.
    0:24:40 They’d actually write – they’d actually get into fights with themselves.
    0:24:42 They’d actually – they’d have their pseudonyms actually fight with each other.
    0:24:48 Ben Franklin used to set up arguments against his different pseudonyms to drive – you know, to really, like, litigate out an issue and to drive newspaper sales.
    0:24:55 But, like, really serious stuff also, like the Federalist Papers, which were kind of the explanation of the new Constitution in 1789.
    0:24:57 Hamilton and Madison wrote those under pseudonyms, right?
    0:25:01 And so this idea of, like, the Internet Anon is, like – that’s an old idea.
    0:25:03 And the idea of a pseudonym is an old idea.
    0:25:06 And the idea of, like, self-publishing is an old idea.
    0:25:12 And the idea of, like, basically these pitch, smash-mouth battles, you know, with very little centralized control over what people say.
    0:25:22 Like, you know, if you read about, like, how, you know, like, Hamilton and Jefferson and then also Jefferson and Adams had these just, like, absolute – they had their own, basically, pet newspapers.
    0:25:24 And it was just, like, absolute level of smash-mouth politics.
    0:25:31 Like, I would say even more, like, extreme and deranged than even what we have today, which people kind of can’t believe.
    0:25:40 But, like, if you read about the election of 1800, like, it was maybe – I think it was more extreme than certainly any election in my lifetime in terms of, like, what – you know, it’s literally John Adams and Thomas Jefferson, like, just, like, slandering –
    0:25:41 So, polarization is the norm.
    0:25:44 Like, really, like, on every conceivable front.
    0:25:49 Yeah, so, like, polar – you know, sort of as our – you know, as they say, unfettered conversations were the norm.
    0:25:51 Anonymity was the norm.
    0:25:54 You know, rumor, you know, scurrilous, you know, accusations were the norm.
    0:25:58 You know, pitch back – you know, sort of Overton window being wide open was the norm.
    0:26:06 And so – and just for people who want to read about this, the best book on this is called Infamous Scribblers, which was sort of the name for journalists in those days.
    0:26:09 And so, like, you know, this has happened before.
    0:26:19 And so, anyway, so the point is, like, this sort of centralized media thing that we’ve been living in that we grew up in or, you know, people my age grew up in, like, it’s a historical aberration off the norm.
    0:26:21 And, again, it’s a consequence of technology change.
    0:26:28 It’s a consequence of this sort of mass publishing, mass media, mass radio, mass television, mass newspaper kind of thing that only started in the 1940s.
    0:26:39 And then correspondingly, therefore, you know, Eric, to your question, like, everything that we think of as corporate branding is an artifact of just a specific point in time of the sort of 1940s through call of the 1980s or something.
    0:26:54 Like, all of, like, brand marketing, corporate brands, corporate messaging, corporate crisis management, like, all these playbooks that they teach at business school were very specific to a time and place that had a very small number of centralized media outlets with tremendous influence and control.
    0:26:58 But, and therefore, the corporate brand, like, why does the corporate brand exist?
    0:27:01 Like, why does a Procter & Gamble brand or any of these brands exist?
    0:27:06 It exists because if you have centralized media, you know, information is going through this very narrow straw, right?
    0:27:09 There’s very little bandwidth to get something on a TV.
    0:27:13 You have very little bandwidth to get something in the newspaper and, therefore, to get it to consumers’ attention.
    0:27:17 And so you kind of had to wrap up everything about a company into, like, a single word and a single image.
    0:27:22 And then you would just, through advertising, you would just pound that over and over and over again, trying to get people to remember it.
    0:27:24 But that’s because that’s all you could do.
    0:27:34 If you open everything up and everybody can publish and everybody can debate and everybody can be present and everybody can, you know, and then you have these, you know, individual, you know, influencers, you know, with 200 million followers and all this stuff.
    0:27:41 Like, all of a sudden, you have this completely different method of communicating with an audience that can be much more based on personality, right?
    0:27:45 So, authenticity, transparency, and then personality, right?
    0:27:46 That there can actually be a human being.
    0:27:51 And then it just happens, like, because your audience always consists of people.
    0:27:56 People relate much more to other people than they relate to a corporation, right?
    0:28:06 And so, as an individual, am I going to feel a stronger emotional affinity to, like, a person who I follow or to some, like, disembodied corporation with an office tower in New York City?
    0:28:12 And if the communication bandwidth is there where I can interact with both of them, of course, I’m going to have a lot more affinity for the people.
    0:28:14 And so, I think I’m sort of a radical inside.
    0:28:20 I think the whole idea of, like, corporate brands is basically just, like, it kind of, they’re on their way out.
    0:28:23 Like, it’s just as a concept, it just doesn’t make sense in the new media environment.
    0:28:28 And then correspondingly, the site, you know, the terms people use these days, like influencer marketing and so forth.
    0:28:31 But the parasocial relationship is actually a really interesting one.
    0:28:34 You know, sort of one-to-many personal relationships.
    0:28:40 You know, I just think so much of how this is going to work in the future is this is based on relationships with individuals.
    0:28:42 And obviously, you know, like, this is happening, right?
    0:28:47 Everything I’m describing is what’s happening in the entertainment industry and is happening, you know, consumer brands.
    0:28:53 And you’ve got, you know, Kim Kardashian with these, you know, with these multibillion-dollar businesses, you know, being direct marketed online.
    0:28:54 You know, many people doing this.
    0:28:56 Many politicians, you know, are now adapting to this.
    0:28:57 And so, this is happening.
    0:29:00 But I just, I still feel like it’s underestimated.
    0:29:07 And if we project forward 10 years, you know, most people, most people are going to think about, most people are going to think about the people they relate to as opposed to the companies they relate to.
    0:29:18 It’s very interesting how you bring that up, Mark, about that, you know, there were no kind of centralized medias and no corporate brands or corporate brands weren’t the thing kind of pre-1940s.
    0:29:30 Because as a kid, I always was surprised that I knew more entrepreneurs from like pre-1940 than, so I knew Thomas Edison and Henry Ford.
    0:29:34 Ford and JP Morgan, but who are the entrepreneurs after that?
    0:29:35 And they weren’t.
    0:29:36 They were just corporations, right?
    0:29:41 Like, you didn’t know, actually, who ran any of those things, you know, even the new companies at the time.
    0:29:48 Just, you know, it would leak out slowly and so forth, but it wasn’t really a thing.
    0:29:55 And then now, you know, we’re getting all these celebrity CEOs again are kind of, that idea is re-emerging, which is fascinating.
    0:29:56 Yeah, that’s right.
    0:30:05 And people kind of can’t believe it, but like before like 1930, like number, either you, like literally you would just go to the store or it was just like the corner store.
    0:30:10 And then you would buy like, you know, a pan and they weren’t branded.
    0:30:18 Like, you know, maybe, or maybe it was like Joe’s store, but it wasn’t like, it wasn’t like, you know, and so like the, like consumer brands didn’t exist in the modern sense.
    0:30:27 And then to Ben’s point, like to the extent you knew any, any business, it was at scale, it was, you know, businesses, you know, prior to like 1930, they were almost all named after their founders, you know, kind of for that reason, right?
    0:30:28 It was the Ford Motor Company.
    0:30:38 And so, and then it was actually this, you know, there’s this whole school of psychology is actually, I think it was Freud’s son-in-law, if I remember correctly, Edward Bernays, who was sort of the father of public relations.
    0:30:43 You know, it was the new field in the 1920s when, when radio and newspapers took off and centralized kind of media started to take off.
    0:30:49 And, you know, they sort of created this whole psychological theory of, of, of, of creating these sort of abstract brands for the, for the reasons that I described.
    0:30:57 But, you know, by the way, which is very linked to the, although the methods also political propaganda, you know, that became, you know, kind of very successful, you know, in those days.
    0:31:08 But it’s just, it, it, it, it is amazing to me as it’s like, you know, there were hundreds of years before there were hundreds of years of industrialization before that and modern, modern economic activity before that, where those things, you know, essentially didn’t exist.
    0:31:19 And, and, and that’s why I’m, I’m so confident in, in, in sort of pegging all this to technology shifts, which is that, you know, the thing that shifted, the thing that shifted how we think about companies happened as a consequence of the shifts in communication technology.
    0:31:25 And then correspondingly, if the communication technology unwinds, which is what’s happening, then you’re actually going to go back to the future.
    0:31:26 And yeah.
    0:31:28 And then of course there’s, there’s more data points to support that every day.
    0:31:32 And is this something you guys had internalized in 2009?
    0:31:37 Is that, is that why you called the firm injuries and Horowitz when any, every other firm was going for some big deal?
    0:31:38 No, that was a different thing.
    0:31:44 So what happened then, so when we were raising the money and it was, you have to remember it’s 2009.
    0:31:47 So it was a difficult year to raise venture capital.
    0:31:50 In fact, I think there were only two new funds raised that year.
    0:31:51 There was Zara’s and Khosla.
    0:31:59 So the, the biggest, the number one objection we got on the fund was, you guys are very successful entrepreneurs.
    0:32:06 What’s going to stop you from going out and quitting doing this and just starting a company?
    0:32:11 And then we’re going to be left holding the bag and nobody’s going to be investing or watching our money.
    0:32:13 And we had no plan to do that.
    0:32:19 So we got the idea of, well, one easy way around that is just name the firm after ourselves.
    0:32:22 Then they’ll know that we’re going to be tied to it forever.
    0:32:24 And we did that.
    0:32:30 And then I had the idea that since nobody could spell Andreessen Horowitz, we should have this A16C thing.
    0:32:31 And that was the name of the firm.
    0:32:45 And of course, immediately, all the competitors said that we were egomaniacs and like narcissistically insane because we named the firm after ourselves, which we just ignored.
    0:32:47 Like, what can we do?
    0:32:48 You know, maybe they have a point.
    0:32:50 Yeah, it’s kind of true.
    0:32:50 Yeah.
    0:32:59 Well, the irony is that you’re still running the firm, you know, 16 years later, still as active as you were beforehand, whereas a lot of other folks have retired.
    0:33:01 That is also true.
    0:33:03 Yeah, no, it worked.
    0:33:04 It did tie us to the firm.
    0:33:05 Yes.
    0:33:06 Yeah.
    0:33:13 And is it as simple as, you know, you guys have had, you know, billions in distributions?
    0:33:15 You don’t obviously need to be doing this anymore.
    0:33:17 Is it as simple as, hey, this is your baby.
    0:33:18 This is where you have the most fun.
    0:33:23 What’s kept you going, you know, far after you guys need to, going at this pace?
    0:33:27 You know, look, I think that one, the firm always had a mission.
    0:33:32 So it was never like, the mission of Andreessen Horowitz was never like, let’s make a lot of money.
    0:33:41 That wasn’t, you know, we actually, both of us had, you know, enough money for a normal person, you know, to be happy in life before we started the firm.
    0:33:42 So that was never the thing.
    0:33:48 It was always, you know, could we make it much easier and better?
    0:33:53 Like, could we make it both easier to build great companies and then with those, could we make those companies better?
    0:34:01 And then like, what in the world would be that, what possible activity could either of us have that would be more important than that?
    0:34:16 Because, you know, one thing Mark and I both share is that, you know, maybe the single best thing that you can do to improve the world is to build a company that, you know, delivers some product or something that improves the world.
    0:34:20 Like that is actually, that’s the thing.
    0:34:30 It’s actually better than, has a better impact than any kind of activism or political activity or anything else is just like literally just making things that make the world better.
    0:34:40 And then, you know, kind of doing something larger than yourself, where you bring a lot of people together to do that and they all kind of grow and improve their lives through it.
    0:34:47 So, you know, what could be better than helping people do this single best human endeavor possible?
    0:34:52 Like neither of us ever thought there was anything we wanted to do with our time that was better than that.
    0:34:59 And so there’s no reason to stop because we don’t have any better ideas, I would say.
    0:35:01 You know, like this is the best idea.
    0:35:13 Mark told me he got the, there’s a story about the Larry Page, or I think if I understand correctly, Larry Page says, I see no better use of my money than giving it all to Elon Musk to build more tech companies.
    0:35:15 Yeah, so that, it’s a little bit of that.
    0:35:15 Yeah, exactly.
    0:35:18 You know, as a philanthropic idea, for sure.
    0:35:25 One other idea I wanted to bring back to the idea of people as corporations is it’s not only the CEOs, right?
    0:35:28 It’s, I see us, you know, as building a cinematic universe, right?
    0:35:36 It’s, it’s, it’s, it’s the CEOs, but it’s also the surrounding, it’s, it’s Chris Dixon, it’s Catherine Boyle, it’s Martine Cassato, it’s Alex Rampel.
    0:35:42 Like you guys have done, you know, basically done a phenomenal job of, of sort of building stars and built, you know, sort of collection of people.
    0:35:49 Yeah, so, and I would say about that, you know, we’re not really, you know, we’re not a company, we’re kind of, we’re a firm.
    0:35:56 And, you know, those people who we were able to recruit in, like, very, like, hyper talented people.
    0:36:12 Really, it’s just like, it’s a platform for those people, you know, and we’re, we’re two of them, but we’re certainly, you know, not, you know, it’s not that hierarchical in that sense is, you know, you probably observe since you’ve been here.
    0:36:25 Like, everybody is kind of doing their thing, but in a common context with a common culture and, you know, kind of a mostly common set of investors and so forth.
    0:36:29 And so, it’s much more like a team.
    0:36:35 It functions more like a team than, than a normal kind of hierarchy, you know, in that sense.
    0:36:57 And, you know, it’s great because we were able, like, if you look at the top people, you know, if you look at Martine Cassato and Chris Dixon and Alex Rampell and David Ulovich and so forth, like, that team is better, like, IQ-wise, capability-wise than the executive teams of Meta or Google or Apple or any of them.
    0:37:05 And it’s just because, you know, in a way, they’re all the boss and they all act like the boss and that works.
    0:37:09 But that’s, that’s just kind of been like a nice outcome of, of the platform.
    0:37:19 Talk about how you guys developed this, this idea of a platform because most, most firms don’t have that, didn’t have that, or you guys moved to this sort of almost federated model.
    0:37:23 Talk, talk a little bit about how the evolution of the firm and how you guys figured that out or what that was like.
    0:37:26 Yeah, so it’s, it’s pretty interesting.
    0:37:31 So one of the things, so that, it came in two pieces.
    0:37:48 So the, the, the first thing was when we started the firm, the history of venture capital, like if you had done like a back test on it, what you find is there were never, ever more than 15 companies in a year that would ever make it to a hundred million dollars in revenue.
    0:37:53 Because, you know, the technology industry, that was like the general size of it.
    0:37:59 That’s the, the amount of new technology that the world could absorb, you know, in those days.
    0:38:05 But, you know, Mark had an idea which he wrote up in, I think, 2011 called Software is Eating the World.
    0:38:21 And the idea behind that was, well, every company that was going to be worth anything was going to be a technology company because software was able to just, to make anything so much better.
    0:38:26 And so there were going to be not 15 companies, but 150 or 200 companies.
    0:38:34 Now, the result of 15 companies meant the optimal venture capital firm was like six or eight people going after those 15 companies.
    0:38:36 You know, each one gets two and you’ve got a monopoly.
    0:38:39 So there was no need for it to ever be bigger.
    0:38:51 And as a result of that, the way they kind of set up their organizations were basically with something, what I’d say is called shared economics, but also shared control.
    0:38:59 And that shared control made sense if you’re only going to be eight or 10 partners or six or 10 partners or whatever.
    0:39:04 Because if you’re never getting bigger than that, then you don’t have to reorganize.
    0:39:09 You don’t have to make difficult management decisions that people are going to disagree with.
    0:39:19 We knew or like we thought software was going to eat the world and we were going to need to be way bigger, way bigger than, you know, six or 10 partners.
    0:39:34 And so we were going to have to be able to reorganize, decompose the problem, set up the organization in a way where very smart teams of people could work independently and address the different facets of the industry that needed to be addressed.
    0:39:41 And you can see it with American Dynamism and infrastructure and apps and crypto and bio and so forth.
    0:39:44 And so we never had shared control.
    0:39:46 We always had centralized control.
    0:39:53 And this is something we got that advice from, you know, Herb Allen was super helpful in us understanding why that would be important.
    0:40:04 And then, you know, also actually Mark’s father-in-law, the late, amazing John Arriaga, was like just very, very clear on like if you’re going to run something, you know, eventually there’s going to be conflict.
    0:40:07 There are going to be these issues and you’ve got to have control.
    0:40:08 And that’s going to be important.
    0:40:13 You know, it’s not important until it is important and then it’s the only thing that matters.
    0:40:25 And so, you know, with that control, we’ve been able to kind of reorganize, reimagine the firm and then go address every single kind of vertical where you need.
    0:40:34 Like the people who know American Dynamism, like to know that in depth, everything from like rare earth minerals to rockets to these kinds of things.
    0:40:39 There’s no way those same six people are going to know everything about crypto.
    0:40:40 It’s not even possible.
    0:40:42 Like these fields are too deep.
    0:40:46 And not only the technology, but also the whole entrepreneurial ecosystem.
    0:40:51 And so you need separate teams to address these separate, very large markets.
    0:40:54 Whereas before you just needed a person on that.
    0:40:57 Like you could have a person on crypto would be fine or a person on AI would be fine.
    0:40:58 No, no, no.
    0:40:59 That’s never going to work again.
    0:41:08 And so our ability to field a whole team against that and restructure things and say, okay, you were doing consumer internet.
    0:41:11 Like that’s not going to be relevant in the next 10 years and so forth.
    0:41:15 These kinds of things are very hard to do if you don’t have control.
    0:41:23 And those two examples, crypto and American Dynamism are also interesting because these are examples where you guys helped create the categories.
    0:41:29 Where I believe you’re the first big venture firm to have dedicated crypto and AD practices.
    0:41:37 You’re also creating a firm and you’ve created a firm that is, can be adaptive to new sort of theses, new ideas, new, new, new trends and build firms against them.
    0:41:38 Yeah.
    0:41:47 And that was something, you know, like I just say that Mark kind of identified early on, you know, he, one of the things he used to say when we started the firm is venture capital is a young man’s game.
    0:41:52 And that’s because a venture capitalist, that was like one of the things he got out of the many conversations we had with him.
    0:42:00 He’s like, and what he was really saying is a young person’s game is, you know, the technology is always changing.
    0:42:07 And, you know, to learn and the people who know the new technology best turn out to be often new people.
    0:42:14 And so to what you see in many venture capital firms is once whatever they exploited runs out.
    0:42:19 So they did network effects and consumer internet, and they were amazing at that.
    0:42:23 But then when that stopped being the thing, they didn’t get to the next thing.
    0:42:26 And we were able to get to the next.
    0:42:28 So, you know, one, we’re always watching for the next thing.
    0:42:36 But then as soon as we see it, like, and we have such brilliant people, you know, Chris Dixon saw crypto, and we’re like, Chris, go get it.
    0:42:42 And, you know, David Yulevich actually saw Catherine Boyle, who saw American Dynamism.
    0:42:44 And Catherine was like, like, this is a very important thing.
    0:42:46 And so we just go do it.
    0:42:51 And we can do it because we don’t have to repurpose our old people.
    0:42:52 We can build a whole new team.
    0:42:54 We can change the organizational structure.
    0:43:05 In an offline conversation, we were talking about how some firms look the same as they did, you know, 30 years ago from a structure perspective.
    0:43:07 And the world has changed.
    0:43:13 And, you know, firms need to change to meet those sort of the evolving needs.
    0:43:18 And this is one example where the sort of stuff has gotten so much more complex.
    0:43:20 There’s been this great complexification.
    0:43:32 And so to your point, a generalist firm could have been able to cover the entire landscape, but now no, you know, individual can have deep knowledge on all the fields, you know, bio, crypto, all the fields we cover.
    0:43:40 And so that’s one great example of how the world has changed and that leads to a need in venture firms to change as well.
    0:43:50 Are there other examples that come to mind around how the asset class has evolved or should have evolved to meet the needs of the world changing?
    0:43:57 Well, you know, it’s probably changed more since we started the firm than it did in the whole history before then.
    0:43:58 And there’s so many ways.
    0:44:06 So, you know, one of the things is, right, like as Mark said earlier, angel investing, kind of that became a real category.
    0:44:21 And then the public markets have become, I would say, very difficult and dysfunctional to the extent that, you know, OpenAI just did a giant raise in the private markets, which I don’t think they could have done in the public markets.
    0:44:38 So now, like the fact that you can raise more money in the private markets than the public markets in one shot just has speaks to the expansion of the private markets to deal with the fact that the public markets are just not a great environment anymore for, you know, for companies.
    0:44:41 And so that changes venture capital because we’re the private market.
    0:44:44 So our market just, you know, got much more enormous.
    0:44:50 And then, you know, as you said, like media change, like how you go to market.
    0:44:54 Like we actually were the first ones to market a firm in venture capital.
    0:45:05 And, you know, Margaret Wett and Marcus did like an amazing job of, you know, creating a brand for a firm that, you know, popped up out of nowhere.
    0:45:07 And that had never happened before.
    0:45:10 But then the way you market it changed entirely, as we just discussed.
    0:45:12 So, you know, it’s evolving.
    0:45:14 The world is changing really fast just in general.
    0:45:26 And now, look, I think AI, just the way we work, the way we operate as a firm is changing very fast due to AI and, you know, like what we can automate, you know, how big a reach, how many entrepreneurs we can know.
    0:45:27 All these things are very different now.
    0:45:36 Let’s double click on the brand point because you guys in 2009, you came out and you were alluding to it earlier, but you made a lot of noise, right?
    0:45:39 You know, some people have different opinions, but everyone had an opinion.
    0:45:44 And you thought deliberately about, hey, we’re going to build a brand in a new way, you guys and market and team.
    0:45:46 And you kind of crushed it.
    0:45:53 Talk about what that strategy was and what perception it was and what was it like as you were building out the brand?
    0:46:00 Well, look, you know, it was a conversation Mark and I had, and, you know, Mark is kind of like one of the ways we understand each other.
    0:46:07 So, Mark asked me, he said, you know, like, I’ve been studying the history of venture capital.
    0:46:10 I’ve been trying to figure out why they don’t do any marketing.
    0:46:21 And it turns out, like, the industrialists and venture capitalists, the Rothschilds, you know, J.P. Morgans and so forth, were sometimes, like, funding both sides of a war.
    0:46:26 And so, like, any kind of publicity, like, might get them killed.
    0:46:37 And that just kind of carried through to modern venture capital so that the original rationale for not doing it was no longer really valid.
    0:46:46 And they told themselves other things, like, we’re very humble, so we don’t market and this kind of nonsense, which is always a rationalization for laziness.
    0:46:48 So, he said, you know, like, what do you think?
    0:46:48 Should we market it?
    0:46:56 And, you know, like, sometimes when Mark asks a question like that, and I already know what he thinks and I haven’t thought about it that much, I just go, yeah, of course, like, let’s market it.
    0:46:57 And that was kind of that conversation.
    0:47:07 And then, you know, he had worked with Margaret, you know, prior at Ning, and he thought super highly of her.
    0:47:13 And so, what happened is, you know, he said, well, let’s, you know, let’s talk to Margaret.
    0:47:14 Let’s see what we can do and so forth.
    0:47:20 And, you know, we spoke to her and this was kind of a hilarious thing.
    0:47:25 And you have to remember that this is the days when, like, magazines were a big deal, which, you know, they’re not so much anymore.
    0:47:34 And so, when we launched the firm, she said to us, she said, do you want to be on the cover of Fortune or Forbes?
    0:47:36 And we were like, Fortune, of course.
    0:47:39 And that’s exactly what happened.
    0:47:41 So, that was the beginning of it.
    0:47:46 And you guys were able to recruit amazing people early on.
    0:47:49 Talk about what it was like to get one of the first big partners.
    0:47:55 Like, you know, you’ve got some partners like Chris who’ve been here, you know, over 12 years.
    0:48:00 What was it like in terms of how you thought about recruiting the early partners and landing them?
    0:48:06 I would say that’s, like, kind of one of the things we got wrong in our thinking.
    0:48:08 Well, we got right and we got wrong.
    0:48:15 So, like, one of the things we got very right was the first person we hired was Scott Cooper, who we knew, like, super well and had worked with for years.
    0:48:20 And it was just like a brilliant, like, and really fundamental to building the firm.
    0:48:25 He’s recently joined the presidential administration in the White House.
    0:48:29 But he was just kind of invaluable and fantastic.
    0:48:35 And he didn’t want to join when we started the firm because he was worried we wouldn’t be able to raise the fund.
    0:48:37 So, we raised the fund and then we hired him.
    0:48:39 He was employee number one.
    0:48:48 Then the second idea that we had was to kind of only founders or CEOs were allowed to be general partners.
    0:49:01 And the reason for that was, you know, a little bit what Mark said earlier, which was we were counter-programming what had happened in the industry where you had a lot of people who were smart but didn’t understand founders.
    0:49:04 So, we wanted everybody in the firm to understand founders.
    0:49:09 But that profile turned out to be not perfect in many ways.
    0:49:11 But we hired some really great people.
    0:49:16 You know, one of the early people is Peter Levine, who still works with us now and so forth.
    0:49:25 And then, you know, we kind of started, the first thing we relaxed was, okay, maybe the company, you had to found a company or BCO, but it didn’t have to be that great a company.
    0:49:28 Like, if the company did okay, then that was okay.
    0:49:33 And that kind of gave us permission, which was controversial at the time, to hire Chris Dixon.
    0:49:40 And one of the things Mark and I recognized early was Chris Dixon was a far better investor than either of us.
    0:49:46 And so, that was like a little bit of an indication that maybe we were too rigid in our criteria.
    0:49:48 And that started to open it up quite a bit.
    0:49:57 I want to go back to one of the unique insights you had was going back to, you know, Mark’s software’s eating the world piece was that there were going to be more winners.
    0:50:00 And those winners were going to be much bigger.
    0:50:03 And there’s a lot of implications that stem from that.
    0:50:05 You’ll raise bigger funds.
    0:50:10 You’ll have this decentralized team or sort of federated model.
    0:50:17 You’ll have, you’ll be able to invest at higher valuations if these companies are going to get bigger and bigger.
    0:50:28 And it feels like that was something that you guys saw relatively early that, you know, other people, other firms or even later stage firms, you know, then sort of got on, got on board with.
    0:50:41 Yeah, so, so the big thing on that is, you know, sort of this really important transformation that’s happened in tech that sort of went kind of unremarked on as a pattern, although you started to see it kind of in the early 2010s, which is, you know, kind of up until roughly 2010.
    0:50:49 Like if you make a list of all the big winners in tech over the preceding 60 years, they were basically all a form of a tool company, you know, technology tool.
    0:50:55 So, you know, they would build personal computers or microchips or operating systems or databases or routers or web browsers or whatever.
    0:50:59 But fundamentally, they were, you know, building components of a computer system.
    0:51:03 And then, you know, they would sell, you know, those tools to either consumers or businesses.
    0:51:06 And then the consumer or business would figure out what to do with the tools.
    0:51:08 And that had been the pattern.
    0:51:23 In fact, Ben will recall when he first started the firm, one of my early investing things was no verticals because you just look at that list and you’re like, basically, the big winners have all been these big horizontal tech companies building general purpose tools that lots of, you know, that many other, many, many, many downstream industries pick up and use.
    0:51:30 But, you know, the big winners, like historically, if you had a tech startup that was focused on a vertical, it just meant that you were a small tools company.
    0:51:42 The classic example. So classic example is I am a tech company and I want to be in the boutique hospitality industry, you know, and so therefore I start a software company that makes, you know, booking software for bed and breakfast hotels.
    0:51:46 And, you know, such things existed, by the way, and they were just like very tiny companies.
    0:51:51 Fast forward to 2010, you have this like basically, and I think it’s really the Internet really started to work.
    0:51:55 Broadband really kicked in, you know, a bunch of things, you know, kind of really catalyzed.
    0:52:00 And what you started to see was actually the vertical, the tech companies that went into vertical started to get to be huge.
    0:52:06 And probably the first two of those, you know, that really, you know, kind of made this clear for me were, you know, Uber and Airbnb, right?
    0:52:13 Where, you know, Airbnb, okay, like, how about we not only build the booking software for the bed and breakfast, but how about like we run the entire service?
    0:52:16 Like, how about we run the entire booking engine? How about we run the entire search engine?
    0:52:21 How about we do all the transactions? How about we do all the customer, you know, all the customer service, like the entire end to end experience?
    0:52:35 Or the same thing for, you know, Uber or Uber and Lyft, which is, you know, you could have a small boutique software company doing taxi dispatch software for taxi limo operators, or you could build Uber or Lyft and build, actually build a giant transportation network with, you know, drivers and riders and money flowing through.
    0:52:48 And then, you know, more recently, you could, you know, a company like Andrel, right? Like, you know, our companies for many years have sold many, many, you know, parts of computers and software into the defense department and into the defense contractors.
    0:52:56 But, you know, Palmer Luckey came along and said, let’s just build a defense contractor. Like, let’s build a direct competitor to the big defense primes and actually, you know, build defense systems.
    0:53:05 You know, Tesla, another one, right? Like, you know, instead of building, you know, embedded, you know, whatever power management software for, you know, for cars, you know, how about just like build the car?
    0:53:13 You know, SpaceX, we keep going. But basically, like in the last 15 years, if you write that, if you do that same list again, you know, many or most of the big winners have been companies that have gone into a vertical.
    0:53:26 But what they’ve done is they’ve gone in and they’ve tried to basically eat the entire vertical, right? They’ve provided an end-to-end experience with everything required to basically, you know, service that vertical, often in direct competition with the incumbents in that vertical, right?
    0:53:35 So Andrel competes head-to-head with existing defense primes, you know, Uber competes, you know, competed head-to-head with tax limo operators, Airbnb famously competed head-to-head with hotels.
    0:53:42 We got extremely angry about that, right? And so, you know, Netflix competed directly with cable channels, right? And movie theaters.
    0:53:50 And so basically, it’s just like, all right, you’re going to have more and more of these companies that are going to use technology to go insert into an end market and then just try to go take that end market.
    0:53:54 Those companies, good news, those companies can get to be gigantic, right?
    0:54:06 Because if you crack the mother load, like, you know, Netflix has, for example, entertainment, you know, or like Tesla has in cars, you can build a company that’s maybe multiples in size, even of that entire industry earlier, you know, the way that existed before.
    0:54:10 The challenge is that those kinds of companies are much different than historical tech companies.
    0:54:14 Those are like full service. And so they’re like much more complex, right?
    0:54:17 They have like a lot more moving parts on the operating side.
    0:54:19 They require a different kind of discipline on the part of the management team.
    0:54:26 You know, they’re going to be operating in a lot of cases, they’re operating in regulated industries, right, where there’s a completely different political dynamic.
    0:54:32 And by the way, they’re going up against entrenched competitors, right, who certainly have no intention of just turning the business over.
    0:54:42 And so I think in many ways, that’s been the defining theme of the last 15 years in the Valley is kind of the evolution from just tools companies to, you know, what we used to call full stack, like just do the whole thing.
    0:54:53 It’s fascinating. The one knock against injuries that I’ve heard over the years is, hey, it’s, you know, they think of their firm as a product or there’s like a, there’s like a machine as if, as if that’s not, you know, a great thing.
    0:55:03 Like if a startup said, hey, we have no moat, you know, I’m just a smart guy, you’d say, hey, that doesn’t, doesn’t feel super defensible, doesn’t feel like you’ve really built something of power.
    0:55:10 And yet, sort of when people think about their venture firms, they sort of run them the opposite of ways that they want their startups to run.
    0:55:17 They really think about structural advantages or durable advantages or network effects or all of these things that they want their startups to have.
    0:55:19 It’s been an interesting sort of contrast there.
    0:55:27 There’s a kernel of truth in the critique. The kernel of truth is like, look, at the, at the end of the day, as an entrepreneur, you do, you do, like your VC, you don’t have personal touch.
    0:55:29 And the reason for that is you’re going to have somebody on your board, right?
    0:55:34 Like you’re going to have somebody on your board. You’re going to have somebody you call at 4 a.m. when like, you know, the world is caving in.
    0:55:36 You’re going to have somebody who you, you’re, you know, you’re dealing with.
    0:55:41 And, and that’s going to be, you know, and it’s going to be, you’re going to be dealing with that person in high tension situations.
    0:55:43 You’re going to want to really rely on them. You’re going to want them to really know what they’re talking about.
    0:55:46 You’re going to want them to, you know, have, have throw weight in the industry.
    0:55:51 And so you’re like that, that really, that does really matter. There, there is that personal relationship.
    0:55:55 And so I, I, I don’t think what would work is just like trying to not provide that.
    0:55:58 And instead just, just provide, as you said, like, just provide a machine.
    0:56:02 But what I think works incredibly well is to provide that and provide the machine.
    0:56:05 Well, and, and the team.
    0:56:15 So the thing that I would say really distinguishes kind of what we do from, from what we experienced is like, we always had a person.
    0:56:20 And when we tried to reach through that person to the rest of the team, they were like, not my company.
    0:56:22 I’m not making that introduction. I’m not doing that.
    0:56:30 Whereas, you know, like almost on a daily basis, you know, we’ll have a company who’ll run into something and they’ll go, oh, wow.
    0:56:31 You know, yeah.
    0:56:33 Like you should talk to Joe Morrissey.
    0:56:35 He dealt with that sales issue over here.
    0:56:36 You should talk to Ben.
    0:56:40 Like he knows how to like deal with a crisis like this.
    0:56:53 In fact, we just had one this week, you know, where, you know, one of our partners, he’s like, well, this is seems like a bad crisis, you know, like bringing, bringing the guy who like lived through all the crises.
    0:57:00 And, you know, that’s me and like, I can really help in that because I, I know I understand like what to do, but I understand what it feels like.
    0:57:06 And, you know, so much of, you know, that kind of thing is, is, is having a deep understanding.
    0:57:10 And in the firm, we understand almost every situation you would be in.
    0:57:18 And there’s somebody who’s a great expert who will be there in like a flash, even if that’s not the person on your board.
    0:57:26 And that, that I think is probably the thing I’m most proud of in the organization is people always get their money’s worth from that perspective.
    0:57:31 This is the big industry structural transformation thing that we think has taken place.
    0:57:36 And we, we, we did, and this is one where we did predict that we have been talking about it for a long time, but I think it’s really happened.
    0:57:41 Like it’s really played out over the last 15 years and it’s still playing out, which is there’s this pattern.
    0:57:48 There’s this pattern in industries as they mature, which is they often start with what you would call like basically a strategy that’s kind of like being in the middle.
    0:57:57 So classic example using this is like retail, you know, retail shopping where, you know, once upon a time there were these things called department stores, you know, it’s just names like Sears and JCPenney.
    0:58:01 And then you would go to the department store and the thing about the department store is it would have a pretty good selection of products at a pretty good price.
    0:58:04 And, and, you know, growing up, that’s, you know, that’s where we would always go shopping.
    0:58:08 You know, by the time you hit the eighties and nineties, you know, basically the department stores stopped working.
    0:58:11 And, and, you know, for the most part, they’ve gone under at this point.
    0:58:18 And, and what happened was they got replaced by competitors that were not in the middle, but were on the far, the far end of one side or the other of kind of the spectrum of strategies.
    0:58:20 That’s why we call the outcome of the barbell.
    0:58:23 And so the department store was replaced by two, two sets of companies.
    0:58:26 So first of all, high scale, right?
    0:58:28 So high scale, Amazon, Walmart, right?
    0:58:32 Where, where, where, where what you get is like an incredible selection at absolutely fantastic prices.
    0:58:33 Right.
    0:58:36 But, but, but it’s a very, you know, to your point, like, it’s a very machine experience.
    0:58:37 It’s a very, it’s a very machine.
    0:58:40 It’s a very, you know, it’s a, you know, it’s a high scale.
    0:58:44 You go to Walmart and like, you know, the shelves up to the ceiling and the whole thing, like, you know, it’s, it’s a specific thing.
    0:58:47 But like that, like wiped out a huge part of the department stores.
    0:58:59 And then the other thing on the other side was basically specialist boutiques, where for the thing that you care about the most, whether that’s, you know, fashion or jewelry or consumer electronics or candles or whatever, right?
    0:59:03 Whatever is the thing that you actually care about the most, you go to the boutique, right?
    0:59:07 And you say, I was talking about it, you go to the Gucci store, you know, to buy your scarf, you go to the Apple store to buy your iPhone.
    0:59:11 And what the boutique offers is a very narrow selection at a very high price.
    0:59:13 But what you’re getting is a very specialized experience.
    0:59:16 And your point, you’re often getting the personal touch, right?
    0:59:20 So you go into a, I don’t know, you go into like a, you know, wrist press boutique or something.
    0:59:21 And it’s just like, it’s just like, great.
    0:59:23 It’s like, wow, would you like some champagne?
    0:59:24 You know, we’re doing the whole thing.
    0:59:25 Oh, let me get you a comfortable chair.
    0:59:27 It’s like, you know, here’s all the espresso.
    0:59:29 You know, it’s just like the whole thing.
    0:59:29 Oh, you want to stay late?
    0:59:30 Great.
    0:59:30 We’ll lock the doors.
    0:59:32 You can stay for another, you know, half hour and browse through everything.
    0:59:38 You know, you just get, you get this very, you know, you get this very, you know, kind of personal touch, you know, personal touch kind of experience.
    0:59:41 And so what happened was the department stores just died because they didn’t offer either one.
    0:59:46 They didn’t offer scale and they didn’t offer the boutique personal touch experience.
    0:59:52 And what you find if you look at the history of business is basically as industries professionalize and mature, many of them go through this.
    0:59:56 And so what I’m describing also happened in advertising agencies.
    1:00:02 By the way, this is a big theme of the TV show Mad Men because they were right in the middle of, you know, if I remember there’s a certain point in the show where they sell, you know,
    1:00:07 they’re running this kind of midsize ad agency and then they actually sell it to McCann, which was one of the big scale players.
    1:00:09 And then they got frustrated there because it was just a big machine.
    1:00:11 And so then they went and started their own boutique.
    1:00:14 And so it was kind of, kind of during that, during that, during that era.
    1:00:16 And then it’s ad agencies, it happened to law firms.
    1:00:18 It happened with Hollywood talent agencies.
    1:00:22 Michael Ovitz catalyzed this when he was in Hollywood in the 70s and 80s.
    1:00:24 It happened in the financial, it happened in banks.
    1:00:26 It happened in investment banking, commercial banking.
    1:00:27 It happened in hedge funds.
    1:00:28 It happened in private equity.
    1:00:34 So we just like seen this pattern that this happens over and over again, but it hadn’t happened in venture capital.
    1:00:40 And so when we entered the field, basically what we observed was you just basically have, they’re all department stores.
    1:00:50 And the venture capital version of the department store is six to eight general partners with a, you know, 300, 400, $500 million fund, you know, doing the sushi boat strategy, right?
    1:00:53 Like sitting and waiting, you know, not, you know, by the way, you know, no website, right?
    1:00:58 Because like, oh, you know, God forbid that you like, you know, ever tell your story to anybody or make yourself visible.
    1:01:01 And then you just, you basically sit on a sandhill road and you basically wait for the deals to come through.
    1:01:04 And then, you know, it had run that way for a long time.
    1:01:06 And so it was kind of this cartel self-referential thing.
    1:01:09 And so it just, it just kind of ran that way.
    1:01:21 And basically our bet when we went for scale and we went to build out the kind of teams that Ben described and sort of this machine that results from it, you know, the bet was basically that the death of the middle was going to happen.
    1:01:22 The barbell was going to play out.
    1:01:29 And so there was going to be an opportunity for a handful of firms to go for high scale, but only a handful, right?
    1:01:32 Because what you get on the other side of this is you don’t get 50 at high scale.
    1:01:34 You get, you know, you get a bunch, but like it’s not that many.
    1:01:36 It was a scale economics kick in.
    1:01:41 And then what would happen on the other side is the rise of the seed, their angel investor and the seed investor.
    1:01:42 And of course, we had been part of that, right?
    1:01:43 We had been on that side of the barbell.
    1:01:51 And this was part of the transformation that had happened in venture, which is the original venture firms were like the original venture firms in the 50s, 60s, 70s.
    1:01:52 They were first money in, right?
    1:01:54 They were the first check, right?
    1:01:55 A company like Intel or Apple.
    1:01:59 By the time the 80s and 90s rolled around, they were no longer the first money in.
    1:02:02 They were often the second or third check after the angels and the seed investors.
    1:02:08 And so, you know, we put two and two together and said, aha, what’s going to happen is this field is going to bifurcate just like every other field.
    1:02:11 We’re going to go for scale and then we’re going to encourage the seed investors.
    1:02:15 And I’ve been, you know, we’ve been very actively trying to invest in seed investors and trying to help them.
    1:02:16 And, you know, I was trying to be very friendly with them.
    1:02:22 And then basically the question, the structural questions posed is what’s the point of having a department store, right?
    1:02:24 Of having a sort of mid-sized firm.
    1:02:26 And the answer is, by the way, there’s no point.
    1:02:28 Like for the same reason there’s no point to department store.
    1:02:34 There’s no point to the mid-sized firm for the reason that Ben described, which is that, you know, they don’t have any – they’re not the first money in.
    1:02:36 They’re not at scale.
    1:02:37 They don’t have any depth.
    1:02:45 And so at the end of the day, there’s really fundamentally no value proposition to the thing if you have access to seed investors on the one side and the scale platforms on the other side.
    1:02:51 And I would say, you know, 10 or 15 years ago we would say this and everybody would get mad, you know, because it sounds like we’re predicting everybody’s going to die.
    1:02:54 But like sitting here today, you know, this has really played out.
    1:02:59 And many of the mid-sized firms that I grew up with are gone.
    1:03:02 And in some cases, they’re gone because they failed.
    1:03:04 But in a lot of cases, they’re actually gone because they succeeded.
    1:03:06 You know, the partners made a lot of money.
    1:03:09 And then at some point, just the rationale for being in business started to fade away.
    1:03:12 And, you know, maybe they had to start working a little bit harder and that wasn’t fun.
    1:03:14 And so they just kind of folded up shop.
    1:03:17 And then the LPs correspondingly have adapted to this.
    1:03:26 And so if you talk to the LPs now, increasingly, they are focusing capital either on the scale platforms or they’re focusing capital into, you know, this very specific kind of early stage seed angel strategy.
    1:03:30 And their interest in funding the department of store equivalent of the VCs, you know, has really faded.
    1:03:35 Anyway, so I view this as like, like, this is one of those things, like, this is a very natural evolution.
    1:03:36 This was destined to happen.
    1:03:38 It’ll happen in many other industries in the future.
    1:03:44 You know, it’s a somewhat, you know, it’s a process that plays out in response to customer demand, right?
    1:03:48 Because the customers of venture firms are the entrepreneurs on the one hand and the LPs on the other hand.
    1:03:50 And if they both want this change to happen, then it’s going to happen.
    1:03:52 And so it’s a very natural process.
    1:03:55 But, you know, it’s disconcerting to be on the wrong side of this.
    1:03:59 And it’s an adaptation process for people to kind of figure out that this is happening.
    1:04:00 But I think now it’s pretty clear.
    1:04:03 That’s well said.
    1:04:06 And that’s one example of how the asset class has evolved.
    1:04:08 Let’s get into other ones.
    1:04:14 I mean, one is that there’s been, you know, as your thesis has played true, software has eaten the world.
    1:04:18 There’s been more demand on the LP side to get into the space.
    1:04:22 Much more money has flooded into space, which means more venture capital firms, which means more competition.
    1:04:34 And of course, when, when supply is constrained, people are sort of competing on the axis of almost, you know, VCs have the power and founders are clamoring to, to get into, to be on the conveyor belt.
    1:04:37 And they’re, you know, pretending not to care by, you know, not having websites.
    1:04:45 But when there’s an explosion of venture firms, now founders are the ones picking and VC firms have to change their tune.
    1:04:50 You guys were, were early on to it, but it also changes sort of the, the, the types of LPs that want to be involved.
    1:04:58 And then, yeah, talk more about how, how, how the asset class has evolved from, from more capital flooding into the space or any other changes that emerged from it.
    1:04:59 Yeah.
    1:05:00 So look, I was like a couple of things.
    1:05:09 So first of all, our, our, our, we used to have this discussion with our friend, Andy Ratcliffe, you know, who’s kind of the master of a venture and, you know, as I said, co-founder of a benchmark and then actually taught venture later at, at, at Stanford.
    1:05:11 And very analytical on the topic.
    1:05:12 Yeah.
    1:05:13 Extremely thoughtful.
    1:05:18 And, you know, cause we have this discussion with him of like, wow, you know, money comes in and out of the, you know, money comes whipping in and out of venture and these, right.
    1:05:25 These dynamics really change of who’s, as they say in Seinfeld, who has hand, you know, in every relationship, somebody, somebody has hand, you know, the upper hand.
    1:05:28 And is it the founders or the, or the, or the VCs?
    1:05:29 And he said, you know, how, how should we think about this?
    1:05:31 And Andy made this very interesting observation.
    1:05:40 He said, basically for as long as he had been in the field, I think, you know, going back, you know, going back now decades, you know, he said, basically venture has always been over, over, overfunded as an asset class.
    1:05:44 There’s really never been a time in which vendor has, venture has been underfunded.
    1:05:49 Maybe, maybe a little bit in the extreme crises, like, you know, maybe 2009 as an example, but like generally venture is overfunded.
    1:05:55 He said, he has rough, I think he said at the time, his rough back of the envelope math is a sort of roughly always overfunded by like a factor of four.
    1:06:00 You know, I think you, you know, maybe these days it’s like a factor of 40 or 400 or something.
    1:06:03 You know, the Sequoia guys are always famous for complaining.
    1:06:06 Anytime Sequoia guys give an interview, they always talk about how there’s just like way too much money in venture.
    1:06:08 As they’re always trying to talk to LPs.
    1:06:11 Yeah, they’re always trying to discourage people from doing any venture, yeah.
    1:06:18 Yeah, they’re trying to talk to LPs into stopping the money flow, but, because, you know, more competition, but, but, but then his question is, okay, why is it always overfunded?
    1:06:24 And, and, and he said, basically, it’s, it’s a consequence of the, you think about the broader financial landscape.
    1:06:26 So, you know, what, what are LPs?
    1:06:33 LPs are large pools of institutional capital being invested for many reasons, but a lot of it ultimately is retirement, they’re ultimately retirement funds.
    1:06:36 Like the ultimate theme is one form or another, they’re retirement funds.
    1:06:43 So, so they’re large pools of capital that need to generate a certain level of return over the next 50 or 100 years to be able to pay for, you know, people’s retirement.
    1:06:47 And, you know, in order to do that, they need to hit a certain level of return.
    1:06:50 And, you know, the nature of, of the modern economy is, you know, population declined.
    1:06:53 You have a lot more older people, a lot fewer younger people.
    1:07:03 And so you have this sort of fundamental issue, which is like how, as a steward of institutional capital, how do you generate the long-term returns that you need in an environment in which actually that’s, that’s actually not so easy.
    1:07:09 And so, you know, you invest in stocks and bonds and whatever, and, and, and, you know, you often still can’t get the math to pencil out.
    1:07:10 You’re not going to hit your return target.
    1:07:15 And then there’s this asset class called venture capital where, you know, sometimes it works and sometimes it doesn’t.
    1:07:17 But when it works, it blows the lights out, right?
    1:07:20 Like when venture capital works, it’s the top performing asset class.
    1:07:28 And, you know, there are individual venture capital funds that have been, you know, just absolutely spectacular, you know, returns that have driven a lot of the return for an entire institutional portfolio.
    1:07:37 And so there’s this, you know, and the way I describe it is, you know, venture capital is never the majority of the money in an institutional pool, but it’s like the, you know, it’s like the cherry on the top of the sundae.
    1:07:43 It’s the thing that, you know, it’s the, it’s the, it’s the small position of the thing, but if it works, it might make the entire formula work.
    1:07:49 And then you just look at like how many, how many pools of capital are there like that out there, right?
    1:07:51 How many, how many LPs are there out there like that?
    1:07:52 And the answer is there’s a lot.
    1:08:00 And then basically what happens is all the LPs basically read the Swenson book, which describes how to run, you know, these institutional capital pools, which is a great book.
    1:08:09 And they basically say, oh, Dave Swenson says you put, you know, X percent of venture capital and, you know, but Dave Swenson says the key to it is you only invest in the top venture capital firms.
    1:08:13 Because venture capital is a feast or famine business and you only want to be in the top 10 percentile of firms.
    1:08:20 And then basically what they do is they go out and they talk to, you know, the firms and then they find out they basically can’t get into most of the firms they want to invest in.
    1:08:23 And then they sort of develop a theory of how these other firms are actually in the top 10 percent.
    1:08:30 And you can actually pick that up because if you ask LPs who are their top, who do they think are the top 10 percent firms, they often have very different lists.
    1:08:34 And, and, and I, and I, you know, and part of it is a function of maybe they’ve, they’ve, they’ve sniffed something out.
    1:08:37 And a part of it is just because like they have to allocate the money.
    1:08:40 And so they kind of convince themselves that there, there are sort of undiscovered gems out there.
    1:08:44 And so as a result, they just, they, they just, they overfund the asset class.
    1:08:47 And then that, that, you know, it’s just like too many LPs leads, right.
    1:08:59 Too many LPs managing too much money leads to too many VCs, leads to too many startups getting funded, which leads to the phenomenon that founders experience, which is I start a company and not only do I have three venture competitors, I often have 30.
    1:09:00 Right.
    1:09:02 And it’s like, you know, basically like what the hell.
    1:09:06 And so anyway, so Andy’s point is like, look, like that’s just an artifact of the world.
    1:09:12 Like we, we are the, we are the tail on a much larger dog and the dog is large scale institutional money flows.
    1:09:19 Like venture is a rounding error in the global financial system, but it’s one that’s just prone to be overfunded for very long periods of time.
    1:09:29 And what, what Andy said was until there’s a new approach to investing these large pools of capital, like basically this, we should basically assume that this, this, this, this process persists over a long period of time.
    1:09:36 So, so I think it just is, is the case, you know, would it be better if the amount of money was kind of, you know, equalized to what it should be relative to the opportunity set?
    1:09:38 I mean, you know, for people like us, yes, that would be better.
    1:09:40 For the world, it would be worse.
    1:09:41 Yeah.
    1:09:42 I was going to say, yeah.
    1:09:43 So that’s the other thing.
    1:09:47 It’s like, if you had less money in the space, would, would you, would entrepreneurs be able to take as many swings?
    1:09:47 No.
    1:09:48 Right.
    1:09:55 And, and, and, you know, look, you know, should I have the arrogance to sit here and say that we’re going to invest in all the great companies and that we’re not going to say no to people who we ought to be funding?
    1:09:58 And obviously we, you know, we make that mistake all the time.
    1:10:03 And so like, if you’re going to have an asset class that is to be overfunded, like this probably is the one to overfund, right?
    1:10:10 In other words, that there’s a societal surplus of, of all of the swings that entrepreneurs get to take that they wouldn’t get to take if the sector wasn’t overfunded.
    1:10:12 And some of those work, right?
    1:10:16 Like, and you have founders come out of nowhere and they raise money from no-name VCs and like they end up building huge successful companies.
    1:10:26 And so I, on a societal basis, I actually think it’s this, it’s like, it’s, it’s, it’s like a form of dysfunction that maybe is not optimal financially, but like on a societal basis, I think it’s probably not positive.
    1:10:38 Yeah. I mean, like what could be better in terms of wasting money than taking money from people who have too much and giving it to people who want to change the world and make it a better place?
    1:10:45 I mean, it seems like a, you know, and our, and our building a company to do so like that, but that seems like a pretty good idea.
    1:10:58 You know, the other thing I’d add to that is venture capital is a little bit unique, you know, from our point of view in that it’s the only asset class where the top managers tend to persist for decades.
    1:11:10 So like, if you look at stocks or bonds or anything else, like the pickers, because they’re all, you know, kind of picking against the same thing and they all have equal rights to invest in everything.
    1:11:18 It tends to like, there’s some amount of randomness or, or whatever that puts somebody on top and then they’re no longer on top the next decade and so forth.
    1:11:24 But in venture capital, the top firms often remain the top firms for a very, very long time.
    1:11:29 And the reason is the best entrepreneurs will only take money from the best venture capital firms.
    1:11:39 And so, you know, if this was the NFL draft, which I think is today, you know, we’d have the number one draft pick every single year, despite already kind of having the best team.
    1:11:43 And so that doesn’t matter if there’s too much money.
    1:11:47 If you always get to pick first, you still can win very consistently.
    1:11:49 And that, that’s sort of what happens.
    1:11:52 So, so it’s a great system from our perspective.
    1:11:53 Good for the world.
    1:11:53 Good for us.
    1:11:54 We love it.
    1:11:54 Yeah.
    1:12:03 Some people will say things like, oh, there’s too many founders or too many people want to be founders as if it’s like a already an efficient market.
    1:12:05 And there aren’t people out there in the world who whom.
    1:12:06 That’s a bit.
    1:12:11 Yeah, it’s the best thing in the world for like people to try, you know, to do something larger than yourself.
    1:12:16 And try and make the world a better place and, you know, get people along the ride with you.
    1:12:19 And everybody’s got a great purpose and they’re all working hard.
    1:12:23 And like, and maybe there’s a great outcome for them in the world.
    1:12:27 Like, why wouldn’t you want to fund as much of that as you can?
    1:12:30 Like, it’s, I never understood the argument that there’s too much venture capital.
    1:12:31 Yeah, it’s crazy.
    1:12:33 It can never be too much.
    1:12:40 When did you guys realize that you were entering the, like, when did you realize, hey, this is really working?
    1:12:47 Like, what was sort of the biggest inflection point in AC’s history of when you guys felt you reached that point?
    1:13:00 So, like, very early on, we realized we could win what we thought were very high quality A rounds from, like, our, from top tier VCs.
    1:13:06 And as soon as we could do that, we were like, oh, we could be top tier.
    1:13:08 We could definitely be top tier.
    1:13:15 We thought, you know, in our original, like, kind of world domination plan, we thought, you know, that was going to take 10 years or whatever.
    1:13:18 But it happened really early on, like, right in fund one.
    1:13:23 And by the time we got to fund three, it was in full effect.
    1:13:27 So, it just happened much faster.
    1:13:30 Now, like, we’re in a whole other world now than we were then.
    1:13:41 But we knew it was, like, as soon as we could beat, you know, in those days, Kleiner, Benchmark, or Sequoia in a deal, that was a very clear indication that we could be top tier.
    1:13:51 Yeah, look, I think it was basically, you know, this is sort of the advice I’d give people, not how to compete in venture, but how to compete in other spaces that are potentially right for transformation.
    1:13:53 It’s really, it’s two things we were able to do.
    1:13:54 I think it’s two things.
    1:13:58 One is just, like, having been a customer, you just have a perspective on these things.
    1:14:04 And so, there is a real, there is a real knowledge advantage if you’ve been a customer or something, you’re really understanding the shortfalls and the opportunities.
    1:14:05 So, that’s one lens.
    1:14:07 But you actually have, you know, you have to do that.
    1:14:07 Like, I think.
    1:14:11 Yeah, that was a hell of a hard lesson that we had to learn that way.
    1:14:12 It was.
    1:14:15 Building a company is a lot of knowledge gathering.
    1:14:19 Yes, 15 years, 15 years of pain and glory.
    1:14:25 And then, yeah, look, the other thing is, you know, we’ve been talking about this the whole discussion, but the other thing is, you know, to take a structural view of the industry, right?
    1:14:32 Which is like, you know, as we talked about before, but like, these industries are not, the structures are not permanent and timeless.
    1:14:36 Like, you know, just because things work a certain way today doesn’t mean that’s how they’ve always worked.
    1:14:37 In fact, almost certainly that’s not the case.
    1:14:42 Almost certainly the structure of any industry has changed a lot over time as circumstances have changed.
    1:14:48 And then, therefore, the structure of whatever industry is today is not going to be the same structure it’s going to have in 10 or 20 or 30 years.
    1:14:56 But incumbents, especially incumbents that no longer have their founders, incumbents are highly likely to underestimate the amount of structural change and they’re going to have a hard time adapting to it.
    1:15:03 And so, if you adopt a structural approach, you can kind of get a, you know, you can get a little bit of a crystal ball, you know, and then combine that with the customer mindset.
    1:15:07 You can kind of look at a little bit of the crystal ball and say, okay, well, I’m going to, you know, it’s going to kind of change this way.
    1:15:13 And then it’s the gap between the way that the incumbents are currently doing it and the future way that it ought to work.
    1:15:15 I mean, that’s where you have the insertion opportunity.
    1:15:21 There’s a related quote to this, Mark, in a New Yorker profile on you many years ago.
    1:15:28 There’s this quote that says, Mark Andreessen sometimes wonders if Naval Ravikant is onto something, the founder of AngelList.
    1:15:33 He’s asked Horowitz, what if we’re the most evolved dinosaur and Naval is a bird?
    1:15:38 So this was in, this was in the middle we call, this is in the heyday of AngelList.
    1:15:39 That was a good question.
    1:15:40 Yes.
    1:15:45 Well, so first of all, it’s a, it’s a question that’s totally ruined because we now know that the dinosaurs were birds.
    1:15:55 So that, you know, T-Rex is running around with feathers and a beak, which my, you know, my, my 12 year old self is deeply disappointed by, you know, Jurassic Park, you know, the next Jurassic Park reboot is going to be very sad and depressing.
    1:16:11 But, but, you know, the, the specific point when I, you know, when I, when I said that, whatever, a decade ago, Ben will recall it was when AngelList was kind of, you know, right in the, you know, basically AngelList was aspiring to basically structurally replace venture the way that we were doing it by having it be a, you know, essentially a marketplace, an online marketplace approach.
    1:16:14 And so that, you know, that was one, you know, kind of disruptive opportunity.
    1:16:18 And, you know, by the way, crowdfunding, you know, there’s, there’s a bunch of these and, you know, there, there are cases where that’s worked really well.
    1:16:21 So that, that, that’s one form of structural change.
    1:16:28 The other form of structural change, of course, is like, okay, you know, AI, you know, which, which I wasn’t, didn’t have in mind a decade ago applying to venture.
    1:16:37 But, you know, today you certainly asked that question, which was like, all right, smart guys, like, you know, you’re sitting around and like doing all this analysis and you have all these smart people and they’re doing all this modeling and all this, you know, research and so forth.
    1:16:43 And then like, you know, why, you know, why can’t you just plug this into, you know, Claude or Chagipity or Gemini and have it tell you what to invest in.
    1:16:46 And so that, you know, I would say that’s, that’s, that’s the new version of the question.
    1:16:46 Yeah.
    1:16:50 There was also crypto a few years ago or, you know, ICOs or GOWs.
    1:16:51 Oh, yeah. ICOs.
    1:16:52 Is that going to be disrupted?
    1:17:00 I mean, look, had ICOs stayed, I mean, ICOs were outlawed, basically, but had ICOs stayed legal, you know, then you have, right, you’re off to, you have just a totally different, you know, kind of way things happen.
    1:17:05 By the way, it turns out to Ben’s point that the main thing that actually happened was the private markets grew up.
    1:17:15 And so, so what actually happened actually, you know, play to the benefit of VCs just through happenstance, I think, in this particular case, which is that firms like ours raise much larger growth funds and, you know, play an even bigger and important role.
    1:17:20 But like there, there’s absolutely no guarantee in life that the next structural change like that will work on our, on our behalf.
    1:17:26 And so, you know, Ben will tell you, I’m always a little bit of an obsessive paranoid about, you know, what happens when the next change happens.
    1:17:27 Yeah. Yeah. No, it’s interesting.
    1:17:37 And, you know, I would just say AI, like, I think it might eventually definitely be kind of better at us than picking.
    1:17:42 But I would just say that the great thing about venture capital is picking is a small part of the game.
    1:17:46 It’s who gets to pick is as important.
    1:17:54 And, you know, how much of that can be done with AI.
    1:18:00 And I think so much of what a venture capital firm is, what are its relationships with the world?
    1:18:03 And, you know, do you get that benefit?
    1:18:07 Because to build a company, you just end up needing a lot of relationships.
    1:18:14 And, you know, and that’s, that’s what I say, like 90% of the activity at the firm is.
    1:18:26 Yeah. And then, Eric, you may know Tyler Cohen has talked about there, you know, there is this long term pattern that actually goes back literally, you know, 400, 500 years of I think what he calls project select, project selectors, project pickers.
    1:18:34 You know, so like, you know, the story’s been told many times, but the origin of the concept of carrier carried interest in the venture capital, private equity world is kind of how we get paid.
    1:18:38 It actually, you know, goes all the way back 400 years ago to the whaling industry.
    1:18:41 How much whale can you carry?
    1:18:43 How much, how much whale can you carry?
    1:18:56 And so what would happen is literally you would have these project pickers, you would have basically angel investors in whaling expeditions and a whaling expedition, like in Moby Dick, it’s like literally a ship and a captain and a crew and they’re going to like go out and they’re going to try to like go get a whale and bring it back.
    1:19:03 Right. Like, and like, you know, it’s like, I don’t know, in the early days of whaling, it was like two thirds of the time the ship comes back, you know, the other third of the time the ship doesn’t come back.
    1:19:07 Right. So like, you know, high risk, I return, you know, occupation.
    1:19:22 And then so basically there were these guys who were the money, you know, the capital suppliers, and they would sit in these coffee houses or pubs and then the captains would come in and pitch and they pitch the project and they say, I’m going to buy this ship and I’m going to go to this spot and this is me and my approach and here’s how I’m going to staff my crew.
    1:19:27 And then the project pickers, you know, the financiers had to decide whether to back the captain.
    1:19:31 And then if they did, they give the captain the money to go buy the ship and hire the crew.
    1:19:34 And then if the ship, you know, didn’t come back, they’d lose all their money.
    1:19:41 If the ship came back with a whale, the carry, the carried interest was the 20% of the whale that the captain and the crew got to keep.
    1:19:42 And that was how they got paid.
    1:19:47 Right. And so, but like venture capital, like literally they’re doing venture capital.
    1:19:51 I mean, you know, Queen Isabella did venture capital when she financed, you know, Christopher Columbus, right?
    1:19:52 Exact same thing.
    1:19:55 You know, actually the Puritan founders of America.
    1:19:56 By the way, that paid off like massively.
    1:20:01 It had some negative consequences or side effects, but it was a good investment.
    1:20:03 It was a very good venture bet.
    1:20:10 You know, the original colonists, the original Puritan colonists of, you know, Plymouth Rock, you know, they actually spent 20 years actually exiled in the Netherlands,
    1:20:15 actually essentially raising venture capital, raising money to be able to buy land and come to the U.S. and create the new colonies.
    1:20:23 And so, and then, you know, we’re also describing the process of, you know, what are called A&R people at record labels who pick new music.
    1:20:27 We’re also describing book publishers, you know, who pick new, you know, new novelists.
    1:20:31 We’re also describing movie studio executives who decide what movies get made, right?
    1:20:43 And so, you know, basically what Tyler says, I think, is basically like anytime you have a part of the economy in which you have this, you have an entrepreneur going on a high risk, high return endeavor where it is far from clear what’s going to work.
    1:20:46 And there are many more aspirants to do that than there is money to fund them.
    1:20:50 And it’s this like multifaceted, you know, kind of skill set that’s required to do it.
    1:20:53 And, you know, and then by the way, funding them, to Ben’s point, you’re not just funding them.
    1:20:57 Like you have to then actually work with them to help them actually execute the entire project.
    1:20:59 Like that’s, that’s art.
    1:21:00 Like that, that’s not science.
    1:21:01 That’s art.
    1:21:04 Like we would, we would, we would like it to be science, but like it’s art.
    1:21:08 And by the way, how do we know that it’s, how do we know that it’s art and not science?
    1:21:15 Every great venture capitalist in the last 70 years has missed most of the great companies of his generation, right?
    1:21:22 Like, so the great VCs have a success, you know, record of getting, I don’t know, two out of 10 or something of the great companies of the decade, right?
    1:21:27 And so like, if like, and that was true of all these guys, all the legends, you know, that I mentioned earlier.
    1:21:32 And so, you know, if it was a science, you could eventually have somebody who just like dials in and gets eight out of 10.
    1:21:34 But in, in, in, in the real world, it’s not like that.
    1:21:37 You know, it’s just, it’s, you’re, you’re in the fluke business.
    1:21:40 And so there’s, there’s this, there’s a, there’s an intangibility to it.
    1:21:43 There’s a taste aspect, the human relationship aspect, the psychology.
    1:21:45 By the way, a lot of it is psychological analysis.
    1:21:47 Like who are these people?
    1:21:48 How do they react under pressure?
    1:21:50 How do you keep them from falling apart?
    1:21:52 How do you, you know, how do you keep them from going crazy?
    1:21:53 How do you keep from going crazy yourself?
    1:21:56 You know, you, you end up being a psychologist half the time.
    1:21:59 And so like, it, it, it is possible.
    1:22:02 I don’t want to be definitive, but like, it’s possible that that is quite literally timeless.
    1:22:07 And when, you know, when the AIs are doing everything else, like that may be one of the last remaining fields that, that people are still doing.
    1:22:08 Yeah.
    1:22:16 Ever since I co-founded a firm in 2016, but I’m sure before that too, people were talking about how software was going to disrupt venture completely.
    1:22:22 And whether it was crypto or whether it was AI or something else, it, well, the asset class has changed in a bunch of the ways that we described.
    1:22:30 It hasn’t been sort of fundamentally disrupted in the same way that we think about disruptive innovation or the Clayton Christensen term, perhaps, as in other industries.
    1:22:30 Yeah.
    1:22:31 Not yet.
    1:22:32 Yeah.
    1:22:32 Not yet.
    1:22:40 But it could, but again, you know, again, like it could, it could, but you know, we could be doing a podcast and, you know, next year and be like, oh, oh, health.
    1:22:49 Well, this is great to get some of the history of the, of the firm and in the future episodes, we’ll talk about where we’re going among other topics that Ben and Mark show is back.
    1:22:50 Mark, Ben, thanks so much.
    1:22:51 Yes.
    1:22:51 Okay.
    1:22:52 Thank you.
    1:22:52 Yep.
    1:22:53 And welcome.
    1:22:53 Welcome, Eric.
    1:22:54 Yeah.
    1:22:54 Welcome, Eric.

    On this episode, taken from The Ben & Marc Show, a16z co-founders Marc Andreessen and Ben Horowitz dive deep into the unfiltered story behind the founding of Andreessen Horowitz—and how they set out to reinvent venture capital itself. 

    For the first time, Marc and Ben walk through the origins, strategy, and philosophy behind building a world-class venture capital firm designed for the future—not just the next fund. They reveal how they broke industry norms with a bold brand, a full-stack support model, and a long-term commitment to backing exceptional builders—anchored in the radical idea that founders deserved real support, not just checks. 

    Joining them to guide the conversation is Erik Torenberg—Andreessen Horowitz’s newest General Partner—who makes his Ben & Marc Show moderating debut. Erik is a technology entrepreneur, investor, and founder of the media company Turpentine.

    Together, they explore: 

    – Why traditional VC needed reinvention 

    – How a16z scaled with a platform model, not a partner model 

    – The “barbell strategy” reshaping venture capital today 

    – Why venture remains a human craft, even in the age of AI 

    Timecodes: 

    00:00 – Intro 

    01:00 – Why Traditional Venture Capital Was Broken 

    03:05 – Marc on Discovering VC and Its Legends 

    05:12 – Surviving the Dot-Com Crash and Angel Investing Collapse 

    07:05 – Helping Founders Raise Venture / Fix VC Relationships 

    08:47 – The a16z Strategy: Building a Support Platform 

    12:07 – First Fund Wins: Skype, Instagram, Slack, Okta 

    12:50 – Building a ‘World-Dominating Monster’ 15:00 – The Sushi Boat VC Problem 

    18:07 – Treating LPs Differently 

    21:40 – Marc and Ben’s Working Relationship 

    23:30 – Updating a16z’s Media Strategy for the Social Era 

    27:20 – History of the Decentralized Media Environment

    30:36 – Decline of Corporate Brands and Going Direct 

    36:06 – Naming the Firm 

    40:13 – Building the a16z ‘Cinematic Universe’ of Talent 

    42:16 – Creating a Federated Model 

    51:02 – Deciding to Market the Firm 

    53:26 – Recruiting General Partners 

    56:33 – Evolution to Full-Stack Companies 

    01:03:53 – The Barbell Theory: The Death of Mid-Sized VCs

    01:11:50 – Why Venture Capital Should Stay Overfunded 

    01:19:50 – When a16z Knew It Could Be Top Tier 

    01:25:58 – Venture Capital is an Art, Not a Science

    Resources:

    Marc on X: https://twitter.com/pmarca 

    Marc’s Substack: https://pmarca.substack.com/

    Ben on X: https://twitter.com/bhorowitz 

    Erik on X: https://x.com/eriktorenberg 

    Erik’s Substack: https://eriktorenberg.substack.com/

  • Enabling Agents and Battling Bots on an AI-Centric Web

    AI transcript
    0:00:05 50% of traffic is already bots, it’s already automated
    0:00:07 and agents are only really just getting going.
    0:00:10 Most people are not using these computer use agents
    0:00:13 because they’re too slow right now, they’re still at previews
    0:00:16 but it’s clear that’s where everything is going.
    0:00:19 Then we’re going to see an explosion in the traffic
    0:00:21 that’s coming from these tools and just blocking them
    0:00:24 just because they’re AI is the wrong answer.
    0:00:27 You’ve really got to understand why you want them,
    0:00:29 what they’re doing, who they’re coming from
    0:00:30 and then you can create these granular rules.
    0:00:34 AI agents are changing how people interact with the web
    0:00:36 but most sites still treat them like bots.
    0:00:40 In this episode, taken from the AI plus A16Z podcast,
    0:00:44 A16Z partner Joel De La Garza talks with ArcJet CEO David Mitten
    0:00:47 about building internet infrastructure for this new era.
    0:00:49 Here’s Derek to kick things off.
    0:00:55 Thanks for listening to the A16Z AI podcast.
    0:00:56 If you’ve been listening for a while
    0:00:58 or if you’re at all plugged into the world of AI,
    0:01:00 you’ve no doubt heard about AI agents
    0:01:03 and all the amazing things they theoretically can do.
    0:01:04 But there’s a catch.
    0:01:07 When it comes to engaging with websites,
    0:01:10 agents are limited by what any given site allows them to do.
    0:01:14 If, for example, a site tries to limit all non-human interactions
    0:01:17 in an attempt to prevent unwanted bot activity,
    0:01:20 it might also prevent an AI agent from working on a customer’s behalf,
    0:01:24 say, making a reservation, signing up for a service, or buying a product.
    0:01:29 This broad strokes approach to site security is incompatible with the idea of what some call
    0:01:35 agent experience, an approach to web and product design that treats agents as first-class users.
    0:01:41 In this episode, A16Z infra partner, Joel De LaGarza dives into this topic with David Mitton,
    0:01:46 the CEO of ArcJet, a startup building developer-native security for modern web frameworks,
    0:01:50 including attack detection, sign-up spam prevention, and bot detection.
    0:01:54 Their discussion is short, sweet, and very insightful.
    0:01:56 And you’ll hear it after these disclosures.
    0:02:01 As a reminder, please note that the content here is for informational purposes only,
    0:02:05 should not be taken as legal, business, tax, or investment advice,
    0:02:08 or be used to evaluate any investment or security,
    0:02:14 and is not directed at any investors or potential investors in any A16Z fund.
    0:02:19 For more details, please see A16Z.com slash disclosures.
    0:02:22 It seems like what once was old is new again.
    0:02:28 And we’d love to get your thoughts on this new emergence of bots
    0:02:31 and how, while we know all the bad things that happen with them,
    0:02:33 there’s actually a lot of good and really cool stuff that’s happening
    0:02:35 and how we can maybe work towards enabling that.
    0:02:37 Right, well, things have changed, right?
    0:02:40 The DDoS problem is still there,
    0:02:43 but it’s just almost handled as a commodity these days.
    0:02:46 The network provider, your cloud provider,
    0:02:47 they’ll just deal with it.
    0:02:49 And so when you’re deploying an application,
    0:02:51 most of the time you just don’t have to think about it.
    0:02:54 The challenge comes when you’ve got traffic
    0:02:57 that just doesn’t fit those filters.
    0:02:59 It looks like it could be legitimate,
    0:03:01 or maybe it is legitimate,
    0:03:02 and you just have a different view
    0:03:04 about what kind of traffic you want to see.
    0:03:06 And so the challenge is really about
    0:03:08 how do you distinguish between the good bots and the bad bots?
    0:03:11 And then with AI changing things,
    0:03:15 it’s bots that might even be acting on behalf of humans, right?
    0:03:17 It’s no longer a binary decision.
    0:03:21 And as the amount of traffic from bots increases,
    0:03:23 in some cases, it’s the majority of traffic
    0:03:26 that sites are receiving is from an automated source.
    0:03:29 And so the question for site owners is,
    0:03:31 well, what kind of traffic do you want to allow?
    0:03:33 And when it’s automated,
    0:03:36 what kind of automated traffic should come to your site?
    0:03:38 And what are you getting in return for that?
    0:03:41 And in the old days, I mean, I guess the old providers,
    0:03:43 we’ll say, the legacy providers in this space,
    0:03:46 like it was very much using a hammer, right?
    0:03:50 So they would say, hey, if this IP address is coming in,
    0:03:51 it’s probably a bot.
    0:03:53 Or they would say, if this user agent is coming in,
    0:03:54 it’s probably a bot.
    0:03:55 Very imprecise.
    0:03:57 And I think the downside of that is that
    0:03:59 you probably blocked a lot of legitimate traffic
    0:04:01 along with the legitimate traffic.
    0:04:04 And now there’s very real consequences
    0:04:06 because some of these AI bots could be actual users
    0:04:08 that are acting on behalf of
    0:04:10 who are looking to purchase your products.
    0:04:11 This is the challenge.
    0:04:13 So a volumetric DDoS attack,
    0:04:15 you just want to block that at the network.
    0:04:16 You never want to see that traffic.
    0:04:20 But everything else needs the context of the application.
    0:04:21 You need to know where in the application
    0:04:23 the traffic is coming to.
    0:04:25 You need to know who the user is, the session,
    0:04:27 and to understand in which case
    0:04:28 you want to allow or deny that.
    0:04:31 And so this is the real issue for developers,
    0:04:34 for site owners, for security teams,
    0:04:36 is to make those really nuanced decisions
    0:04:40 to understand whether the traffic should be allowed or not.
    0:04:43 And the context of the application itself is so important
    0:04:44 because it depends on the site.
    0:04:46 If you’re running an e-commerce operation,
    0:04:47 an online store,
    0:04:50 the worst thing you can do is block a transaction
    0:04:51 because then you’ve lost the revenue.
    0:04:54 Usually you want to then flag that order for review.
    0:04:57 A human customer support person is going to come in
    0:04:59 and determine based on various signals
    0:05:01 by whether to allow it.
    0:05:03 And if you just block that at the network,
    0:05:05 then your application will never see it.
    0:05:08 You never even know that that order was failed in some way.
    0:05:11 There’s been a lot of media releases
    0:05:14 about companies that have released solutions in this space.
    0:05:17 But largely they were based on sort of those
    0:05:19 old kind of approaches using network telemetry.
    0:05:23 Is that generally how they’re working now?
    0:05:26 Or is there some other capabilities that they’ve released?
    0:05:28 Because they give them AI names
    0:05:29 and you just immediately assume
    0:05:30 that they’re doing something fancy.
    0:05:31 That’s right, yeah.
    0:05:32 So blocking on the network
    0:05:36 is basically how the majority of these old school products work.
    0:05:39 They do analysis before the traffic reaches your application
    0:05:42 and then you never know what the result of that was.
    0:05:44 And that just doesn’t fly anymore.
    0:05:47 It’s insufficient for being able to build modern applications.
    0:05:49 Particularly with AI coming in
    0:05:51 where something like OpenAI
    0:05:54 has four or five different types of bots
    0:05:56 and some of them you might want to make
    0:05:58 a more restrictive decision over.
    0:06:00 But then others are going to be taking actions
    0:06:01 on behalf of a user search.
    0:06:05 And we’re seeing lots of different applications
    0:06:06 getting more signups.
    0:06:08 Businesses actually getting higher conversions
    0:06:10 as a result of this AI traffic.
    0:06:13 And so just blocking anything that is called AI
    0:06:15 is too blunt of an instrument.
    0:06:17 You need much more nuance.
    0:06:18 And the only way you can do that
    0:06:20 is with the application context,
    0:06:22 understanding what’s going on inside your code.
    0:06:24 I mean, I’d say we’re seeing across the industry
    0:06:27 that AI is driving incredible amounts
    0:06:28 of new revenue to companies.
    0:06:30 And if you use an old world tool
    0:06:31 to just block any of that traffic,
    0:06:33 you’re probably dooming your business.
    0:06:33 That’s right.
    0:06:36 Or you’re putting it into some kind of maze
    0:06:37 where it’s seeing irrelevant content.
    0:06:39 And then by doing that,
    0:06:41 you are kind of downranking your site
    0:06:44 because the AI crawler is never going to come back.
    0:06:46 It’s kind of like blocking Google
    0:06:47 from visiting your site.
    0:06:49 It’s like, yeah, Google doesn’t get you in,
    0:06:51 you’re no longer in Google’s index,
    0:06:53 but then you’re no longer in Google’s index.
    0:06:54 And so anyone searching
    0:06:56 is not going to find you as a result.
    0:06:59 Well, and I believe we had sort of standards
    0:07:00 in the old days that developed
    0:07:03 or quasi standards like robots.txt, right?
    0:07:04 Which would tell you like until the crawlers,
    0:07:06 hey, don’t crawl these directories.
    0:07:08 Are we doing something similar
    0:07:09 for this new age agentic world?
    0:07:13 So robots.txt is still the starting place.
    0:07:15 And it’s kind of a voluntary standard.
    0:07:19 It evolved over several decades ago now.
    0:07:20 It’s been around a long time.
    0:07:22 Bots have been a problem for a long time.
    0:07:24 And the idea is you describe
    0:07:25 the areas of your application
    0:07:29 and tell any robot that’s coming to your site
    0:07:31 whether you want to allow that robot
    0:07:33 to access that area of the site or not.
    0:07:35 And you could use that to control
    0:07:36 the rollout of new content.
    0:07:39 You could protect certain pages of your site
    0:07:40 that you just don’t want to be indexed
    0:07:41 for whatever reason.
    0:07:43 And you can also point the crawler
    0:07:44 to where you do want it to go.
    0:07:46 You can use the sitemap for that as well.
    0:07:49 But the robots.txt file format
    0:07:50 has evolved over time
    0:07:53 to provide these signals to the likes,
    0:07:55 to crawlers like search engines
    0:07:56 from Google and so on.
    0:07:59 The challenge with that is it’s voluntary
    0:08:01 and there’s no enforcement of it.
    0:08:04 So you’ve got good bots like Googlebot
    0:08:05 that will follow the standard
    0:08:07 and you’ll be able to have full control
    0:08:08 over what it does.
    0:08:11 But there are newer bots that are ignoring it
    0:08:13 or even sometimes using it as a way
    0:08:15 to find the parts of your site
    0:08:16 that you don’t want it to access
    0:08:18 and they will just do that anyway.
    0:08:20 And so this becomes a control problem
    0:08:21 for the site owner.
    0:08:23 And you really want to be able to understand
    0:08:26 not just what the list of rules are
    0:08:27 but how they are enforced.
    0:08:28 Totally.
    0:08:30 Maybe it’d be great to walk through
    0:08:32 what these agents are.
    0:08:34 Maybe get some more understanding
    0:08:36 of sort of how they operate,
    0:08:38 what people are using them for,
    0:08:40 perhaps go through a couple of the use cases.
    0:08:42 And then it’d be great to understand
    0:08:44 sort of like how you do control it
    0:08:47 because it seems like a far more complicated problem
    0:08:48 than just bad IP addresses.
    0:08:48 Right.
    0:08:51 So if we think about OpenAI as an example
    0:08:53 because they have four or five different crawlers,
    0:08:56 there’s one and they all have different names
    0:08:57 and they all identify themselves
    0:08:58 in different ways.
    0:09:01 So one actually is crawling to train
    0:09:03 the OpenAI models on your site.
    0:09:05 And that’s the one that probably everyone
    0:09:07 is thinking about when they’re thinking about
    0:09:09 I want to block AI, the training.
    0:09:11 And you have different philosophical approaches
    0:09:12 to how you want to be included
    0:09:14 in the training data.
    0:09:15 The others are more nuanced
    0:09:17 and will require more thought.
    0:09:19 So there’s one that will go out
    0:09:23 when a user is typing something into the chat
    0:09:24 and is asked a question
    0:09:26 and OpenAI will go out and search.
    0:09:29 It’s built up its own search index.
    0:09:31 And so that’s equivalent of Googlebot.
    0:09:33 You probably want to be in that index
    0:09:35 because as we’re seeing,
    0:09:37 sites are getting more signups,
    0:09:37 they’re getting more traffic.
    0:09:40 The discovery process is being part
    0:09:42 of just another search index is super important.
    0:09:42 Gotcha.
    0:09:44 So like when I ask OpenAI,
    0:09:46 when is John F. Kennedy’s birthday?
    0:09:47 If it doesn’t know the answer,
    0:09:48 it goes out and searches the web.
    0:09:49 Yeah, that’s right.
    0:09:50 Or if it’s trying to get open hours
    0:09:51 for something,
    0:09:53 it might go to a website for a cafe or whatever
    0:09:55 and pass it and then return the results.
    0:09:57 So that’s really just like a classic
    0:09:58 search engine crawler
    0:10:01 except it’s kind of happening behind the scenes.
    0:10:02 The other one is something
    0:10:04 that’s happening in real time.
    0:10:06 So you might give the agent
    0:10:07 a specific URL
    0:10:09 and go and ask it to summarize it
    0:10:11 or to look up a particular question
    0:10:13 in the docs for a developer tool
    0:10:14 or something like that.
    0:10:15 And then that’s a separate agent
    0:10:16 that will go out,
    0:10:17 it will read the website
    0:10:18 and then it will return
    0:10:20 and answer the query.
    0:10:22 For both of these two examples,
    0:10:23 OpenAI and others
    0:10:26 are now starting to cite those sources.
    0:10:27 And you’ll regularly see,
    0:10:28 and this is kind of the recommendation,
    0:10:31 is you get the result from the AI tool
    0:10:33 but you shouldn’t trust it 100%.
    0:10:34 You go and then verify
    0:10:35 and you look at the docs.
    0:10:37 And maybe it’s like
    0:10:38 when you used to go to Wikipedia
    0:10:39 and you’d read the summary
    0:10:40 and then you’d look at the references
    0:10:41 and you’d go to all the references
    0:10:43 and check to make sure
    0:10:44 what had been summarized
    0:10:45 was actually correct.
    0:10:46 But all three of those examples,
    0:10:48 you clearly could see
    0:10:48 why you would want them
    0:10:49 accessing your site.
    0:10:50 Right.
    0:10:52 Why like blocking all of OpenAI’s crawlers
    0:10:53 is probably a very bad idea.
    0:10:54 Yeah, it’s too blunt.
    0:10:55 It’s too blunt an instrument.
    0:10:56 You need to be able to distinguish
    0:10:57 each one of these
    0:10:59 and determine which parts of your site
    0:11:00 you want them to get into.
    0:11:03 And this then comes to the fourth one
    0:11:04 which is the actual agent.
    0:11:06 This is the end agent,
    0:11:08 the computer operator type feature.
    0:11:09 Headless web browsers.
    0:11:11 Headless web browsers, yeah.
    0:11:12 But even a web browser,
    0:11:12 a full web browser
    0:11:14 operating inside a VM.
    0:11:16 And those are the ones
    0:11:17 that require more nuance
    0:11:20 because maybe you’re booking a ticket
    0:11:22 or doing some research
    0:11:23 and you do want the agent
    0:11:24 to take actions on your behalf.
    0:11:26 Maybe it’s going to your email inbox
    0:11:27 and triaging things.
    0:11:30 From the application builder’s perspective,
    0:11:32 that’s probably a good thing.
    0:11:33 You want more transactions,
    0:11:35 you want more usage of your application.
    0:11:37 But there are examples
    0:11:39 where it might be a bad action.
    0:11:40 So for example,
    0:11:41 if you’re building a tool
    0:11:42 that is going to try and
    0:11:44 buy all of the concert tickets
    0:11:46 and then sell them on later,
    0:11:47 that becomes a problem
    0:11:48 for the concert seller
    0:11:50 because they don’t want to do that.
    0:11:50 They want the true fans
    0:11:52 to be able to get access to those.
    0:11:53 And again, you need the nuance.
    0:11:55 Maybe you allow the bot
    0:11:56 to go to the homepage
    0:11:57 and sit in a queue.
    0:11:58 But then when you get
    0:11:59 to the front of the queue,
    0:12:00 you want the human
    0:12:01 to actually make the purchase
    0:12:02 and you want to rate limit that
    0:12:03 so that maybe the human
    0:12:04 can only purchase,
    0:12:05 let’s say, five tickets.
    0:12:06 You don’t want them
    0:12:07 to purchase 500 tickets.
    0:12:08 And so this gets into
    0:12:10 the real details of the context,
    0:12:11 each one,
    0:12:12 about what you might want to allow
    0:12:13 and what you might want to restrict.
    0:12:15 That’s incredibly complicated.
    0:12:16 I mean, if I remember back
    0:12:18 why we made a lot
    0:12:18 of the decisions we made
    0:12:19 on blocking bots
    0:12:21 was strictly because of scale.
    0:12:23 So, you know,
    0:12:25 you’ve got 450,000 IP addresses
    0:12:26 sending you terabits of traffic
    0:12:27 through a link
    0:12:28 that only can do gigabit
    0:12:30 and you’ve got to just
    0:12:31 start dropping stuff, right?
    0:12:32 And you take, you know,
    0:12:34 it’s the battlefield triage
    0:12:35 of the wounded, right?
    0:12:36 It’s like some of you
    0:12:37 aren’t going to make it
    0:12:39 and it becomes a little brutal.
    0:12:40 That sounds incredibly sophisticated.
    0:12:43 How do you do that sort of
    0:12:44 fine-grained control
    0:12:45 of traffic flow
    0:12:47 at internet scale?
    0:12:48 So this is about
    0:12:49 building up layers of protections.
    0:12:51 So you start with the robots.txt,
    0:12:53 just managing the good bots.
    0:12:54 Then you look at IPs
    0:12:56 and start understanding,
    0:12:57 well, where’s the traffic coming from?
    0:12:58 In an ideal scenario,
    0:12:59 you have one user per IP address,
    0:13:00 but we all know that
    0:13:02 that doesn’t happen.
    0:13:02 That never happens.
    0:13:04 And so you can start to build up
    0:13:05 databases of reputation
    0:13:06 around the IP address
    0:13:08 and you can access
    0:13:09 the underlying metadata
    0:13:10 about that address
    0:13:11 knowing which country
    0:13:12 it’s coming from
    0:13:13 or which network it belongs to.
    0:13:15 And then you can start
    0:13:16 building up these decisions
    0:13:17 thinking, well,
    0:13:18 we shouldn’t really be getting
    0:13:20 traffic from a data center
    0:13:22 for our signup page.
    0:13:23 And so we could block
    0:13:24 that network.
    0:13:26 But it becomes more challenging
    0:13:27 if we have that agent example.
    0:13:30 The agent with a web browser
    0:13:31 or headless browser
    0:13:31 is going to be running
    0:13:32 on a server somewhere.
    0:13:33 It’s probably in a data center.
    0:13:35 And then you have
    0:13:36 the compounding factor
    0:13:37 of the abusers
    0:13:39 will purchase access
    0:13:39 to proxies
    0:13:41 which run on residential
    0:13:41 IP addresses.
    0:13:43 So you can’t easily rely
    0:13:44 on the fact
    0:13:45 that it’s part of
    0:13:47 a home ISP block anymore.
    0:13:48 And so you have to build up
    0:13:49 these patterns
    0:13:50 understanding the reputation
    0:13:51 of the IP address.
    0:13:52 Then you have
    0:13:53 the user agent string
    0:13:55 that is basically
    0:13:56 a free text field
    0:13:57 that you can fill in
    0:13:58 with whatever you like.
    0:13:58 There is kind of
    0:13:59 a standard there,
    0:14:00 but the good bots
    0:14:01 will tell you who they are.
    0:14:02 It’s been surprising
    0:14:04 getting into the details
    0:14:05 of this how many bots
    0:14:06 actually tell you
    0:14:06 who they are.
    0:14:07 And so you can block
    0:14:07 a lot of them
    0:14:08 just on that heuristic
    0:14:10 combined with the IP address.
    0:14:11 Or allow them.
    0:14:12 Or allow them.
    0:14:13 Yeah, I’m the shopping bot
    0:14:13 from OpenAI.
    0:14:14 Right.
    0:14:14 Come on in,
    0:14:15 buy some stuff.
    0:14:15 Exactly.
    0:14:16 And Googlebot,
    0:14:16 OpenAI,
    0:14:18 they tell you who they are
    0:14:19 and then you can verify that
    0:14:20 by doing a reverse DNS
    0:14:21 lookup on the IP address.
    0:14:22 So even though
    0:14:23 you might be able
    0:14:24 to pretend to be Googlebot,
    0:14:25 you can check to make sure
    0:14:26 that that’s the case or not
    0:14:28 with very low latency lookups.
    0:14:30 So we can verify that,
    0:14:30 yes, this is Google,
    0:14:31 I want to allow them.
    0:14:33 yes, this is the OpenAI bot
    0:14:35 that is doing the search indexing,
    0:14:36 I want to allow that.
    0:14:37 The next level from that
    0:14:39 is building up fingerprints
    0:14:40 and fingerprinting
    0:14:41 the characteristics
    0:14:42 of the request.
    0:14:43 And this started
    0:14:45 with the JA3 hash
    0:14:45 which was invented
    0:14:46 at Salesforce
    0:14:47 and has now been developed
    0:14:48 into a JA4.
    0:14:50 Some of them are open source
    0:14:50 these algorithms,
    0:14:52 some of them are not.
    0:14:53 So essentially you take
    0:14:53 all of the metrics
    0:14:54 around a session
    0:14:55 and you create a hash of it
    0:14:56 and then you stick it
    0:14:56 in a database.
    0:14:57 Exactly.
    0:14:58 And you look for matches
    0:14:58 to that hash.
    0:14:59 You look for matches
    0:15:00 and then the idea
    0:15:01 is that the hash
    0:15:02 will change based
    0:15:03 on the client
    0:15:04 so you can allow
    0:15:05 or deny certain clients
    0:15:06 but if you have
    0:15:07 a huge number
    0:15:07 of those clients
    0:15:08 all spamming you
    0:15:09 then they all
    0:15:10 look the same,
    0:15:10 they all have
    0:15:11 the same fingerprint
    0:15:12 and you can just
    0:15:13 block that fingerprint.
    0:15:14 So this is almost like
    0:15:15 if you think of
    0:15:17 I always think of things
    0:15:18 in terms of the classic
    0:15:18 sort of network stack
    0:15:20 like layer 0 up to layer 7
    0:15:22 like this is almost like
    0:15:24 layer 2 level identity
    0:15:25 for devices, right?
    0:15:26 Right.
    0:15:27 It’s looking at the TLS
    0:15:28 handshake on the network level
    0:15:30 and then you can go up
    0:15:31 the layers
    0:15:32 and there’s one called
    0:15:33 the JA4H
    0:15:35 which looks at the HTTP headers
    0:15:37 and the earlier versions
    0:15:39 of this would be working
    0:15:40 on the ordering
    0:15:40 of the headers
    0:15:41 for instance
    0:15:42 so an easy way
    0:15:43 to work around it
    0:15:44 is just to shift
    0:15:44 the headers
    0:15:46 the hashing has improved
    0:15:47 over time
    0:15:48 so that even changing
    0:15:49 the ordering of the headers
    0:15:51 doesn’t change the hash
    0:15:52 and the idea is
    0:15:53 that you can then combine
    0:15:54 all of these different signals
    0:15:56 to try and come to a decision
    0:15:57 about whether you think
    0:15:57 this is
    0:15:58 or who it is
    0:16:00 basically making the request
    0:16:01 and if it’s malicious
    0:16:02 you can block it
    0:16:02 based on that
    0:16:03 and if it’s someone
    0:16:04 that you want to allow
    0:16:05 then you can do so.
    0:16:05 And this is before
    0:16:06 you even get into
    0:16:08 kind of the user level
    0:16:09 what’s actually happening
    0:16:10 in the application, right?
    0:16:10 That’s right, yeah.
    0:16:12 So this is the logic
    0:16:13 on top of that
    0:16:14 because you have to identify
    0:16:14 who it is first
    0:16:16 before you apply the rules
    0:16:17 about what you want them to do.
    0:16:18 Gotcha, so it’s almost like
    0:16:19 you’re adding an authentication layer
    0:16:21 or an identity layer
    0:16:22 to sort of the transport side.
    0:16:23 That’s right, yeah.
    0:16:25 Or the application side
    0:16:26 I guess you should say.
    0:16:27 Yeah, the application, yeah.
    0:16:29 But it’s throughout the whole stack
    0:16:30 the whole OSI model
    0:16:31 and the idea is
    0:16:32 you have this
    0:16:33 consistent fingerprint
    0:16:34 that you can then
    0:16:36 apply these rules to
    0:16:36 and identity
    0:16:37 kind of layers
    0:16:38 on top of that
    0:16:39 and we’ve seen
    0:16:40 some interesting developments
    0:16:42 in fingerprinting
    0:16:43 and providing signatures
    0:16:45 based on
    0:16:45 who the request
    0:16:46 is coming from.
    0:16:47 So a couple of years ago
    0:16:48 Apple announced
    0:16:49 Privacy Pass
    0:16:52 which is a hash
    0:16:52 that is attached
    0:16:54 to every request
    0:16:54 you make
    0:16:55 if you’re in the
    0:16:55 Apple ecosystem
    0:16:56 using Safari
    0:16:57 on iPhone
    0:16:58 or on Mac
    0:16:59 then there is a way
    0:17:01 to authenticate
    0:17:01 that the request
    0:17:02 is coming from
    0:17:03 an individual
    0:17:04 who has a subscription
    0:17:05 to iCloud
    0:17:06 and Apple has
    0:17:07 their own fraud analysis
    0:17:08 to allow you
    0:17:09 to subscribe to iCloud
    0:17:09 so it’s a very
    0:17:11 it’s an easy assumption
    0:17:11 to make
    0:17:12 that if you have
    0:17:12 a subscription
    0:17:14 and this signature
    0:17:15 is verified
    0:17:16 then you’re a real person.
    0:17:17 There’s a new one
    0:17:18 that Cloudflare
    0:17:19 recently published
    0:17:21 around doing the same thing
    0:17:23 for automated requests
    0:17:24 and having a fingerprint
    0:17:25 that’s attached
    0:17:25 to a signature
    0:17:26 inside every single request
    0:17:27 which you can then use
    0:17:29 public key cryptography
    0:17:30 to verify
    0:17:31 these are all emerging
    0:17:32 as the problem
    0:17:33 of being able
    0:17:34 to identify
    0:17:35 automated clients
    0:17:36 increases
    0:17:37 because you want
    0:17:38 to be able to know
    0:17:39 who the good ones are
    0:17:40 to allow them through
    0:17:41 whilst blocking
    0:17:42 all the attackers.
    0:17:43 Yeah it’s just like
    0:17:43 the old days
    0:17:44 with Kerberos right
    0:17:45 every large vendor
    0:17:46 is going to have
    0:17:46 their flavor
    0:17:48 and if you’re a shop
    0:17:48 and you’re trying
    0:17:49 to sell to everybody
    0:17:50 you’ve got to kind of
    0:17:50 work with all of them.
    0:17:51 That’s right
    0:17:52 and you just need
    0:17:53 to be able to understand
    0:17:54 is this a human
    0:17:55 and is our application
    0:17:56 built for humans
    0:17:57 and then you allow them
    0:17:58 or is it
    0:17:59 that we’re building
    0:17:59 an API
    0:18:00 or do we want
    0:18:01 to be indexed
    0:18:01 and we want
    0:18:02 to allow this traffic
    0:18:03 it’s just giving
    0:18:04 the site owner
    0:18:05 the control.
    0:18:06 Yeah I mean I think
    0:18:07 what’s really interesting
    0:18:09 to me is that
    0:18:10 in my own use
    0:18:11 and in my own life
    0:18:13 like I interact
    0:18:13 with the internet
    0:18:15 less and less directly
    0:18:16 like almost every day
    0:18:17 and I’m going through
    0:18:18 some sort of
    0:18:19 AI type thing
    0:18:20 it could be an agent
    0:18:20 it could be
    0:18:21 a large language model
    0:18:22 it could be
    0:18:23 any number of things
    0:18:24 but I generally
    0:18:25 don’t query stuff
    0:18:26 directly as much
    0:18:27 as I used to
    0:18:28 and it seems like
    0:18:28 we’re moving
    0:18:29 to a world
    0:18:29 where almost
    0:18:31 the layer you describe
    0:18:32 the agent type
    0:18:34 activity you describe
    0:18:34 will become
    0:18:35 the primary consumer
    0:18:36 of everything
    0:18:36 on the internet.
    0:18:38 Well if 50%
    0:18:38 of the traffic
    0:18:39 is already
    0:18:40 bots
    0:18:41 it’s already
    0:18:41 automated
    0:18:42 and agents
    0:18:43 are only really
    0:18:44 just getting going
    0:18:45 most people
    0:18:45 are not using
    0:18:46 these computer use
    0:18:47 agents
    0:18:48 because they’re
    0:18:48 too slow
    0:18:49 right now
    0:18:49 they’re not
    0:18:50 they’re still
    0:18:51 like previews
    0:18:52 but it’s clear
    0:18:52 that’s where
    0:18:53 everything is going
    0:18:54 then we’re going
    0:18:55 to see an explosion
    0:18:57 in the traffic
    0:18:57 that’s coming
    0:18:58 from these tools
    0:18:58 and just blocking
    0:18:59 them just because
    0:19:00 they’re AI
    0:19:01 is the wrong answer
    0:19:02 you’ve really got
    0:19:03 to understand
    0:19:04 why you want them
    0:19:05 what they’re doing
    0:19:06 who they’re coming
    0:19:06 from and then
    0:19:07 you can create
    0:19:08 these granular rules
    0:19:08 I mean I hate
    0:19:09 to use the analogy
    0:19:09 but these things
    0:19:10 are almost like
    0:19:11 avatars right
    0:19:12 they’re running
    0:19:13 around on someone’s
    0:19:13 behalf
    0:19:14 and you need
    0:19:14 to figure out
    0:19:15 who that someone
    0:19:16 is and what
    0:19:16 the objectives
    0:19:17 are
    0:19:18 and control them
    0:19:19 very granularly
    0:19:20 and the old
    0:19:20 school methods
    0:19:22 of doing that
    0:19:23 assume malicious
    0:19:23 intent
    0:19:25 which isn’t
    0:19:26 always the case
    0:19:26 and increasingly
    0:19:27 is going to be
    0:19:28 not the case
    0:19:28 because you want
    0:19:29 the agents
    0:19:30 to be doing things
    0:19:31 and the signals
    0:19:31 just no longer
    0:19:33 work when you’re
    0:19:33 expecting traffic
    0:19:34 to come from
    0:19:35 a data center
    0:19:35 or you’re expecting
    0:19:36 it to come from
    0:19:37 an automated
    0:19:38 Chrome instance
    0:19:39 and being able
    0:19:40 to have the
    0:19:41 understanding
    0:19:42 of your application
    0:19:43 to dig into
    0:19:44 the characteristics
    0:19:45 of the request
    0:19:45 is going to be
    0:19:46 increasingly important
    0:19:47 in the future
    0:19:48 of distinguishing
    0:19:49 how criminals
    0:19:50 are using
    0:19:51 AI
    0:19:52 what we’ve seen
    0:19:52 so far
    0:19:52 is either
    0:19:53 training
    0:19:54 and people
    0:19:55 have that opinion
    0:19:55 of whether they
    0:19:56 want to train
    0:19:56 or not
    0:19:57 or it’s bots
    0:19:58 that maybe
    0:19:58 have got something
    0:19:59 wrong
    0:20:00 they’re accessing
    0:20:00 the site
    0:20:01 too much
    0:20:01 because they
    0:20:02 haven’t thought
    0:20:02 about throttling
    0:20:04 or they’re ignoring
    0:20:04 robots.txt
    0:20:05 rather than looking
    0:20:06 at agents.txt
    0:20:07 which is distinguishing
    0:20:09 between an agent
    0:20:09 you want to access
    0:20:10 your site
    0:20:11 and some kind
    0:20:11 of crawler
    0:20:13 and the examples
    0:20:14 that we’ve seen
    0:20:15 are just bots
    0:20:15 coming to websites
    0:20:16 and just downloading
    0:20:17 the content
    0:20:17 continuously
    0:20:19 there’s no world
    0:20:20 where that should
    0:20:20 be happening
    0:20:22 and this is
    0:20:23 where the cost
    0:20:23 is being put
    0:20:24 on the site owner
    0:20:25 because they currently
    0:20:26 have no easy way
    0:20:26 to manage
    0:20:27 the control
    0:20:30 the traffic
    0:20:30 that’s coming
    0:20:31 to their site
    0:20:32 directionally
    0:20:33 things are improving
    0:20:34 because
    0:20:34 you might have
    0:20:35 looked back
    0:20:36 18 months
    0:20:37 and the bots
    0:20:38 have no rate
    0:20:38 limiting
    0:20:39 they’re just
    0:20:39 downloading content
    0:20:40 all the time
    0:20:42 today we know
    0:20:42 that these bots
    0:20:43 can be verified
    0:20:44 they are identifying
    0:20:45 themselves
    0:20:46 they are much
    0:20:47 better citizens
    0:20:48 of the internet
    0:20:48 and they are
    0:20:49 starting to follow
    0:20:50 the rules
    0:20:51 and so over the
    0:20:52 next 18 months
    0:20:53 I think we’ll see
    0:20:54 more of that
    0:20:55 more of the AI
    0:20:56 crawlers that we want
    0:20:57 following the rules
    0:20:58 doing things in the right way
    0:20:59 and it will start
    0:21:00 to split into
    0:21:01 making it a lot easier
    0:21:02 to detect the bots
    0:21:03 with criminal intent
    0:21:04 and those are the ones
    0:21:05 that we want to be blocking
    0:21:06 So with the transition
    0:21:08 of bots
    0:21:09 from being these
    0:21:10 entities on the internet
    0:21:11 that represent
    0:21:12 third parties
    0:21:13 and organizations
    0:21:14 to this new world
    0:21:16 where these AI agents
    0:21:16 could be representing
    0:21:17 organizations
    0:21:17 they could be
    0:21:18 representing customers
    0:21:19 they could be
    0:21:20 representing any number
    0:21:20 of people
    0:21:21 and this is probably
    0:21:22 the wave of the future
    0:21:24 it seems to me like
    0:21:25 detecting
    0:21:26 that it’s AI
    0:21:27 or a person
    0:21:27 is going to be
    0:21:28 an incredibly difficult
    0:21:28 challenge
    0:21:29 and I’m curious
    0:21:31 like how are you
    0:21:31 thinking about
    0:21:33 proving humanness
    0:21:34 on the internet
    0:21:34 right
    0:21:35 proofing
    0:21:37 is a tale
    0:21:38 as old as time
    0:21:39 there’s a NIST
    0:21:39 working group
    0:21:40 on proofing identity
    0:21:41 that’s been running
    0:21:42 I think for 35 years
    0:21:43 and like still
    0:21:44 hasn’t really gotten
    0:21:45 to something
    0:21:45 that’s implementable
    0:21:46 there’s 15 companies
    0:21:47 out there right
    0:21:48 the first wave
    0:21:50 of ride share services
    0:21:51 and gig economy
    0:21:52 type companies
    0:21:53 needed to have proofing
    0:21:53 right
    0:21:54 because you’re hiring
    0:21:54 these people
    0:21:55 in remote places
    0:21:56 where you don’t
    0:21:56 have an office
    0:21:57 and it’s still
    0:21:58 not a solved problem
    0:21:59 I’m curious
    0:21:59 like it feels
    0:22:00 like maybe AI
    0:22:02 can help get us there
    0:22:03 or maybe there’s
    0:22:03 something that’s
    0:22:04 happening in that space
    0:22:04 right
    0:22:06 well the pure solution
    0:22:07 is digital signature
    0:22:07 right
    0:22:08 but we’ve been
    0:22:09 talking about that
    0:22:10 for so long
    0:22:12 and the UX
    0:22:12 around it
    0:22:13 is basically impossible
    0:22:15 for normal people
    0:22:16 to figure out
    0:22:17 and it’s why
    0:22:17 something like
    0:22:18 email encryption
    0:22:19 no one encrypts
    0:22:19 their email
    0:22:21 you have encrypted chat
    0:22:21 because it’s built
    0:22:22 into the app
    0:22:23 and it can do
    0:22:24 all the difficult
    0:22:24 things like the
    0:22:26 key exchange
    0:22:26 behind the scenes
    0:22:28 so that solution
    0:22:29 isn’t really going to work
    0:22:31 but AI has been used
    0:22:32 in analyzing traffic
    0:22:33 for at least over a decade
    0:22:34 it’s just it was called
    0:22:35 machine learning
    0:22:36 and so you start
    0:22:37 with machine learning
    0:22:38 and the question is
    0:22:39 well what does
    0:22:40 the new generation
    0:22:42 of AI allow us to do
    0:22:43 the challenge with
    0:22:45 the LLM type models
    0:22:46 is just the speed
    0:22:47 at which they are
    0:22:48 doing analysis
    0:22:49 because you often
    0:22:51 want to take a decision
    0:22:52 on the network
    0:22:53 or in the application
    0:22:54 within a couple of milliseconds
    0:22:55 otherwise you’re going to be
    0:22:56 blocking the traffic
    0:22:57 and the user’s going to
    0:22:57 become annoyed
    0:22:59 and so you can do that
    0:22:59 with kind of
    0:23:01 classic machine learning models
    0:23:01 and do the inference
    0:23:02 really quickly
    0:23:03 and where I think
    0:23:04 the interesting thing
    0:23:05 in the next few years
    0:23:06 is going to be
    0:23:07 is how we take
    0:23:08 this new generation
    0:23:10 of generative AI
    0:23:11 using LLMs
    0:23:12 or other types
    0:23:14 of LLM-like technology
    0:23:15 to do analysis
    0:23:17 on huge traffic patterns
    0:23:18 I think that can be done
    0:23:19 in the background
    0:23:19 initially
    0:23:20 but we’re already seeing
    0:23:22 new edge models
    0:23:23 designed to be deployed
    0:23:24 to mobile devices
    0:23:25 and IoT
    0:23:26 that use very low amounts
    0:23:28 of system memory
    0:23:29 and can provide
    0:23:30 inference responses
    0:23:31 within milliseconds
    0:23:32 I think those
    0:23:33 are going to start
    0:23:33 to be deployed
    0:23:35 to applications
    0:23:37 over the next few years
    0:23:38 I think you’re exactly right
    0:23:38 like I think
    0:23:40 so much of what
    0:23:41 we’re seeing now
    0:23:42 is just being restricted
    0:23:43 by the cost of inference
    0:23:44 and that cost is
    0:23:45 dropping incredibly fast
    0:23:46 right
    0:23:46 we saw this with
    0:23:47 cloud
    0:23:47 where like
    0:23:49 S3 went to being
    0:23:50 the most expensive storage
    0:23:50 you could buy
    0:23:51 to being free
    0:23:52 essentially free
    0:23:52 Glacier is essentially free
    0:23:53 right
    0:23:53 free is beer
    0:23:53 right
    0:23:54 whatever
    0:23:55 and so like
    0:23:56 we’re seeing that
    0:23:58 even at a more
    0:23:58 accelerated rate
    0:23:59 for inference
    0:23:59 like the cost
    0:24:00 is just falling
    0:24:00 incredibly
    0:24:01 and then
    0:24:02 when you look
    0:24:03 at the capabilities
    0:24:04 of these
    0:24:05 new technologies
    0:24:06 to drop
    0:24:07 a suspicious email
    0:24:08 into chat GPT
    0:24:09 and ask if it’s
    0:24:09 suspicious
    0:24:10 and it’s like
    0:24:11 100% accurate
    0:24:11 right
    0:24:12 like if you want
    0:24:13 to like find
    0:24:14 sensitive information
    0:24:15 you ask
    0:24:15 the LLM
    0:24:16 is a sense
    0:24:16 of information
    0:24:17 and it’s like
    0:24:18 100% accurate
    0:24:18 like
    0:24:20 it’s amazing
    0:24:21 like as you squint
    0:24:22 and look at the future
    0:24:23 you can start to see
    0:24:24 these really incredible
    0:24:25 use cases
    0:24:25 right
    0:24:26 like to your point
    0:24:27 of inference
    0:24:27 on the edge
    0:24:27 like
    0:24:29 do you think
    0:24:29 do you think
    0:24:29 we all end up
    0:24:30 eventually
    0:24:30 with like
    0:24:31 an LLM
    0:24:32 running locally
    0:24:33 that’s basically
    0:24:34 going to be clippy
    0:24:34 but for CISOs
    0:24:35 like it pops up
    0:24:36 and says
    0:24:36 hey it looks like
    0:24:36 you’re doing
    0:24:37 something stupid
    0:24:38 like is that
    0:24:39 is that kind of
    0:24:39 where you think
    0:24:40 we land
    0:24:40 that’s what we’re
    0:24:41 working on
    0:24:41 is getting
    0:24:42 this analysis
    0:24:43 into the process
    0:24:43 so that for
    0:24:44 every single request
    0:24:45 that comes through
    0:24:45 you can have
    0:24:46 a sandbox
    0:24:47 that will analyze
    0:24:48 the full request
    0:24:49 and give you a response
    0:24:50 whereas now
    0:24:50 you can wait
    0:24:51 maybe two to five
    0:24:53 seconds to delay
    0:24:53 an email
    0:24:54 and do the analysis
    0:24:55 and decide whether
    0:24:57 to flag it for review
    0:24:57 or send it
    0:24:58 to someone’s inbox
    0:24:59 delaying an HTTP request
    0:25:00 for five seconds
    0:25:02 that’s not going to work
    0:25:02 and so I think
    0:25:05 the trend that we’re seeing
    0:25:05 with the improvement
    0:25:06 cost
    0:25:08 the inference cost
    0:25:09 but also the latency
    0:25:10 in getting the inference
    0:25:11 decision
    0:25:12 that’s going to be the key
    0:25:14 so we can embed this
    0:25:14 into the application
    0:25:16 you’ve got the full context
    0:25:16 window
    0:25:17 so you can add
    0:25:17 everything you know
    0:25:18 about the user
    0:25:19 everything about the session
    0:25:20 everything about your application
    0:25:22 alongside the request
    0:25:23 and then come to decision
    0:25:24 entirely locally
    0:25:25 on your web server
    0:25:26 on the edge
    0:25:27 wherever it happens
    0:25:27 to be running
    0:25:28 as I listen to you
    0:25:29 say that
    0:25:30 and describe this process
    0:25:30 all I can think
    0:25:31 is that advertisers
    0:25:32 are going to love this
    0:25:34 it just seems like
    0:25:35 the kind of technology
    0:25:36 built for sort of like
    0:25:37 hey he’s looking at this product
    0:25:38 show him this one right
    0:25:38 yeah
    0:25:40 super fast inference
    0:25:40 on the edge
    0:25:41 coming to a decision
    0:25:43 and for advertisers
    0:25:44 stopping click spam
    0:25:46 that’s a huge problem
    0:25:47 and being able to
    0:25:47 come to that decision
    0:25:48 before it even goes
    0:25:50 through your ad model
    0:25:51 and the auction system
    0:25:52 who would have ever thought
    0:25:53 that non-deterministic
    0:25:55 incredibly cheap compute
    0:25:57 would solve these use cases
    0:25:57 right
    0:25:59 we’re in a weird world
    0:26:01 that’s it for this episode
    0:26:03 thanks again for listening
    0:26:04 and remember to keep listening
    0:26:05 for some more great episodes
    0:26:07 as the AI space matures
    0:26:08 we need to start thinking
    0:26:09 more practically
    0:26:10 about how the technology
    0:26:11 coexists with the systems
    0:26:12 and platforms
    0:26:13 we already use
    0:26:15 that’s what we try to do here
    0:26:16 and we’ll keep examining
    0:26:16 these questions
    0:26:17 in the weeks to come

    Taken from the AI + a16z podcast, Arcjet CEO David Mytton sits down with a16z partner Joel de la Garza to discuss the increasing complexity of managing who can access websites, and other web apps, and what they can do there. A primary challenge is determining whether automated traffic is coming from bad actors and troublesome bots, or perhaps AI agents trying to buy a product on behalf of a real customer.Joel and David dive into the challenge of analyzing every request without adding latency, and how faster inference at the edge opens up new possibilities for fraud prevention, content filtering, and even ad tech.Topics include:

    • Why traditional threat analysis won’t work for the AI-powered web
    • The need for full-context security checks
    • How to perform sub-second, cost-effective inference
    • The wide range of potential actors and actions behind any given visit

    As David puts it, lower inference costs are key to letting apps act on the full context window — everything you know about the user, the session, and your application.

     

    Follow everyone on social media:

    David Mytton

    Joel de la Garza

    Check out everything a16z is doing with artificial intelligence here, including articles, projects, and more podcasts.

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures. 

  • Marc Andreessen on Startup Timing

    AI transcript
    0:00:05 This is not saying that every startup should be an AI startup or whatever, but there is this incredible new tool that can be used.
    0:00:10 And it’s just sort of general fact that big companies, big incumbents have a very hard time reacting to these platform changes.
    0:00:12 Sometimes they pull it off, but a lot of times they really struggle.
    0:00:17 And so it’s sort of the best possible opportunity for a startup going up against an incumbent is when there’s this kind of shift happening.
    0:00:21 What if right now is the best time in decades to start a company?
    0:00:28 In this episode taken from Speedrun, a three-month accelerator powered by A16Z designed to help founders move fast,
    0:00:35 Marc Andreessen joins Games’ general partner, Jonathan Lai, to make the case that we’re entering a once-in-a-generation window for builders.
    0:00:38 From the explosive rise of AI to shifting global policies,
    0:00:43 Marc breaks down why the next four years presents a rare and urgent opportunity for founders
    0:00:47 and what separates the entrepreneurs who seize the moment from those who miss it.
    0:00:48 Let’s get into it.
    0:00:55 As a reminder, the content here is for informational purposes only.
    0:01:00 Should not be taken as legal, business, tax, or investment advice, or be used to evaluate any investment or security,
    0:01:05 and is not directed at any investors or potential investors in any A16Z fund.
    0:01:10 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
    0:01:17 For more details, including a link to our investments, please see A16Z.com forward slash disclosures.
    0:01:29 How much do you think disagreeability or just the ability to sort of fight and sort of disagree publicly is a necessary trait for founders,
    0:01:30 as exemplified by Steve Jobs?
    0:01:33 Yeah, so start with Steve, who I knew.
    0:01:36 So Steve is one of the most disagreeable people in the history of humankind.
    0:01:40 You know, Steve would disagree with you over the shape of the glass on the table in front of you.
    0:01:41 Like, he was going to argue about everything.
    0:01:46 So it’s where a lot of the genius came from, which is he was just not going to take the status quo for granted under any circumstances.
    0:01:50 Elon uses the term first principles thinking, and Steve had a lot of that as well.
    0:01:51 So there’s a lot of genius in there.
    0:01:54 But if you read the books on Steve or you hear the stories,
    0:01:56 there’s basically two stories about Steve you hear.
    0:01:59 One is that he was a saint and he was perfect in all regards, which is somewhat true.
    0:02:03 But the other story that you hear is basically he was like a screaming lunatic,
    0:02:06 and he would just run around and yell at people in elevators and fire people in meetings.
    0:02:10 And just he was like, you know, people have all these kind of awful, horrible things.
    0:02:12 It’s like angel devil kind of thing.
    0:02:14 And I think on this, I think the reality was somewhere in the middle,
    0:02:17 at least what I saw and what I heard from people who worked with him for a very long time,
    0:02:21 was basically he was just like absolutely intolerant of anything less than first class work.
    0:02:26 If you brought him first class work, and if you were top in your field and super diligent and on top of everything
    0:02:29 and had all the details figured out and knew what you were doing and were really good,
    0:02:33 he was like the best manager you were ever going to work with and the best CEO you were ever going to work with.
    0:02:36 And the thing that comes up when people talk about who worked with him closely is,
    0:02:39 I did the best work of my life working for him, right?
    0:02:42 And part of that is because he really appreciated, understood the quality of great work,
    0:02:47 and he didn’t tolerate anything less than that, which meant that everybody around you also hit that bar.
    0:02:50 That’s sort of his approach to performance management is everybody’s going to be doing top end work.
    0:02:54 They’re not going to be here. As a consequence, the best people in the world are going to love being here
    0:02:56 because they’re surrounded by the best people in the world.
    0:02:59 And by the way, that’s also something, of course, that Elon shares.
    0:03:03 And so the stories that you hear about the screaming or firing people in meetings or whatever,
    0:03:06 it was always a side effect of it’s not first class work.
    0:03:09 And it’s either somebody who really should have been able to do it, who didn’t,
    0:03:12 who didn’t work hard enough, or it’s somebody who was just not capable of doing it, right?
    0:03:14 Who needed to go find something else.
    0:03:18 But the result, the core thing always, and this is why there was so much love for him, right?
    0:03:21 The core thing was I do the best work of my life working for him.
    0:03:24 Having said that, also, Steve himself matured, right?
    0:03:27 And his career kind of got, time dilation kind of happens.
    0:03:30 And so you kind of think, well, Steve built Apple, the company that we all know and love today.
    0:03:32 And it’s like, well, yes, but it took a very long time.
    0:03:34 And there were twists and turns along the way.
    0:03:37 And the big twist and turn, of course, is he got fired, right?
    0:03:39 So he started the company in 76 with Steve Wozniak.
    0:03:41 You know, they built Apple I, Apple II.
    0:03:43 They built Elisa, actually, which failed.
    0:03:47 That was the other thing, by the way, is not every Steve Jobs project succeeded, right?
    0:03:48 And he learned a lot from the failures.
    0:03:51 And now everybody’s forgotten the failures, but you can read about them on Wikipedia.
    0:03:54 Elisa was before the Macintosh and was a complete failure.
    0:03:56 But then the Macintosh started to work.
    0:03:59 But before the Macintosh really took off, he brought in the wrong CEO.
    0:04:01 And then there was a revolt inside the company.
    0:04:02 And he got fired.
    0:04:04 And then he spent 10 years out of the company, right?
    0:04:07 Which is actually when I got to know him, is when he was off doing Next.
    0:04:09 Next was like a complete wipeout.
    0:04:11 Like, it was years of real pain and agony.
    0:04:14 And then he had this little side project nobody understood, which is this little graphics company,
    0:04:15 Pixar, it turned out.
    0:04:19 Anyway, the point is, by the time he came back into Apple in like 97, I think maybe been
    0:04:20 out for like 12 years.
    0:04:24 And again, what people who knew him better than I did said, he learned how to actually be a
    0:04:25 great CEO, not at Apple, but at Next.
    0:04:29 Because he spent 12 years actually doing it the hard way, where he wasn’t
    0:04:30 being showered with praise.
    0:04:31 He didn’t have the magic touch.
    0:04:33 The product fundamentally didn’t take.
    0:04:35 He had to like, as we say now, pivot.
    0:04:37 Because we have this great new term, pivot.
    0:04:39 We used to just say, f*** up.
    0:04:42 You know, pivoted next.
    0:04:47 By the way, when I was in college in 92, 89, when I got 89, I got Illinois in 89, we were
    0:04:48 one of the early adopters of the Next computer.
    0:04:50 NEXT was the name of the company.
    0:04:52 It was called the Next Cube, was the computer.
    0:04:54 And it was sort of the post-Macintosh.
    0:04:57 And so it was like, literally, it was like a $15,000 Macintosh.
    0:05:00 And it was like fully modern, not only graphical, but it was like full Unix at a time when that
    0:05:02 was really unusual for a desktop computer.
    0:05:04 It had this incredible state-of-the-art.
    0:05:05 It had the first CD-ROM drive.
    0:05:08 And then one of the great stories is it was a perfect cube.
    0:05:10 It was a perfect 12-inch cube, 12 by 12 by 12.
    0:05:14 And there’s a famous story of his designers at Next, when they were designing it, they came
    0:05:19 in with a sort of optimal, Pareto optimal hardware design, built for manufacturing, cost optimization,
    0:05:21 which was 12 inches by 12 inches by 13 inches.
    0:05:22 And he said, f*** you.
    0:05:26 Go back and make it a perfect cube.
    0:05:28 And they’re like, Steve, that’s going to double the cost.
    0:05:29 And he’s like, I don’t f***ing care.
    0:05:31 Make it a cube.
    0:05:33 And so it was like a perfect cube.
    0:05:34 It was slowish.
    0:05:36 Like, it took forever.
    0:05:38 It was like walking through molasses to use the thing.
    0:05:40 Completely flopped.
    0:05:41 Nobody wanted it.
    0:05:42 Pivoted the company to software.
    0:05:43 Nobody wanted the software.
    0:05:45 Anyway, the point is, like, that was really hard.
    0:05:47 And he had to make every part of the company work.
    0:05:48 He had to try to figure out how to optimize it the hard way.
    0:05:51 He had to retain a team through 12 years of basically failure.
    0:05:55 What basically people say is he had this incredible growth and innovation skill set from Apple phase
    0:05:58 one, and then he had an incredible management skill set from the sort of wilderness years.
    0:06:01 So by the time he came back to Apple in 97, he was both.
    0:06:06 And he was also at that point a great CEO, but he maybe not would have ever become the Steve Jobs
    0:06:08 that we know had he not gone through the hard period.
    0:06:11 So a big undercurrent to what you’ve been discussing is the concept of, like, market timing, right?
    0:06:13 Like, what’s the right time to sort of enter a market?
    0:06:18 Is the technology truly ready for sort of the consumer or the enterprise version of that technology?
    0:06:24 And so thinking back to, like, the 90s, you wanted to be not Pets.com, but Chewy eventually
    0:06:25 did it wrong, right?
    0:06:27 Like, some of these companies that came 10, 15 years later.
    0:06:32 If you’re a builder right now and you’re selecting, like, markets right now to build in,
    0:06:35 how would you think about what the right time is?
    0:06:37 Is it through early for certain markets and so on and so forth?
    0:06:41 Yeah, so one of my observations, it sounds crazy, but I think might be true, is I think
    0:06:42 actually all the ideas might be good.
    0:06:46 Qualified founders who know what they’re doing, I think, generally are right about the opportunity
    0:06:49 that they’re going after and that it’s just, okay, if you’re, like, a qualified founder doing
    0:06:52 what you’re doing, you should be so deep in the domain that when you think about, okay,
    0:06:55 this technology now makes this use case possible and we can build a product that does that,
    0:06:56 people are going to want that.
    0:06:59 I think the accuracy rate on that is almost 100%.
    0:07:00 The problem is the timing.
    0:07:04 And early is the same as being wrong, right, in practice, right?
    0:07:07 And you see this in the pattern, which is basically, is anytime there’s an overnight success,
    0:07:11 like a technology company that just, like, lights the world on fire, is doing great,
    0:07:14 what you find is basically there were a set of companies that tried to do that same thing
    0:07:17 five years earlier and failed, five years before that and failed, five years before that and
    0:07:18 failed, before that and failed.
    0:07:21 Often goes back, by the way, I mentioned the e-commerce and the phone thing.
    0:07:25 The first actual smartphone came out in 1987, right?
    0:07:29 And so there were literally 20 years of failed attempts to do smartphones until the iPhone,
    0:07:31 you know, 20 years, right?
    0:07:34 And so I knew many of the founders trying to do smartphones before the iPhone and they were
    0:07:35 very, very smart, capable people.
    0:07:39 Many of them very successful and in other domains are very successful later on.
    0:07:40 It was just too early.
    0:07:41 The technology wasn’t ready yet.
    0:07:43 And then why is this hard?
    0:07:45 It’s hard for basically two reasons.
    0:07:48 One is because to be a successful founder, by definition, you have to live in the future,
    0:07:48 right?
    0:07:51 So you have to envision a world that doesn’t exist yet in which the thing that you’re building
    0:07:53 is something that everybody’s going to use.
    0:07:56 And like people around you can’t see that yet because it doesn’t exist yet.
    0:08:00 And so like you see a vision of the world that doesn’t yet exist and that may be right and
    0:08:03 may not be right, or it may happen now and it may not happen now.
    0:08:05 And then the other thing about it is the world gets a vote.
    0:08:10 And, you know, I think one of the really interesting kind of conceptual questions is there’s like
    0:08:13 the push side of you trying to make something happen in the world and trying to get the
    0:08:15 world to understand something and want to do something and buy something.
    0:08:18 But then there’s like some sociological thing on the other side,
    0:08:20 which is, okay, when is the world actually ready for the new idea?
    0:08:22 Here’s another just like really amazing thing.
    0:08:25 Somebody got one of the small versions of Llama to run on Windows 98.
    0:08:30 They booted up a literally a Dell desktop computer from, I think, 1998 with a fresh copy of
    0:08:31 Windows 98 and they got Llama running on it.
    0:08:36 And so like all of those old PCs literally could have been smart this whole time.
    0:08:37 They really could have been.
    0:08:39 And like we had neural networks, right?
    0:08:40 There were lots of people working on AI.
    0:08:44 But like we could have been talking to our computers in English for the last basically
    0:08:45 almost 30 years.
    0:08:46 We now know.
    0:08:47 Crazy alternate history.
    0:08:48 Crazy alternate history.
    0:08:49 And like we didn’t.
    0:08:50 There was an AI startup boom in the 80s.
    0:08:53 There were a lot of really smart people in the 80s who thought this was all going to happen
    0:08:53 then.
    0:08:56 There were people in the 50s who thought it was going to happen then.
    0:08:59 And so this timing thing is just like incredibly difficult.
    0:09:03 And then just practically speaking, as a startup, you only have so long to either prove it right
    0:09:03 or not.
    0:09:06 And basically, you know, give or take five years total, right?
    0:09:09 If you don’t get traction in a startup in the first five years, your odds of success are
    0:09:11 very low just because it’s very hard to hold the team together.
    0:09:12 You’re in a race against time.
    0:09:14 Yeah, you’re in a race against time.
    0:09:17 And then you end up in this frustrating, you know, I have a lot of friends who’ve ended
    0:09:19 up in this frustrating circumstance where you started a company in 2010.
    0:09:22 You thought now was the time your company failed in 2014.
    0:09:25 And then a company started in 2015 that just hit it and went.
    0:09:26 And you’re just, right?
    0:09:29 And so the timing part of it is very hard.
    0:09:33 I don’t think there’s an actual answer to the timing thing other than I think that’s a really
    0:09:35 key part of what they call entrepreneurial judgment.
    0:09:40 Like, I think that’s a big thing that the founder has to really work hard about and think hard
    0:09:40 about.
    0:09:42 And again, it’s this thing of like reconciling.
    0:09:43 I live in the future and I can see it happening.
    0:09:47 But like, can I actually deliver it in the form of a product that actually works the way
    0:09:48 people expect?
    0:09:53 And then is there enough precondition in the world such that they’re going to want to actually
    0:09:54 do this thing?
    0:09:59 And I just like, I mean, these guys, I’ll tell you, like, I don’t, I don’t even, I don’t
    0:10:01 even try to forecast these things myself anymore.
    0:10:05 Like in meetings, I just, I just figured like, let’s just assume the founder is more
    0:10:08 likely to be right than we are just because like, this is the best, the founder’s best
    0:10:11 guess is probably the best guess that exists because the founders have the most information
    0:10:13 kind of synthesize a view on this.
    0:10:17 But having said that, you know, we’re going to back in any, you know, I don’t know what,
    0:10:19 like, I don’t know, like a third of the companies or whatever, and we’re going to back in a given
    0:10:20 year.
    0:10:21 It’s just going to turn out that they were too early.
    0:10:25 By the way, the other corollary to this is if, if, if, if it’s going to work, it almost
    0:10:26 always feels like you’re too late.
    0:10:32 So as a founder, it almost always feels like, I wish I, I, I, I should have started this
    0:10:32 company.
    0:10:36 If I only had started this company two years earlier, right?
    0:10:39 Because, because it’s like, you’re in a situation where you’re like, oh my God, like it’s going
    0:10:41 to happen and the market’s coming and it’s going to happen.
    0:10:42 But like, my product’s not quite ready yet.
    0:10:44 I’m like, oh, like I’m running out of time.
    0:10:45 Right.
    0:10:49 And like, my experience at least is that, that’s a very, that feels terrible in the moment because
    0:10:52 it feels like you’re not rising to the occasion, but that’s, that’s the, that’s actually the
    0:10:53 positive sign.
    0:10:56 What excites you the most when you think about AI and sort of what it can do to create a
    0:10:58 storytelling and just the entertainment industry at large?
    0:11:05 So aspirationally, the, you know, there’s opportunity for like a complete reinvention of the entire
    0:11:08 process of storytelling and games and immersive worlds.
    0:11:09 Uh, right.
    0:11:12 And it’s, you know, this is the sort of thing it’s, instead of, instead of us making them,
    0:11:15 maybe they’re going to get hallucinated there, you know, they’re going to get dreamed, you
    0:11:16 know, you’re playing a game and it’s going to, it’s going to basically, right.
    0:11:19 It’s going to be an AI sort of a symbiosis with you as the player.
    0:11:22 And it’s going to, you know, adapt, it’s going to learn from you, adapt to you and make you
    0:11:27 the best possible experience, um, you know, with the best possible, you know, the trade-off
    0:11:30 between, you know, the ease and difficulty learning curve and, and the most possible
    0:11:31 entertaining scenario for you.
    0:11:35 And, you know, the best games people will play for, you know, many thousands of hours.
    0:11:38 Um, and, uh, and then by the way, the businesses can be really amazing
    0:11:42 underneath this because they don’t, you, you don’t bear all the cost of, of, uh, of, uh,
    0:11:42 manual content creation.
    0:11:46 And so, you know, you, you build an engine like this and it, it runs, you know, it runs
    0:11:46 forever.
    0:11:50 Um, and so like that, that’s super exciting.
    0:11:54 Um, and then, you know, being able to have virtual worlds with like, you know, super smart,
    0:11:57 you know, lots of, you know, populated by, you know, these incredible personalities.
    0:12:01 Um, uh, you know, it’s just a whole new, amazing kind of storytelling.
    0:12:03 So like, I know we’re, and we’re big believers in all this.
    0:12:05 And like, I think that’s all going to happen.
    0:12:06 It’s going to be, it’s going to be, it’s going to be fantastic.
    0:12:09 Um, I will say, I am also seeing this through the eyes of my nine-year-old.
    0:12:14 Um, so I’m re-going through the, uh, you know, the process of learning about gaming and coming
    0:12:15 up to speed as a gamer through my, my nine-year-old.
    0:12:18 And so, um, a couple, a couple of key lessons.
    0:12:21 One is, well, Warcraft apparently is, or not Warcraft, um, Minecraft is immortal.
    0:12:27 Um, so like, like he loves Minecraft more than like life itself.
    0:12:30 And so that, like that, whatever, whatever, whatever those guys did is just, it has worked.
    0:12:34 Um, and by the way, it really interesting thing, like the nine-year-olds, at least the nine-year-olds,
    0:12:37 they don’t care at all about visual fidelity, visual quality.
    0:12:42 Like he could not give a, um, and so Minecraft, Minecraft and Roblox and games that look like
    0:12:43 that are, are fantastic.
    0:12:47 And he just discovered his first, um, I just learned about IO games, um, if people have heard
    0:12:48 of those.
    0:12:52 And so he just discovered, he just discovered his first, uh, first person shooter, um, which
    0:12:54 just thrills, thrills me to no end.
    0:12:56 It’s, it’s, it’s, it’s called, uh, shell shockers.
    0:13:01 Um, and it’s basically, it’s Counter-Strike, but with, uh, uh, uh, uh, anthropomorphic eggs.
    0:13:02 Oh, wow.
    0:13:05 Uh, so it’s like full military hardware.
    0:13:08 And then when you shoot another player, he explodes in a shower of yolk.
    0:13:12 Um, and it’s, it’s a, it’s a completely browser-based game, right?
    0:13:14 It’s multiplayer, but it’s all browser-based.
    0:13:15 And so it punches through all the school firewalls.
    0:13:19 Um, and so the, the, the, and it runs, it runs great on Chromebooks.
    0:13:21 Um, he’s, he’s very frustrated.
    0:13:22 He can’t get it.
    0:13:24 When we take away his laptop, he’s currently on a laptop break.
    0:13:27 Uh, when we take away his laptop, he tries very hard to get it working on the Kindle, on
    0:13:28 the web browser on his Kindle.
    0:13:34 That’s his main form of frustration in his life right now is that, is that, uh, and we tell
    0:13:37 him it’s good, it’s good that shell shockers doesn’t run on the web browser on the Kindle.
    0:13:38 Cause if it did, we’d have to take away the Kindle.
    0:13:44 Um, so, um, but anyway, um, I, I watched him doing this and then, and then, you know,
    0:13:48 and then I had this moment when ChatGPT came out, uh, where I was like, wow, this is amazing.
    0:13:50 And, you know, I can’t wait to show this to my kid.
    0:13:52 And I’m going to, you know, I felt like, that felt like Prometheus bringing fire down from
    0:13:54 the heavens to my, to my kid.
    0:13:57 It’s like, I’m bringing him, you know, AI and it’s going to be this tutor and coach and
    0:13:58 it’s going to be with him forever.
    0:14:00 And it’s going to talk to him and he can ask good questions and it’ll teach him about all
    0:14:01 these things he’s interested in.
    0:14:05 And I, I, I install it on his laptop and show him, you type in a quick question.
    0:14:08 It’ll, it’ll answer the question and, you know, and, and, and I’m like, I’m like, this
    0:14:11 is like the best thing I could ever possibly do for my kid is given this capability.
    0:14:18 And, you know, he looks at me and he’s like, so I was like, no, it’s like, it took 80 years
    0:14:21 with the neural networks and the this and the training data and the whole thing.
    0:14:24 And it like does the thing and it makes it, he’s like, that is a computer.
    0:14:27 Like if it doesn’t answer your questions, like what would it do?
    0:14:31 Well, against this, this backdrop, right?
    0:14:36 Like we’ve got new administration, new policies, new technology, sort of new sort of cultural
    0:14:37 movement potentially.
    0:14:40 It feels like we’re very much at a moment in time.
    0:14:43 I’ve, I’ve heard you describe it before as like we could be in the precipice of literally
    0:14:44 a golden age, right?
    0:14:47 For, for, for America and for the, and for the world.
    0:14:54 What are some ways that startups today, like in 2025 could be thinking about capitalizing just
    0:14:55 on, on this moment, right?
    0:14:56 Yeah.
    0:14:59 So look, I think for at least the next four years, I think it’s, you know, basically blue
    0:14:59 skies.
    0:15:04 And so I think that, you know, now, now’s definitely the time to build, you know, you know, we,
    0:15:08 as I said earlier, like our whole theory of all this is that the prime time for startups
    0:15:09 is when there’s a platform change.
    0:15:13 And I’m not, by the way, this is not saying that every startup should be an AI startup or
    0:15:17 whatever, but like there is this incredible new tool, you know, that can be used.
    0:15:21 And it’s just, again, it’s just sort of general fact that big, big companies, big incumbents
    0:15:24 have a very hard time reacting to these platform changes.
    0:15:26 Sometimes they, they, they pull it off, but a lot of times they really struggle.
    0:15:31 And so it’s sort of the best possible opportunity for a startup is going up against an incumbent
    0:15:33 is when there’s this, this kind of shift happening.
    0:15:34 So I think that’s happening.
    0:15:35 Yeah.
    0:15:40 There’s going to be four years of, of, of, of, of clear, clear sailing, I think in the
    0:15:43 U S you know, the rest of the world’s in a funny state.
    0:15:47 Um, uh, you know, the EU has all but made AI illegal at this point.
    0:15:52 Um, uh, they, they literally, this is the amazing stuff.
    0:15:55 Um, they literally say publicly, their senior officials literally said publicly a year ago,
    0:15:59 they’re like, well, we’ve realized that you cannot be the world’s leader in AI innovation.
    0:16:01 And so therefore we will be the world’s leader in AI regulation.
    0:16:06 Um, which is one of those things that I think you only say if you’re an unelected bureaucrat,
    0:16:08 completely lost in your own thoughts.
    0:16:13 Um, but you know, they, they passed this very draconian AI law, um, that as a consequence,
    0:16:18 even the big AI labs now are not launching new AI products in the EU, uh, cause they can’t
    0:16:18 figure out how to do it legally.
    0:16:21 Um, the UK actually backslid on this.
    0:16:25 They, they, they should, UK should be a haven for, for openness and instead they, they basically
    0:16:29 have matched the EU, um, on, on, on bad regulations and seem like they’re going sideways on that.
    0:16:34 So, so, and this is important cause like the, the main U S trading partners are actually our
    0:16:37 main, our main trading partners at China, our main trading partners, Europe, like the, the,
    0:16:40 the, the, the, the, the globalization is sort of based fundamentally, you know, in the West
    0:16:43 is based on the economic relationship between the U S and Europe.
    0:16:47 Um, and then, yeah, I mean, look, you know, the entire world is, is, is, is opening up.
    0:16:49 I mean, you know, kids everywhere are now online.
    0:16:50 They have access to all this.
    0:16:51 They, they want to use all this technology.
    0:16:52 They want to build it.
    0:16:54 You know, they want to be part of these companies.
    0:16:57 Um, some of that obviously happens through, you know, migration, but a lot of that happens
    0:16:59 through just like everybody’s online now.
    0:17:01 Um, and so there’s, you know, talent all over the world.
    0:17:06 Um, and you know, hopefully the, the, the internet itself, you know, and general economic
    0:17:09 arrangements stay free enough where people all over the world can come together and build
    0:17:09 things.
    0:17:10 Oh, sure.
    0:17:10 That’s very exciting.
    0:17:13 Definitely feels like what a unique moment in time to be building.
    0:17:16 So, um, I think it’s great note that to wrap it on.
    0:17:18 Sarah, thank you, Mike and Jason.
    0:17:18 Let’s give him a hand.
    0:17:19 Good, good, good.
    0:17:20 Thanks, everybody.
    0:17:29 Like, it just drives you nuts.
    0:17:29 Right.
    0:17:31 Um, and by the way, is it okay?
    0:17:33 Is this an adult audience?
    0:17:35 Is it, can I, can I swear?
    0:17:36 Is this everybody, is everybody over the age of 18?
    0:17:37 It’s hard to tell.
    0:17:40 If you’re below the age of 15, it’s just whatever the new thing is.
    0:17:41 It’s just the way the world has always worked.
    0:17:42 It’s totally normal.
    0:17:46 Um, if you’re between the ages of 15 to 35, you know, the new thing is very exciting
    0:17:48 and you might be able to make a career out of it.
    0:17:52 Um, if you’re above the age of 35, the new thing is unholy, um, and against the social
    0:17:54 order and it’s going to destroy civilization.
    0:17:59 Thanks for listening to the A16Z podcast.
    0:18:04 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash
    0:18:05 A16Z.
    0:18:07 We’ve got more great conversations coming your way.
    0:18:08 See you next time.

    What if now is the best time in decades to start a company?

    In this episode, taken from Speedrun, a16z’s accelerator for early-stage founders, Marc Andreessen joins games General Partner Jonathan Lai to make the case that we’re entering a once-in-a-generation window for innovation. From the rise of AI to the cultural and policy shifts reshaping the global economy, Marc explains why the next four years present a rare opportunity for builders to seize the moment.

    Along the way, they discuss market timing, platform shifts, and what sets successful founders apart – including lessons from Steve Jobs, insights into AI’s impact on storytelling and games, and why being “too early” can feel just like being wrong.

    Timecodes

    0:00 Lessons from Steve Jobs on Leadership & Innovation

    2:27 The AI Boom: How It’s Changing Everything

    5:52 Market Timing: The #1 Factor in Startup Success

    8:13 Why the Next 4 Years Are Critical for Tech

    11:30 AI & The Future of Gaming, Storytelling & Virtual Worlds

    14:28  Why Some Startups Fail While Others Explode

    17:11  The Role of Founders in the AI Era

    Resources: 

    Find Marc on X: https://x.com/pmarca

    Find Jonathan on X: https://x.com/Tocelot

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

  • Katherine Boyle: How Tech Can Rebuild America

    Why does seriousness feel radical today? a16z General Partner Katherine Boyle joins The LaBossiere Podcast to explore what it means to build for the national interest—and why that starts with purpose.

    Katherine, part of the American Dynamism team at a16z, shares how we got to a place where public service became uncool, how tech can help rebuild trust in government, and why suffering, friction, and responsibility are essential ingredients for growth. From the collapse of civic duty to the rise of meme-driven politics, they dig into the cultural forces shaping America—and the opportunity to reclaim a sense of mission.

    They also discuss why Silicon Valley is more idea than place, what journalists and investors have in common, and why being laughed at might be the clearest sign you’re on the right path.

    Timecodes: 

    0:00 – Intro

    4:48 The Decline in Public Service

    7:47 Making Government Cool Again

    10:07 Silicon Valley’s Aversion to National Security

    13:15 Positive Sum vs Zero Sum Cultures

    16:27 China, Authoritarianism, and Doing Hard Things

    19:27 What Makes America Special?

    23:03 Silicon Valley and the “Real Economy”

    26:28 Investing in Mature Markets

    29:08 Vanna White and The Wheel of Fortune

    30:27 Journalism and Loneliness

    32:52 – Time and Suffering

    38:10 – Seriousness and Purpose

    41:11 – Is Culture Downstream of Technology?

    42:48 – Propaganda and Coolness as a Strategic Asset

    44:40 – Florida, Texas, and Regulatory Arbitrage

    47:51 – DC, Silicon Valley, and Florida

    50:20 – What Should More People Be Thinking About

    Resources: 

    Find Katherine on X: https://x.com/KTmBoyle

    Find Alex on X: https://x.com/adlabossiere

    Listen to more from The LaBossiere Podcast:

    YouTube: ⁠https://bit.ly/3QDLQFt⁠

    Apple: ⁠https://apple.co/478Be6M⁠

    Spotify: ⁠https://spoti.fi/3sfiFiE⁠

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on X: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

  • Where We Are in the AI Cycle

    AI transcript
    0:00:03 People are still trying to figure out how everything works.
    0:00:09 What I love about the vibe writing concept actually is it’s a place in which full autonomy can be fulfilled today.
    0:00:15 You’re prompting, although it’s English-like, it turns out you’re just programming.
    0:00:17 And you’re just programming in prompt.
    0:00:22 I’ve had so many conversations with product managers over the last two years about the death of product management.
    0:00:25 It’s the end of the field, why we need PMs.
    0:00:29 It was extreme in 1990, and it’s extreme today.
    0:00:33 Where are we really in the AI computing shift?
    0:00:37 Is this the Windows 3.1 moment or more like the 64K IBM PC?
    0:00:46 In this episode, part of our This Week in Consumer series, I’m joined by A16Z general partner Anisha Charya and board member Stephen Sanofsky,
    0:00:50 former Microsoft president and one of the most influential product thinkers in tech,
    0:00:54 to unpack where we are in the AI platform cycle and what’s coming next.
    0:00:59 We dig into the framework shaping this moment, partial autonomy, jagged intelligence,
    0:01:02 Vibe coding versus vibe writing, what builders are wrong about agents,
    0:01:05 what Google’s I.O. signals about platform strategy,
    0:01:09 and why the future might be less about killer apps and more about control sliders.
    0:01:14 We begin by discussing this week’s talk from Andre Karpathy on why software is changing again.
    0:01:16 Let’s get into it.
    0:01:21 As a reminder, the content here is for informational purposes only.
    0:01:24 Should not be taken as legal, business, tax, or investment advice,
    0:01:27 or be used to evaluate any investment or security,
    0:01:31 and is not directed at any investors or potential investors in any A16Z fund.
    0:01:37 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
    0:01:40 For more details, including a link to our investments,
    0:01:44 please see A16Z.com forward slash disclosures.
    0:01:54 Anish, Steven, we were having such a good conversation offline that I wanted to get this on the podcast.
    0:01:55 There are a few topics we wanted to discuss.
    0:02:00 First, we were all fascinated by Andre Karpathy’s talk at Startup School.
    0:02:02 Steven, what did you find so interesting?
    0:02:04 What were your takeaways or reactions from it?
    0:02:07 Well, I totally loved the talk.
    0:02:12 He did an unbelievable, like a philosopher king version of where we are.
    0:02:16 And I just found his metaphors really compelling.
    0:02:21 In fact, what I might do is even take it further back and just say,
    0:02:24 since he used an analogy of like where we are in computing,
    0:02:27 I’m talking about the Windows 3 era and stuff like that.
    0:02:35 And having lived through all of them, I tend to think we’re at the 64K IBM PC era of the microcomputer.
    0:02:39 And the reason I think that is actually a technical one,
    0:02:44 which is that we’re at the point where people are still trying to figure out how everything works.
    0:02:54 And all the coding and all of the energy is working around like these very basic working problems.
    0:02:57 Like with the PC, it was like, okay, we have 64K of memory.
    0:02:58 And our programs are all too big.
    0:03:00 And we have no display and all these problems.
    0:03:03 And with AI, people are like, it’s going to replace search.
    0:03:05 It’s going to replace Excel.
    0:03:06 And it’s going to replace all these things.
    0:03:08 But it doesn’t add very well.
    0:03:10 It gives you a lot of errors.
    0:03:14 Like the thing that you say it’s going to do, it just doesn’t even do yet.
    0:03:19 So I feel like we’re at a point that is just so, so early.
    0:03:24 And he did a fantastic job of sort of making that arc.
    0:03:26 You know, the thing that struck me the most was
    0:03:29 he talked a lot about our relationship with this new tool.
    0:03:32 You know, and in a sense, we want to use it in the same way that we’ve used
    0:03:35 all the other computing tools and technologies we’ve used in the past.
    0:03:40 But he really talked about this kind of inversion of the relationship of LLMs as people,
    0:03:43 spirits, the fact that they have jagged intelligence.
    0:03:46 So to me, that sort of meta point he made was one of the most interesting.
    0:03:51 We have to relearn how to use this type of tool before we know how to be productive with it.
    0:03:55 I think tools is a super interesting point because the talk is anchored in tools,
    0:03:57 but the world itself is anchored in tools.
    0:04:01 And the early stages of a platform are always about tools.
    0:04:04 And so you kind of get a little confused.
    0:04:07 Like right now, of course, he was talking about vibe coding, clearly,
    0:04:10 because he pioneered the term, invented the concept, and is living it.
    0:04:14 And it’s very interesting because I actually think coding is one domain
    0:04:17 that always works best early in a platform because, well,
    0:04:20 all the customers of the platform are developers,
    0:04:23 and they’re going to make their tooling kind of work and come along.
    0:04:26 But I really think that the most interesting thing for me,
    0:04:30 what’s being underestimated in the near term is sort of vibe writing.
    0:04:33 I mean, it seems weird to say anything with AI is underestimated
    0:04:36 because Lord knows that’s not where we are.
    0:04:40 But the thing is, is that vibe writing is so here.
    0:04:44 Like if you’re in college, you’re already vibe writing.
    0:04:48 And businesses are still working through the, well, can we use this?
    0:04:49 Doesn’t seem appropriate.
    0:04:52 And that’s a thing I’ve definitely lived through with word processors.
    0:04:55 You know, I had to get permission from the dean in college
    0:04:56 to use a computer to write papers.
    0:05:00 But this vibe writing is absolutely a thing.
    0:05:04 And it is really, really no different than when calculators showed up
    0:05:08 and all of a sudden just doing math homework involved using a calculator.
    0:05:11 And people are like, well, you’re not going to know how to do math in the future.
    0:05:13 And it’s like, I won’t have to know how to do math.
    0:05:15 That’s like the whole point of a tool.
    0:05:20 Like I have a power drill, so I do not know how to use like one of those Amish drill things,
    0:05:22 you know, and the world moves up the stack.
    0:05:24 And so that’s where we are.
    0:05:25 And it’s just super exciting.
    0:05:28 What I love about the vibe writing concept actually is
    0:05:31 it’s a place in which full autonomy can be fulfilled today.
    0:05:35 So you can ask the model to vibe write something, you know,
    0:05:38 really detailed and compelling, and it’ll do a great job.
    0:05:41 Whereas with vibe coding, I think there’s a ton of constraints
    0:05:44 as to what the model can actually do versus what it can conceptually do.
    0:05:47 And understanding those boundaries and constraints
    0:05:50 is going to define a lot of the text-to-code stuff for the next two years.
    0:05:52 Well, I’d push back a little bit on that
    0:05:54 because, of course, I agree on the coding side.
    0:05:58 And I think one of the things developers do early in a platform is they love to tell you
    0:06:03 that they’re doing something every day and it’s working, but it actually just isn’t.
    0:06:05 And that’s just what happens early in a platform.
    0:06:09 They tell you all these things that they say are easy and they’re actually not.
    0:06:12 And they spent 18 hours struggling with something that didn’t work.
    0:06:17 But on the vibe writing side, it also hits a point that I just think is so, so important,
    0:06:21 which is, yeah, you can prompt it to spew out a bunch of stuff.
    0:06:26 But if you have a job and your salary depends on you submitting that,
    0:06:28 or you’re a student and your grade depends on you submitting that,
    0:06:30 it actually better be right.
    0:06:34 And you can’t just say, look, vibe wrote this and here you go.
    0:06:37 And I think people don’t get confused when it comes to like math.
    0:06:40 Like everybody knows you have to go check to the math if you ask it to do a table
    0:06:42 and then add a column that does math.
    0:06:49 But we’re going to just see endless, endless human wasn’t in the loop vibrating things.
    0:06:53 And it’s just that with programs, you can’t really see that right away
    0:06:56 because in order to actually distribute it or get someone to use it,
    0:06:58 you have to at least fix the initial bugs.
    0:07:02 We’ll only see them later when there are security bugs, authentication bugs,
    0:07:05 passwords stored in plain text, or a zillion other problems
    0:07:07 that are going to happen from vibe coding.
    0:07:08 In a sense, we’ve seen this already, right?
    0:07:12 We saw a bunch of lawsuits that were citing case precedent from cases that don’t exist.
    0:07:16 So maybe this is actually the operative point, which is there’s full autonomy,
    0:07:17 there’s partial autonomy.
    0:07:21 I mean, maybe partial autonomy in writing is moving us from writer to editor,
    0:07:22 but you still have to be the editor.
    0:07:24 Yeah, we should also give him credit.
    0:07:28 Many people have talked about this, but he did a fantastic job using the Iron Man analogy
    0:07:34 of how we’re going to have autonomy, partial autonomy, and a slider to control what you want.
    0:07:38 I actually think that’s a fantastic analogy and a way of thinking
    0:07:40 that gives you a very clear picture from the movies.
    0:07:46 But at the same time, people are very, very aggressive on their timeline of agents.
    0:07:50 And there’s a very, very long history in trying to automate things
    0:07:54 that turn out to be very, very difficult to automate.
    0:07:56 And he did a fantastic job.
    0:07:58 He said, people are talking about like the year of agents.
    0:08:01 Yeah, that’s a good consultant phrase.
    0:08:03 Just like he said, we’re in the decade of agents,
    0:08:10 and it’s going to take a decade for things to be anywhere near living up to agentification as a meme.
    0:08:12 It’s an interesting point.
    0:08:14 I think a lot about agents as applied to financial services.
    0:08:19 And I think there’s a set of problems in financial services that are high friction, low judgment.
    0:08:23 So for example, when I want to go refinance my personal loan,
    0:08:26 I don’t really feel attached to any specific brand of a personal loan provider.
    0:08:28 I just want the cheapest rate.
    0:08:30 So it’s actually a very low judgment decision.
    0:08:34 But going and researching and applying for a personal loan is a high friction process.
    0:08:36 That’s something I would love to delegate to an agent.
    0:08:38 I think it can do a nice job.
    0:08:42 Whereas doing my taxes, wow, like, Stephen, how much risk do you want to take on your taxes?
    0:08:44 How many things do you want to report or not report?
    0:08:46 That requires an enormous amount of judgment.
    0:08:48 And of course, it also is high friction.
    0:08:51 So when I think of the two by two of where is automation going to come first,
    0:08:53 I think a lot about high friction, low judgment.
    0:08:57 I want to build on that because I actually think it’s super important to also consider
    0:09:01 that for anyone to offer the alternatives to the market,
    0:09:05 there has to be an ability to differentiate, to explain.
    0:09:08 And so you end up with this kind of thing where I just want the cheapest flight.
    0:09:13 And of course, for 20 years, all of the flight searches and stuff has worked on the cheapest.
    0:09:16 But it turns out that’s not actually what you want.
    0:09:16 That’s right.
    0:09:21 Plus, a lot of people want to intervene in presenting your choices to you.
    0:09:21 Yes.
    0:09:27 And so this idea that all choice in life is going to be reduced to some headless API.
    0:09:27 Right.
    0:09:28 I don’t understand.
    0:09:31 People have to go build that and make a living building those things.
    0:09:36 So to your example of refinancing a home, like the only reason that it can exist as a search
    0:09:42 problem today is because the different people who want to refinance you can target you with an ad
    0:09:46 and attract you as a customer and differentiate themselves on that offering.
    0:09:51 And if you can’t do that, then your ability to actually automate that task isn’t going to exist
    0:09:54 because there’s no economic incentive to just be,
    0:09:59 hi, I’m the headless, faceless, nameless, low-priced mortgage leader.
    0:09:59 That’s right.
    0:10:00 It’s not really a business.
    0:10:02 There’s nothing there.
    0:10:05 Just headless, faceless, nameless food isn’t a thing.
    0:10:07 It doesn’t show up in a white can labeled food.
    0:10:10 And then you consume it and you’re, okay, all good.
    0:10:11 I have food now.
    0:10:13 Well, maybe Soylent, but yes.
    0:10:16 In the future, in the dystopian future of Repo Man, that’s where we end up.
    0:10:17 But that’s not going to happen.
    0:10:20 I want the cheapest flight as long as it’s not on Spirit Airlines.
    0:10:20 Right.
    0:10:23 I want the cheapest flight, but I’m traveling with a family of three.
    0:10:24 I don’t want to leave at 5 a.m.
    0:10:25 No red eye.
    0:10:25 Yeah.
    0:10:27 Like, I want miles on this airline.
    0:10:31 A lot of things don’t add up to that.
    0:10:32 This is a real thing in business.
    0:10:35 It’s a thing on the producer and the consumer side.
    0:10:39 Consumers really, really want much more choice than they often think they do.
    0:10:44 And anyone who’s bought anything on Amazon knows they complain about the choice, but they
    0:10:48 really don’t want just, like, phone case to show up as the thing because it was $6.
    0:10:54 I think this is a real through line through the talk, which is partial autonomy, jagged
    0:10:55 intelligence.
    0:11:00 Karpathy is just talking a ton about the constraints of the technology, which I think is the right
    0:11:02 thing for us to be thinking through trade-offs around as builders.
    0:11:06 And he does a great job, very much as this philosopher that I love.
    0:11:08 His delivery, his tone.
    0:11:09 Don’t just go read the summaries.
    0:11:11 Don’t read a post.
    0:11:13 Go just watch the video immersively.
    0:11:18 Well, I want to get to automation and employment, particularly on the entry-level side.
    0:11:23 But first, I just want to ask the broader question of, there was this idea of AI plus human,
    0:11:26 I think it was chess, could beat AI for some period of time.
    0:11:28 And that was kind of the co-pilot view of the world.
    0:11:30 Human plus AI will have a better product.
    0:11:33 And then it turned out, I think it was chess, maybe it was Goic, or maybe it was both, that
    0:11:36 actually that was a temporary thing and AI is just better.
    0:11:42 And then there’s the question as to how much of the world is like chess, where a human plus
    0:11:47 AI is only better for a certain period of time and then the models get better, or how much
    0:11:48 the world is like something else.
    0:11:51 We’re always going to want that human plus AI is just better, or we’re just always going
    0:11:52 to want humans to do it.
    0:11:56 Look, my view is that in a domain in which you have a formal definition of correctness,
    0:12:00 the path will be no autonomy, partial autonomy, full autonomy.
    0:12:05 In domains where you don’t have a formal definition of correctness or where a ton of human judgment
    0:12:10 is necessary and human choice and sort of a human direction, we’re just the right product
    0:12:12 design is not to go all the way to full autonomy.
    0:12:15 I would argue that chess and Go do have a formal definition for correctness.
    0:12:18 So it makes sense that those were fully automated over time.
    0:12:23 We’re back to the early stages of where things are, which means that a bunch of programmers
    0:12:26 are sort of defining what success looks like.
    0:12:31 And programmers are very good at either works or it doesn’t work, or I just want to automate
    0:12:36 this, or I’m going to reduce your job to a tiny shell script kind of mentality.
    0:12:40 And I just look at the world as everything is gray.
    0:12:45 And everything is much harder than it looks when you don’t actually have to do it.
    0:12:51 Ages and ages ago, I visited a really giant hospital in Minnesota to help them figure out
    0:12:54 how to use Excel within the medical profession.
    0:12:58 And the doctor just looked at me and he’s like, I don’t think you understand.
    0:13:02 He was like, my job is all uncertain.
    0:13:05 Every aspect of what I do is uncertain.
    0:13:12 So adding something that pretends to be certain, like a spreadsheet, to my uncertainty doesn’t
    0:13:13 actually help me.
    0:13:18 And so fast forward, first, I’ve spent 25 years with a doctor, but that’s a different, but
    0:13:21 if there was a story this week about radiologists.
    0:13:26 And so very early, actually, if you go to ImageNet, everybody was immediately radiology is doomed.
    0:13:29 Oh, like you never need to get a skin cancer biopsy.
    0:13:32 You’ll just take pictures of your mole and it will just tell you.
    0:13:34 And then you find out, wow, there’s judgment there.
    0:13:38 And there’s even judgment in doing the biop, in how to do the biopsy, and then what to biopsy,
    0:13:39 and all this.
    0:13:43 But it turns out the radiologists have, like, fully embraced AI.
    0:13:50 But they embraced it no different than they embraced the latest MRI technology or the latest
    0:13:53 software update from GE for a CAT scan.
    0:13:59 Like, I just think there are so many things like that, and so many jobs are either very,
    0:14:05 very uncertain, or most of the job is basically exception handling.
    0:14:05 Right.
    0:14:08 And, like, people are like, oh, we’re going to automate our taxes.
    0:14:13 Okay, taxes are literally a giant cascading if and switch statements of exceptions.
    0:14:14 Yes.
    0:14:20 And so the idea that you will just automate that, well, you have to know the answer to
    0:14:20 all the exceptions.
    0:14:25 And if you’re going to prompt it with the answer to all the exceptions, then you’re doing your
    0:14:26 taxes manually.
    0:14:30 It’s sort of like once you reach a certain income, you have to get help from an accountant
    0:14:30 to do your taxes.
    0:14:34 And the first thing the accountant does is ask you for your tax planner.
    0:14:38 And as a software person, I look at it, I’m like, the tax planner really, really looks like
    0:14:42 the input fields of the software you’re using.
    0:14:46 So maybe I could just buy that software and then type it in.
    0:14:50 And I said that, and he’s like, well, you’re welcome to, but you will go to jail.
    0:14:55 And he explains, because every time I give him a number is a whole decision about where
    0:14:56 to apply it.
    0:14:57 Does it work?
    0:15:00 And I’m like, well, you’re not really a farmer, so don’t fill anything in on that form and stuff
    0:15:01 like that.
    0:15:03 Automation is extremely difficult.
    0:15:08 And it’s exception bound, it’s judgment bound, and it’s all uncertain.
    0:15:12 You know, a field in which this question comes up a ton is product management.
    0:15:16 I’ve had so many conversations with product managers over the last two years about the
    0:15:17 death of product management.
    0:15:20 It’s the end of the field, why we need PMs.
    0:15:25 And I think our sort of developer generation has developed a real resentment towards product
    0:15:26 managers, which is a different conversation.
    0:15:31 With that said, I think that the product management job is the job of addressing ambiguity.
    0:15:34 And it’s ambiguity that prevents progress from being made.
    0:15:37 Sometimes it’s execution, decision making, product design.
    0:15:39 That will not change.
    0:15:44 The nature of business and human interaction and companies is these complex adaptive systems
    0:15:46 where there will always be ambiguity.
    0:15:49 I think you’ll always need judgment, and you’ll always need somebody who looks like a product
    0:15:50 manager.
    0:15:54 Yeah, I think that really gets to the vibe coding challenge we’re dealing with, which
    0:15:57 is like, how fast can we go text to app?
    0:16:03 And I think here, what’s so interesting in the long arc of platform transitions is that we’re
    0:16:08 also having this platform transition happen not just out on the open.
    0:16:09 We’ve had that before.
    0:16:13 Like back when, in the earliest days of computing, these platform transitions happened in user group
    0:16:17 meetings, like at the Cumberly Community Center down the street, or in magazines or newsletters,
    0:16:20 and then with news groups, then the internet and so on.
    0:16:24 The whole internet was all ICQ, and it was all in the open.
    0:16:27 But now it’s like happening on CNN, on the nightly news.
    0:16:33 Everyone knows about the platform transition that’s happening, in particular on social,
    0:16:34 in Discord.
    0:16:38 And so what’s happening is you’re getting a lot of like vibe coding for clout.
    0:16:44 And so you’re getting a lot of this, I had an idea, I prompted it, and it worked.
    0:16:45 And here I am.
    0:16:47 At some point, I just go, I’m calling BS on that.
    0:16:48 That’s like not a thing.
    0:16:50 And then I sound like an old person.
    0:16:54 And because some people think I am, I don’t, but some people think I am.
    0:16:54 I don’t either.
    0:16:57 It looks like, hey, you’re just being old.
    0:16:57 Yes.
    0:17:02 But then you dig in and you find out like, wow, you’re prompting.
    0:17:06 Although it’s English-like, it turns out you’re just programming.
    0:17:06 Yes.
    0:17:08 And you’re just programming in prompts.
    0:17:08 Yes.
    0:17:12 And people are like, oh, this is what we’re going to do is we’re just going to get
    0:17:15 the model to require a little bit more structure.
    0:17:17 And I’m like, you’re writing a new programming language.
    0:17:18 Yes.
    0:17:22 And this path of text to app and vibe coding is just developing a new language,
    0:17:24 which is super cool.
    0:17:27 Lord knows the world is built on programming languages.
    0:17:32 In the 80s, if you drove slowly past the computer science department trying to get a PhD,
    0:17:35 they would just invent a new programming language right then and there
    0:17:37 if you stood outside the building for too short a time.
    0:17:42 But we can’t lose sight of the fact that the arc of programming has been one of basically
    0:17:44 over-promise and under-deliver.
    0:17:49 When I was in college, like, the theory was the market was going to need so many programmers
    0:17:53 that the whole employment force, the whole workforce would just be software people.
    0:17:55 And that never happened.
    0:17:57 And now here we are, we’re not going to need any.
    0:17:58 They’re all just going to go away.
    0:18:02 And I think it was extreme in 1990 and it’s extreme today.
    0:18:07 And I think that the big thing is this over-promising at each transition.
    0:18:09 Even just most recently, low-code.
    0:18:11 Who even says that word anymore?
    0:18:13 Like, we’re not allowed to even mention it.
    0:18:17 And it’s because it’s always the same thing, which is, yes, if all you’re doing
    0:18:21 is a very straightforward app that looks like all the other straightforward apps,
    0:18:24 but with a domain spin or a branding logo or something.
    0:18:28 We see this with Wix and with website templates, like it’s possible.
    0:18:31 But you’re not going to run a company on any of those.
    0:18:33 I totally agree.
    0:18:37 Where I disagree, actually, is I think that the language, the language model in this case,
    0:18:42 but the language in your metaphor, is improving at a dramatic rate underneath these things.
    0:18:46 So while I think almost all these products today, they’re good at prototyping,
    0:18:50 they’re trying to push into refinement, they’re not really usable as things that you can actually
    0:18:52 deploy to production at all.
    0:18:55 In fact, most of the cool demos you see on Twitter don’t work three days later.
    0:18:59 So they’re very much in the prototyping phase, but the programming language in the metaphor
    0:19:01 is improving dramatically.
    0:19:06 I think we’ll get there, at least make more progress than we think, versus a traditional
    0:19:10 programming language like object-oriented, it didn’t feel like it 100x the number of programmers
    0:19:13 or 1 100th the amount of time to ship something to production.
    0:19:17 We just got new tools and sort of new problems to solve.
    0:19:19 Well, of course, you’re benefiting from hindsight.
    0:19:20 Yes.
    0:19:21 And that’s a key thing.
    0:19:22 First, I agree.
    0:19:26 We’re in an exponential improvement cycle with the models.
    0:19:26 Yes.
    0:19:29 So any predictive power goes out the window.
    0:19:29 Correct.
    0:19:33 And anyone who says like something negative, you’re going to be the next person who says
    0:19:36 the internet is going to be a faxing fad like a fax machine.
    0:19:39 And that’s a bad, you just don’t want to be there.
    0:19:43 And it turns out also, that’s a case where having lived through them, you get very shy
    0:19:48 about making predictions because you see how foolish people look for a long time.
    0:19:50 But take something like object-oriented.
    0:19:53 I mean, this thing was hyped to the moon.
    0:19:55 This is a wave of programming languages.
    0:19:58 Just to give you an idea, again, how the speed things move.
    0:20:00 They started in 1980.
    0:20:05 And by 1990, they finally reached like peak hype.
    0:20:05 Right.
    0:20:07 So it was like 10 years of incremental improvement.
    0:20:10 And by then, they were also over.
    0:20:15 Like any programmer would have kind of said, eh, it’s sort of just changing the old programming
    0:20:18 paradigms of abstraction and polymorphism and stuff like that.
    0:20:23 But meanwhile, the magazines, which was the key measure of success at the time, there was
    0:20:27 one magazine that had a picture of a baby of diapers, not a picture, a drawing, on the
    0:20:28 cover of programming.
    0:20:30 How programming will get made easy.
    0:20:35 I remember seeing it at the newsstand, and I was working on the C++ compiler at the time.
    0:20:40 C++ was a brand new language in 1990, and it didn’t work yet.
    0:20:46 And here was a baby who was going to make programming possible for other babies at baby care or something.
    0:20:52 And whether it was that or all the database programming languages like Delphi or PowerBuilder,
    0:20:56 in algorithmic sense, they were all constant improvement.
    0:21:01 Like just, they added a constant factor, like plus seven, onto programming.
    0:21:04 None of them changed the mathematical order of magnitude.
    0:21:09 And what I believe is, with writing right now, it’s changing order of magnitude.
    0:21:11 And so it’s here, it’s happening.
    0:21:13 Accuracy isn’t there.
    0:21:18 But one of the things about writing is, like, actually, when you read it, most of it in business
    0:21:19 is not really accurate already.
    0:21:21 It’s very much like autocorrect.
    0:21:27 Like autocorrect fixed all the common typos, like T-E-H to T-H-E in English, and just replaced
    0:21:32 them with these wild new autocorrects that just replaced what you typed with a word that
    0:21:36 has no meaning in the context of the sentence, which is what we face on phones all the time.
    0:21:41 So what we’re going to see is a whole different set of errors in business writing or academic
    0:21:46 writing in schools that just replace other errors that have always creeped in.
    0:21:46 Totally.
    0:21:49 I remember Smalltalk was the hot language, right?
    0:21:51 Well, Smalltalk was the start of it, and it was called Smalltalk 80.
    0:21:52 Yes.
    0:21:56 And then it really didn’t ever achieve any momentum outside of Palo Alto.
    0:21:57 Yes.
    0:21:59 But then C++ came along.
    0:21:59 Yes.
    0:22:02 And there were 50 languages in the middle that people don’t talk about.
    0:22:02 Yes.
    0:22:06 Like Objective-C being one of them, that was the iPhone language, which was really one Steve
    0:22:06 Jobs.
    0:22:11 There was Object Pascal and Pascal, Pascal with a relational database attached.
    0:22:13 But this was my master’s degree.
    0:22:14 Then I quit grad school.
    0:22:17 I could go on about this one for far too long, so I’ll just stop now.
    0:22:22 Do you think there will be best-selling novels that are entirely AI-generated or nearly entirely
    0:22:23 in the next few years?
    0:22:24 A hundred percent.
    0:22:28 I don’t think Stephen King is going to do that, but I think there’ll be some new writer who
    0:22:29 will probably write it under a pseudonym.
    0:22:34 And a year after the novel is written and has been made into a movie, they’ll say, oh, by
    0:22:39 the way, I got the plot idea from a prompt, and then I just started having writing and I
    0:22:40 was editing it along the way.
    0:22:41 Absolutely.
    0:22:45 The copyright suit that follows from training models and stuff, that’s a different issue.
    0:22:48 I think there’s two things on this, actually, that are really interesting.
    0:22:52 So one is these language models are these averaging machines.
    0:22:56 And with art, you almost definitely don’t want the average of all the novels or all the writing
    0:22:57 or all the authors.
    0:22:59 You want something that’s at the edge.
    0:23:04 So how do we actually point them in a direction such that they can be at the edge of culture,
    0:23:06 which I think is important for making great art?
    0:23:11 I think the other thing is a lot of the artists don’t yet know how to use any new tools, and
    0:23:13 we’re going to see artists that are native in the technology.
    0:23:17 Instead, what we’re seeing a lot out there, what’s called the slop, has just been a lot
    0:23:22 of this low barrier to entry art that’s being created, which is great because it gives people
    0:23:24 the sort of fulfillment of creative generation.
    0:23:29 I think what we’re talking less about is, hey, how is the ceiling being raised for artists
    0:23:30 because they have access to these technologies?
    0:23:36 Without going all de champa on what is art, I mean, bad sitcoms are part of society too,
    0:23:38 but I think it’s important.
    0:23:44 We tend to focus on like the very, very best of things, but most everything isn’t only the
    0:23:45 very best.
    0:23:47 In business writing, it’s all slop.
    0:23:48 Yeah.
    0:23:53 I mean, this is why, look, I’ve written a lot of business writing, so I can say this confidently
    0:23:56 about what I’ve written and what gets written.
    0:24:01 But take something completely mundane that a lot of people in Silicon Valley spend a lot
    0:24:04 of time working on, like the enterprise software case study.
    0:24:12 I’m telling you, GPT generates a better enterprise case studies faster than the typical marketing
    0:24:15 associate does at a company in like one millionth effort.
    0:24:16 Yes.
    0:24:18 And does the content need to exist?
    0:24:19 It actually does.
    0:24:21 It’s just an important part of the selling process.
    0:24:22 Yes.
    0:24:26 And so at the extreme, like with something like medical diagnosis, we tend to think about
    0:24:32 the most obscure diseases, the most difficult to understand problems, with the finest hospitals,
    0:24:33 with the most resources.
    0:24:39 But you have to remember, like 80% of the world has no access to anything.
    0:24:39 Right.
    0:24:40 Yes.
    0:24:48 So wherever you think of medical LLM is, as in the slop scale, most people don’t have access
    0:24:50 access to anything average.
    0:24:55 So we have to just make sure that the whole debate does not center around, like, what is
    0:24:58 Francis Ford Coppola using as the book?
    0:24:59 And who are the actors?
    0:24:59 Yes.
    0:25:01 And who is the cinematographer?
    0:25:06 Because that corporate case study, well, they often go and interview the person and film it.
    0:25:09 Well, like, all of a sudden, we see it today.
    0:25:10 Those things are done over Zoom.
    0:25:11 Yes.
    0:25:17 So suddenly, flying in or getting a satellite and booking, we’ve changed our view of excellent
    0:25:19 because we wanted more access.
    0:25:22 And I think that’s absolutely going to happen.
    0:25:24 Should you get graded on slop in school?
    0:25:25 That’s a different problem.
    0:25:27 But most stuff is pretty average.
    0:25:29 The world needs more slop, says Stephen.
    0:25:33 I feel like this is an oppressed interview where you can put words in my mouth like that.
    0:25:33 Yeah.
    0:25:34 The world needs more slop.
    0:25:35 That’ll be the title.
    0:25:37 Well, so actually, Mark makes this point.
    0:25:40 I think it’s a really good one, you know, which is, is the bar for success perfection?
    0:25:43 Is the bar for success what people can do today?
    0:25:46 Or is the bar for success just something that’s better than the alternative?
    0:25:50 And in your case of 80% of the world that has access to no medical knowledge,
    0:25:54 no medical services, no medical opinion, of course, this is dramatically better.
    0:25:54 Yeah.
    0:25:58 I look at it like when I had to get permission to use a word processor in college,
    0:26:05 one of the stumbling blocks was that my printer was like an Epson MX80 dot matrix printer.
    0:26:09 And it looked like a printer, like a computer printer,
    0:26:13 which the rules for the papers were they had to be written on a typewriter.
    0:26:13 Yes.
    0:26:17 And then the Macintosh came out in the spring and only had an image writer,
    0:26:19 which is another dot matrix printer.
    0:26:25 So all of a sudden the standard changed because the value of being able to revise and edit and
    0:26:32 update and copy paste and use fonts was just so much higher than the fidelity of the teacher
    0:26:35 reading it on bond paper with courier.
    0:26:37 And that’s going to happen with content as well.
    0:26:41 What I would love to talk to you about, Stephen, is actually just hearing your take on I.O.
    0:26:44 And if you felt like a Google I.O.
    0:26:46 Oh, yeah, yeah.
    0:26:51 So essentially there was a lot of conversation around Google and how Google had sort of fallen
    0:26:54 behind and lost their ability to make new things.
    0:26:57 They released a ton of new software at every part of the stack in I.O.
    0:27:00 What do you think that says for Google about Google?
    0:27:02 Do you think the sort of demise of Google is overstated?
    0:27:06 Well, of course, I think the demise of Google is an absurd proposition.
    0:27:11 The demise of a giant company is a crazy thing to say.
    0:27:17 Driving in, I was listening on CNBC, some investor or whatever talking their book, talking about IBM
    0:27:19 is the one to buy.
    0:27:24 I almost wanted to pull over to the side of the road and think, what universe am I in
    0:27:28 where this company that has died like nine times in my career?
    0:27:28 Right.
    0:27:30 And so death of is just such a done thing.
    0:27:34 Losing a position of influence, however, is a very real thing.
    0:27:40 In these platform transitions, big companies have an enormous asset, which is the shock and
    0:27:41 awe asset.
    0:27:46 And so they have the ability to tell the story called we’re pivoting our whole company around
    0:27:49 this and we’re a zillion dollar in whole company.
    0:27:59 And here is like a full assault across the board for every single asset we have and every
    0:28:04 single category the world is talking about that matters.
    0:28:06 And that’s what you could do.
    0:28:11 Someone was asking me on Twitter yesterday about this event Microsoft held in 2000 called
    0:28:12 Forum 2000.
    0:28:18 And it was when we announced like a whole bunch of internet stuff and the early cloud stuff.
    0:28:21 It wasn’t called cloud, but early cloud stuff.
    0:28:24 And nobody in that room understood what we were talking about.
    0:28:25 Not a person.
    0:28:32 But they all left like, oh my God, there is so much stuff here, which is a repeat of five
    0:28:35 years earlier when we did what was called Internet Strategy Day.
    0:28:39 And like the headlines were literally sleeping giant awoke.
    0:28:46 And so it was totally predictable that Google would show up with like literally the B2 bombers
    0:28:48 of software.
    0:28:52 But the question is really much deeper than that.
    0:28:59 And it’s really, will they alter their context of how they build products and their go-to-market?
    0:29:03 Because that’s really what undermines the big technology companies.
    0:29:09 And so with Microsoft, the interesting thing was all those products that got announced over
    0:29:13 that five-year span or 10-year span, none of them are around today.
    0:29:15 I should be very careful every time I say something like this, I get assaulted.
    0:29:22 But like the big announcement at Forum 2000 was the .NET framework in C Sharp, which by
    0:29:26 almost any measure, one would call a legacy platform today.
    0:29:30 So it like came and went in six or seven years.
    0:29:36 And everything was about virtual machines and clustering and all this stuff that VMware was
    0:29:36 doing.
    0:29:38 And that’s not where anything was.
    0:29:42 And on top of all that, the economic model became SaaS.
    0:29:49 And so what I’m looking at with Google is not, can they present all the technologies in the
    0:29:56 context of Google search and ads, but can they transform the way they think to something
    0:29:56 new?
    0:29:59 Because that’s really where the disruption is going to happen.
    0:30:00 I love that point.
    0:30:00 Cool.
    0:30:01 Awesome.
    0:30:03 Anish, Stephen, thanks so much for this weekend.
    0:30:03 Sure thing.
    0:30:04 Thank you.
    0:30:04 Super fun.
    0:30:09 Thanks for listening to the A16Z podcast.
    0:30:14 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash
    0:30:15 A16Z.
    0:30:18 We’ve got more great conversations coming your way.
    0:30:19 See you next time.

    In this episode of ‘This Week in Consumer’, a16z General Partners Anish Acharya and Erik Torenberg are joined by Steven Sinofsky – Board Partner at a16z and former President of Microsoft’s Windows division – for a deep dive on how today’s AI moment mirrors (and diverges from) past computing transitions.

    They explore whether we’re at the “Windows 3.1” stage of AI or still in the earliest innings, why consumer adoption is outpacing developer readiness, and how frameworks like partial autonomy, jagged intelligence, and “vibe coding” are shaping what gets built next. They also dig into where the real bottlenecks lie, not in the tech, but in how companies, products, and people work.

     

    Resources: 

    Find Anish on X: https://x.com/illscience

    Find Steven on X: https://x.com/stevesi

    Watch Andrej Karpathy’s talk: https://www.youtube.com/watch?v=LCEmiRjPEtQ

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

  • Building Cluely: The Viral AI Startup that raised $15M in 10 Weeks

    AI transcript
    0:00:02 We wrote our first lines of code 10 weeks ago.
    0:00:05 We’re like earlier than the latest YC batch of companies,
    0:00:08 yet we’re like generating probably more revenue than every single one of them.
    0:00:13 It’s been so hard to hear through the noise of everything in the AI,
    0:00:15 especially for the consumer facing, for the consumer facing.
    0:00:19 To do that consistently is actually way harder, near impossible.
    0:00:22 I heard someone call it Riz marketing, which is a compliment.
    0:00:24 They’re like, I hate this Riz marketing.
    0:00:26 AI is so magical.
    0:00:29 We like built the digital god, locked it in a chatbot.
    0:00:32 Six months ago, I was some random college kid in a dorm,
    0:00:35 and now I feel like I’m at the center of the tech universe.
    0:00:40 What happens when a founder treats virality not as a tactic, but as the product?
    0:00:45 Roy Lee, co-founder and CEO of Cluely, is either a generational entrepreneur
    0:00:47 or a walking internet experiment.
    0:00:48 Maybe both.
    0:00:52 In just a few months, he’s built one of the most talked about startups in tech,
    0:00:54 not with a massive fundraise or a polished product suite,
    0:00:58 but through relentless short-form content, polarizing stunts,
    0:01:02 and a translucent AI overlay that feels more like performance art than interface.
    0:01:07 Joining me today is Roy, along with Brian Kim, who led A16Z’s investment in Cluely.
    0:01:10 We talk about Roy’s approach to building in public,
    0:01:13 what Gen Z understands about attention that tech still doesn’t,
    0:01:15 and why momentum might be the new moat in consumer AI.
    0:01:18 This is a conversation about distribution as design,
    0:01:20 founder market fit at internet speed,
    0:01:24 and what happens when you stop trying to be professional and start trying to win.
    0:01:25 Let’s get into it.
    0:01:30 As a reminder, the content here is for informational purposes only,
    0:01:33 should not be taken as legal, business, tax, or investment advice,
    0:01:36 or be used to evaluate any investment or security,
    0:01:40 and is not directed at any investors or potential investors in any A16Z fund.
    0:01:46 Please note that A16Z and its affiliates may also maintain investments in the companies discussed in this podcast.
    0:01:49 For more details, including a link to our investments,
    0:01:53 please see A16Z.com forward slash disclosures.
    0:02:00 Roy, man, the myth, the legend, the man of the moment,
    0:02:02 Cluely is the current thing.
    0:02:02 Yes.
    0:02:03 How does it feel?
    0:02:07 The announcement the other day, a lot of love, a little bit of hate.
    0:02:07 Yeah.
    0:02:11 How do you react to this?
    0:02:12 I mean, it’s pretty crazy.
    0:02:17 I think, like, literally six months ago, I was some random college kid in a dorm,
    0:02:19 and now I feel like I’m at the center of the tech universe.
    0:02:25 And the more astonishing thing is how correct my assumptions on virality have been.
    0:02:30 I think it’s growing increasingly clear that people on ex-LinkedIn are behind,
    0:02:33 and there’s such an extremely small intersection of people who understand
    0:02:36 how developed the algorithm is on, like, Instagram, TikTok,
    0:02:40 and people on tech, like, Twitter, LinkedIn, ex-LinkedIn.
    0:02:44 And there’s just such a small intersection that it’s inevitable that my predictions will be right.
    0:02:46 And it’s been crazy to see that play out in real time.
    0:02:47 Yeah.
    0:02:49 Elon’s reaching out.
    0:02:51 Meta’s offering a billion-dollar acquisition offer.
    0:02:53 Brian, we said not to do it, of course.
    0:02:54 Of course, of course.
    0:02:54 I’m to the moon.
    0:02:57 Brian, what is your reaction seeing this play out and having led the investment?
    0:03:00 Yeah, well, first, he’s a bro of the moment.
    0:03:04 I love the little snippets of, like, you calling Molly bro, like, 20 different times.
    0:03:05 Amazing.
    0:03:07 Look, Roy, this is fantastic.
    0:03:09 It’s so interesting.
    0:03:14 I remember right before the announcement, you had already sent me the video 24 hours ago,
    0:03:16 like, bro, this is what we’re going to show.
    0:03:16 Yeah.
    0:03:18 And I was like, okay, like, this is great.
    0:03:20 And I think it’s going to do well.
    0:03:22 I didn’t think it would do that well.
    0:03:23 Yeah, yeah, yeah.
    0:03:25 And I didn’t think it would create that much controversy.
    0:03:31 But I also, I think, underestimated how positive people will view this.
    0:03:36 There’s a lot of meta-analysis on you out there, which is, like, really, really cool.
    0:03:39 And there are people who are going, like, two to three layers deep.
    0:03:45 And I’ve actually read something that, like, linked a meta-analysis of A16Z with Cluelly
    0:03:50 and how there’s, like, this sort of fungibility of thoughts and things like that, which is, like, mind-blowing.
    0:03:51 I have no idea what the fuck they’re talking about.
    0:03:53 So, it’s just been incredible.
    0:03:55 You put people doing literary criticism.
    0:03:57 No, I’m not kidding.
    0:04:02 There’s literally a guy who wrote, like, fungibility of thoughts with A16Z and Cluelly.
    0:04:03 Yeah, yeah.
    0:04:04 That’s funny.
    0:04:06 People always over-read into stuff.
    0:04:07 They’re like, oh, this has this master plan.
    0:04:08 A16Z is trying to do this whole.
    0:04:09 That’s what I’m saying.
    0:04:10 Everyone.
    0:04:10 Yeah.
    0:04:12 No, don’t tell them, Eric.
    0:04:13 We are playing 4D chess here.
    0:04:14 Yeah, it’s like, it’s fine.
    0:04:16 10 threads, analyzing the left-hand handshake.
    0:04:18 It’s like, guys, come on.
    0:04:19 Some of the haters, I tweeted something.
    0:04:20 I was talking to Brian.
    0:04:22 I tweeted, how you feel about Cluelly is how you feel about yourself.
    0:04:26 But I deleted it because I didn’t want to trigger people on Twitter.
    0:04:27 I’ll talk shit on a podcast.
    0:04:28 Maybe I’ll get clipped to Twitter.
    0:04:30 But I don’t want to inflame people.
    0:04:31 It’s fascinating to see.
    0:04:36 So, I want to go back to the beginning because this isn’t a story that has been told a lot.
    0:04:38 We were talking about the Amazon interview.
    0:04:42 But maybe let’s go back even further when we think through where you are now.
    0:04:47 Talk about, like, the threads or the through lines from your childhood that can help make
    0:04:49 sense of who you are and kind of this moment.
    0:04:55 I think from birth, the most character-defining feature of me has been, like, attention-grabbing
    0:04:56 and provocative.
    0:04:58 This played out elementary, middle, and high school.
    0:05:01 I was always, like, I had a camp of people that loved me and a camp of people that hated
    0:05:02 me.
    0:05:03 I was, like, always the boldest.
    0:05:04 Like, I would say the craziest shit.
    0:05:07 Everything that was on my mind, just no filter, I would say it.
    0:05:10 And this ended up with a lot of people liking me and a lot of people just really disliking
    0:05:11 me.
    0:05:13 And I think things culminated senior year of high school.
    0:05:14 I had a well in school.
    0:05:16 I got accepted to Harvard early.
    0:05:19 And then later that year, I was just always doing crazy shit.
    0:05:22 And I snuck out of a school field trip, passed curfew.
    0:05:25 Like, the police had to come and escort me back because we were out at 2 a.m.
    0:05:27 We were all, like, 16, 17, 18.
    0:05:29 And then I got a suspension for that.
    0:05:33 And that was, like, when the camp of Roy haters took the storm and reported everything
    0:05:34 everywhere.
    0:05:36 And it ended up getting me rescinded from Harvard.
    0:05:40 And then that kind of started my journey of, I think, like, entrepreneurial, wanting to
    0:05:41 actually swing big at building companies.
    0:05:44 And that’s when I felt like my life took such a crazy turn.
    0:05:48 For context, my parents literally run, like, a college admissions consulting company.
    0:05:51 So we literally teach kids how to get into Harvard.
    0:05:53 And the youngest son of the company, like, gets fucking rescinded from Harvard.
    0:05:54 It’s, like, the worst thing ever.
    0:05:56 So we decided, let’s keep this quiet.
    0:05:59 Do you still have the same test scores and application and everything?
    0:06:00 Maybe next year you’ll get into a different school.
    0:06:03 So I spent, like, an entire year at home.
    0:06:07 And I underestimated how mentally tormenting that would be.
    0:06:11 Like, I’m probably the most extroverted person you might have ever met in your life.
    0:06:14 I cannot stay maybe, like, eight hours without talking to someone.
    0:06:17 And to spend a year alone, like, it made me think, man, my life is so crazy.
    0:06:22 I might as well just quintuple down on every single crazy belief thought I have and just live
    0:06:23 the most interesting life ever.
    0:06:26 So that was, like, the moment where I decided, like, I’m all in on building companies.
    0:06:28 There’s no way I can do anything else.
    0:06:32 I was wondering if you were going to have a moment of, like, oh, this has set me back
    0:06:32 in some ways.
    0:06:34 Maybe I should reform or tame it down.
    0:06:37 But you took the opposite of, like, no, this is who I am and it’s going to work.
    0:06:41 Yeah, you just sit in a room by yourself for 12 months and all of a sudden your craziest
    0:06:44 thoughts become logical and there’s no one else.
    0:06:47 Like, the echo chamber is you in your brain and it amplifies everything.
    0:06:50 And I think that’s the reason why I am, like, the person that I am today and willing to
    0:06:51 make the bets that I am today.
    0:06:56 Later, I go to community college in California at the behest of my parents.
    0:06:58 I think California, this is, like, a middle ground.
    0:07:01 California, I have the chance to build a company and community college, like, I have the chance
    0:07:04 to get the education that my Asian parents always dreamed of.
    0:07:05 So I do that.
    0:07:09 And then later I get into Columbia and I have to go to Columbia for at least a semester to
    0:07:10 appease my parents.
    0:07:13 I go to Columbia and the first thing I’m thinking is, like, can I find a co-founder and a wife?
    0:07:17 Like, those are the only two things that I’m looking for in college and still looking for
    0:07:17 the wife.
    0:07:21 But on pretty much the first day, that’s when I met Neil, my co-founder, started hacking
    0:07:22 on a bunch of things.
    0:07:24 And the one thing that worked was the earliest version of Cluelay.
    0:07:25 Wow.
    0:07:29 And were your parents ever trying to reform you or moderate you?
    0:07:31 Or were they kind of like, hey, Roy’s going to be Roy?
    0:07:36 Almost every single day of my life until I got into Harvard, then they calmed down and
    0:07:38 they’re like, wow, this kid really made it.
    0:07:42 And then when I got kicked from Harvard again, they were like on my ass a lot until I got into
    0:07:43 Columbia again.
    0:07:43 And they’re like, wow.
    0:07:46 After all this, he gets back into the Ivies.
    0:07:47 He’s like, I guess I really can’t trust him.
    0:07:51 And his unorthodox swings will be home runs.
    0:07:52 And is that where they’re at now?
    0:07:52 Yeah.
    0:07:54 That’s actually what I’m going to ask.
    0:07:57 You mentioned before that your parents, they will love you no matter what.
    0:07:58 Yeah, yeah, yeah, yeah.
    0:08:00 However, I am curious what they think now.
    0:08:05 Man, it’s crazy how lax they’ve gotten since I got back into Columbia.
    0:08:06 Like, now they’re okay with anything.
    0:08:08 When I told them, hey, mom, I’m dropping out to do this.
    0:08:10 Like, she’s like, oh, like, okay.
    0:08:11 You know, like, I expected it.
    0:08:12 Why didn’t you drop out sooner?
    0:08:14 Why did it take a semester and a half?
    0:08:16 Because at this point, I was, like, convincing my co-founder to drop out with me.
    0:08:18 And she’s like, man, like, took it long enough.
    0:08:21 So they’re, like, totally on board with all the crazy shit that I do.
    0:08:21 Wow.
    0:08:24 One of the things we were talking about in the context of going viral, I heard you say,
    0:08:28 is that Twitter is two years behind Instagram.
    0:08:28 Yeah.
    0:08:29 Or behind the other platforms.
    0:08:35 Talk a little bit about when your sort of provocativeness sort of translated over to Twitter
    0:08:36 or just, like, the digital mediums.
    0:08:37 Like, how did that strategy evolve?
    0:08:39 And let’s talk about the difference in the platforms.
    0:08:40 Yeah, this goes way back.
    0:08:44 But many years ago, when YouTube first came out as a platform,
    0:08:45 this was, like, the turning point of everything.
    0:08:48 This democratized, essentially, content.
    0:08:49 And now you weren’t paying for commercials.
    0:08:55 And the visibility and publicity in content was not gated by amount of money you’re willing
    0:08:56 to spend on ads or TV space.
    0:08:58 It’s just gated by the quality of content.
    0:09:03 And five years ago, when TikTok came out and short-form algorithms really started taking over,
    0:09:05 that shifted the frame once again.
    0:09:07 So now it’s not about how much good content you make.
    0:09:10 It’s literally just about how much content can you make.
    0:09:14 There is simply not enough good content out there for the average person to consume,
    0:09:18 which is why you see the same brain rot reels over and over and over again, over and over again.
    0:09:22 You see the same Minecraft parkour video over and over just because there’s literally not enough content
    0:09:23 for the average consumer to consume.
    0:09:26 And people have not caught on to a few things.
    0:09:30 First, you just need to make more content that a consumer will.
    0:09:33 Like, most people don’t know how to make viral content.
    0:09:36 Content that any person can watch, consume, and is digestible.
    0:09:40 And everyone on ex-LinkedIn is trying to go be, like, the most intellectual,
    0:09:41 like, thoughtful person.
    0:09:45 And they’ll generate some slop that maybe, like, 200 people in the world can actually understand and make sense of.
    0:09:49 But, of course, they want to seem like the most interesting, thoughtful, like, intelligent person.
    0:09:50 But this just lacks viral sense.
    0:09:52 There’s not enough viral content to go out.
    0:09:57 And the second thing is that the algorithms really promote the most controversial things.
    0:10:02 And people on ex-LinkedIn, there’s not enough controversial things to be rewarded for.
    0:10:07 So when I come out swinging out of the gate, I’ve been on Instagram, TikTok for the past 10 years of my life,
    0:10:10 and I understand what level of controversialness you need.
    0:10:14 I take the slightest foot into controversialness, and all of a sudden, ex-LinkedIn explodes.
    0:10:18 Because the algorithm inherently highly, highly rewards this stuff.
    0:10:21 And as a result, it’s just getting shoved into everyone’s feed, and they don’t understand.
    0:10:26 But I’m just, like, literally applying the same principles of controversialness from IG, TikTok onto ex-LinkedIn.
    0:10:29 And they’re just so not ready for it that it feels like the craziest thing ever.
    0:10:36 And this is something that I’ve said before, but I guarantee, like, my videos do not go as viral on Instagram, like, Instagram, TikTok.
    0:10:39 And the sole reason is because on those platforms, they are not controversial enough.
    0:10:43 And on those platforms, there are literally people committing felonies in public,
    0:10:45 or at least insinuating, like, they are committing felonies.
    0:10:47 Even then, it’ll be like, oh, good try, bro.
    0:10:48 It’s not interesting enough.
    0:10:50 And, like, ex-LinkedIn, they just have not caught on.
    0:10:53 There’s not enough creators out there who are willing to press the controversial button.
    0:10:55 Mark says this sometimes, right?
    0:10:56 Like, supply chain of the meme.
    0:10:57 Yeah.
    0:11:05 It’s like the strangest concept, but the meme actually travels from, like, Reddit, and then it goes to X, and then it goes to Instagram, then LinkedIn, then CNBC.
    0:11:09 And there’s, like, a train that it goes that certain people are just early, late.
    0:11:15 But when you actually flip it with virality and controversiness, maybe that flips a little bit.
    0:11:16 Well, and sometimes it starts in, like, 4chan or something.
    0:11:19 That’s what I mean, like, I didn’t even want to say it.
    0:11:23 It’s 4chan, Reddit, then Twitter, then Instagram and LinkedIn.
    0:11:31 But for you, it’s actually Instagram and Twitter comes before Twitter in terms of the raunchiness or craziness of it.
    0:11:32 Yeah, yeah.
    0:11:41 I mean, I just feel like the average person who’s in X comments hating about how controversial this is, they spend one hour looking through the timeline that is my Instagram feed.
    0:11:42 Their brains would melt and explode.
    0:11:46 Like, they would not be able to comprehend how are people, like, digesting this at scale.
    0:11:56 It’s funny because ever since Elon took over X, some people start complaining of, oh, this stuff has gotten too controversial or too much dark stuff or too much negative content.
    0:11:59 And, yeah, it turns out it’s just the beginning.
    0:12:00 Yeah, yeah.
    0:12:02 I mean, literally, this is what the future of content is going to be.
    0:12:04 You’re not going to get more millennial founders.
    0:12:06 You’re only going to get more Gen Z founders.
    0:12:09 And I guarantee you, their backgrounds and content are the exact same as mine.
    0:12:10 And they know exactly what I’m doing.
    0:12:12 I’m sort of like the canary that’s leading the way for this.
    0:12:14 But I guarantee you, there’s more of this coming.
    0:12:15 And it’s inevitable.
    0:12:16 And just embrace it.
    0:12:20 When did you realize that this was the way you were going to build a company?
    0:12:24 How did you intuit, hey, distribution is a scarcity, distribution is what matters?
    0:12:26 Because there’s a lot of creators out there, but they’re not combining it with a tech company.
    0:12:28 Like, how and when did you put this together?
    0:12:33 I guess there was a certain point where I kept going viral that I sort of realized that
    0:12:37 I know something that ex-LinkedIn people don’t know yet.
    0:12:40 And it is sort of like mastery of the algorithm.
    0:12:43 And I think everything started with the interview coder situation.
    0:12:45 Interview coder was the earliest prototype of Cluey.
    0:12:48 And it was a tool to let you cheat on technical interviews.
    0:12:50 And I used it to cheat my way through an Amazon interview.
    0:12:51 I made it super public.
    0:12:54 I posted it everywhere and ended up getting me like blacklisted from big tech and kicked
    0:12:55 out of school.
    0:12:57 And that situation was inherently viral.
    0:13:00 Like, when’s the last time someone got kicked out of an Ivy League and raised $5 million?
    0:13:02 Like, this has probably never happened in the history of humanity.
    0:13:04 So that situation was inherently viral.
    0:13:08 And at that time, I had no idea that this was like a repeatable thing that I could do.
    0:13:10 But then the launch video happened.
    0:13:12 And I had my intuitions about the virality of launch video.
    0:13:13 And I just kept scrolling on Twitter.
    0:13:17 And I was wondering, like, man, why is nobody doing what Avi Schiffman with friend.com
    0:13:19 showed the world you could do a year ago?
    0:13:20 Like, why has nobody done this yet?
    0:13:21 And it worked.
    0:13:23 And then I did the 50 interns thing, and it worked.
    0:13:25 And like, I kept doing viral video after viral video.
    0:13:29 And at a certain point, I just realized, like, holy shit, people on ex-LinkedIn, they have
    0:13:30 not caught on yet.
    0:13:35 And this is the massive alpha that I’m trying to capture here, is that they have not caught
    0:13:40 on to what it means to master the short form algorithm or the algorithm that is one
    0:13:41 of short form.
    0:13:46 And as a result, I am able to dominate the timeline for not the past week, but like, probably
    0:13:49 the past few months, just because people on ex-LinkedIn have not caught on.
    0:13:50 And for some reason, they still refuse to catch on.
    0:13:55 And so just to put a finer point out, the 50 interns, maybe explain this idea of basically
    0:13:58 at your company, you either have engineers and you have creators.
    0:14:00 Yeah, there are only two roles.
    0:14:04 You are either a world-class engineer who is building the product, or you are a world-class
    0:14:05 influencer.
    0:14:09 And for a full time, every single person has over 100,000 followers on some social media platform.
    0:14:14 It is the only way to prove that you actually have mastery over virality and you understand
    0:14:15 what it takes.
    0:14:19 And I think if any company in the world has a marketing team and the head of marketing does
    0:14:21 not have at least 100,000 followers, you need to replace them.
    0:14:22 Like, the game has changed.
    0:14:26 And so do you think this is a strategy that other companies should also be employing, whether
    0:14:31 it’s intern-based or just like having an army of creators and sort of deploying them towards
    0:14:31 their end?
    0:14:33 Yeah, I’ll go a bit deeper in the interns.
    0:14:36 So Cluley made a pretty viral video announcing that we were hiring 50 interns and you’d be
    0:14:38 in here making content all day.
    0:14:40 And essentially, that’s almost what we do.
    0:14:42 We have like over 60 contractors.
    0:14:46 These contractors get paid per video and they just are forced to sit in front of a camera
    0:14:48 and make TikTok and Instagram videos about Cluley.
    0:14:50 And this is what marketing looks like.
    0:14:52 This job does not exist five years ago.
    0:14:55 Like, how do you explain the job if you sit in front of a camera and you make five, 10
    0:14:59 second videos that seemingly make no sense to anybody, but just consistently generate millions
    0:15:00 of views like that?
    0:15:03 That’s not a job that makes sense to people, but that is our internship.
    0:15:05 That’s what like a modern day marketing internship looks like.
    0:15:09 And we pay very little money for the amount of views that we get.
    0:15:13 And different companies, they’re paying literally millions of dollars for Super Bowl ads when
    0:15:17 you can get the same quality and quantity of views, $20,000.
    0:15:18 Did you see it converting?
    0:15:19 Yes, yes, yes.
    0:15:19 Of course.
    0:15:21 I mean, like those are real is clear.
    0:15:24 Our only converting videos is the ones that we have on IG TikTok.
    0:15:24 Yeah.
    0:15:29 Brian, why don’t you share your story of how you got excited about Cluley, of how you and
    0:15:32 Roy met and built this relationship and how this partnership formed?
    0:15:37 I had a good contact in New York who travels in the young crack folks area.
    0:15:39 And her name is Ali DeBow.
    0:15:42 And one of the lists that she sent had Roy in it.
    0:15:44 And I like read what they’re working on.
    0:15:49 And it sort of reminded me of, oh, like a scout thing or it’s on the edge of reality.
    0:15:50 I’m like, oh, this is interesting.
    0:15:50 I want to talk to him.
    0:15:51 So I reached out.
    0:15:53 Roy, if you remember, I just reached out.
    0:15:54 Hey, I heard your name, blah, blah, blah.
    0:15:54 We should talk.
    0:15:56 You’re like, oh, bro, let’s talk.
    0:16:00 And then a day later, you wrote back, I’m like, actually, you’re multi-stage.
    0:16:01 I don’t want to talk to you.
    0:16:03 My advisor says, don’t want to talk to you.
    0:16:03 Go away.
    0:16:05 And then what do I do?
    0:16:07 Like, I could have just stood down, but I said, okay, fine.
    0:16:09 I promise you will not talk about fundraising.
    0:16:10 Let’s just talk.
    0:16:11 Like, I want to meet you.
    0:16:12 I want to talk to you.
    0:16:13 I want to build a relationship.
    0:16:14 Thankfully, you agreed.
    0:16:15 We got on a quick call.
    0:16:18 We like chatted a little bit where you like had your origin story.
    0:16:19 I’m like, oh my God, it’s good.
    0:16:19 It’s amazing.
    0:16:20 It’s so cool.
    0:16:21 Like, I’m glad he’s doing this.
    0:16:24 Sadly, he’s not accepting money, but that’s okay.
    0:16:24 Yeah.
    0:16:26 And then I tracked you, I tracked your Twitter.
    0:16:28 I tracked what you’re doing, the 50 intern.
    0:16:32 You moved to San Francisco and I had it in my mind.
    0:16:35 Okay, next time going to opportunity strikes, I’m just going to show up.
    0:16:39 So I think I somehow got your phone number or some such thing and texted you, yo, like I’m
    0:16:40 at your office.
    0:16:41 Can I hang?
    0:16:44 So I come and you say, yeah, like, that’s great.
    0:16:44 Come up.
    0:16:47 And what I actually think was really, really cool.
    0:16:52 There are a couple of steps, but step one was there was like an engineer who randomly
    0:16:58 found you on Twitter or Instagram and had just come up like you did not know him.
    0:16:59 He did not know you.
    0:17:03 He just came to your office, came in to say hi, wanted to say hello.
    0:17:06 And one of your friends, I think Nicholas was just there hanging out.
    0:17:11 And the quality of the people, the fact that random engineers were knocking on the door
    0:17:15 to come talk to you randomly was just like, oh, there’s something like really strange and
    0:17:16 special happening here.
    0:17:20 And all of your team members sitting in the thing and like doing things and creating content
    0:17:22 and you and Neil working on the product.
    0:17:23 It was like, oh, like there’s something very special.
    0:17:28 And I sort of was thinking, oh, like, is this something that we should sort of back?
    0:17:30 And then I think one more video was made or something.
    0:17:35 And next time I visited, I think I came with some stuff or, you know, you’re eating steak.
    0:17:41 And then you flash some metric or something where I realize that you’re converting this like
    0:17:44 awareness and eyeballs into money, dollars.
    0:17:45 You like drop some numbers.
    0:17:47 You’re like, oh, yeah, we’re doing this many revenue.
    0:17:48 And that’s where we’re going.
    0:17:49 And guess what?
    0:17:51 Like some enterprise customer wants to talk to us.
    0:17:52 I don’t know why.
    0:17:52 Blah, blah, blah, blah, blah.
    0:17:59 And that’s when I sort of realized, oh, he is actually able to like convert this awareness
    0:18:03 and distribution that you’re getting into real dollars.
    0:18:06 And I don’t know many people who know how to do that.
    0:18:11 And during that time, I was already writing this thing called momentum as a moat because
    0:18:17 it’s been so hard to pierce through the noise of everything in the AI, especially in the sort
    0:18:19 of consumer facing, pursuers facing.
    0:18:23 To do that consistently is actually way harder, near impossible.
    0:18:29 And so I had the theory that, oh, like companies who know how to do that, companies know how
    0:18:31 to build at that speed are going to be the winners.
    0:18:34 And I felt like I have found a person who was doing that.
    0:18:37 And so I think we moved very quickly.
    0:18:42 I told you, look, like, just hit download, hit download on the Stripe data, hit download,
    0:18:43 send it to us.
    0:18:44 I won’t ask any more questions.
    0:18:45 Yeah.
    0:18:46 And then we’ll have a chat.
    0:18:48 And that hopefully is what we delivered.
    0:18:54 We quickly scrambled to do a very fun small person chat with you and some of the partners
    0:18:57 where you called some of them old, bald and boring.
    0:19:00 And we were excited to do the deal.
    0:19:06 I think I was at an LP summit running around in Las Vegas, trying to call you to get to a
    0:19:07 terms, et cetera.
    0:19:08 And that’s how it all worked out.
    0:19:14 And after a while, I brought you five, six pounds of steak as an excitement and a deal.
    0:19:18 I want to double click on the momentum as a moat piece, Ben, because you have an interesting
    0:19:20 background in that you’ve been doing it similar for a while.
    0:19:22 You worked at Snap beforehand, among other places.
    0:19:28 And I remember Ben Thompson had this post about Snap where he said that Snap has a gingerbread
    0:19:32 strategy where basically if they invent stuff, Meta is going to copy it.
    0:19:34 So they just have to keep inventing stuff.
    0:19:36 And I guess it’s called the breadcrumbs or something.
    0:19:38 And you also have backed a number of these.
    0:19:43 You’ve had a front row view to, hey, these network effects aren’t as necessarily defensible
    0:19:44 as they once were.
    0:19:47 And so companies need to keep innovating, keep pushing.
    0:19:52 And so share more about how this kind of momentum, especially as it moves to AI companies, this
    0:19:54 theory was born and what it really means for defensibility.
    0:19:58 Yeah, this might be somewhat interesting to you, Roy and folks, which is I did not have
    0:19:58 that view before.
    0:20:03 I did not think gingerbread strategy necessarily worked, nor momentum was a moat.
    0:20:03 I did not.
    0:20:09 I actually truly believed in these handcrafted artisan products that really get to the core
    0:20:11 of why people want to use it.
    0:20:15 I still somewhat believe core of it, but these artisan products where it just takes a while
    0:20:16 to build it.
    0:20:17 And it’s like very, very nuanced.
    0:20:21 And I have believed that led to high retention.
    0:20:26 So the thing that I looked at the most always was, is this product highly retained?
    0:20:31 Does it product have like network effect on the traditional sort of theories of moat?
    0:20:36 And what I realized, and this was like true to some extent in the era of mobile.
    0:20:38 Mobile is like a two decade old platform.
    0:20:41 So a lot of things have been tried and a lot of people try different things.
    0:20:45 And therefore finding something of people came back again and again and again was the most
    0:20:46 important thing in my mind.
    0:20:47 And then AI hit.
    0:20:51 And I still had that framework where, oh, like I’m going to look for things that are
    0:20:53 highly retentive and repeated again and again.
    0:20:54 And guess what?
    0:20:56 Things change too fast.
    0:21:00 Like the underlying model changes every day or every week.
    0:21:06 If you like craft this thing and open AI as someone like built their new model to include that
    0:21:08 part in their new product, you’re done.
    0:21:08 You’re gone.
    0:21:14 So then it couldn’t become about like this highly thoughtful, slow build product.
    0:21:18 It needed to be something where founders knew how to move extremely quickly.
    0:21:21 And that included product that included distribution.
    0:21:25 And because these fucking things are so magical, AI is so magical.
    0:21:28 We like built the digital God, locked it in a chatbot.
    0:21:34 Because it’s so magical right now, kind of anything goes like people will give it a chance.
    0:21:41 And therefore, what’s really important is to try to build the plane as it’s falling down the cliff.
    0:21:48 And people who enjoy the thrill of the plane going down and actually is excited about building as it goes down,
    0:21:50 I think those are the winners of the next day.
    0:21:57 And so when I think about folks like Roy, it’s the type of founder archetype who gets value and is excited
    0:22:03 and leading the charge in terms of that speed, whether it’s marketing, distribution, or product build.
    0:22:08 And usually all of that needs to come together to build an extremely durable, long data product.
    0:22:13 And that, to me, eventually will turn into a product that needs to be retained, that needs to be used every day.
    0:22:18 But we’re still in this early stage of AI where I think momentum is the most.
    0:22:22 Going back to Roy, I’d love to hear how you think about it.
    0:22:29 Because we talked about it a little bit over chat where you think like stage one, stage two, stage three of building Cluely
    0:22:31 and sort of the distribution advantage that you have.
    0:22:35 And you keep talking about this, oh, maybe X and LinkedIn people don’t get it right now.
    0:22:37 But that gap may narrow over time.
    0:22:39 So how do you think about the next stage and et cetera?
    0:22:43 But that’s how it links to sort of my theory of momentum of the moment.
    0:22:44 I want to get to Roy’s product strategy.
    0:22:47 But first, I want to add some points, which is it was interesting.
    0:22:53 Paul Graham, when he started Y Combinator, identified that technical founders were undervalued,
    0:22:54 that they were being underappreciated.
    0:22:56 And people thought, oh, you need to have an MBA.
    0:23:03 And you realize, hey, it’s easier to teach technical founders business than it is to teach business people how to build great products or how to code.
    0:23:08 And then what happens over the next 15 years, it becomes way easier to build these things.
    0:23:11 You know, AWS, low-code AI, et cetera.
    0:23:14 And distribution becomes the scarcity in that there’s so much software.
    0:23:20 It’s such a flooded ecosystem, as you quoted in your piece, Andrew Chen’s piece, about how all the marketing channels suck.
    0:23:22 They’ve all been sort of wrang out dry.
    0:23:25 And so distribution is now the scarcity.
    0:23:29 And so in the same way, you know, the technical co-founder, now there’s almost like an audience co-founder.
    0:23:30 How do you really break out?
    0:23:35 And we’ve seen creators start to build business, like Mr. Beast with his Feastables, right?
    0:23:37 You know, Kylie Jenner with her, what was it?
    0:23:37 Lip gloss.
    0:23:38 Yeah, commerce.
    0:23:40 But no one’s really done it with software.
    0:23:42 No one has put the two and two together.
    0:23:48 You know, Justin Bieber tried to do a social network called Shots, or John Jahidi using Bieber.
    0:23:54 But no big creators really built, whether a consumer or enterprise, huge software company.
    0:23:57 They built commerce or physical goods.
    0:23:59 And I always thought, hey, why doesn’t Mr. Beast launch like a square competitor?
    0:24:01 Like, he’s got all these eyeballs.
    0:24:03 Like, I’m sure he’s got to get better margins than Feastables.
    0:24:07 And he has games, and he’s a friend and a friend of the firm, and he’s done phenomenally well.
    0:24:10 But what I like is that Roy is putting both of those together.
    0:24:14 And so maybe you can walk through a little bit.
    0:24:16 You talked about your distribution strategy.
    0:24:20 Why don’t you talk about how the product strategy has evolved from the beginning, and then we get to the sequencing.
    0:24:27 Yeah, I think something that a lot of people miss is that the first line of Cluely code was written, like, 10 weeks ago.
    0:24:29 So this is, like, really new.
    0:24:33 And this started with Interview Coder, which is just, like, a product that I coded up over a weekend in my dorm.
    0:24:36 And it was, like, a tool to let you cheat on interviews.
    0:24:43 And what we realized after we got about, like, 250 million impressions on the whole scenario is, like, wow, we just got so many eyeballs on this thing.
    0:24:51 Maybe if we can do this again, but we have a more general-use product with a similar UX because we think we’re really onto something with the UX here, then maybe we can make a lot more money.
    0:24:53 And that’s what we did with Cluely.
    0:24:55 And we just launched it as, like, Interview Coder for everything.
    0:24:55 You know, cheat on everything.
    0:24:59 Let’s just see what happens, and the usage data will tell us what people are using it for.
    0:25:00 That’s, like, exactly what’s happening.
    0:25:05 Now we have this general-purpose cheating tool, which is, in reality, is just, like, an invisible AI overlay.
    0:25:07 And here’s a new user experience for AI.
    0:25:09 Let’s push it out to a bunch of people and see what happens.
    0:25:13 And as a result, like, now we’re past, like, a billion views overall on Cluely.
    0:25:15 And we’re probably, like, the most viral startup in the world.
    0:25:20 And we have all this usage data that literally tells us, hey, here’s where this is most sticky, and here’s where the product direction needs to go.
    0:25:29 And I think that’s, like, the core advantage of distribution is you do not have to worry about market fit or anything because your users will tell you where the direction of market fit is headed.
    0:25:33 And their usage data will literally give you the information that you need to know.
    0:25:36 And if you don’t have usage data, then you’re literally shooting blind.
    0:25:41 Like, every person who has built a company before knows that you can’t really know what direction you’re going.
    0:25:42 You have to talk to your users.
    0:25:46 But I feel like if your distribution is strong enough, you don’t need to, like, talk to your users.
    0:25:47 You just need to look at their data.
    0:25:48 You just need to look at the aggregate number.
    0:25:52 When you’re also redefining kind of what a minimum viable product is to some degree.
    0:25:53 Yeah.
    0:25:57 Other people, other companies will spend many months building this thing and then seeing how people use it.
    0:26:03 But for you, if you can sort of draft the right content, you can test out the idea in a much quicker way to see, hey, is this really resonating?
    0:26:04 Yeah, exactly.
    0:26:07 I mean, when we launched the video, like, we barely had a functioning product.
    0:26:12 Like, the day before is when we finished our final test and we’re like, okay, we think this works.
    0:26:14 Now let’s just launch the video as soon as possible.
    0:26:20 And we launched the video and, like, all of a sudden, tens of thousands, we just said, hey, let’s just throw sales calls in the videos to see if people use it for sales calls.
    0:26:21 Because that seems like a pretty lucrative space.
    0:26:25 All of a sudden, we have, like, over a million dollars of enterprise revenue coming in for people using it for sales calls.
    0:26:32 And it’s just, like, you can shot in the dark distribution a lot quicker and a lot more accurately than you can shot in the dark product.
    0:26:35 And you don’t need, like, a million product integrations.
    0:26:36 Like, it’s just so much quicker.
    0:26:48 And what’s even better about it is that the iteration loop is much faster, too, because the algorithm will literally tell you via a number, which is number of views, like, shares, whatever, like, how well your strategy is going to work.
    0:26:51 So it’s much, much easier to test, is this viral?
    0:26:55 Does this have viral fit rather than does this have, like, you know, market fit?
    0:26:59 Roy, does that mean you sort of let the audience guide where the product goes?
    0:27:00 Is that how you sort of think about it?
    0:27:01 Yeah, yeah, exactly, yeah.
    0:27:12 Let’s talk a little bit about the form factor, because one of the things that internally we discussed is, look, Roy’s probably top 1%, now I revise it, the top 001% in the world in terms of knowing how to distribute.
    0:27:22 The Venn diagram of that and people who know and had the instinct to build a semi-translucent overlay, it sounds so simple.
    0:27:26 Does disappearing picture sound so simple?
    0:27:26 It’s easy.
    0:27:27 It’s not technically hard.
    0:27:31 Like, half-translucent overlay, that sounds simple.
    0:27:32 It’s not technically hard.
    0:27:32 Yeah, yeah, yeah.
    0:27:38 But that overlap, to me, was what gave me so much excitement around what you’re building.
    0:27:40 I actually have it right now on.
    0:27:42 Eric, did you go to University of Michigan?
    0:27:44 Yeah, I did.
    0:27:45 See, I didn’t know that.
    0:27:46 I did not know that.
    0:27:47 But I have Cluly open.
    0:27:49 I’m like, you went to U of M.
    0:27:49 Got that.
    0:27:52 Oh, you did philosophy, policy, and economics, I think?
    0:27:52 Yeah.
    0:27:53 Great.
    0:27:54 We can sort of bond over that.
    0:27:57 This, all of a sudden, is an incredible tool.
    0:27:57 That’s amazing.
    0:28:07 When you talk more about where you see the product going, or particularly how you think about it, anyone who’s building AI tools is asking themselves the question of, how is this defensible from one of the major players?
    0:28:09 Will OpenAI, et cetera, just build this feature?
    0:28:18 How do you think about making your product truly defensible, especially from the people that, because of their reach, their size, OpenAI is a distribution, too, right?
    0:28:19 So how do you think about this?
    0:28:24 Yeah, I mean, I guess we’re first to move in a pretty novel UX.
    0:28:26 And I think we did get to translucent.
    0:28:29 I think everyone’s going to inevitably get to translucent overlay.
    0:28:31 This is how integrated AI should feel.
    0:28:36 And Apple shows everyone that liquid glass is the translucent overlay that will be the form factor of AI in the future.
    0:28:38 Right now, I feel like it’s just a land grab.
    0:28:45 And if the question is about distribution, then I think there’s actually like a pretty strong case for us to make that we will actually end up distributing better than OpenAI.
    0:28:50 And it’s enough that you could probably bet on us at like, well, like a 30,000x discount.
    0:28:52 I’m actually not worried about distribution.
    0:28:56 And I think the quality of the product, I mean, it’s quite simple.
    0:29:04 I really feel like this is just a land grab right now to see who can convince as many consumers and enterprise first that they are the guy who deserves to win the translucent overlay.
    0:29:06 And right now, we’re making so much noise.
    0:29:09 I mean, like with the translucent overlay, like why would it not be us?
    0:29:11 And when did you figure out the translucent overlay?
    0:29:13 Yeah, I mean, I was just in my dorm with Neil.
    0:29:18 And we were just like, we literally spent every day thinking about how can we make InterviewCoder more invisible to interviews.
    0:29:19 And we played around.
    0:29:24 And there’s probably like 20 to 30 versions of InterviewCoder in the past that just we thought didn’t work.
    0:29:28 Essentially, it feeds you a code answer, like an answer to a coding problem.
    0:29:30 And you need to overlay that on top of your code.
    0:29:33 And we’re just like, man, I really need this integrated into my code.
    0:29:36 I need to see what I’m doing as well as see the answer that AI is giving me.
    0:29:38 And eventually, we just landed on translucency.
    0:29:40 And this was like, wow, this is like a magical moment.
    0:29:42 This is what the product needed.
    0:29:47 And like very soon, we realized, like, why are we only thinking about coding interviews and like software engineering coding interviews?
    0:29:49 This is such a small market.
    0:29:50 This is true for everything.
    0:29:52 AI should not feel like a separate window.
    0:29:54 Like it should be integrated seamlessly.
    0:29:55 And that looks like translucency.
    0:29:59 Brian, you said you were inspired by the Dragon Ball Z Scouter as an example.
    0:30:04 I would love, Rory, to chat about the staged approach of how you’re thinking about it, I think.
    0:30:05 Like right now, we’re distribution first.
    0:30:07 And then we’ll sort of build the product as we go.
    0:30:11 Stage two, here’s how we do it with a bunch of engineering prowess and product development, etc.
    0:30:14 We sort of chatted a little bit about that on text.
    0:30:18 And to me, that was like, oh, okay, like you’ll figure out product as you go.
    0:30:19 Yeah, yeah, yeah.
    0:30:20 I’d love to talk a little bit about that.
    0:30:24 I mean, right now, the internet is up in storms saying, hey, where’s the product?
    0:30:25 Where’s the product?
    0:30:30 And the two things that like we are literally working day and night to build out the product
    0:30:32 that we have in our head that the users are telling us that they want.
    0:30:36 But also every video I make that’s not directly about the product
    0:30:39 drums up so much more hype for the eventual product launch video.
    0:30:41 Like I will guarantee that this will be viral.
    0:30:45 And I guarantee that it will be more viral than if we just launched like earlier.
    0:30:50 And I think there’s truth in the statement of launch early, ship fast, launch before you’re ready.
    0:30:53 But for some reason, when we’re doing that at scale,
    0:30:55 like it feels like everyone is, oh, no, you launched too early.
    0:30:56 Now, what are you talking about?
    0:30:57 This is the playbook.
    0:30:59 Like we wrote our first lines of code 10 weeks ago.
    0:31:02 We’re like earlier than the latest YC batch of companies.
    0:31:05 Yet we’re like generating probably more revenue than every single one of them.
    0:31:09 And like the product is literally two and a half months since being built.
    0:31:10 In my perspective, we are pre-launch.
    0:31:14 And the huge benefit of being like massively distributing pre-launch is that
    0:31:19 you will know what product to build with as much certainty as you could possibly get.
    0:31:23 And if you can distribute and hype it up to an audience of literally millions of people like,
    0:31:25 hey, this product’s coming out, this product’s coming out.
    0:31:29 And we’re screaming to the world, AI overlay, like AI that sees your screen, here’s your audio.
    0:31:33 Like the second we make that, like, why would you pick anyone else’s product to use?
    0:31:36 We’ve been screaming it from the top of our lungs since day one, like before day one even.
    0:31:41 And like right now, the stage is distribution, get it into everyone’s mind.
    0:31:41 What is Clue.
    0:31:44 Clue is the invisible AI that sees your screen, here’s your audio.
    0:31:45 Everyone knows this.
    0:31:48 And as soon as we launch it, like who else will they download?
    0:31:49 Yeah.
    0:31:52 One thing I think that’s fascinating about what you’re doing, and I’ll compare it to our friends
    0:31:58 at TBPN who are kind of rebranding or reclaiming, they call themselves corporate driven media.
    0:32:00 Because everyone was like, we’re independent media.
    0:32:02 And they’re like, no, no, we’re corporate backed.
    0:32:03 We’re more honest that way.
    0:32:08 And they’re just leaning into like all the bits about ramp and, you know, say 5%.
    0:32:10 And there’s this kind of humor to it.
    0:32:15 And I think similarly, you’re kind of leaning into the controversy or being controversial as
    0:32:19 a strategy where some people think, oh, that’s fake or forced or whatever.
    0:32:23 But you’re at the same time, you’re super authentic in the way that you’re doing it.
    0:32:24 And people feel like they know you.
    0:32:29 And sometimes even if there’s a character, an exaggeration, that also feels authentic in
    0:32:29 some way.
    0:32:31 It’s just kind of a unique style.
    0:32:34 Yeah, I mean, I think this like just reflects a growing shift in society.
    0:32:38 I mean, literally for the past few decades now, there’s just been such this sharp drop
    0:32:39 off in professionalism.
    0:32:43 And I really think it’s because like content creation has been democratized.
    0:32:46 Like the first ever YouTube creators were just people making funny videos in their dorms.
    0:32:48 And this was the most authentic, like people crave this.
    0:32:52 Nobody wants to see another ad or another corporate newspaper like bullshit like that.
    0:32:55 Like they want to see some real person doing real things.
    0:33:00 And the democratization of content creation has allowed authentic people who make content to be
    0:33:01 seen by millions.
    0:33:03 And now authentic people who create content will be seen by millions.
    0:33:07 And for some reason, it’s just like no company gets this, that you’ll have 100 X followers
    0:33:09 and your post will be about introducing blah, blah, blah, blah.
    0:33:11 And it’s like the most corporate bullshit ever.
    0:33:13 You’re not going to get any views and nobody wants to see this.
    0:33:17 Would you rather have a world where everyone was extremely transparent and even the founders
    0:33:20 were honest and like everything you see is just someone’s authentic life?
    0:33:22 Would you rather see like a bunch of corporate bullshit everywhere?
    0:33:26 Like what we envision for the end state distribution of Cluey is that it does not feel like an ad
    0:33:27 at all.
    0:33:30 This is just the story of someone’s life that you want to see.
    0:33:31 And it is like true and authentic.
    0:33:36 And I try to be like, I’m probably more transparent about everything than I’d say probably 99% of
    0:33:37 companies in the value are.
    0:33:40 It’s funny because what you’re saying is some people get so mad at, I heard someone call it
    0:33:43 Riz marketing, which is a compliment, but they’re like, I hate this Riz marketing.
    0:33:45 Like it’s got to be product first, get back to fundamentals.
    0:33:48 But first, this is its own fundamentals too of how the world works today.
    0:33:52 But two is, so of course they can’t destroy you, but even the people who want to destroy
    0:33:55 you, the joke in my head is they could try to kill you, but they can’t kill the idea.
    0:33:55 Yeah.
    0:33:59 Like the way that you’re building companies, like there’s something about it that is going
    0:34:02 to have an impact on the next generation and you’re just starting the company journey.
    0:34:06 What I would say the viral marketing, when you say viral marketing is interesting,
    0:34:06 right?
    0:34:08 Like one leads to another.
    0:34:11 I actually think what you’re doing is like anti-fragile marketing.
    0:34:13 Like you’re so controversial, I’m going to cut off your head.
    0:34:16 And three sprungs up because one is mad at you.
    0:34:17 One is really happy about you.
    0:34:18 One is neutral.
    0:34:22 Like you get all these, like every time someone comes at you and comes at the idea of clearly,
    0:34:25 the more aura points it gets.
    0:34:26 Like aura farming.
    0:34:29 Any lessons from the most controversial stuff that you’ve done?
    0:34:30 Oh, is it always triple down?
    0:34:33 How do you think about, is there a line where to cross it?
    0:34:39 So I think the few lessons have been never punched down, like never, ever even remotely
    0:34:40 close to punched down.
    0:34:46 And I think people reward and the algorithm does reward authenticity more than anything.
    0:34:47 Like it rewards many things.
    0:34:50 But one of the things that it rewards most is like authenticity.
    0:34:53 And I think you’ll see me in Twitter like every once in a while, I’ll make like a genuine
    0:34:54 comment.
    0:34:55 Like, hey, hey, thank you.
    0:34:56 Like, I really respect you or something.
    0:34:58 I think like people love to see that.
    0:35:00 I saw your response to Gary Tan.
    0:35:00 Yeah, yeah.
    0:35:01 I mean, I mean, it’s true.
    0:35:01 Yeah.
    0:35:02 Like I respect that guy.
    0:35:05 And I hope that one day when the company gets to where I imagine it will be, then he
    0:35:06 will come turn around.
    0:35:10 And I don’t think the lesson is to triple down on everything.
    0:35:14 But I think the lesson is that if you’re honest, then the algorithm will reward you because
    0:35:18 there’s literally zero other company out there that is being fully honest about everything.
    0:35:20 And there’s zero founder out there that behaves authentically.
    0:35:23 The only other person I might think is like Elon Musk.
    0:35:27 And I think that’s a really good role model to have in business making.
    0:35:34 As you build clearly into the first destination for consumers and enterprise alike, how does
    0:35:40 like the type of stunts, if you will, and skits actually fit into the type of customer that
    0:35:41 you want to serve?
    0:35:43 I think we’re headed towards a future where this is the new normal.
    0:35:47 I mean, 40 years ago, we were way more professional than we are now.
    0:35:51 If you were even an engineer, you come into work, you come in with a suit and tie.
    0:35:52 And if you don’t, it’s distasteful.
    0:35:52 It’s disgusting.
    0:35:53 I cannot believe it.
    0:35:55 Like you should never, ever show up in a hoodie and sweatpants.
    0:35:59 Now you’re weird if you come up with a suit and you’re not in a hoodie and sweatpants.
    0:36:01 Everyone wants more authentic things.
    0:36:05 Right now, for some reason in society, we’re still lingering onto this image that companies
    0:36:09 need to be like brand friendly and boring and never say anything controversial or whatever.
    0:36:13 Like I don’t understand how this became the societal norm.
    0:36:15 But in reality, people want to see interesting things.
    0:36:17 I mean, like that’s the point of life is you see interesting things.
    0:36:19 There’s just this lack of professionalism.
    0:36:24 And again, with the distribution of short form content, like everyone sees the craziest
    0:36:26 things and you just get desensitized to these things.
    0:36:29 And that’s why sports like sperm racing are able to be so hyperviral.
    0:36:33 Nobody would have aired this on CNN 10 years ago, but you don’t need CNN anymore.
    0:36:34 You have Instagram, TikTok.
    0:36:36 And Instagram, TikTok, they love sperm racing.
    0:36:39 As a result, they’re able to raise at a fucking mass evaluation, like genuinely have a shot
    0:36:40 at being like the next legit sport.
    0:36:42 And it’s not just sperm racing.
    0:36:44 I mean, it’s every company in the world.
    0:36:47 There’s Sam Altman talking about how hot the guys are that GPT generates in the timeline.
    0:36:50 There’s Elon Musk doing ravings about his political takes.
    0:36:53 Like every company is getting more and more controversial, less professional.
    0:36:55 This trend is not dying anytime soon.
    0:36:58 I’m just the person that is pushing the envelope perhaps a little further than the world is ready
    0:36:59 for at the moment.
    0:37:02 But I think that the question that everyone should linger on a little bit is, can you imagine
    0:37:04 a world where we do win?
    0:37:05 Can you imagine the world?
    0:37:08 What will the state of the world look like if Cluey does win?
    0:37:13 And we prove to everyone like the bar for professionalism is here where we’ve determined it.
    0:37:17 The corporate culture of America as a whole is going to shift in perhaps the entire world.
    0:37:22 Everyone will realize, oh, we’ve had our like panties up in a bunch worrying about brand
    0:37:25 image and professionalism when in reality, the world craves something different.
    0:37:30 And I have very strong conviction that I’m right in this because I was right about X.
    0:37:35 And I did not understand for life and me why nobody is producing the viral videos that
    0:37:37 were so obviously designed to go viral for the algorithm.
    0:37:41 It’s just because nobody has caught on and nobody’s willing to like press that button.
    0:37:44 Now that I’ve pressed the button once, just imagine what the world looks like if Cluey makes
    0:37:45 it.
    0:37:46 You probably like are more interested in that.
    0:37:50 That would be a much more interesting world if every company was being 100% radically transparent
    0:37:53 and doing exactly what the most interesting thing was.
    0:37:57 As Elon says, the most entertaining outcome is the most likely.
    0:37:58 It’s true.
    0:38:01 On that note, this has been a fantastic episode with Rory, Brian.
    0:38:02 Thanks so much for coming on the podcast.
    0:38:02 Yeah.
    0:38:03 Thank you so much for having me.
    0:38:04 Thank you.
    0:38:09 Thanks for listening to the A16Z podcast.
    0:38:14 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash A16Z.
    0:38:17 We’ve got more great conversations coming your way.
    0:38:18 See you next time.

    What if virality wasn’t a tactic — but the entire product?

    In this episode, a16z General Partners Erik Torenberg and Bryan Kim sit down with Roy Lee, cofounder and CEO of Cluely, one of the most talked-about consumer AI startups of 2025. Cluely didn’t raise a mega round or drop a feature suite to get traction – it broke through by turning distribution into design: launching viral short-form videos, pushing polarizing product drops, and building in public with speed and spectacle.

    We cover:

    – Why virality is Cluely’s moat

    – Building a brand-native AI interface

    – The Gen Z founder mindset

    – What most startups get wrong about attention

    – Why creators are the new product managers

    – Cluely’s long-term vision for ambient AI

    Cluely is a glimpse at the next generation of startups,  where the line between product and performance is disappearing.

     

    Timecodes: 

    00:00 Introduction 

    01:07 Early Success

    02:02 Roy’s Journey: From College Kid to Tech Universe

    04:37 The Turning Point: Harvard and Beyond

    06:57 Building Cluey: The Early Days

    08:27 The Viral Strategy: Mastering Algorithms

    13:56 The 50 Interns Experiment

    15:30 The Investment Journey: Roy and Bryan’s Partnership

    19:20 Momentum as a Moat: The Future of AI Companies

    20:32 The Evolution of Product Strategy in the AI Era

    21:19 The Importance of Speed and Adaptability

    22:48 The Role of Distribution in Modern Startups

    24:26 Roy’s Journey and Product Development

    25:25 The Power of User Data and Feedback

    26:58 Innovative Marketing and Distribution Tactics

    28:25 The Future of AI Integration and Translucent Overlays

    32:15 Controversial Marketing and Authenticity

    34:01 The Impact of Radical Transparency

    36:42 The Changing Landscape of Professionalism

    38:26 Concluding Thoughts and Future Vision

    Resources: 

    Find Roy on X: https://x.com/im_roy_lee

    Find Bryan on X: https://x.com/kirbyman01

    Learn more about Cluely: http://cluely.com/

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://x.com/eriktorenberg

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.