AI transcript
0:00:07 Are ads purely destructive or negative to the user experience, or are they actually, if done properly, are they actually either neutral or even positive?
0:00:14 You know, if you’re not Apple, do you really want to be a company that basically sits there and says, yeah, the world’s moving and we’re very deliberately not going to lean as hard as we can into it?
0:00:24 I think there’s a lot of survivorship bias in these kinds of strategy discussions where people look at the one company that’s able to pull this off, and they don’t look at the 50 other companies that are in the graveyard because they didn’t adapt.
0:00:31 Mark Andreessen went live on TBP on this week, and today we’re dropping that full conversation here on the pod.
0:00:40 Mark gets into it all, what’s really happening in AI right now, how Apple is playing its hand, the return of open source, and why perfect products can signal the end, not the peak.
0:00:47 He also shares his take on how to break into venture capital in 2025 and what he’s actually using AI for day to day.
0:00:48 Let’s get into it.
0:00:57 This information is for educational purposes only and is not a recommendation to buy, hold, or sell any investment or financial product.
0:01:05 This podcast has been produced by a third party and may include paid promotional advertisements, other company references, and individuals unaffiliated with A16Z.
0:01:13 Such advertisements, companies, and individuals are not endorsed by AH Capital Management, LLC, A16Z, or any of its affiliates.
0:01:19 Information is from sources deemed reliable on the date of publication, but A16Z does not guarantee its accuracy.
0:01:23 We have Mark Andreessen joining us.
0:01:25 He’s live from the TBP on Ultra Drum.
0:01:26 Welcome to the stream.
0:01:27 How are you doing, Mark?
0:01:29 Hey, what’s happening?
0:01:30 Great to see you.
0:01:31 Yeah, you too.
0:01:32 A lot.
0:01:38 It’s a little bit of a slow news day, but exciting stuff with GPT, open source.
0:01:40 It’s not a slow August.
0:01:41 It’s not a slow August.
0:01:41 We’re glad.
0:01:45 We were just reflecting that we’ve taken exactly one day off this summer.
0:01:51 That was July 4th, and we’re showing the Europeans how American companies work.
0:01:52 American work.
0:01:57 We’re setting an example, and we have proof of work because we exist on the internet, and you
0:01:58 can see us live every day.
0:02:00 So we’re setting an example.
0:02:02 How are you doing?
0:02:02 How’s your summer going?
0:02:03 Fantastic.
0:02:04 Going really well.
0:02:08 So how long is it going to be until you guys put up avatars that make claims that you’re
0:02:11 working hard all through the summer when it turns out you’re on the beach?
0:02:12 You might have caught us.
0:02:15 I think you’ll know better than us as to when the technology gets there.
0:02:17 We’ve been demoing some of the stuff.
0:02:21 People have been doing a lot of deep fakes of us, and fortunately, all of them have been
0:02:24 clockable, so it doesn’t feel like a brand risk.
0:02:28 But they’re getting closer and closer, and I know that there’s going to be a moment where
0:02:32 we have to say, hey, that’s actually using our name and likeness to endorse something
0:02:33 that we don’t necessarily endorse.
0:02:34 Can you please take that down?
0:02:39 So we’re approaching the touring test, the uncanny valley.
0:02:40 We’re escaping the uncanny valley.
0:02:47 I think a question, looking back over the, you know, maybe 10 or 15 years, was what moments
0:02:50 did you feel like there just was not a lot of action happening?
0:02:56 Because this summer is just the pace from so many different teams has been absolutely insane.
0:02:58 Everybody’s like trying to keep up.
0:03:02 And it didn’t used to feel that way, at least from my point of view.
0:03:10 So my view of it always is there’s like these disconnected, you know, kind of patterns or trends.
0:03:14 There’s sort of the sort of day-to-day phenomenon where like engineers show up every day and they
0:03:15 make things a little bit better.
0:03:19 And then every once in a while, you know, you get a technical breakthrough or a new platform.
0:03:25 And that process kind of this, you know, kind of sawtooth kind of up to the right kind of process
0:03:27 kind of plays out over time, kind of regardless of what else is happening in the world.
0:03:31 And so it keeps happening through recessions and depressions and wars and like all kinds
0:03:33 of crazy, crazy, crazy stuff that’s happening.
0:03:35 But basically, you know, the technology keeps getting better.
0:03:37 So there’s kind of that curve.
0:03:42 And then there’s the sort of enthusiasm curve and then the adoption curve, you know, which
0:03:45 is basically like, when do these things actually show up in the world?
0:03:49 And then, by the way, when are people actually ready, you know, for the new thing?
0:03:52 Like if you talk to people who worked on language, I’m sure you guys have talked to people who work
0:03:56 on language models, they will tell you that they were surprised the chat GPT was the breakthrough
0:03:59 moment because they thought everybody already knew what these models could do for, you know,
0:04:00 three years before that.
0:04:04 And so they were, you know, they were shocked that it was the chatbot interface that made the
0:04:04 thing go.
0:04:10 And so there’s somewhat of a sort of arbitrary disconnection between what’s actually happening
0:04:13 in the substance and then what people are seeing and feeling.
0:04:16 And so it’s just, it’s really hard to predict when these things pop.
0:04:20 But also, if you’re in this day to day, it’s really hard to tell, you know, when things are
0:04:24 going to be hot or not, because it doesn’t necessarily map to how much the technology is improving.
0:04:29 Yeah, we were just talking about that in the context of Google’s new world model.
0:04:33 It’s this like generative video game that you can kind of move around in.
0:04:37 And it feels like DeepMind is just absolutely crushing at the AI research frontier.
0:04:41 They have the best world model simulator that you can walk around in.
0:04:47 The question is like, if they let another lab do the chat GPT thing and just get it out
0:04:52 into the consumer three months earlier, they might wind up kind of chasing and trying to catch
0:04:57 up if somebody actually figures out how to make it like a dominant consumer product.
0:05:01 Now, in the enterprise, it’s more oligopolistic, but consumer seems to be winner take all.
0:05:07 I guess the question is like, how much value do you place right now in the AI race to just
0:05:14 like moving fast, breaking things, you know, dealing, having like the thick skin to deal
0:05:18 with like the safety constraints and all of the different stuff, obviously not being irresponsible,
0:05:21 but just speeding up the organization as much as possible.
0:05:23 It feels like now’s the time to really push on that.
0:05:24 Yeah.
0:05:25 Well, first of all, I need to correct you.
0:05:27 It’s moving fast and making things.
0:05:28 I don’t know whether that’s right.
0:05:30 I don’t even know where that came from.
0:05:33 I have no idea.
0:05:34 Never heard of it.
0:05:36 I think chat GPT didn’t really break anything.
0:05:37 I think that’s a good point.
0:05:39 It really did just move fast and make things.
0:05:41 The first things it made were weird, but that was fine.
0:05:44 And it failed and it hallucinated a ton, but it didn’t really break anything.
0:05:45 I don’t know.
0:05:50 Yeah, I believe in this case, total deaths attributable to chat GPT are still zero.
0:05:51 Zero.
0:05:56 So notwithstanding all of it, notwithstanding all of the, all the caterwauling, but yeah.
0:06:01 So look, I think the AI industry in particular has a very acute version of the, of the, of the
0:06:03 sort of challenge that you identified with.
0:06:07 And, and, you know, and I don’t say this negatively, just an observation, which is that they’re, you
0:06:10 know, like in sort of a normal technology company, you’ve kind of got engineers who make
0:06:13 products and then you’ve got, you know, kind of salespeople or marketing people who sell
0:06:16 them, you know, in the, in the air companies, you have this third tier of, you know, the
0:06:17 quote unquote researchers.
0:06:17 Yeah.
0:06:18 Right.
0:06:21 And so, you know, which is, which has worked out incredibly well.
0:06:24 I mean, the researchers have done, you know, they’ve just done like amazing breakthroughs
0:06:27 at these companies, but you know, the, the, the, the handoff, you know, there’s not necessarily
0:06:29 a clean handoff from the researchers to the, to the market.
0:06:35 And so it kind of raises this question of like, okay, like, is there, is, are these companies
0:06:37 therefore kind of three, you know, kind of three segment companies where they have
0:06:41 research and then they have product development and then, and then they have go to market.
0:06:43 And I, and I think that’s a really open issue.
0:06:46 I mean, if you, you know, Google’s kind of a case study of this, you know, you alluded
0:06:49 to deep mind, but even more broadly, Google, you know, Google developed a transformer in
0:06:50 2017.
0:06:52 And then they basically let it sit on the shelf, right.
0:06:54 Cause it was a research project.
0:06:55 They didn’t productize it.
0:06:58 They were very worried about, you know, from people I’ve talked to, they were very worried
0:07:01 about the, you know, brand issues and safety issues, you know, kind of all these, all
0:07:02 these, they had all these reasons to not productize it.
0:07:07 I talked to somebody senior who was there at the time and I, I asked them, you know, when, when,
0:07:12 when could you have had chat GPT with GPT four level, uh, output, um, if you had just
0:07:14 got, you know, gone, gone flat out starting in 2017.
0:07:17 And they said by 2019, you know, they’re, they already knew how to do it.
0:07:21 And then, you know, they’ve now caught up, but it took, it took an extra five years, five
0:07:22 years to catch up.
0:07:25 Um, and so I, I think a lot of these companies kind of have that challenge.
0:07:30 Elon as usual, of course, this is provoking this question is, I’m sure you guys talked
0:07:34 about, but you know, he, he has now, you know, within XAI, he’s now collapsed, you know, he’s
0:07:36 eliminated the distinction between research and product.
0:07:40 Um, and so, you know, of course he, you know, he’s pushing this as hard as he can.
0:07:42 And I think it’s a, it’s a good question for a lot of these other companies, kind of how
0:07:46 hard they want to push on actually getting these things in fully productized form out to the
0:07:46 market.
0:07:53 Yeah, yeah, on, on, on Elon’s, uh, like distinction, it feels like there is more research to be
0:07:58 done, but it feels like we’re, we’re entering like a new cycle of, you know, just focus on
0:08:02 the engineering, focus on the deployment, the applications, let’s get all this technology
0:08:02 out into the world.
0:08:03 Let’s reap all that benefit.
0:08:09 And yes, there will be a, a different track of fundamental research that’s happening somewhere,
0:08:11 but it’s really, really hard to predict.
0:08:16 And so if you have something that’s working, just double down and just go really aggressive
0:08:16 on it.
0:08:22 Um, I’m, I’m wondering, uh, more on, on that, but also on Apple strategy.
0:08:26 It feels like Apple’s been, um, kind of like, you know, people have been maligning them for
0:08:28 not, for missing the AI opportunity.
0:08:33 And Tim Cook’s just there on the earnings call being like, look, we acquired a couple of
0:08:38 small companies and seven companies, but then, uh, it seems like they’re taking more of like
0:08:39 an American dynamism approach.
0:08:43 Like there was news today in the journal that they, uh, that they’re investing a hundred million
0:08:44 dollars in American manufacturing.
0:08:47 They’re certainly doing stuff.
0:08:52 They’re just not chasing the, you know, the, the, the shiny tennis headline, a hundred billion
0:08:53 dollar CapEx.
0:08:59 Um, so I’m wondering about your thoughts on, on when you have a, you know, uh, when, uh, when
0:09:05 you have a platform, uh, how hard is it to resist chasing the new shiny object?
0:09:09 Is that the right move or are, are there any other things that you think Apple should be,
0:09:11 uh, you know, changing their strategy on?
0:09:12 Yeah.
0:09:16 So look, Apple’s always had this, you know, very clearly defined strategy that, you know,
0:09:19 Steve, Steve and Tim, you know, working together, figured out a long time ago, which is, you know,
0:09:22 they, they, I forget the exact term, but it’s, it’s something like basically they, they invest
0:09:24 deeply into the core of what they do.
0:09:26 You know, they’ll basically work internally on things for many years.
0:09:29 They all, they only actually release things when they feel like they’re kind of fully baked.
0:09:31 Um, right.
0:09:35 And, and, and so as a consequence, they have this thing where, and Tim says this, right.
0:09:38 Uh, you know, they’re rarely first to market with new technologies, you know, they’re, they’re
0:09:42 more often in the category of what, you know, Peter, Peter Thiel calls last to market, you
0:09:44 know, they’re, you know, they’ll, they’ll, they’ll come out, whatever, three years later,
0:09:48 whatever, five years later, you know, there, you know, there were tablets for years before
0:09:50 the iPad, there were, you know, smartphones for years before the iPhone.
0:09:53 Folding phones, they’re about to do a folding phone.
0:09:55 It’s like 10 years into that technology.
0:09:56 I’m sure if they do it, they’ll hit it.
0:09:58 Yeah, the last mover, the last mover.
0:09:59 Yeah, yeah, yeah.
0:09:59 Sorry.
0:10:00 The last mover.
0:10:03 And I guess, yeah, well, what I would say is like, look, that clearly works if you’re
0:10:03 Apple, right.
0:10:08 Um, and so it, it clearly works if you’re Apple, but I would say there’s a fine line between
0:10:10 that strategy and just, and simply becoming obsolete.
0:10:10 Right.
0:10:14 Um, and so the, the problem is like, if you’re not Apple, um, and you don’t have all the other
0:10:17 kind of super strengths and, you know, kind of now the market position that Apple has,
0:10:21 you know, do you really want to be a company, you know, if you’re not Apple, do you really
0:10:23 want to be a company that basically sits there and says, yeah, the world’s moving and
0:10:26 we’re very deliberately not going to lean as hard as we can into it.
0:10:31 Um, and so I, I think there’s a lot of survivorship bias in these kinds of strategy discussions
0:10:34 where people look at the one company that’s able to pull this off and they don’t look at
0:10:38 the 50 other companies that are in the graveyard, you know, because they, you know, because, because
0:10:38 they didn’t adapt.
0:10:41 I mean, you know, all the other smartphone companies, when the iPhone came out, they
0:10:42 were like, oh yeah, well, we could do touch too.
0:10:43 Right.
0:10:44 You know, we’ll just, you know, we’ll get to it.
0:10:44 Right.
0:10:47 Um, and you know, you know, they’re gone.
0:10:47 Yeah.
0:10:48 What do you, what do you think?
0:10:52 The Blackberry Bold, I remember, it was like an iPhone knockoff.
0:10:53 What do you think?
0:10:53 Yeah.
0:10:59 You know, right now people are, are variety of, you know, shareholders are annoyed at Apple
0:11:02 around their reaction to AI, LLMs.
0:11:07 John’s annoyed around just like transcription generally, just like super basic stuff, but
0:11:13 it doesn’t feel like the, the, uh, core businesses immediately threatened today.
0:11:17 It feels like it’s still on the horizon around these sort of like, you know, eyewear based
0:11:22 computing, you know, potentially net new devices that we’re, that, that we’ll see from, uh, you
0:11:24 know, companies like open AI over time.
0:11:31 But where do you, like, like how, how real is the threat, you know, this year, uh, versus
0:11:34 10 years from today and, and kind of what’s your framework?
0:11:35 Yeah.
0:11:38 Well, look, I mean, I think the biggest ultimate danger, I mean, the biggest ultimate danger
0:11:42 is very clear, which is just like, at what point do you not carry around a pane of glass
0:11:45 in your hand, you know, called a phone, um, you know, because other things have superseded
0:11:48 it and, you know, like everything, you know, everything becomes obsolete at the point.
0:11:52 Um, so there will, there will come some time to make sure when we’re not, you know, carrying
0:11:54 phones around and we’ll, we’ll watch movies where people have phones and we’ll be
0:11:56 like, yeah, look at a, look at how primitive they were, right.
0:12:00 Because, because we’ll have moved on to other things and whether those things are eye-based
0:12:04 or, you know, uh, uh, you know, the other kinds of wearables or whether it’s just kind
0:12:08 of, you know, computing happening in the environment, um, or just, you know, entirely voice-based
0:12:11 or, you know, who knows what it is, but, um, you know, there will come a time when that
0:12:15 happens, you know, is that time three years from now, cause there’s like some, you know,
0:12:18 huge breakthrough, you know, from, from some company that figures out the, the, the product
0:12:22 that absolutes the phone right away, or is that 20 years from now because the phone
0:12:24 is just, you know, such a standard platform for everything that we do in our
0:12:27 our lives and everything else, you know, kind of remains a peripheral to the phone.
0:12:30 I mean that, you know, that’s, you know, that, that’s the game of elephants that’s playing
0:12:30 out there.
0:12:34 Um, you know, obviously I think, uh, you know, I think it’s highly likely that we’ll, we’ll
0:12:35 have a phone for a very long time.
0:12:39 Having said that it is, it is exciting that there are companies that are going directly
0:12:39 at that challenge.
0:12:43 Um, and you know, whoever cracks the code on that will be the, will be the next Apple.
0:12:46 And by the way, that, that may in the fullness of time be Apple itself.
0:12:48 You know, they, they, they may be the company that figures that out.
0:12:49 Yeah.
0:12:54 I remember being at a board meeting at Andrews and Horowitz, maybe a decade ago or something.
0:12:59 And Chris Dixon showed me the hollow lens and I was like, okay, we’re one year away from
0:13:00 this being everywhere.
0:13:06 And, and I feel like today I’m still in the, like, yeah, VR, it’s definitely one year away.
0:13:08 The next quest I’m going to be wearing daily.
0:13:12 Um, and, and it feels like we’re always there, but it does feel like Apple did a lot of work
0:13:17 on the, on the fundamental, uh, the, you know, pixel density of the resolution of the display.
0:13:21 And then Meta has been doing a ton of work on just getting it light and affordable.
0:13:26 Like it feels closer than ever, but, uh, you know, you, you, you always got to wait until
0:13:28 you see the churn numbers until you really call the game.
0:13:28 Right.
0:13:32 Well, you’re saying the same, but you know, I think that’s true, but you’d also say, you
0:13:34 know, I’m, I’m on the, on the meta board.
0:13:39 So I’m kind of a dog hunting this one, but like the Meta Ray-Ban glasses are a big hit.
0:13:39 Oh, totally.
0:13:39 Right.
0:13:43 Like they’re, they’re a big, you know, so I think we, we now have a form factor that we
0:13:46 know works, you know, for, for, for eye-based wearables is, you know, there’s not VR and then
0:13:50 VR, you know, on top of that, but, um, you know, just, uh, you know, the glasses and, you
0:13:53 know, and then the, the glasses of camera, you know, sort of integrated camera, integrated
0:13:54 microphone, integrated speaker.
0:13:54 Yep.
0:13:56 You know, that’s a very interesting platform.
0:13:59 Um, you know, the watch clearly works by the way, which Apple, of course, you know, is played
0:14:02 a significant role in making happen, you know, that now sells in, in, in huge
0:14:05 volume, um, you know, so that’s the second data point.
0:14:08 And then, you know, like, I think these, you know, these, these, I think some form of AI
0:14:09 pin is going to work.
0:14:09 Yep.
0:14:12 Um, I also think, you know, headphones are going to get a lot more sophisticated, which
0:14:13 is already happening.
0:14:13 Yep.
0:14:17 Um, and so, you know, you do have these, you know, kind of data points coming out and then,
0:14:20 yeah, look, the, the, the, the trillion dollar question ultimately is, are these, are these
0:14:21 peripherals to the phone?
0:14:25 Um, you know, which is what they are today, or are these replacements for the phone?
0:14:28 And, you know, we, we, yeah, I would say we, you know, we have, we, we, I think
0:14:31 we have a lot of invention coming both from new companies and from the incumbents
0:14:32 who are going to try to figure that out.
0:14:32 Yeah.
0:14:37 I always think about the value of like narrowing the aperture on these new technologies, like
0:14:42 with, with the, the meta Ray-Bans, I feel like the fact that they aren’t also trying to
0:14:45 be a screen is actually a feature, not a bug.
0:14:46 And I always go back to the iPhone.
0:14:50 Like it was first and foremost, a phone and people bought it because it could make calls
0:14:52 and then it could make text messages.
0:14:55 And then it was an iPod, but I, do you disagree with that?
0:14:55 Please.
0:14:58 Well, you guys, I don’t, you guys might be too young.
0:15:00 The first iPhone actually was a bad phone.
0:15:02 How so?
0:15:04 Don’t you guys, for the first two years, it couldn’t reliably make phone calls.
0:15:10 I had, I had like the third one and a friend had one, but I feel like it was still like
0:15:14 people were carrying cell phones and that was the, at least of the expectation.
0:15:15 But yeah, I mean, I guess you’re right.
0:15:19 So for the, for the first two, it was a classic Apple strike because the first, for the first
0:15:21 two years, the thing couldn’t make fun, reliably make phone calls.
0:15:25 And then it turned out there was an issue with the antenna and with, with how you held
0:15:28 it and there was a, I remember that email and you would, and you would disconnect it.
0:15:30 You could basically brick the device.
0:15:30 Yeah.
0:15:32 Based on how you held it.
0:15:35 And somebody emailed, this is when Steve would, would respond to emails from random people.
0:15:37 And somebody emailed Steve saying, if I, you know, hold the phone this way, it doesn’t
0:15:37 make phone calls.
0:15:39 And he’s like, well, don’t hold it that way.
0:15:41 Yeah.
0:15:41 Right.
0:15:41 Yeah.
0:15:43 So, so even there it was like, yeah.
0:15:45 And people, you know, people forget it.
0:15:47 It took like five years for the iPhone to find its footing.
0:15:50 It took like two years to get the, I remember also the original iPhone didn’t have, it didn’t
0:15:52 have broadband data.
0:15:56 It was on, it was on the old 2G, it was called the AT&T edge network.
0:15:57 So it didn’t have broadband data.
0:15:59 And then of course it didn’t have an app store, right?
0:16:00 It was completely locked out, right?
0:16:05 So the challenges, the challenges for Apple now is that people are so used to perfection
0:16:11 with the device that launching a product that isn’t perfect, like is embarrassing, right?
0:16:14 Like you look at the vision pro and it’s like, well, the battery’s big.
0:16:16 Steve would have hated this, right?
0:16:21 Like how he never would have shipped this and that being constrained and not being able
0:16:26 to innovate because you’re tied to this like impossible standard of being on whatever generation
0:16:31 17 of the iPhone and perfecting every element is a real challenge.
0:16:33 So I would say there’s a corollary to that.
0:16:37 One of the things I’ve observed over the years is I think to algae products become obsolete
0:16:39 at the precise moment, they become perfect.
0:16:42 And did you apply what I mean?
0:16:46 What I mean by perfect basically is like, yeah, it’s like the perfect idealized complete product.
0:16:50 Like it does everything you could possibly ever imagine, everything a customer could imagine,
0:16:52 everything you as the technology developer could imagine.
0:16:54 It’s absolutely perfect.
0:16:59 And there’s been tons of examples of this over the last 50 years where it’s like the absolute
0:17:02 perfect, it seems to be the permanent version of that product.
0:17:05 And then it just turns out that’s actually the point of obsolescence because it means creativity
0:17:07 is no longer being applied right into that platform.
0:17:09 You’re just like, there’s just nothing else to do.
0:17:11 You’re just like, you’re done, right?
0:17:12 The product has been realized.
0:17:14 And then the cycle is what happens to your point.
0:17:18 The cycle is other people come in with completely different approaches, completely different
0:17:23 kinds of products that are broken and weird in all kinds of ways, you know, but are fundamentally
0:17:23 different.
0:17:25 And so, you know, that is one of the time-honored traditions.
0:17:28 And, you know, one of the, you know, one of the, you know, things you could say about,
0:17:31 you know, Tim is, you know, his willingness to kind of break the mold of Apple only shifts
0:17:32 perfect products.
0:17:36 But, you know, being willing to ship the, you know, the Vision Pro, you know, you know,
0:17:39 it shows a level of determination to kind of stay in the innovation game, which I think
0:17:40 is very positive.
0:17:40 Yeah.
0:17:41 Yeah, yeah, yeah.
0:17:42 That’s great.
0:17:45 Updated thinking on open source.
0:17:49 Since we last talked, there’s, there’s a lot that’s been happening.
0:17:51 OpenAI is an open source company again.
0:17:52 Yes.
0:17:53 OpenAI is open again.
0:17:53 Yes.
0:17:53 Yeah.
0:17:54 Yeah.
0:17:55 Look, very encouraging.
0:17:58 You know, a year ago, I was very, you know, I was, I was getting very distressed about
0:18:00 open, you know, whether open source say I was going to be allowed.
0:18:00 Yeah.
0:18:01 Right.
0:18:02 It was even going to be illegal.
0:18:04 And so, and I think, you know, we’re basically through that at this point.
0:18:07 We’re through that in the U.S.
0:18:10 You know, we’ll, we’ll see about, we’ll see about the rest of the world.
0:18:14 And then look, you know, the U.S.-China thing is obviously a big deal, but, you know, I think
0:18:17 it’s been net positive for the world that China has been, been so enthusiastic about open
0:18:20 source say I coming out of China, which has been great.
0:18:23 And then, yeah, look, OpenAI leaning hard into this, you know, and releasing what, you know,
0:18:27 what they did as I, as I think fantastic, both because of what they released, which
0:18:30 is great, but also just the fact that they are now, you know, willing to do
0:18:32 that, and then Elon reconfirmed overnight that he’s going to, you know, open source,
0:18:34 you know, start open source in previous versions of Grok.
0:18:39 And so, yeah, so we, you know, we, we, we, we seem to be, we seem to be in a timeline where
0:18:41 open source AI is going to happen.
0:18:45 You know, right now, you know, what you, I think what you would say is it kind of lags
0:18:48 the leading edge for proprietary implementations by, you know, six months or something like
0:18:48 that.
0:18:53 But, but I think that, you know, that’s a good, if that’s the status quo that continues, I
0:18:54 think that would be a very good status quo.
0:18:58 What are the rough edges that we need to kind of sand down when we’re thinking about
0:19:00 Chinese open source models specifically?
0:19:06 Is it, we need to do some fine tuning on top of them to add back free speech, or do we need
0:19:11 to watch for back doors, say it’s phone and home, if it runs into this specific thing?
0:19:15 Like the Chinese open source thing, it was remarkable because I feel like it really does accelerate
0:19:18 the pace of innovation because everyone gets to see, oh, this is how reasoning works.
0:19:19 I think that’s great.
0:19:24 At the same time, it made me very, it made me much more appreciative of AI safety research
0:19:29 and capability research and actually being able to interpret what’s going on and say definitively,
0:19:32 this model is going to behave weird in this weird way.
0:19:34 Like the Manchurian candidate problem.
0:19:37 We haven’t found any of that, but it certainly seems like something we’d want to keep an eye
0:19:38 on.
0:19:42 But from your perspective, like what are the risks that we need to be aware of going into a
0:19:45 world where China is really pushing hard into open source?
0:19:49 Yeah, there’s two, there’s two, and you identified them, but let’s, let’s, let’s talk about both
0:19:49 of them.
0:19:54 So the phone, so the phone home thing is the, is the easy one, which is you can put up, you
0:19:57 know, you can packet sniff, you know, a network and you can tell when the thing is doing that
0:20:00 and you, and, and plus you can go in, you can go in the code and you can see what it’s doing
0:20:00 that.
0:20:04 And so you can validate, you can even validate that that’s either happening or not happening.
0:20:05 And I think that, you know, that’s important.
0:20:09 But I, you know, I think people are going to, people are going to, are going to figure that
0:20:09 out.
0:20:12 You can kind of gate that problem practically.
0:20:17 The, the, the, the bigger issue is we, we have this term in the field right now called
0:20:18 open weights.
0:20:21 And open weights is a loaded term.
0:20:26 It uses the open term from open source, but of course with open source, the thing is you,
0:20:27 you can actually read the code.
0:20:32 You know, with open weights, you have, you know, just a giant file full of numbers, as you
0:20:33 said, that you, you can’t really interpret.
0:20:37 And then what you don’t, what you don’t have, what, what most, what most of the open source,
0:20:41 the open weights models don’t have, including, you know, deep seek specifically, what they
0:20:43 don’t have is they don’t have open data, right.
0:20:44 Or open corpus, right.
0:20:47 So you, you, you can’t actually see the training data that went, went into them.
0:20:50 And of course, you know, most of the people building models are kind of obscuring what
0:20:52 that, you know, what that training data is in various ways.
0:20:57 And, and so when you get an open weight model, you know, the good news is the, the, the software
0:20:58 source is open.
0:21:00 The good news is you can run it in your machine.
0:21:04 You can verify that it doesn’t phone home, but you don’t actually know what’s happening inside
0:21:04 the weights.
0:21:08 And so I, I think that that is going to be a bigger and bigger issue, which is like,
0:21:13 okay, how the thing behaves, like, yeah, what, what has it actually been trained to do?
0:21:17 And what restrictions or directives has it been given in the training, you know, that are
0:21:19 embedded in the weights that you need to be able to see?
0:21:24 You know, this is, I would say this is coming up as sort of, I would say a global issue, you
0:21:26 know, which, you know, we worry about when these models come from China, other countries
0:21:30 worry when these models come from the U S, which is right.
0:21:32 So one of the, one of the phrases you’ll hear when you talk to people kind of outside the
0:21:35 U S is kind of this, this phrase people are kicking around, which is not my weights, not
0:21:35 my culture.
0:21:36 Right.
0:21:37 Right.
0:21:39 Or by the way, for that matter, not my weights, not my laws.
0:21:40 Yeah.
0:21:40 Right.
0:21:45 Which is like, okay, like what actually is this thing going to do?
0:21:46 Right.
0:21:50 And to your point that Chinese models, for example, might, you know, never criticize, you
0:21:50 know, communism or something.
0:21:54 I can tell you the American models have all kinds of constraints also.
0:21:54 Yeah.
0:21:55 Right.
0:21:59 Implemented, you know, usually by a very specific kind of person in a very specific location
0:21:59 in the U S.
0:21:59 Yep.
0:22:04 And so, you know, I think that this is a, this is a general issue and we’re, and we’re going
0:22:09 to have to see basically people’s tolerance levels being willing to run open weights models
0:22:11 where they don’t fundamentally have access to the data.
0:22:14 And then correspondingly, I think what we’ll see is more open source developers also doing
0:22:15 open corpus, open data.
0:22:16 So you can see what’s actually in them.
0:22:17 Yeah.
0:22:23 Obviously open source is very important in terms of just distributing intelligence broadly,
0:22:29 giving people the ability to run their own models and really fine tune them and have control.
0:22:35 There’s also the big push just to make frontier models and high capability models free.
0:22:39 One model is you charge for the premium, you give the free away.
0:22:40 It’s a freemium model.
0:22:42 That’s what we’re seeing at most of the labs right now.
0:22:48 There’s also this kind of specter on the horizon of potentially putting ads in LLMs.
0:22:49 what that would do to the world.
0:22:55 Jordy got in a little dust up with Mark Cuban on the timeline, deciding whether or not it
0:22:58 would be a net good to put advertising in LLMs.
0:23:00 What might happen that might be bad there?
0:23:01 What do you have to take?
0:23:01 Yeah.
0:23:08 My point broadly was that ads have been an incredible way to make a variety of products
0:23:10 and services online free.
0:23:17 And just saying like default, just no ads would potentially, you know, be incredibly destructive.
0:23:21 Um, but, uh, yeah, curious your framework.
0:23:22 Yeah.
0:23:26 So I should start by saying like, whenever I personally use an internet service, I always
0:23:28 try to buy the premium version of it that doesn’t have ads.
0:23:29 Um, right.
0:23:33 And so if I can like live personally inside an ad for universe and pay for it, like, that’s
0:23:33 great.
0:23:37 Um, and I’ll, I’ll free, I’ll freely admit, you know, whatever level of, you know, hypocrisy
0:23:39 or incongruence, you know, kind of, kind of, kind of, kind of results from that.
0:23:40 But the point is choice.
0:23:41 The point is choice.
0:23:44 Well, the point is exactly what you said.
0:23:44 It’s affordability.
0:23:48 So the, the, the problem is if you really want to get to five, if you want to get to a
0:23:52 billion and then 5 billion people, um, you, you, you can’t do that with a paid offering.
0:23:55 Like it just, you, at any sort of reasonable price point, it’s just not possible.
0:23:58 Uh, the, you know, global per capita GDP is not high enough for that.
0:24:00 People don’t have enough income for that, uh, at least today.
0:24:05 Um, and, and so if, if you want to get to, you know, if you want the, if you want the Google
0:24:10 search engine or the Facebook social app or the whatever AI, you know, frontier AI model
0:24:15 to be available to 5 billion people, uh, for free, um, uh, you, you, you need to have
0:24:16 a business model.
0:24:19 You need to have an indirect business model and, and, and as is the obvious one.
0:24:22 Um, and so I, I do think if, you know, if, if, if, if you take
0:24:25 some principles stand against ads, I think you unfortunately are also taking a stand
0:24:28 against, against, against broad access just in the way the world works today.
0:24:32 And then, and then look, the other, the other really salient question is, um, you know, the
0:24:34 same question that the companies like Google and Facebook have been dealing with for a long
0:24:40 time, which is, um, are ads, uh, purely destructive or negative to the user experience or are they
0:24:43 actually, if done properly, are they actually either neutral or even positive?
0:24:44 Right.
0:24:46 And, and this was something that, you know, Google, I think to their credit figured out very
0:24:51 early, which is, you know, a, a, a well targeted ad at a specifically relevant point
0:24:53 in time is actually content.
0:24:55 Like it actually enhances the experience, right?
0:24:57 Cause it’s the obvious case you’re searching on a product.
0:24:59 There’s an ad, you can buy the product, you click by the product.
0:25:01 That was actually a useful piece of functionality.
0:25:06 Um, and so, you know, can you, can you have ads or, or, or other things that are like ads
0:25:09 or look like ads, you know, different kinds of referrals, you know, mechanisms or whatever.
0:25:12 Can you have them in such a way that they’re actually additive to the, to the product
0:25:13 experience?
0:25:17 Um, and you can imagine, just like with searching with social networking, you can imagine lots of
0:25:17 examples of that.
0:25:22 People will, you know, the people will, you know, the whiner out of lots of different
0:25:25 ways, but I think, you know, I think that hasn’t been a bad outcome overall.
0:25:29 Um, and I think that, uh, I think it’s entirely possible that that’s what, what happens with,
0:25:30 with these models as well.
0:25:31 Yeah.
0:25:38 So, uh, kind of similar kind of question, what, what should be legal, kind of trying
0:25:41 to create legal frameworks on, on a number of issues with AI.
0:25:46 Uh, there’s been a number of IP cases that have been working their way through the courts.
0:25:49 What can labs use to train models, et cetera.
0:25:51 There’s been some good outcomes recently.
0:25:58 Sam also was talking about how a lot of people are using AI as like a confidant, like a, you
0:25:59 know, a friend, things like that.
0:26:03 And he mentioned that currently your chats are not privileged.
0:26:09 They can be used in, in, in a, in a lawsuit or, or other, uh, situations.
0:26:16 Uh, how, how optimistic are you that our sort of legal system in the U S can get some of these
0:26:22 issues right where maybe it can’t just be, you know, total free markets, kind of lawless,
0:26:23 whatever goes.
0:26:24 Yeah.
0:26:28 So in the case of training data, I think that there, I mean, there’s a bunch of these copyright,
0:26:29 you know, kind of lawsuits happening right now.
0:26:32 There’s, you know, the big New York times opening, I won, and there’s been a bunch of others.
0:26:37 Um, yeah, I, I think in that, but for that particular problem, my guess is that problem ultimately
0:26:38 has to be solved through legislation.
0:26:41 Um, it’s, it’s, it’s ultimately a legislative question.
0:26:44 And the reason is because it goes to the nature of copyright law itself, you know, which,
0:26:45 which is legislation.
0:26:49 And, and, and of course, you know, the, the, the content industry is already claiming that
0:26:52 of course, you know, using, using copyrighted data to train, you know, without permission
0:26:57 or without paying is, is, is sort of, you know, they believe illegal on its face, you know,
0:27:00 due to violation of copyright law, the counter argument to that, which, you know, which we
0:27:02 believe is, well, it’s not copying, right.
0:27:06 There’s, there’s a distinction between training and copying, just like in the real world, there’s
0:27:09 a distinction between reading a book and copying the book, you know, as a person.
0:27:12 And so there, there, there’s going to need, I think, you know, the courts are trying to
0:27:13 grapple with that.
0:27:17 There’s a whole bunch of cases, there’s jurisdictional questions, you know, probably ultimately Congress
0:27:20 is going to have to figure out a, um, you know, uh, figure out an answer on that.
0:27:24 And by the way, the president is kind of, you know, thrown down that gauntlet in his, I think
0:27:27 the speech he gave last week or two weeks ago, um, you know, where he said that, you know,
0:27:29 we Washington probably needs to deal with, deal with that as an issue.
0:27:34 Um, so that’s one on the, on the, on the, um, on the, on the privacy thing.
0:27:37 I, I think that, that one feels like it’s a Supreme court thing.
0:27:40 Um, to me, it feels like that’s the kind of issues had a Supreme court.
0:27:45 And, uh, in other words, like whether, for example, your transcripts are considered your
0:27:48 property and whether they’re protected against, you know, warrantless search and seizure.
0:27:52 Um, and, and the observation I would make there is if you look at the march of technology
0:27:57 over time, so the constitution has like very clear, you know, fourth, fifth amendments, you
0:28:00 know, very specific rights around the, you know, the things that are yours, you know, such
0:28:03 as, you know, your home, you know, being in your home, you know, by the way, the thoughts
0:28:04 in your head, right.
0:28:08 Um, uh, you know, that the government can’t just like come in and take, they can’t, you
0:28:10 know, they can’t just come in and search your house without a warrant.
0:28:14 You know, they can’t like, you know, put you in a jail cell and beat you until you fess
0:28:14 up.
0:28:18 Like, you know, there, there are, you know, we, we, we have constitutional protections against
0:28:21 the government being able to basically, you know, take information, you know, fundamentally,
0:28:24 um, uh, you know, as well as possessions.
0:28:28 Um, and then basically what happens is every time there’s a new technology that creates
0:28:34 a new kind of sort of, you know, thing that you own, you know, thing that’s yours, thing
0:28:37 that you would consider to be private thing that you wouldn’t want the government to be
0:28:41 able to take without a warrant, you know, out of the gate, law enforcement agencies just
0:28:44 naturally go try to get those things because there are ways to solve crimes.
0:28:46 And, you know, it feels like that that’s a legal thing to do.
0:28:49 And then basically the courts come in later and they, you know, rule one way or the other
0:28:53 and basically say, no, that, that actually is also a thing that is protected against, uh,
0:28:58 you know, warrantless, for example, warrantless search, um, you know, warrantless wiretapping.
0:29:01 And so I, I feel like that, you know, this is the latest of probably, I don’t know, 20 of
0:29:02 those over the last hundred years.
0:29:06 Um, and you know, I, I don’t know which way it’ll go, but I think it’s, it’s going to be
0:29:07 a key thing.
0:29:10 Cause as you know, people are already telling these models, you know, lots, lots of things that
0:29:12 they’re, you know, that, that are very personal.
0:29:15 Okay. Lightning round, quick questions.
0:29:16 We’re letting you get out of here in a couple of minutes.
0:29:21 Um, we’re in this age of spiky intelligence models are great at some things and then terrible
0:29:21 at others.
0:29:24 Where are you actually getting value out of AI right now?
0:29:27 Where is it falling down for you?
0:29:29 Where are you, how are you using AI day to day?
0:29:30 Yeah.
0:29:33 So I, I have two kind of, I don’t know, bar, barbell approach.
0:29:35 Um, one is for, for serious stuff.
0:29:37 I love the deep research capabilities.
0:29:37 Yeah.
0:29:41 Um, and so, and I’m, I’m doing this in a bunch of models, but like the ability to basically
0:29:42 say, I’m interested in this topic.
0:29:46 And then what I just, I just felt like, write me a book and I, you know, I’m kind of hoping
0:29:47 for the longest book I can get.
0:29:49 I always tell it like, go longer, go longer, more sophisticated.
0:29:53 Um, you know, but the, the leading edge models now they’re getting up to like 30 page PDFs,
0:29:56 um, you know, that are like completely well-formulated, you know, basically long form,
0:29:59 long form essays, um, you know, it’s just like incredible richness and depth.
0:30:03 Um, and you know, if it’s 30 pages today, I’m sort of crossing my fingers.
0:30:05 It’ll get to, you know, 300 pages coming up here in the next few years.
0:30:10 Um, and so I, I, you know, I’m able to basically have the thing generate enormous amounts of,
0:30:14 of, of reading material with just like, I think incredible richness and depth and complexity.
0:30:17 Um, and then, and then on the other side of the barbell is humor.
0:30:20 Um, and I’ve, I’ve posted some of these to my ex feed over the last couple of years,
0:30:24 but I think these models are already much funnier than people give them credit for.
0:30:24 Really?
0:30:28 Um, yeah, I think, I think they’re, they’re actually quite highly entertaining.
0:30:31 Um, a while ago I posted that.
0:30:33 Is it specific, specific formats?
0:30:37 Like, you know, the B, B, Mark Andreessen, you know, that, that format.
0:30:40 Take a dip in my pool, in my office.
0:30:41 They’re really good.
0:30:43 So they’re really good at green text.
0:30:44 Uh, that works really well.
0:30:47 But the, the, the, the, for some reason, the ones I find hysterical are the, I haven’t
0:30:51 written screenplays, um, you know, for like TV shows or, or, uh, or plays or movies.
0:30:53 Um, and, um, I, I posted it.
0:30:54 I had it right.
0:30:57 A new season of the HBO Silicon Valley, you know, set 10 years later.
0:30:57 Yep.
0:31:01 Um, and I had it right, like an entire, I had it right, like 10, 10 scripts for a complete
0:31:02 season.
0:31:04 And of course I just said, you know, make it like Silicon Valley, except, you know, it’s
0:31:07 happening at, at, at, it is in 2021, it kind of peak woke.
0:31:11 Um, and, and I thought it was just, I think it’s just, you know, I’ll sit there at two
0:31:13 in the morning and just like laughing my ass off at how funny this thing is.
0:31:18 Um, and so I think these things are actually, are actually already like extremely funny.
0:31:21 They’re extremely entertaining when they’re, when they’re, uh, you know, when they’re used
0:31:21 in that way.
0:31:23 And I, I, I do, I do enjoy that a lot.
0:31:26 And I generate a lot of those, uh, that I, that I don’t post.
0:31:28 Stay in the group chats.
0:31:30 It’s probably good idea.
0:31:31 They’re your property.
0:31:31 Yeah.
0:31:34 Hopefully the fourth amendment holds on these.
0:31:34 Yeah.
0:31:34 That’s great.
0:31:36 I have one last question.
0:31:36 Go for it.
0:31:37 And then I’ve got one more.
0:31:40 Uh, how do you get a job as a venture capitalist in 2025?
0:31:45 Um, so I think, I mean, look, the, the best way, the best way to do it is to have a track
0:31:49 record early as somebody who is like in the loop specifically on your product development.
0:31:53 Um, and so somebody who, you know, be, be like deeply in the trenches, um, at one of these
0:31:56 new companies in one of these spaces, um, you know, participate in the creation of, of a great
0:32:00 new product, uh, and, and, and a great new company and, you know, really demonstrate
0:32:01 that you know how to do that.
0:32:04 Um, you know, there’s, there, you know, there, there are great VCs who have not done that,
0:32:07 but, you know, I think that is sort of a foundational skillset.
0:32:10 Uh, you know, for working with the kinds of founders that, that you want to work with,
0:32:13 who are going to, you know, are going to want you to have, you know, kind of very interesting
0:32:14 things to say on that.
0:32:16 Um, as I think that, you know, it’s still the, the, the best way to do it.
0:32:17 Yeah.
0:32:21 Like feel the growth, be, immerse yourself in the growth, the, the, the, the, the, the,
0:32:24 the, the, the, the aggressive growth environment, and then you’ll be able to identify it when
0:32:25 you see it from afar.
0:32:26 Yeah.
0:32:26 That’s right.
0:32:32 Last question for me, state of M&A in your mind, how are you advising, you know, companies,
0:32:37 uh, where, where you’re on the board or just the portfolio broadly around what they should
0:32:40 expect now and, and in the near future?
0:32:44 Uh, you mean in terms of whether you can get things approved or basically, yeah.
0:32:45 Yeah.
0:32:45 Yes.
0:32:47 So look, approval still, approval is not a slam dunk.
0:32:50 There was, uh, there was a, you know, there was a, as I just saw, there was a medical device
0:32:53 company this morning, you know, where the, the acquisition was not allowed by the FTC.
0:32:55 So, um, you know, look, there, there is still scrutiny.
0:32:58 It’s, you know, it’s obviously a very different political regime in Washington, but, you know,
0:33:02 this is, this is not, you know, by, by their own statements, this is not an administration
0:33:06 that believes it’s in total laissez-faire, um, MNAN is, it definitely wants to, you know,
0:33:09 in, in, in their view, maintain a, a very healthy level of, of market competition.
0:33:16 Um, so I think, do you expect, do you expect certain companies to be negatively impacted by
0:33:18 the Figma story, right?
0:33:24 You have this deal gets blocked, successful, you know, IPO, Lena Khan is taking a victory lap.
0:33:30 Uh, you know, many people are responding and, and joking saying, you know, someone, Lena
0:33:34 cuts off the arm of a pianist and they endure and can create a masterpiece.
0:33:40 And then, um, and so I expect, and then you look at the example with, you know, Roomba, I
0:33:43 think it was where, where Roomba had to deal with Amazon.
0:33:47 It was blocked and, and the company has just been in shambles ever since.
0:33:52 So I, my concern is that people look at Figma and say, you should be independent.
0:33:53 You just figure it out.
0:33:53 Nothing can go wrong.
0:33:55 Yes.
0:33:55 Yeah.
0:33:57 I mean, it’s kind of taking a victory lap was very disconcerting.
0:34:01 Um, and for exactly the reason you said, which is survivorship bias, um, right.
0:34:05 Which is you, you, you, you pick the one that worked out and then, you know, it’s the, it’s
0:34:09 the airplane, the red dots, the airplane, you know, you, you, you ignore the 50 that are
0:34:10 in the ground, uh, that you’ve never heard of.
0:34:15 Um, and so that was very disconcerting because that, you know, it’s sort of the central planning
0:34:17 fallacy, which is like, we make centrally planned economic decisions.
0:34:20 We have one example, you know, it’s like in Europe, it’s like, yeah, well, the, the bottle
0:34:22 caps actually don’t fall off the bottle, right?
0:34:25 Like, you know, it works.
0:34:27 Right.
0:34:29 It’s like, okay.
0:34:32 But, but do you want to live and you want to live in an economic regime in which the, you
0:34:34 know, the government has dictated bottle cap design?
0:34:35 The answer is clearly no.
0:34:37 Uh, cause the downside consequences.
0:34:42 Or even, even looking at that, uh, you know, the Chinese model, which is, you know, people
0:34:47 can say they’re picking winners, but to get to maybe picking a winner, you have this intense
0:34:53 bloodbath of competition where, you know, teams need to rise to the top and sort of prove themselves
0:34:58 before they get any of that real, like, you know, meaningful state benefit.
0:34:59 Uh, yeah, that’s right.
0:35:03 And so you just, you just, yeah, you just, you just have this adverse selection, uh, survivorship
0:35:06 bias thing where you just, you, you don’t pay attention to all the collateral damage.
0:35:10 So I, I, I do think that mentality is like super, super, super dangerous.
0:35:14 Um, and so, yeah, look, I think companies just have to be very thoughtful about this, both
0:35:18 acquirers and the inquiries, um, you know, and I was, you know, the big thing is if you’re
0:35:21 selling a company, like you just need to anticipate that you might, you might not get it through.
0:35:24 And if you don’t, they’re sort of, they’re like, okay, number one, is there like a big
0:35:25 enough breakup fee, right?
0:35:28 Are you going to get, you know, paid for the, you know, paid for the, the, the, you know,
0:35:31 the damage that you’re going through, um, you know, is, and, and how is that
0:35:32 structured on the one hand?
0:35:34 And then two is, yeah, look, do you have the kind of company culture that’s going
0:35:37 to be able to withstand that, um, and, and, and, and is your business, you know, strong,
0:35:39 strong enough to be able to be able to get through that.
0:35:42 And it’s, it, it, it is a real risk and something worth, you know, taking very seriously.
0:35:42 Yeah.
0:35:44 And that’s, that, that’s why it felt emotional.
0:35:46 We were at NYC last week.
0:35:52 It felt emotional this, that, that the, the Figma team was, was able to like effectively just
0:35:56 like restart the business and say like, we’re, we’re, we’re taking this all the way.
0:36:02 So if you talk to any really successful company, what they’ll tell you is, yeah, over the
0:36:05 years, we have these like crucible moments in which like, we almost died.
0:36:06 Right.
0:36:08 But we like pull together and we pulled it off.
0:36:10 And then that became like, you know, one of these central kind of mythical events in the
0:36:12 history of the company that we always refer to.
0:36:15 And I’m like, my God, we got through that and we’re so strong and tough and we’ve been
0:36:16 forging fire and now we can do anything.
0:36:18 And it’s like, yeah, that’s great.
0:36:22 And then there are 50 other companies that have those crucible moments blew up and died.
0:36:28 And so, yeah, it’s, it’s all of the quote lessons learned on this stuff.
0:36:30 They’re all conditional on, on life survival.
0:36:33 And so they, they, these things need to be taken incredibly seriously.
0:36:35 You know, which, which the great CEOs do.
0:36:36 Yeah.
0:36:37 Well, thanks so much for joining.
0:36:38 We’ll let you get back to your day.
0:36:40 We are five minutes over next time.
0:36:42 We have to book five hours because this is fantastic.
0:36:44 I got 10% of the way through.
0:36:45 Let’s do the first 24 hour TV.
0:36:46 Yeah.
0:36:47 We would love to have you again.
0:36:47 Marathon.
0:36:49 Enjoy the rest of your day.
0:36:50 We’ll talk to you soon, Mark.
0:36:50 Have a good day.
0:36:51 Bye.
0:36:52 Thank you guys.
0:36:52 Thank you.
0:36:57 Thanks for listening to the a 16 Z podcast.
0:37:02 If you enjoyed the episode, let us know by leaving a review at rate, this podcast.com slash
0:37:03 a 16 Z.
0:37:05 We’ve got more great conversations coming your way.
0:37:07 See you next time.

In this episode, Marc Andreessen joins TBPN for an unfiltered conversation spanning everything from ads in LLMs to why Apple’s AI strategy may be risky for anyone not named Apple.

Marc breaks down the current state of AI: why open source is resurging, how foundational research is (or isn’t) turning into product, and whether we’ve hit the moment when phones start to fade as dominant platforms. He also shares his candid thoughts on Meta’s wearable wins, Vision Pro’s imperfections, and how humor and deep research are his two favorite use cases for AI today.

Timecodes:

0:00 Intro  

2:41  The Pace of AI and Technology Cycles  

4:03  Research vs. Productization in AI Companies  

5:15  Apple’s Strategy: Last Mover Advantage  

7:09  The Future Beyond Smartphones  

10:23  Open Source AI: Progress and Challenges  

13:49  Ads in AI: Business Models and User Experience  

15:52  Legal Frameworks for AI and Data  

17:53  Lightning Round: How Mark Uses AI  

19:01  Breaking into Venture Capital in 2025  

20:34  M&A, Survivorship Bias, and Company Resilience  

Resources

Watch TBPN: https://www.tbpn.com/

Marc on X:   https://x.com/pmarca

Marc’s Substack: https://pmarca.substack.com/

Leave a Reply

Your email address will not be published. Required fields are marked *