AI Eats the World: Benedict Evans on the Next Platform Shift

0
0
AI transcript
0:00:03 Chad GPT has got eight or nine hundred million
0:00:03 weekly out of users.
0:00:06 And if you’re the kind of person who is using this
0:00:10 for hours every day, ask yourself why five times more people
0:00:12 look at it, get it, know what it is, have an account,
0:00:15 know how to use it, and can’t think of anything to do with it
0:00:17 this week or next week.
0:00:19 The term AI is a little bit like the term technology.
0:00:22 When something’s been around for a while, it’s not AI anymore.
0:00:24 Is machine learning still AI? I don’t know.
0:00:27 In actual general usage, AI seems to mean new stuff.
0:00:30 And AGI seems new scary stuff.
0:00:32 AGI seems to be a little bit like this.
0:00:35 Like either it’s already here and it’s just more software
0:00:38 or it’s five years away and will always be five years away.
0:00:41 We don’t know the physical limits of this technology
0:00:43 and so we don’t know how much better it can get.
0:00:46 You’ve got Sam Altman saying, we’ve got PhD level researchers right now.
0:00:48 And Demis Asibis says, no we don’t, shut up.
0:00:51 Very new, very, very big, very, very exciting,
0:00:53 world-changing things tend to lead to bubbles.
0:00:55 So yeah, if we’re not in a bubble now, we will be.
0:00:59 Is AI just another platform shift
0:01:01 or the biggest transformation since electricity?
0:01:05 Benedict Evans, technology analyst and former A16Z partner,
0:01:08 has spent years studying waves like PCs, the internet, and cell phones
0:01:11 to understand what actually changed and who captured the value.
0:01:14 Now he’s turned that same lens on AI
0:01:17 and the picture is far more complex than benchmarks or hype cycles suggest.
0:01:20 Some industries may be rewritten from the ground up.
0:01:22 Others may barely notice.
0:01:25 Tech giants like Google, Meta, Amazon, and Apple
0:01:27 are racing to reinvent themselves before someone else does.
0:01:29 Yet for all the excitement,
0:01:33 most people still struggle to find something they truly need AI for every single day.
0:01:36 A disconnect Benedict thinks is an important signal
0:01:37 about where we really are in the curve.
0:01:40 In today’s episode, we get into where bottlenecks emerge,
0:01:42 why adoption looks the way it does,
0:01:44 what kinds of products still haven’t shown up,
0:01:46 and how history can actually guide us here.
0:01:49 And finally, what would have to happen over the next few years
0:01:51 is for us to look back and say,
0:01:52 AI wasn’t just another wave.
0:01:54 It was bigger than the internet.
0:01:59 Benedict, welcome back to the A16Z podcast.
0:02:00 Good to be back.
0:02:02 We’re here to discuss your latest presentation,
0:02:03 AI Eats the World.
0:02:05 So for those who haven’t read it yet,
0:02:07 maybe we can share the high-level thesis
0:02:11 and maybe contextualize it in light of recent AI presentations.
0:02:13 I’m curious how your thinking has evolved.
0:02:16 Yeah, it’s funny, one of the slides in the debt reference
0:02:18 is a conversation where I had with a big company CMO
0:02:21 who said, we’ve all had lots of AI presentations now.
0:02:22 The Google one.
0:02:24 We’ve had the Google one and the Microsoft one.
0:02:26 We’ve had the Bain one and the BCG one.
0:02:29 We’ve had the one from Accenture and the one from our ad agency.
0:02:30 So now what?
0:02:33 So there’s sort of 90-odd slides.
0:02:35 So there’s a bunch of different things I’m trying to get at.
0:02:37 One of them is, I think, just to say,
0:02:40 well, if this is a platform shift or more than a platform shift,
0:02:42 how do platform shifts tend to work?
0:02:43 What are the things that we tend to see in it?
0:02:47 And how many of those patterns can we see being repeated now?
0:02:51 And, of course, some of the patterns that come out of that
0:02:52 are things like bubbles,
0:02:56 but others are that lots of stuff changes inside the tech industry.
0:02:58 And there are winners and losers
0:03:00 and people who were dominant end up becoming irrelevant.
0:03:03 And then there were new billion, trillion-dollar companies created.
0:03:07 But then there’s also, what does this mean outside the tech industry?
0:03:11 Because if we think back over the last waves of platform shifts,
0:03:13 there were some industries where this changed everything
0:03:15 and created and uncreated industries.
0:03:18 And there are others where this was just kind of a useful tool.
0:03:20 So, you know, if you’re in the newspaper business,
0:03:23 the last 30 years looked very different to if you were in the cement business,
0:03:25 where the internet was just kind of useful,
0:03:27 but didn’t really change the nature of your industry very much.
0:03:33 And so what I tried to do is give people a sense of,
0:03:36 well, what is it that’s going on in tech?
0:03:37 How much money are we spending?
0:03:38 What are we trying to do?
0:03:39 What are the unanswered questions?
0:03:43 What might or might not happen within the tech industry?
0:03:46 But then outside technology,
0:03:48 how does this tend to play out?
0:03:50 What seems to be happening at the moment?
0:03:54 How is this manifesting into tools and deployment
0:03:57 and new use cases and new behaviors?
0:04:01 And then as we kind of step back from all of this,
0:04:03 again, how many times have we gone through all of this before?
0:04:05 It’s funny, I went on a podcast this summer
0:04:07 and I sort of opening line,
0:04:07 I said something like,
0:04:09 well, I’m a centrist.
0:04:11 I think this is as big a deal as the internet or smartphones,
0:04:13 but only as big a deal as the internet or smartphones.
0:04:16 And there’s like 200 YouTube commentators under these
0:04:18 saying this more and he doesn’t understand how big this is.
0:04:21 And I think, well, the internet was kind of a big deal.
0:04:23 It was kind of a big deal.
0:04:25 And, you know, I sort of finished the day
0:04:26 by looking at elevators
0:04:28 because I live in an apartment building in Manhattan
0:04:30 and we have an attended elevator,
0:04:32 which means there’s no buttons,
0:04:33 there’s an accelerator and a brake.
0:04:35 And the doorman gets in and drives you to your floor,
0:04:35 this streetcar.
0:04:40 And in the 50s, Otis deployed automatic elevators
0:04:42 and then you get in and you press a button.
0:04:44 And they marketed it by saying,
0:04:46 it’s got electronic politeness,
0:04:48 which means the infrared beam.
0:04:52 And today when you get into an elevator,
0:04:52 you don’t say,
0:04:56 ah, I’m using an electronic elevator.
0:04:57 It’s automatic.
0:04:58 It’s just a lift,
0:05:01 which is what happened with databases
0:05:04 and with the web and with smartphones.
0:05:06 And I kind of think now,
0:05:06 this is funny,
0:05:08 I’ve done a couple of polls on this in LinkedIn and threads
0:05:10 of is machine learning still AI?
0:05:13 The term AI is a little bit like the term technology
0:05:15 or automation.
0:05:17 It only kind of applies when something’s new.
0:05:19 When something’s been around for a while,
0:05:20 it’s not AI anymore.
0:05:21 So like databases certainly aren’t AI.
0:05:23 Is machine learning still AI?
0:05:24 I don’t know.
0:05:25 And there’s obviously,
0:05:26 there’s like an academic definition
0:05:28 where people say this guy’s an idiot.
0:05:28 No, of course,
0:05:30 I’m going to explain the definition of AI.
0:05:32 But then in actual general usage,
0:05:33 AI seems to mean new stuff.
0:05:38 AI seems like new, scary stuff.
0:05:40 Yeah, it’s funny.
0:05:41 I was thinking about this.
0:05:42 There’s an old theologian’s joke
0:05:43 that the problem for Jews
0:05:44 is that you wait and wait and wait
0:05:46 for the Messiah and he never comes.
0:05:47 And the problem for Christians
0:05:48 is that he came and nothing happened.
0:05:50 You know, the world didn’t change.
0:05:51 There was still sin.
0:05:52 All practical purposes, nothing happened.
0:05:55 And AI seems to be a little bit like this.
0:05:57 Like either it’s already here.
0:05:58 And so you’ve got Sam Altman saying,
0:06:00 we’ve got PhD level researchers right now.
0:06:02 And Demis Asibis says,
0:06:02 what?
0:06:03 No, we don’t.
0:06:03 Shut up.
0:06:06 And so either it’s already here
0:06:08 and it’s just more software
0:06:10 or it’s five years away
0:06:12 and will always be five years away.
0:06:13 Yeah, yeah.
0:06:16 Let’s compare back to previous platform chefs
0:06:17 because some people look at,
0:06:18 you know,
0:06:19 some of the internet and say,
0:06:20 hey, there were net new
0:06:22 trillion dollar companies,
0:06:23 Facebook and Google
0:06:24 that were created from it
0:06:26 and just sort of all sorts of
0:06:28 new emerging winners.
0:06:29 Whereas they look at something like mobile
0:06:30 and say, hey,
0:06:30 you know,
0:06:31 there were big companies
0:06:32 like Uber and Snap
0:06:33 and Instagram and WhatsApp,
0:06:35 but these were billion dollar outcomes
0:06:37 or tens of billion dollar outcomes.
0:06:38 But really the big winners were,
0:06:39 were in fact,
0:06:40 Facebook and Google.
0:06:41 And so in some sense,
0:06:44 mobile perhaps was sustaining.
0:06:45 You feel free to quibble
0:06:46 with the definition
0:06:47 of sustaining disruptive,
0:06:48 but sustaining in the sense
0:06:49 that maybe more of the value
0:06:51 went to incumbents,
0:06:51 companies that existed
0:06:53 prior to the shift.
0:06:53 I’m curious
0:06:55 how you think about AI
0:06:56 in light of that
0:06:57 in terms of
0:06:58 is more of the gains
0:06:59 going to come to net new companies
0:07:01 like OpenAI and Anthropic
0:07:02 and others that follow
0:07:04 or are more of the gains
0:07:04 going to be captured
0:07:05 by Microsoft
0:07:06 and Google
0:07:07 and Meta
0:07:07 and companies
0:07:08 that existed prior?
0:07:11 So I think there’s several answers
0:07:11 to this.
0:07:11 One of them is like
0:07:13 you kind of have to be careful
0:07:14 about like framings
0:07:15 and structures and things
0:07:15 because you end up
0:07:16 arguing about the framing
0:07:17 and the definition
0:07:18 rather than arguing
0:07:19 about what’s going to happen.
0:07:21 And they’re all useful,
0:07:22 but they’ve all got holes in them.
0:07:24 And, you know,
0:07:26 what mobile did was,
0:07:26 you know,
0:07:27 there’s a bunch of things
0:07:28 that it changed fundamentally.
0:07:29 It shifted us from the web
0:07:30 to apps, for example.
0:07:32 And it gave everybody
0:07:32 in the world
0:07:33 a pocket computer.
0:07:34 So even today,
0:07:35 there’s less than a billion
0:07:36 consumer PCs on Earth
0:07:36 than there’s something
0:07:38 between five and six billion
0:07:38 smartphones.
0:07:41 And it made possible things
0:07:42 that would not have been
0:07:43 possible without it,
0:07:44 whether that’s TikTok
0:07:45 or arguably,
0:07:46 I think,
0:07:47 things like online dating.
0:07:49 And you can map those
0:07:50 against dollar value.
0:07:51 You can also map those
0:07:53 against kind of structural change
0:07:54 in consumer behavior
0:07:55 and access to information
0:07:56 and things.
0:07:58 And I think you could
0:07:59 certainly argue that Meta
0:08:00 would be a much smaller company
0:08:01 if it wasn’t for mobile,
0:08:02 for example.
0:08:04 So you can kind of argue
0:08:05 the puts and calls
0:08:06 on this stuff a lot.
0:08:07 There’s certainly
0:08:08 not all platform shifts
0:08:09 are the same.
0:08:10 And, you know,
0:08:11 you can do the sort of
0:08:12 standard sort of teleology
0:08:12 of say, well,
0:08:13 there were mainframes
0:08:14 and then PCs
0:08:14 and then the web
0:08:15 and then smartphones.
0:08:16 But you kind of want
0:08:17 to put SAS in there somewhere
0:08:18 and you kind of want
0:08:19 to put open source in there
0:08:19 and maybe you want
0:08:20 to put databases.
0:08:22 And so these are kind
0:08:22 of useful framings,
0:08:24 but they’re not predictive.
0:08:24 They don’t tell you
0:08:25 what’s going to happen.
0:08:27 They just kind of give you
0:08:28 one way of understanding
0:08:30 what seems some of the
0:08:31 patterns that we have here.
0:08:32 And of course,
0:08:33 the big debate around
0:08:34 generative AI
0:08:34 is just another
0:08:35 platform shift
0:08:36 or is it something
0:08:36 more than that?
0:08:37 And of course,
0:08:38 the problem is we don’t
0:08:38 know and we don’t have
0:08:39 any way of knowing
0:08:40 other than waiting to see.
0:08:41 So this may be as big
0:08:43 as PCs or the web
0:08:44 or SAS or open source
0:08:44 or something,
0:08:45 or it may be as big
0:08:46 as computing
0:08:46 and then you’ve got
0:08:47 the very overexcited
0:08:48 people living in group
0:08:49 houses in Berkeley
0:08:50 who think this is
0:08:50 as big as fire
0:08:51 or something.
0:08:52 Well, great.
0:08:53 But does this
0:08:53 bring new companies?
0:08:54 I mean,
0:08:54 you go back to the mobile,
0:08:55 there was a time
0:08:56 when people thought
0:08:57 that blogs were going
0:08:58 to be indifferent to the web,
0:08:59 which seems weird now.
0:09:00 Like Google needed
0:09:02 like a separate blog search.
0:09:02 This was seriously,
0:09:03 this was a thing.
0:09:04 There was a time
0:09:06 when it was really not clear
0:09:06 and I think you kind of
0:09:07 generalize this point.
0:09:09 You go back to the internet
0:09:11 in the mid-90s,
0:09:12 we kind of knew
0:09:13 this was going to be
0:09:13 a big thing.
0:09:15 We didn’t really know
0:09:16 it was going to be the web.
0:09:17 So before that,
0:09:17 we didn’t know
0:09:18 it was going to be the internet.
0:09:19 We knew there were
0:09:20 going to be networks.
0:09:20 It wasn’t clear
0:09:21 it was going to be the internet.
0:09:21 Then it wasn’t clear
0:09:22 it was going to be the web.
0:09:23 Then it wasn’t really clear
0:09:24 how the web was going to work.
0:09:27 And when Netscape launched,
0:09:28 like Mark Zuckerberg
0:09:29 was in junior high
0:09:30 or elementary school
0:09:30 or something
0:09:32 and Larry and Sergei
0:09:32 were students
0:09:34 and like Amazon
0:09:34 with a bookstore.
0:09:37 So you can know it
0:09:38 but not know it.
0:09:39 And you could make
0:09:39 the same point
0:09:40 about smartphones.
0:09:41 Like it was,
0:09:42 we knew everyone
0:09:42 was going to have
0:09:43 an internet connected
0:09:44 thing in their pocket,
0:09:45 but it was not clear
0:09:46 it was basically
0:09:47 going to be a PC
0:09:48 from this has-been
0:09:49 PC company
0:09:50 from the 80s
0:09:52 and a search engine company.
0:09:52 It was not clear
0:09:53 it wasn’t going to be
0:09:54 Nokia or Microsoft.
0:09:54 See, I think you have
0:09:55 to be super careful
0:09:56 in making kind of
0:09:57 deterministic predictions
0:09:58 about this.
0:09:59 What you can do
0:10:00 is say,
0:10:00 well,
0:10:02 when this stuff happens
0:10:03 everything changes
0:10:04 and that’s happened
0:10:06 five or ten times before.
0:10:07 I’m curious how you got
0:10:08 conviction in this idea
0:10:09 or what you got
0:10:09 as a prediction
0:10:10 that hey,
0:10:11 AI is going to be
0:10:12 as big as the internet
0:10:12 which of course
0:10:13 is pretty big
0:10:15 but I’m not yet,
0:10:16 I’m not yet at the conviction
0:10:17 that it’s going to be
0:10:17 any bigger.
0:10:18 I’m curious
0:10:19 what sort of
0:10:21 inspires that
0:10:21 sort of
0:10:24 statement
0:10:24 and then also
0:10:26 what might change
0:10:27 your mind either way
0:10:28 that it might not
0:10:28 be as big as the internet
0:10:29 because of course
0:10:29 the internet
0:10:30 was obviously very big
0:10:31 but also that
0:10:31 hey,
0:10:33 perhaps it might be bigger.
0:10:34 Well,
0:10:35 so I think
0:10:36 I don’t want to
0:10:37 I made a diagram
0:10:38 of kind of S-curves
0:10:39 kind of going up
0:10:39 and someone said
0:10:40 well,
0:10:40 what’s the axis
0:10:41 on this diagram?
0:10:42 I don’t want to
0:10:43 kind of get into
0:10:45 is this 5% bigger
0:10:46 than internet
0:10:47 or is it 20% bigger?
0:10:48 I think the question
0:10:48 is more like
0:10:49 is it another
0:10:50 of these industry cycles
0:10:51 or is it a much
0:10:52 more fundamental
0:10:52 change
0:10:53 in what technology
0:10:54 can be?
0:10:54 Is it more like
0:10:55 computing
0:10:56 or electricity
0:10:57 as a sort of
0:10:58 structural change
0:11:00 rather than
0:11:01 here’s a whole bunch
0:11:01 more stuff we can do
0:11:02 with computers?
0:11:03 I think that’s sort of
0:11:04 the question
0:11:05 and there’s a funny
0:11:06 sort of disconnect
0:11:07 I think in looking
0:11:07 at debates about this
0:11:08 within tech
0:11:09 because I watched
0:11:10 this is one of
0:11:12 the open AI
0:11:12 live streams
0:11:13 a couple of
0:11:13 weeks ago
0:11:16 and they spend
0:11:17 the first 20 minutes
0:11:17 talking about
0:11:18 how they’re going
0:11:18 to have like
0:11:18 human level
0:11:19 PhD level
0:11:20 AI researchers
0:11:21 like next year
0:11:22 and then the
0:11:23 second half
0:11:23 of the stream
0:11:23 is
0:11:24 oh and here’s
0:11:25 our API stack
0:11:26 that’s going to
0:11:26 enable hundreds
0:11:27 and thousands
0:11:27 of new software
0:11:28 developers
0:11:28 just like Windows
0:11:29 and in fact
0:11:30 literally quote
0:11:30 Bill Gates
0:11:31 and you think
0:11:31 well those can’t
0:11:32 kind of both
0:11:33 be true
0:11:34 like either
0:11:35 I’ve got a thing
0:11:35 which is a
0:11:36 PhD level
0:11:37 AI researcher
0:11:38 which by implication
0:11:39 is like a PhD
0:11:40 level CPA
0:11:42 or I’ve got
0:11:43 a new piece
0:11:43 of software
0:11:44 that does my
0:11:44 taxes for me
0:11:45 and well
0:11:45 which is it
0:11:47 either this thing
0:11:47 is going to be
0:11:48 like human level
0:11:49 and some
0:11:50 and that’s a very
0:11:51 very challenging
0:11:52 problematic
0:11:53 complicated statement
0:11:55 or this is
0:11:56 going to let us
0:11:57 make more software
0:11:57 that can do
0:11:58 more things
0:11:58 that software
0:11:59 couldn’t be
0:12:00 and I think
0:12:01 there’s a real
0:12:02 like schizophrenia
0:12:04 in conversations
0:12:04 around this
0:12:05 because like
0:12:06 scaling laws
0:12:06 and it’s going
0:12:07 to scale all
0:12:07 the way
0:12:07 and meanwhile
0:12:08 I’m going
0:12:08 to hear
0:12:09 look how good
0:12:09 it is at
0:12:10 writing code
0:12:10 and again
0:12:10 like well
0:12:10 is it
0:12:11 writing code
0:12:12 or do we
0:12:12 not need
0:12:13 software anymore
0:12:13 because in
0:12:14 principle
0:12:15 if the models
0:12:15 keep scaling
0:12:16 nobody’s going
0:12:16 to write code
0:12:17 anymore
0:12:17 you’ll just
0:12:18 say to the
0:12:18 model like
0:12:18 hey can you
0:12:19 do this
0:12:19 thing for me
0:12:21 is it a little
0:12:21 bit of a hedge
0:12:22 or like a
0:12:23 sequencing thing
0:12:23 or
0:12:24 well
0:12:25 some of it’s
0:12:26 a sequencing
0:12:26 thing
0:12:27 but you know
0:12:27 in principle
0:12:28 if you think
0:12:28 this stuff is
0:12:29 going to keep
0:12:29 scaling
0:12:30 like why are
0:12:30 you investing
0:12:31 in a software
0:12:31 company
0:12:33 because you know
0:12:34 people just have
0:12:34 this like
0:12:37 and I think
0:12:38 this is
0:12:39 kind of
0:12:40 the funny
0:12:40 kind of
0:12:40 challenge
0:12:41 and this is
0:12:41 I think
0:12:42 the fundamental
0:12:43 way that this
0:12:43 is different
0:12:43 from previous
0:12:44 platform shifts
0:12:45 is that with
0:12:46 the internet
0:12:47 or with mobile
0:12:48 or being
0:12:48 deemed with
0:12:48 mobile
0:12:49 you didn’t
0:12:50 know what
0:12:50 was going to
0:12:50 happen in the
0:12:51 next couple
0:12:51 of years
0:12:52 you didn’t
0:12:52 know what
0:12:53 Amazon would
0:12:53 become
0:12:53 and you didn’t
0:12:54 know how
0:12:54 Netscape was
0:12:55 going to work
0:12:55 out
0:12:56 and you didn’t
0:12:56 know what
0:12:56 next year’s
0:12:57 iPhone was going
0:12:57 to be
0:12:58 and 10 years
0:12:58 ago when we
0:12:59 cared about
0:12:59 that
0:12:59 you kind of
0:13:00 knew the
0:13:00 physical limits
0:13:00 it’s like
0:13:01 you knew
0:13:02 in 1995
0:13:02 you knew
0:13:03 that telcos
0:13:03 were not
0:13:03 going to
0:13:04 give everybody
0:13:05 gigabit fiber
0:13:05 next year
0:13:07 and you knew
0:13:07 that the iPhone
0:13:08 wasn’t going
0:13:08 to like have
0:13:09 a year’s
0:13:09 battery life
0:13:10 and unroll
0:13:10 and have a
0:13:11 projector
0:13:11 and fly
0:13:12 or whatever
0:13:13 but we don’t
0:13:14 know the
0:13:15 physical limits
0:13:15 of this
0:13:15 technology
0:13:16 because we
0:13:16 don’t really
0:13:17 have a good
0:13:17 theoretical
0:13:18 understanding
0:13:18 of why
0:13:18 it works
0:13:19 so well
0:13:20 nor indeed
0:13:20 do we have
0:13:20 a good
0:13:21 theoretical
0:13:21 understanding
0:13:21 of what
0:13:21 human
0:13:22 intelligence
0:13:22 is
0:13:23 and so we
0:13:24 don’t know
0:13:24 how much
0:13:25 better it can
0:13:25 get
0:13:27 so you can
0:13:28 do a chart
0:13:28 and you could
0:13:29 say well
0:13:31 for modems
0:13:31 and this is
0:13:32 the roadmap
0:13:32 for DSL
0:13:33 and this is
0:13:33 how fast
0:13:34 DSL will be
0:13:34 and then you
0:13:34 can make
0:13:35 some guesses
0:13:36 about how
0:13:37 quickly telcos
0:13:37 will deploy
0:13:38 DSL
0:13:38 and then you
0:13:39 can say well
0:13:39 clearly we’re
0:13:39 not going to
0:13:40 be able to
0:13:41 replace broadcast
0:13:41 TV
0:13:42 we’re streaming
0:13:43 in 1998
0:13:45 but we don’t
0:13:46 have an
0:13:47 equivalent way
0:13:47 of modelling
0:13:48 this stuff
0:13:50 to know
0:13:51 what is the
0:13:51 fundamental
0:13:52 capability
0:13:52 of it
0:13:53 going to
0:13:53 look like
0:13:54 in three
0:13:54 years
0:13:55 which gets
0:13:56 you to
0:13:56 these kind
0:13:56 of slightly
0:13:57 vibes based
0:13:58 forecasting
0:13:59 where no one
0:14:00 really knows
0:14:00 so you
0:14:01 know
0:14:01 Jeff Hinton
0:14:02 says well
0:14:02 I feel
0:14:03 like
0:14:03 and Demis
0:14:03 Sassabas
0:14:04 says well
0:14:04 I feel
0:14:05 like
0:14:05 but no one
0:14:06 knows
0:14:06 and then
0:14:07 Karpathy
0:14:07 goes into
0:14:08 Rakesh’s
0:14:08 podcast
0:14:09 and says
0:14:09 I feel
0:14:09 like
0:14:10 you know
0:14:10 it’s
0:14:10 a decade
0:14:11 out
0:14:11 yeah I
0:14:12 know
0:14:12 when I saw
0:14:13 this meme
0:14:14 of what’s his
0:14:14 name
0:14:14 Ilya
0:14:15 Suskeva
0:14:15 but he
0:14:16 says like
0:14:16 the answer
0:14:17 will reveal
0:14:17 itself
0:14:18 and somebody
0:14:18 like
0:14:20 memed
0:14:20 I’m going
0:14:20 to say
0:14:21 photoshopped
0:14:21 but of course
0:14:22 it wouldn’t
0:14:22 have been
0:14:22 photoshopped
0:14:23 turned him
0:14:23 into a
0:14:23 Buddhist
0:14:23 monk
0:14:24 wearing
0:14:24 like an
0:14:24 orange
0:14:25 like an
0:14:25 orange
0:14:25 outfit
0:14:26 the future
0:14:27 will reveal
0:14:27 itself
0:14:29 but this is
0:14:30 the problem
0:14:30 we don’t
0:14:30 know
0:14:31 we don’t
0:14:31 have a way
0:14:32 of modeling
0:14:32 this
0:14:32 yeah
0:14:33 and so
0:14:35 let’s connect
0:14:36 this to sort
0:14:36 of the
0:14:37 you know
0:14:37 the upfront
0:14:37 investment
0:14:38 that some
0:14:38 of these
0:14:38 companies
0:14:39 are making
0:14:40 because we
0:14:40 don’t know
0:14:40 you know
0:14:41 is there
0:14:41 a risk
0:14:42 of over
0:14:43 investment
0:14:43 leading to
0:14:44 some
0:14:44 you know
0:14:45 potential
0:14:46 you know
0:14:46 bubble like
0:14:46 mechanics
0:14:47 or how
0:14:48 do you
0:14:48 think about
0:14:48 that question
0:14:50 well
0:14:51 deterministically
0:14:52 very new
0:14:53 very very
0:14:53 big
0:14:54 very very
0:14:54 exciting
0:14:55 world changing
0:14:55 things tend
0:14:56 to lead
0:14:56 to bubbles
0:14:58 and you
0:14:58 don’t think
0:14:59 anybody would
0:14:59 dispute that
0:15:00 you can see
0:15:00 some bubbly
0:15:01 behavior now
0:15:02 and you
0:15:02 know
0:15:02 you can
0:15:03 argue about
0:15:03 what kind
0:15:04 of bubble
0:15:04 but again
0:15:05 like that
0:15:05 doesn’t have
0:15:06 very much
0:15:06 predictive power
0:15:08 and you
0:15:09 know one
0:15:09 of the
0:15:10 features of
0:15:11 bubbles is
0:15:11 that when
0:15:11 everything’s
0:15:12 going
0:15:13 everything goes
0:15:14 up all at
0:15:14 once and
0:15:15 everyone looks
0:15:15 like a genius
0:15:16 and everyone
0:15:17 leverages and
0:15:18 cross leverages
0:15:18 and does
0:15:19 circular revenue
0:15:19 and that’s
0:15:20 great until
0:15:20 it’s not
0:15:22 and then
0:15:22 you get
0:15:23 kind of a
0:15:23 ratchet effect
0:15:24 as it goes
0:15:24 back down
0:15:24 again
0:15:26 so yeah
0:15:26 if we’re not
0:15:26 in a bubble
0:15:27 now we will
0:15:28 be I remember
0:15:28 Mark
0:15:29 Andreessen
0:15:29 saying you
0:15:29 know
0:15:30 1997 was not
0:15:31 a bubble
0:15:31 98 was not
0:15:32 a bubble
0:15:32 99 was a
0:15:33 bubble
0:15:34 are we in
0:15:35 97 now
0:15:36 or 98 or
0:15:36 99
0:15:37 you know
0:15:38 if we could
0:15:39 predict that
0:15:39 you know we’d
0:15:40 live in a
0:15:40 parallel universe
0:15:42 I think
0:15:43 you know
0:15:43 to the
0:15:44 there’s I
0:15:45 suppose maybe
0:15:46 kind of two
0:15:46 more specific
0:15:47 more tangible
0:15:48 answers to this
0:15:49 the first
0:15:49 the first
0:15:49 the first
0:15:49 of them
0:15:50 is we
0:15:50 don’t
0:15:51 really
0:15:51 know
0:15:53 what
0:15:53 the
0:15:54 compute
0:15:54 requirements
0:15:55 of this
0:15:55 stuff
0:15:55 are going
0:15:56 to be
0:15:57 and
0:15:58 forecasting
0:15:58 that
0:15:59 except like
0:16:00 more
0:16:01 and forecasting
0:16:02 that feels a lot
0:16:03 like trying to
0:16:03 forecast like
0:16:04 bandwidth use
0:16:05 in the late
0:16:06 90s
0:16:07 imagine if you were
0:16:08 trying to do the
0:16:09 algebra on that
0:16:09 you say well
0:16:10 this many users
0:16:11 you know how much
0:16:12 bandwidth does a
0:16:12 web page use
0:16:13 how will that
0:16:14 change how will
0:16:14 that change
0:16:15 as bandwidth
0:16:15 gets faster
0:16:16 what happens
0:16:16 with video
0:16:17 what kind
0:16:17 of video
0:16:18 what bandwidth
0:16:19 what bit rate
0:16:19 of video
0:16:20 how long
0:16:20 do people
0:16:21 watch a video
0:16:21 how much
0:16:22 video
0:16:23 and then
0:16:24 you’d like
0:16:25 you could
0:16:25 build the
0:16:26 spreadsheet
0:16:26 and it would
0:16:27 tell you
0:16:27 what bit rate
0:16:28 what global
0:16:29 bandwidth consumption
0:16:30 would be in
0:16:30 10 years
0:16:31 and then you
0:16:31 could try and
0:16:31 use that to
0:16:32 back calculate
0:16:32 how many
0:16:33 routers is this
0:16:34 going to sell
0:16:35 and you could
0:16:36 get a number
0:16:37 but it wouldn’t
0:16:37 be the number
0:16:38 you know
0:16:38 there’d be a
0:16:39 hundred fold
0:16:40 range of possible
0:16:41 outcomes from that
0:16:42 and you could
0:16:42 you know
0:16:42 you could make
0:16:43 the same point
0:16:44 about algebra
0:16:45 of consumption
0:16:45 now
0:16:47 so you know
0:16:48 right now
0:16:49 we have a bunch
0:16:50 of rational
0:16:50 actors saying
0:16:51 well this stuff
0:16:52 is transformative
0:16:53 and a huge threat
0:16:55 and we can’t
0:16:56 keep up with
0:16:56 demand for it
0:16:57 now
0:16:58 and as far
0:16:58 as we know
0:16:59 the demand
0:16:59 is going
0:16:59 to keep
0:17:00 going up
0:17:02 and you know
0:17:02 we’ve had a
0:17:03 variety of quotes
0:17:03 from all of the
0:17:04 hyperscalers
0:17:05 basically saying
0:17:05 the downside
0:17:06 of not investing
0:17:06 is bigger
0:17:07 than the downside
0:17:08 of over investing
0:17:13 that kind of
0:17:14 thing always
0:17:14 works well
0:17:15 until it doesn’t
0:17:16 yeah
0:17:17 and I saw
0:17:18 slightly strange
0:17:18 quote from
0:17:19 Mark Zuckerberg
0:17:20 saying well
0:17:21 if it turns out
0:17:21 that we’ve
0:17:21 over invested
0:17:22 we can just
0:17:23 resell the
0:17:23 capacity
0:17:24 and I thought
0:17:24 let me just
0:17:26 like stop you
0:17:27 there Mark
0:17:28 because if it
0:17:28 turns out
0:17:29 you can’t
0:17:29 use your
0:17:30 capacity
0:17:30 everybody else
0:17:31 is going to
0:17:31 have loads
0:17:31 of spare
0:17:32 capacity
0:17:32 as well
0:17:33 yeah
0:17:33 all these
0:17:34 people
0:17:34 now who are
0:17:35 desperate for
0:17:35 more capacity
0:17:36 if it turns
0:17:37 out we can
0:17:37 get the same
0:17:38 results for
0:17:38 hundreds of
0:17:39 the compute
0:17:41 that will be
0:17:41 true for
0:17:42 everyone else
0:17:42 two not
0:17:43 just you
0:17:46 so yeah
0:17:47 you know
0:17:48 in an investment
0:17:49 cycle like this
0:17:49 you tend to
0:17:50 get over
0:17:50 investment
0:17:52 but then
0:17:53 after that
0:17:54 there’s very
0:17:55 limited predictions
0:17:55 you can make
0:17:56 about what’s
0:17:56 going to happen
0:17:57 I think the
0:17:58 more useful
0:18:00 kind of way
0:18:01 to look at
0:18:02 this is to
0:18:03 think well
0:18:05 you’ve got
0:18:06 these kind
0:18:07 of transformative
0:18:08 capabilities
0:18:10 that are
0:18:11 already
0:18:12 increasing
0:18:13 the value
0:18:13 of your
0:18:13 existing
0:18:14 products
0:18:14 if you’re
0:18:15 Google or
0:18:15 Meta or
0:18:16 Amazon
0:18:17 and you’re
0:18:17 going to be
0:18:18 able to use
0:18:19 them to build
0:18:19 a bunch
0:18:20 more stuff
0:18:22 and
0:18:24 why would
0:18:24 you want
0:18:25 to let
0:18:25 somebody else
0:18:26 do that
0:18:27 rather than
0:18:27 you doing
0:18:28 it
0:18:28 as long
0:18:28 as you’re
0:18:29 able to
0:18:29 keep
0:18:30 funding
0:18:31 and selling
0:18:31 what you’re
0:18:32 building
0:18:35 and it
0:18:35 will turn
0:18:36 out that
0:18:37 we have
0:18:37 an evolution
0:18:38 of models
0:18:38 in the next
0:18:39 year
0:18:39 that means
0:18:39 you can
0:18:40 get the
0:18:40 same result
0:18:41 for
0:18:42 hundreds
0:18:43 of the
0:18:43 compute
0:18:43 that you’re
0:18:44 using
0:18:44 today
0:18:45 bearing in
0:18:45 mind that
0:18:46 it’s already
0:18:47 going down
0:18:48 pick your
0:18:49 numbers
0:18:49 20,
0:18:49 30,
0:18:50 40 times
0:18:50 a year
0:18:53 but then
0:18:53 the usage
0:18:54 is going
0:18:54 up
0:18:54 so you’re
0:18:54 in this
0:18:55 very
0:18:55 as I said
0:18:55 it’s like
0:18:56 trying to
0:18:56 predict
0:18:56 bandwidth
0:18:57 consumption
0:18:57 in the
0:18:57 late
0:18:57 90s
0:18:57 early
0:18:58 2000s
0:18:59 you can
0:19:00 throw
0:19:00 all the
0:19:00 parameters
0:19:00 out
0:19:01 but it
0:19:01 doesn’t
0:19:01 get you
0:19:01 something
0:19:01 useful
0:19:02 you just
0:19:02 kind of
0:19:02 need to
0:19:03 step back
0:19:03 and say
0:19:04 yeah but
0:19:04 is this
0:19:04 internet
0:19:05 thing
0:19:05 any good
0:19:07 well
0:19:08 yeah because
0:19:08 I’m curious
0:19:08 I’m curious
0:19:09 if the
0:19:09 bottlenecks
0:19:09 are
0:19:10 if you
0:19:10 see them
0:19:10 as more
0:19:10 on the
0:19:10 supply
0:19:11 side
0:19:11 or the
0:19:11 demand
0:19:12 side
0:19:12 you know
0:19:13 more
0:19:14 technical
0:19:14 constraints
0:19:15 or is
0:19:15 just
0:19:16 is AI
0:19:16 any good
0:19:17 are there
0:19:17 enough
0:19:18 use cases
0:19:19 to justify
0:19:22 what are you
0:19:22 seeing and
0:19:23 what are you
0:19:23 predicting
0:19:24 so
0:19:26 maybe two
0:19:26 answers to
0:19:27 this question
0:19:27 the first
0:19:27 of them
0:19:28 is I
0:19:28 think we’ve
0:19:28 had the
0:19:29 sort of
0:19:30 a bifurcation
0:19:30 of what
0:19:31 all the
0:19:31 questions
0:19:31 are
0:19:32 so there
0:19:32 are now
0:19:33 very very
0:19:34 detailed
0:19:34 conversations
0:19:35 about chips
0:19:35 and then
0:19:36 very very
0:19:36 detailed
0:19:37 conversations
0:19:38 about data
0:19:38 centers
0:19:38 and about
0:19:39 funding
0:19:39 for data
0:19:40 centers
0:19:40 and then
0:19:41 about
0:19:42 what is
0:19:43 a new
0:19:44 enterprise
0:19:44 SaaS
0:19:44 company
0:19:45 built on
0:19:45 AI
0:19:46 what margins
0:19:47 will it
0:19:47 have
0:19:48 and how
0:19:49 much money
0:19:49 does it
0:19:49 need to
0:19:50 raise
0:19:50 and so
0:19:50 there are
0:19:50 venture
0:19:51 capital
0:19:51 conversations
0:19:52 and so
0:19:52 there are
0:19:52 many
0:19:53 different
0:19:53 conversations
0:19:54 within which
0:19:55 like I don’t
0:19:55 know anything
0:19:56 about chips
0:19:56 you know I
0:19:56 can spell
0:19:57 ultraviolet but
0:19:58 like I don’t
0:19:58 know what
0:19:59 like an
0:19:59 ultraviolet
0:20:00 process is
0:20:01 it’s like
0:20:02 it’s more
0:20:02 violets
0:20:03 I don’t
0:20:03 know
0:20:06 and so
0:20:07 you’ve got
0:20:07 this you know
0:20:08 it’s like the
0:20:08 Milton Friedman
0:20:09 line
0:20:09 no one knows
0:20:10 how to build
0:20:10 a pencil
0:20:10 you’ve got
0:20:11 you know
0:20:11 we’ve got
0:20:12 this you know
0:20:12 it’s turned
0:20:12 into deployment
0:20:13 I think
0:20:14 a second
0:20:15 answer might
0:20:16 be I think
0:20:17 there’s two
0:20:17 kinds of
0:20:18 AI deployment
0:20:19 generative AI
0:20:19 deployment
0:20:20 one of them
0:20:21 is there
0:20:22 are places
0:20:23 where it’s
0:20:24 very easy
0:20:25 and obvious
0:20:25 right now
0:20:26 to see what
0:20:26 you would
0:20:27 do with
0:20:27 this
0:20:28 which is
0:20:29 basically
0:20:29 software
0:20:30 development
0:20:31 marketing
0:20:33 point
0:20:33 solutions
0:20:35 for many
0:20:36 very boring
0:20:37 very specific
0:20:38 enterprise
0:20:38 use cases
0:20:40 and also
0:20:40 basically
0:20:41 people like
0:20:41 us
0:20:42 which are
0:20:43 people who
0:20:43 have kind
0:20:43 of very
0:20:44 open
0:20:45 very free
0:20:45 form
0:20:46 very flexible
0:20:46 jobs
0:20:47 with many
0:20:47 different
0:20:48 things
0:20:49 and people
0:20:50 who are always
0:20:50 looking for ways
0:20:51 to optimize
0:20:51 that
0:20:52 and so
0:20:53 you get
0:20:53 people in
0:20:53 Silicon Valley
0:20:54 who are like
0:20:54 you know
0:20:54 I spend
0:20:55 all my
0:20:55 time in
0:20:56 chat GPT
0:20:56 I don’t
0:20:56 use
0:20:57 Google
0:20:57 anymore
0:20:58 you know
0:20:58 I’ve
0:20:59 replaced
0:20:59 my
0:20:59 CRM
0:21:00 with
0:21:00 this
0:21:02 and
0:21:02 you
0:21:03 kind of
0:21:03 and then
0:21:04 obviously
0:21:04 people who
0:21:04 write
0:21:05 if you’re
0:21:05 writing
0:21:05 codes
0:21:05 this works
0:21:06 really well
0:21:06 if you’re
0:21:07 in marketing
0:21:07 you know
0:21:07 all these
0:21:08 stories of
0:21:08 big companies
0:21:08 where you know
0:21:09 they’re making
0:21:10 300 assets
0:21:10 where they would
0:21:10 have made
0:21:11 30
0:21:13 and then
0:21:13 Accenture
0:21:14 and Bain
0:21:14 and McKinsey
0:21:15 and Infosys
0:21:16 and so on
0:21:16 sitting and
0:21:17 solving very
0:21:18 specific problems
0:21:18 inside big
0:21:18 companies
0:21:20 then there’s a
0:21:20 whole bunch
0:21:21 of other
0:21:22 people who
0:21:22 look at it
0:21:23 and they’re
0:21:23 like
0:21:25 it’s
0:21:25 okay
0:21:28 and
0:21:28 you go
0:21:28 and look
0:21:29 at the
0:21:30 usage data
0:21:31 and you
0:21:31 see
0:21:32 okay
0:21:33 chat GPT
0:21:33 has got
0:21:34 800 or
0:21:34 900 million
0:21:35 weekly
0:21:35 active users
0:21:37 5% of
0:21:37 people are
0:21:38 paying
0:21:40 and
0:21:40 then you
0:21:41 go and look
0:21:42 at all the
0:21:42 survey data
0:21:43 and you know
0:21:43 it’s very
0:21:44 fragmented
0:21:44 and inconsistent
0:21:45 but it all
0:21:45 sort of
0:21:46 points to
0:21:46 like
0:21:47 something like
0:21:48 10 or 15%
0:21:49 of people
0:21:49 into the
0:21:50 developed world
0:21:50 are using
0:21:51 this every
0:21:51 day
0:21:53 another 20 or
0:21:53 30% of people
0:21:54 are using it
0:21:55 every week
0:21:56 and if you’re
0:21:56 the kind of
0:21:56 person who
0:21:57 is using
0:21:57 this for
0:21:58 hours every
0:21:58 day
0:21:59 ask yourself
0:21:59 why
0:22:01 five times
0:22:02 more people
0:22:02 look at it
0:22:03 get it
0:22:03 know what
0:22:04 it is
0:22:04 have an
0:22:04 account
0:22:05 know how
0:22:05 to use it
0:22:06 and can’t
0:22:06 think of
0:22:07 anything to do
0:22:07 with it
0:22:08 this week
0:22:08 or next
0:22:09 week
0:22:11 why is
0:22:11 that
0:22:13 is it
0:22:13 because
0:22:13 it’s
0:22:13 early
0:22:14 and it’s
0:22:14 not like
0:22:15 a young
0:22:15 people
0:22:15 thing
0:22:16 either
0:22:17 incidentally
0:22:17 so is
0:22:18 that just
0:22:18 because
0:22:18 it’s
0:22:19 early
0:22:20 is it
0:22:20 because
0:22:21 of the
0:22:22 error
0:22:22 rates
0:22:23 is it
0:22:23 because
0:22:24 you have
0:22:25 to map
0:22:25 it against
0:22:26 what you
0:22:26 do
0:22:26 every
0:22:26 day
0:22:27 and
0:22:29 one of
0:22:29 the
0:22:30 analogy
0:22:30 I was
0:22:30 used
0:22:30 to
0:22:30 use
0:22:30 which
0:22:31 isn’t
0:22:31 in the
0:22:31 current
0:22:31 presentation
0:22:32 I’ve
0:22:32 used
0:22:32 in
0:22:32 previous
0:22:33 presentations
0:22:33 is
0:22:34 imagine
0:22:34 you’re
0:22:34 an
0:22:35 accountant
0:22:35 and you
0:22:35 see
0:22:35 software
0:22:36 spreadsheets
0:22:36 for the
0:22:36 first
0:22:36 time
0:22:37 this
0:22:37 thing
0:22:38 can
0:22:38 do
0:22:39 a month
0:22:39 of work
0:22:39 in 10
0:22:40 minutes
0:22:40 almost
0:22:41 centrally
0:22:43 you want
0:22:43 to change
0:22:44 you want
0:22:45 to recalculate
0:22:45 that DCF
0:22:46 that 10 year
0:22:47 DCF
0:22:47 with a different
0:22:48 discount rate
0:22:48 I’ve done it
0:22:49 before you
0:22:50 finished asking
0:22:50 me to
0:22:50 and that
0:22:51 would have
0:22:51 been like
0:22:51 a day
0:22:52 or two
0:22:52 days
0:22:52 or three
0:22:52 days
0:22:53 of work
0:22:53 to recalculate
0:22:53 all these
0:22:54 numbers
0:22:55 great
0:22:55 now imagine
0:22:56 you’re
0:22:56 a lawyer
0:22:57 and you
0:22:57 see it
0:22:58 and you
0:22:58 think
0:22:59 well that’s
0:23:00 great
0:23:01 my accountant
0:23:01 should see
0:23:02 it
0:23:02 maybe I’ll
0:23:03 use it
0:23:03 next week
0:23:04 when I’m
0:23:04 making a
0:23:05 table of
0:23:05 my billable
0:23:06 hours
0:23:06 but that’s
0:23:07 not what
0:23:07 I do
0:23:07 all day
0:23:09 and Excel
0:23:10 doesn’t
0:23:11 use
0:23:11 do things
0:23:12 that a
0:23:12 lawyer can
0:23:12 do every
0:23:13 day
0:23:14 and I
0:23:14 think
0:23:15 there’s
0:23:15 this other
0:23:16 class
0:23:16 of person
0:23:17 that’s
0:23:17 like
0:23:18 I’m not
0:23:19 sure what
0:23:19 to do
0:23:19 with this
0:23:20 and some
0:23:20 of that
0:23:21 is habit
0:23:21 some
0:23:22 of that
0:23:22 is like
0:23:23 realizing
0:23:23 no
0:23:24 instead
0:23:24 of doing
0:23:24 it
0:23:25 that way
0:23:25 I could
0:23:25 do it
0:23:26 this way
0:23:27 but that’s
0:23:28 also what
0:23:29 products are
0:23:30 like every
0:23:30 entrepreneur
0:23:31 who comes
0:23:32 into A16Z
0:23:33 when I was
0:23:33 there from
0:23:34 2014 to
0:23:35 2019
0:23:35 and I’m
0:23:36 sure now
0:23:37 like you
0:23:37 could look
0:23:38 at any
0:23:38 company that
0:23:39 comes in
0:23:39 and say
0:23:40 that’s
0:23:40 basically
0:23:41 a database
0:23:42 that’s
0:23:43 basically
0:23:44 a CRM
0:23:45 that’s
0:23:46 basically
0:23:47 Oracle
0:23:47 or Google
0:23:48 Docs
0:23:49 except
0:23:50 that they
0:23:51 realize
0:23:51 there’s
0:23:51 this
0:23:52 problem
0:23:52 or this
0:23:53 workflow
0:23:53 inside
0:23:54 this
0:23:54 industry
0:23:55 and
0:23:55 worked
0:23:55 out
0:23:56 how
0:23:56 to
0:23:56 use
0:23:56 a
0:23:57 database
0:23:58 or CRM
0:23:59 or basically
0:23:59 concepts
0:24:00 from 5,
0:24:00 10,
0:24:01 20 years
0:24:01 ago
0:24:02 and solve
0:24:03 that problem
0:24:03 for people
0:24:04 in that
0:24:04 industry
0:24:05 and go
0:24:05 in and
0:24:06 sell it
0:24:06 to them
0:24:06 and work
0:24:07 out how
0:24:07 they can
0:24:07 get it
0:24:07 to use
0:24:08 it
0:24:09 and
0:24:10 so this
0:24:10 is why
0:24:11 you look
0:24:11 at data
0:24:12 on this
0:24:12 that depending
0:24:13 on how
0:24:13 you count it
0:24:13 the typical
0:24:14 big company
0:24:14 today has
0:24:15 400 to
0:24:15 500
0:24:15 SaaS
0:24:15 apps
0:24:16 in the
0:24:17 US
0:24:18 400 to
0:24:18 500
0:24:18 SaaS
0:24:19 applications
0:24:20 and they’re
0:24:20 all
0:24:21 basically
0:24:21 doing
0:24:22 something
0:24:22 you could
0:24:22 do in
0:24:22 Oracle
0:24:23 or Excel
0:24:24 or email
0:24:28 and that’s
0:24:29 the other
0:24:29 side
0:24:30 I’m monologuing
0:24:31 I’m afraid
0:24:31 but this is
0:24:32 the other
0:24:32 side of
0:24:33 what do
0:24:33 you do
0:24:34 with these
0:24:34 things
0:24:34 do you
0:24:35 just go
0:24:35 to the
0:24:35 bot
0:24:36 and ask
0:24:36 it to
0:24:36 do a
0:24:36 thing
0:24:37 for you
0:24:38 or does
0:24:39 an enterprise
0:24:40 salesperson
0:24:40 come to
0:24:41 your boss
0:24:41 and sell
0:24:42 you a
0:24:42 thing
0:24:43 that means
0:24:44 now you
0:24:44 press a
0:24:44 button
0:24:45 and it
0:24:45 analyzes
0:24:46 this process
0:24:47 that you
0:24:47 needed
0:24:48 that you
0:24:49 never realized
0:24:49 you were
0:24:50 even doing
0:24:52 and I
0:24:53 feel like
0:24:53 that’s
0:24:54 I mean
0:24:55 that’s
0:24:56 why there
0:24:56 are AI
0:24:57 software
0:24:57 companies
0:24:59 really
0:25:00 isn’t that
0:25:00 what they’re
0:25:00 doing
0:25:01 they’re
0:25:01 unbundling
0:25:01 chat
0:25:02 GPT
0:25:02 just as
0:25:03 the
0:25:03 enterprise
0:25:03 software
0:25:04 company
0:25:04 of 10
0:25:04 years ago
0:25:04 was
0:25:05 unbundling
0:25:05 Oracle
0:25:05 or Google
0:25:06 or Excel
0:25:07 do you
0:25:07 have the
0:25:07 view
0:25:07 that
0:25:08 what
0:25:08 Excel
0:25:09 did
0:25:09 for
0:25:10 accountants
0:25:14 AI
0:25:14 is now
0:25:14 doing
0:25:15 for
0:25:15 coders
0:25:16 and
0:25:17 developers
0:25:17 but hasn’t
0:25:18 quite
0:25:18 figured out
0:25:19 that daily
0:25:20 critical
0:25:20 workflow
0:25:20 for
0:25:21 other
0:25:21 job
0:25:21 positions
0:25:22 and so
0:25:22 it’s
0:25:22 unclear
0:25:23 for
0:25:23 people
0:25:23 who
0:25:23 aren’t
0:25:24 developers
0:25:25 why I
0:25:25 should
0:25:25 be
0:25:25 using
0:25:25 this
0:25:26 for
0:25:26 many
0:25:26 hours
0:25:27 a
0:25:27 day
0:25:28 I
0:25:29 think
0:25:29 there’s
0:25:29 a lot
0:25:29 of
0:25:30 people
0:25:30 who
0:25:31 don’t
0:25:31 have
0:25:32 tasks
0:25:32 that
0:25:33 work
0:25:33 very
0:25:33 well
0:25:33 with
0:25:34 this
0:25:35 and
0:25:35 then
0:25:35 there’s
0:25:36 a lot
0:25:36 of
0:25:36 people
0:25:37 who
0:25:37 need
0:25:37 it
0:25:37 to
0:25:37 be
0:25:38 wrapped
0:25:38 in
0:25:38 a
0:25:39 product
0:25:39 and
0:25:39 a
0:25:39 workflow
0:25:40 and
0:25:40 tooling
0:25:40 and
0:25:41 UX
0:25:41 and
0:25:41 someone
0:25:41 to
0:25:42 come
0:25:42 and say
0:25:43 hey
0:25:43 have you
0:25:43 realized
0:25:43 you
0:25:43 could
0:25:44 do
0:25:44 it
0:25:44 with
0:25:44 this
0:25:46 I
0:25:46 had
0:25:46 this
0:25:46 conversation
0:25:47 in the
0:25:47 summer
0:25:48 with
0:25:48 Balaji
0:25:49 who’s
0:25:49 another
0:25:50 former
0:25:50 A16Z
0:25:51 person
0:25:52 and
0:25:53 he was
0:25:53 making
0:25:53 this
0:25:53 point
0:25:54 about
0:25:54 validation
0:25:55 that
0:25:55 can
0:25:55 you
0:25:56 because
0:25:56 these
0:25:56 things
0:25:57 still get
0:25:57 stuff
0:25:57 wrong
0:25:57 and
0:25:58 people
0:25:58 in
0:25:58 the
0:25:58 valley
0:25:58 often
0:25:58 kind
0:25:59 of
0:25:59 hand
0:25:59 wave
0:25:59 this
0:26:00 away
0:26:01 but
0:26:01 you
0:26:01 know
0:26:01 there
0:26:01 are
0:26:02 questions
0:26:02 that
0:26:03 have
0:26:04 specific
0:26:04 answers
0:26:05 where it
0:26:05 needs
0:26:05 to be
0:26:06 the right
0:26:06 answer
0:26:06 or one
0:26:07 of a
0:26:07 limited
0:26:08 set
0:26:08 of
0:26:08 right
0:26:08 answers
0:26:09 can
0:26:09 you
0:26:10 validate
0:26:10 that
0:26:10 mechanistically
0:26:12 if
0:26:13 not
0:26:13 is it
0:26:13 efficient
0:26:14 to
0:26:14 validate
0:26:14 it
0:26:14 with
0:26:15 people
0:26:16 so
0:26:16 you
0:26:16 know
0:26:16 the
0:26:17 marketing
0:26:17 use
0:26:17 case
0:26:17 it’s
0:26:18 a lot
0:26:18 more
0:26:18 efficient
0:26:18 to
0:26:18 get
0:26:19 a
0:26:19 machine
0:26:19 to
0:26:19 make
0:26:19 you
0:26:20 200
0:26:20 pictures
0:26:20 and
0:26:21 have
0:26:21 a
0:26:21 person
0:26:21 look
0:26:21 at
0:26:21 them
0:26:22 and
0:26:22 pick
0:26:22 10
0:26:22 that
0:26:22 are
0:26:22 good
0:26:23 than
0:26:23 to
0:26:23 have
0:26:24 people
0:26:25 make
0:26:25 10
0:26:25 good
0:26:26 images
0:26:26 or
0:26:26 100
0:26:26 even
0:26:27 more
0:26:27 if
0:26:27 you’re
0:26:27 going
0:26:27 to
0:26:27 make
0:26:28 500
0:26:28 images
0:26:28 and
0:26:28 pick
0:26:28 100
0:26:29 that
0:26:29 are
0:26:29 good
0:26:29 that’s
0:26:30 a lot
0:26:30 more
0:26:30 efficient
0:26:31 than
0:26:31 having
0:26:31 a
0:26:31 person
0:26:31 make
0:26:32 100
0:26:32 images
0:26:33 but
0:26:33 on
0:26:33 the
0:26:34 other
0:26:34 hand
0:26:34 if
0:26:34 you’re
0:26:34 doing
0:26:34 something
0:26:34 like
0:26:35 data
0:26:35 entry
0:26:35 and
0:26:35 I
0:26:36 wrote
0:26:36 something
0:26:36 about
0:26:36 this
0:26:37 about
0:26:38 OpenAI
0:26:38 launch
0:26:39 deep
0:26:39 research
0:26:40 OpenAI
0:26:40 launch
0:26:40 deep
0:26:41 research
0:26:41 their
0:26:41 whole
0:26:42 marketing
0:26:42 case
0:26:42 is
0:26:42 it
0:26:42 goes
0:26:43 off
0:26:43 and
0:26:43 collects
0:26:44 data
0:26:44 about
0:26:44 the
0:26:44 mobile
0:26:45 market
0:26:45 I
0:26:45 used
0:26:45 to be
0:26:45 a
0:26:45 mobile
0:26:46 analyst
0:26:46 the
0:26:46 numbers
0:26:46 are
0:26:46 wrong
0:26:48 their
0:26:49 use
0:26:49 case
0:26:49 of
0:26:50 look
0:26:50 how
0:26:50 useful
0:26:50 this
0:26:51 is
0:26:51 their
0:26:51 numbers
0:26:51 are
0:26:52 wrong
0:26:53 and
0:26:53 in
0:26:53 some
0:26:54 cases
0:26:54 they’re
0:26:54 wrong
0:26:55 because
0:26:55 they’ve
0:26:56 literally
0:26:56 transcribed
0:26:57 the
0:26:57 number
0:26:58 incorrectly
0:26:58 from
0:26:58 the
0:26:58 source
0:26:59 in
0:26:59 other
0:27:00 cases
0:27:00 it’s
0:27:00 wrong
0:27:00 because
0:27:01 they’ve
0:27:01 used
0:27:01 a
0:27:01 source
0:27:02 they
0:27:02 shouldn’t
0:27:02 have
0:27:03 used
0:27:03 but
0:27:04 like
0:27:04 if
0:27:04 I
0:27:04 asked
0:27:04 an
0:27:05 intern
0:27:05 to
0:27:05 do
0:27:05 it
0:27:05 for
0:27:05 me
0:27:06 then
0:27:06 an
0:27:06 intern
0:27:06 would
0:27:06 probably
0:27:06 have
0:27:07 picked
0:27:07 that
0:27:08 and
0:27:08 to
0:27:09 the
0:27:09 point
0:27:09 about
0:27:10 verification
0:27:11 if you’re
0:27:12 going to do
0:27:12 data
0:27:12 entry
0:27:13 if I’m
0:27:13 going to
0:27:14 ask
0:27:14 a
0:27:14 machine
0:27:14 to
0:27:15 copy
0:27:15 200
0:27:15 numbers
0:27:16 out
0:27:16 of
0:27:16 200
0:27:17 PDFs
0:27:17 and
0:27:17 then
0:27:17 I’m
0:27:17 going to
0:27:17 have
0:27:18 to
0:27:18 check
0:27:18 all
0:27:18 200
0:27:18 of
0:27:19 those
0:27:19 numbers
0:27:19 I’m
0:27:19 going to
0:27:19 as well
0:27:20 just
0:27:20 do
0:27:20 it
0:27:20 myself
0:27:22 so
0:27:22 you’ve
0:27:23 got
0:27:23 like
0:27:23 a
0:27:24 whole
0:27:24 swirling
0:27:25 matrix
0:27:26 of
0:27:27 how
0:27:27 do
0:27:27 you
0:27:28 map
0:27:28 this
0:27:29 against
0:27:30 existing
0:27:30 problems
0:27:31 but
0:27:31 the
0:27:31 other
0:27:32 side
0:27:32 of
0:27:32 it
0:27:32 is
0:27:33 how
0:27:33 do
0:27:33 you
0:27:34 map
0:27:34 this
0:27:34 against
0:27:35 new
0:27:35 things
0:27:35 that
0:27:35 you
0:27:36 couldn’t
0:27:36 have
0:27:36 done
0:27:36 before
0:27:38 and
0:27:39 this
0:27:39 comes
0:27:39 back
0:27:39 to
0:27:40 my
0:27:40 point
0:27:40 about
0:27:40 platform
0:27:41 ships
0:27:41 because
0:27:41 you know
0:27:42 I see
0:27:42 people
0:27:42 looking
0:27:43 at
0:27:44 genitive
0:27:44 AR
0:27:44 and
0:27:45 saying
0:27:45 well
0:27:45 this
0:27:45 is
0:27:45 useless
0:27:46 because
0:27:46 it
0:27:46 makes
0:27:46 mistakes
0:27:47 and
0:27:47 I
0:27:47 think
0:27:48 that’s
0:27:48 kind
0:27:48 of
0:27:48 like
0:27:49 looking
0:27:49 at
0:27:49 an
0:27:50 Apple
0:27:50 2
0:27:50 in
0:27:50 the
0:27:50 late
0:27:51 70s
0:27:51 and
0:27:51 saying
0:27:51 could
0:27:52 you
0:27:52 use
0:27:53 these
0:27:53 to
0:27:53 run
0:27:53 banks
0:27:54 to
0:27:54 which
0:27:54 the
0:27:54 answer
0:27:54 is
0:27:55 no
0:27:56 but
0:27:56 that’s
0:27:56 kind
0:27:57 of
0:27:57 the
0:27:57 wrong
0:27:58 question
0:27:59 could
0:27:59 you
0:27:59 build
0:28:00 professional
0:28:01 video
0:28:01 editing
0:28:02 inside
0:28:02 Netscape
0:28:03 no
0:28:04 but
0:28:04 that’s
0:28:05 the
0:28:05 wrong
0:28:06 question
0:28:06 and
0:28:07 later
0:28:07 20
0:28:08 years
0:28:08 later
0:28:08 you
0:28:08 can
0:28:08 but
0:28:09 meanwhile
0:28:09 it
0:28:09 does
0:28:09 a
0:28:10 whole
0:28:10 bunch
0:28:10 of
0:28:10 other
0:28:10 stuff
0:28:10 the
0:28:11 same
0:28:11 with
0:28:11 mobile
0:28:12 can
0:28:12 you
0:28:12 use
0:28:13 mobile
0:28:13 to
0:28:13 replace
0:28:14 your
0:28:15 five
0:28:15 screen
0:28:16 professional
0:28:16 programming
0:28:16 rig
0:28:17 no
0:28:17 therefore
0:28:17 it
0:28:17 can’t
0:28:18 replace
0:28:18 PCs
0:28:18 well
0:28:20 guess
0:28:20 what
0:28:20 5 billion
0:28:21 people
0:28:21 have got
0:28:21 a
0:28:21 smartphone
0:28:21 and
0:28:22 700 or 800
0:28:22 million
0:28:22 people
0:28:22 have got
0:28:23 a
0:28:23 consumer
0:28:23 PC
0:28:23 so
0:28:24 it
0:28:24 kind
0:28:24 of
0:28:24 did
0:28:25 but
0:28:25 did
0:28:25 a
0:28:25 different
0:28:25 thing
0:28:26 and
0:28:27 the
0:28:27 new
0:28:27 thing
0:28:27 is
0:28:30 generally
0:28:30 not
0:28:31 very
0:28:31 good
0:28:31 or
0:28:32 terrible
0:28:32 at
0:28:32 the
0:28:32 stuff
0:28:32 that
0:28:32 was
0:28:33 important
0:28:33 to
0:28:33 the
0:28:33 old
0:28:33 thing
0:28:34 but
0:28:34 it
0:28:34 does
0:28:35 something
0:28:35 else
0:28:36 right
0:28:37 and
0:28:38 a lot
0:28:38 of
0:28:38 the
0:28:39 question
0:28:39 is
0:28:44 old
0:28:44 tasks
0:28:44 that
0:28:45 generative
0:28:45 AI
0:28:45 is
0:28:45 good
0:28:46 at
0:28:46 there’s
0:28:46 also
0:28:47 many
0:28:47 more
0:28:47 old
0:28:48 tasks
0:28:48 that
0:28:48 generative
0:28:49 AI
0:28:49 is
0:28:49 maybe
0:28:49 not
0:28:49 very
0:28:50 good
0:28:50 at
0:28:51 but
0:28:51 then
0:28:51 there’s
0:28:51 a whole
0:28:51 bunch
0:28:52 of
0:28:52 other
0:28:52 things
0:28:52 that
0:28:53 you
0:28:53 would
0:28:53 never
0:28:53 have
0:28:54 done
0:28:54 before
0:28:55 that
0:28:55 generative
0:28:56 AI
0:28:56 is
0:28:56 really
0:28:57 really
0:28:57 good
0:28:57 at
0:28:59 and
0:28:59 then
0:28:59 how
0:28:59 do
0:28:59 you
0:29:00 find
0:29:00 those
0:29:00 or
0:29:01 think
0:29:01 of
0:29:01 those
0:29:01 and
0:29:01 how
0:29:01 much
0:29:02 of
0:29:02 that
0:29:02 is
0:29:02 the
0:29:02 user
0:29:03 thinking
0:29:03 of it
0:29:03 faced
0:29:03 with
0:29:03 the
0:29:04 general
0:29:14 product
0:29:15 with
0:29:15 a
0:29:15 button
0:29:15 that
0:29:16 will
0:29:16 do
0:29:16 it
0:29:16 for
0:29:16 you
0:29:18 and
0:29:18 that’s
0:29:19 why
0:29:19 there
0:29:19 are
0:29:19 software
0:29:20 companies
0:29:20 right
0:29:21 and
0:29:21 on
0:29:22 mobile
0:29:22 some
0:29:22 of
0:29:22 the
0:29:23 new
0:29:23 use
0:29:23 cases
0:29:24 getting
0:29:24 in
0:29:25 strangers
0:29:25 cars
0:29:26 we
0:29:26 mentioned
0:29:26 left
0:29:26 an
0:29:26 Uber
0:29:27 or
0:29:27 sort
0:29:27 of
0:29:28 you
0:29:28 know
0:29:28 lending
0:29:28 people
0:29:28 you
0:29:28 met
0:29:28 via
0:29:29 an
0:29:29 app
0:29:29 or
0:29:31 sort
0:29:31 of
0:29:32 you
0:29:32 know
0:29:32 lending
0:29:32 your
0:29:33 spare
0:29:33 bedroom
0:29:34 out
0:29:35 etc
0:29:36 and
0:29:36 those
0:29:37 were
0:29:37 net
0:29:37 new
0:29:37 companies
0:29:38 that
0:29:39 were
0:29:39 built
0:29:39 around
0:29:39 those
0:29:40 behaviors
0:29:40 and
0:29:40 I
0:29:40 think
0:29:40 for
0:29:41 AI
0:29:41 there’s
0:29:41 still
0:29:41 questions
0:29:41 of
0:29:42 what
0:29:42 are
0:29:43 those
0:29:43 net
0:29:43 new
0:29:43 behaviors
0:29:44 we’re
0:29:44 starting
0:29:44 to see
0:29:45 some
0:29:45 in
0:29:45 terms
0:29:46 of
0:29:46 people
0:29:47 engaging
0:29:47 and
0:29:48 talking
0:29:48 with
0:29:49 chat
0:29:49 bots
0:29:49 instead
0:29:50 of
0:29:50 humans
0:29:50 or
0:29:51 in
0:29:52 addition
0:29:53 and
0:29:53 then
0:29:53 there’s
0:29:53 a
0:29:53 question
0:29:53 of
0:29:54 are
0:29:54 these
0:29:54 done
0:29:55 by
0:29:55 the
0:29:56 model
0:29:56 providers
0:29:57 that
0:29:57 currently
0:29:57 exist
0:29:58 or
0:29:58 are
0:29:58 these
0:29:58 done
0:29:58 by
0:29:59 net
0:30:03 is
0:30:03 how
0:30:04 far
0:30:04 up
0:30:04 the
0:30:04 stack
0:30:04 does
0:30:04 the
0:30:04 new
0:30:05 thing
0:30:05 go
0:30:07 and
0:30:08 I
0:30:08 was
0:30:08 talking
0:30:08 about
0:30:08 this
0:30:09 with
0:30:09 another
0:30:09 former
0:30:10 A16Z
0:30:10 person
0:30:10 who
0:30:11 pointed
0:30:11 out
0:30:11 that
0:30:11 in
0:30:12 the
0:30:12 mid
0:30:13 90s
0:30:14 people
0:30:14 argued
0:30:15 that
0:30:15 the
0:30:15 operating
0:30:16 system
0:30:16 does
0:30:16 all
0:30:16 of it
0:30:17 and
0:30:18 Windows
0:30:18 apps
0:30:18 are
0:30:18 basically
0:30:19 just
0:30:19 thin
0:30:20 Win32
0:30:20 wrappers
0:30:21 and
0:30:22 Office
0:30:22 is
0:30:23 basically
0:30:23 just
0:30:23 a
0:30:23 thin
0:30:24 Win32
0:30:24 wrapper
0:30:24 like
0:30:25 all the
0:30:25 important
0:30:25 stuff
0:30:25 is
0:30:26 being
0:30:26 done
0:30:26 by
0:30:26 the
0:30:26 OS
0:30:27 whether
0:30:27 it’s
0:30:27 the
0:30:28 document
0:30:28 management
0:30:29 and
0:30:29 printing
0:30:29 and
0:30:29 storage
0:30:30 and
0:30:30 display
0:30:31 which
0:30:31 all
0:30:31 stuff
0:30:31 that
0:30:31 used
0:30:31 to be
0:30:31 done
0:30:32 by
0:30:32 apps
0:30:32 like
0:30:33 on
0:30:33 DOS
0:30:33 the
0:30:33 apps
0:30:33 had
0:30:33 to
0:30:34 do
0:30:34 printing
0:30:34 the
0:30:34 apps
0:30:34 had
0:30:34 to
0:30:35 manage
0:30:35 the
0:30:35 display
0:30:36 we
0:30:36 moved to
0:30:36 Windows
0:30:37 90%
0:30:37 of the
0:30:37 stuff
0:30:37 that
0:30:37 the
0:30:37 app
0:30:38 used
0:30:38 to do
0:30:38 is
0:30:38 now
0:30:38 being
0:30:38 done
0:30:39 by
0:30:39 Windows
0:30:40 and
0:30:41 so
0:30:41 Office
0:30:42 is
0:30:42 just
0:30:42 like
0:30:42 a
0:30:42 thin
0:30:43 Win32
0:30:43 wrapper
0:30:44 and
0:30:44 all the
0:30:44 stuff
0:30:44 has
0:30:44 been
0:30:45 done
0:30:45 by
0:30:45 the
0:30:46 OS
0:30:46 and
0:30:46 it
0:30:46 turned
0:30:46 out
0:30:46 well
0:30:46 that
0:30:47 was
0:30:47 again
0:30:48 frameworks
0:30:48 are
0:30:48 useful
0:30:48 but
0:30:48 that’s
0:30:49 not
0:30:49 maybe
0:30:49 not
0:30:49 a
0:30:50 useful
0:30:50 way
0:30:50 of
0:30:50 thinking
0:30:50 about
0:30:50 what’s
0:30:50 going
0:30:51 on
0:30:51 and
0:30:52 the
0:30:52 same
0:30:52 thing
0:30:53 now
0:30:53 like
0:30:54 how
0:30:54 much
0:30:54 does
0:30:55 this
0:30:55 need
0:30:56 single
0:30:57 dedicated
0:30:58 understanding
0:30:59 of how
0:30:59 that
0:30:59 market
0:31:00 it
0:31:00 works
0:31:00 or
0:31:00 what
0:31:01 that
0:31:01 market
0:31:01 is
0:31:01 and
0:31:01 what
0:31:02 you
0:31:02 would
0:31:02 do
0:31:02 with
0:31:02 that
0:31:03 I
0:31:04 remember
0:31:04 when
0:31:04 we
0:31:04 were
0:31:04 at
0:31:05 A16Z
0:31:05 there
0:31:05 was
0:31:05 an
0:31:06 investment
0:31:06 in
0:31:06 a
0:31:06 company
0:31:07 called
0:31:07 Everlaw
0:31:08 which
0:31:08 is
0:31:10 legal
0:31:10 discovery
0:31:11 in
0:31:11 the
0:31:11 cloud
0:31:12 and
0:31:13 so
0:31:13 machine
0:31:13 learning
0:31:14 happens
0:31:14 and
0:31:14 so
0:31:14 now
0:31:15 they
0:31:15 can
0:31:15 do
0:31:15 translation
0:31:16 are
0:31:16 they
0:31:17 worried
0:31:17 that
0:31:17 lawyers
0:31:17 are
0:31:18 going
0:31:18 to
0:31:18 say
0:31:18 we
0:31:18 don’t
0:31:18 need
0:31:18 you
0:31:19 guys
0:31:19 anymore
0:31:19 we’re
0:31:19 just
0:31:19 going
0:31:19 to
0:31:20 get
0:31:20 a
0:31:20 translate
0:31:21 app
0:31:21 and
0:31:21 a
0:31:21 sentiment
0:31:22 analysis
0:31:22 app
0:31:22 from
0:31:22 AWS
0:31:26 law firms
0:31:26 want to
0:31:26 buy a
0:31:27 thing
0:31:28 legal
0:31:28 discovery
0:31:29 software
0:31:29 management
0:31:29 they
0:31:30 don’t
0:31:30 want to
0:31:31 write
0:31:31 their
0:31:31 own
0:31:31 do
0:31:32 API
0:31:32 calls
0:31:32 I
0:31:32 mean
0:31:32 very
0:31:33 big
0:31:33 law firms
0:31:33 might
0:31:34 but
0:31:34 typical
0:31:35 law firm
0:31:35 isn’t
0:31:35 going
0:31:35 to
0:31:35 do
0:31:36 that
0:31:36 people
0:31:36 buy
0:31:37 solutions
0:31:37 they
0:31:37 don’t
0:31:37 buy
0:31:37 technologies
0:31:39 and
0:31:39 the
0:31:39 same
0:31:39 thing
0:31:40 here
0:31:40 like
0:31:40 how
0:31:41 far
0:31:41 up
0:31:41 the
0:31:41 stack
0:31:42 do
0:31:42 these
0:31:43 models
0:31:43 go
0:31:45 how
0:31:46 much
0:31:46 can
0:31:46 you
0:31:46 turn
0:31:47 things
0:31:47 into
0:31:49 a
0:31:50 widget
0:31:51 how
0:31:51 much
0:31:51 can
0:31:51 you
0:31:51 turn
0:31:52 things
0:31:52 into
0:31:53 an
0:31:53 LLM
0:31:54 request
0:31:55 and
0:31:55 how
0:31:55 much
0:31:56 know
0:31:56 does
0:31:56 it
0:31:56 turn
0:31:57 out
0:31:57 that
0:31:57 you
0:31:57 need
0:31:58 that
0:31:58 dedicated
0:31:59 UI
0:31:59 the fun
0:31:59 thing
0:31:59 is
0:32:00 you can
0:32:00 see
0:32:00 this
0:32:00 around
0:32:00 Google
0:32:00 because
0:32:01 Google
0:32:01 had
0:32:01 this
0:32:01 whole
0:32:01 idea
0:32:02 that
0:32:02 everything
0:32:02 would
0:32:02 just
0:32:02 be
0:32:02 a
0:32:02 Google
0:32:03 query
0:32:03 and
0:32:03 Google
0:32:03 would
0:32:04 work
0:32:04 out
0:32:04 what
0:32:04 the
0:32:04 query
0:32:04 was
0:32:05 and guess
0:32:05 what
0:32:06 now you
0:32:06 want
0:32:07 Google
0:32:08 flights
0:32:08 is not
0:32:08 a
0:32:08 Google
0:32:09 query
0:32:09 you know
0:32:10 they use
0:32:10 certain
0:32:11 point
0:32:12 and one
0:32:12 of the
0:32:13 interesting
0:32:13 things
0:32:13 about
0:32:13 this
0:32:13 and
0:32:14 it’s
0:32:14 interesting
0:32:15 to think
0:32:15 about
0:32:16 what a GUI
0:32:16 is doing
0:32:17 that
0:32:17 some
0:32:18 of what a GUI
0:32:19 is doing
0:32:19 and the obvious
0:32:20 thing that a GUI
0:32:20 is doing
0:32:22 is that it
0:32:22 enables
0:32:23 Office to have
0:32:24 500
0:32:25 features
0:32:25 and you
0:32:26 can find
0:32:26 them all
0:32:27 or at least
0:32:27 you don’t
0:32:28 have to
0:32:28 memorize
0:32:28 keyboard
0:32:29 commands
0:32:30 you can
0:32:30 now have
0:32:31 effectively
0:32:31 infinite
0:32:32 features
0:32:32 and you
0:32:32 can just
0:32:33 keep
0:32:33 adding
0:32:33 menus
0:32:33 and
0:32:34 dialogue
0:32:34 boxes
0:32:35 and eventually
0:32:35 you run
0:32:36 out of
0:32:36 screen space
0:32:36 for dialogue
0:32:37 boxes
0:32:38 but you
0:32:38 can have
0:32:38 hundreds
0:32:39 of features
0:32:40 without people
0:32:40 needing to
0:32:40 memorize
0:32:41 keyboard
0:32:41 commands
0:32:42 but the
0:32:42 other side
0:32:43 of it
0:32:43 is
0:32:44 you’re
0:32:45 in that
0:32:45 dialogue
0:32:45 box
0:32:46 or you’re
0:32:46 in that
0:32:47 screen
0:32:47 in that
0:32:47 workflow
0:32:48 in
0:32:49 Workday
0:32:49 or Salesforce
0:32:50 or whatever
0:32:51 the enterprise
0:32:51 software is
0:32:53 or the
0:32:54 airline
0:32:54 website
0:32:55 or Airbnb
0:32:55 or whatever
0:32:55 it is
0:32:56 and there
0:32:56 aren’t 600
0:32:57 buttons on the
0:32:57 screen
0:32:58 there’s seven
0:32:58 buttons on the
0:32:58 screen
0:32:59 because a bunch
0:32:59 of people
0:33:00 at that
0:33:00 company
0:33:01 have sat
0:33:01 down
0:33:01 and thought
0:33:02 what is it
0:33:03 that the
0:33:03 users should
0:33:04 be asked
0:33:04 here
0:33:05 what questions
0:33:05 should we
0:33:06 give them
0:33:07 what choices
0:33:08 should there
0:33:08 be at this
0:33:09 point in the
0:33:09 flow
0:33:11 and that
0:33:12 reflects a
0:33:13 lot of
0:33:13 institutional
0:33:14 knowledge
0:33:14 and a lot
0:33:15 of learning
0:33:15 and a lot
0:33:16 of testing
0:33:17 and a lot
0:33:17 of really
0:33:18 careful thought
0:33:18 about how
0:33:19 this should
0:33:19 work
0:33:20 and then
0:33:20 you give
0:33:21 somebody
0:33:21 a raw
0:33:22 prompt
0:33:22 and you
0:33:23 just say
0:33:24 okay
0:33:24 you just
0:33:25 tell the
0:33:25 thing
0:33:26 how to
0:33:26 do
0:33:26 the
0:33:26 thing
0:33:26 and
0:33:28 you’ve
0:33:28 kind of
0:33:29 got to
0:33:29 shut your
0:33:29 eyes
0:33:30 screw your
0:33:30 eyes up
0:33:31 and think
0:33:31 from first
0:33:32 principles
0:33:32 how does
0:33:32 this all
0:33:33 of this
0:33:33 work
0:33:34 it’s kind
0:33:34 of like
0:33:34 I always
0:33:35 used to talk
0:33:35 about machine
0:33:36 learning as
0:33:36 giving you
0:33:36 infinite
0:33:37 interns
0:33:38 you know
0:33:38 imagine
0:33:39 imagine you’ve
0:33:39 got a
0:33:39 task
0:33:40 and you’ve
0:33:40 got an
0:33:41 intern
0:33:42 and the
0:33:42 intern
0:33:43 doesn’t
0:33:43 know what
0:33:44 venture
0:33:44 capital
0:33:45 is
0:33:47 how
0:33:47 are they
0:33:47 going to be
0:33:48 and like
0:33:50 they and
0:33:50 they don’t
0:33:51 know that
0:33:53 companies publish
0:33:54 quarterly reports
0:33:55 and that
0:33:56 we’ve got a
0:33:57 Bloomberg account
0:33:58 that lets us
0:33:58 look up
0:33:59 multiples
0:34:00 and that
0:34:01 then you
0:34:01 should probably
0:34:02 use
0:34:04 pitch book
0:34:04 for this
0:34:05 data
0:34:06 and rather
0:34:06 than using
0:34:07 Google
0:34:07 this is my
0:34:08 point about
0:34:08 deep research
0:34:09 like no you
0:34:09 should use
0:34:10 this source
0:34:10 and not
0:34:11 that source
0:34:14 do you want
0:34:14 to have to
0:34:15 work that out
0:34:15 from scratch
0:34:16 or do you
0:34:16 want a bunch
0:34:17 of people
0:34:17 who know a
0:34:18 lot about
0:34:18 this stuff
0:34:19 to have spent
0:34:19 five years
0:34:20 working out
0:34:21 what the
0:34:21 choices should
0:34:22 be on the
0:34:22 screen for
0:34:23 you to click
0:34:23 on it
0:34:24 I mean it’s
0:34:24 the old
0:34:25 user interface
0:34:25 saying the
0:34:26 computer should
0:34:26 never ask
0:34:26 you a question
0:34:27 that you
0:34:27 should have
0:34:27 to work out
0:34:28 that it
0:34:28 should know
0:34:29 by itself
0:34:30 you go to
0:34:30 a blank
0:34:32 raw chatbot
0:34:32 screen
0:34:34 it’s asking
0:34:34 you literally
0:34:35 everything
0:34:36 it’s not just
0:34:37 asking you
0:34:37 one question
0:34:38 it’s asking
0:34:39 you absolutely
0:34:39 everything
0:34:40 about what
0:34:41 it is that
0:34:41 you want
0:34:42 and how
0:34:42 you’re going
0:34:42 to work
0:34:43 out how
0:34:43 to do
0:34:43 it
0:34:45 and so
0:34:46 you’re
0:34:46 mentioning
0:34:47 chat
0:34:49 isn’t
0:34:50 sort of
0:34:50 a product
0:34:51 as much
0:34:51 as a chatbot
0:34:52 disguised
0:34:53 as a product
0:34:54 I’m curious
0:34:56 when we
0:34:56 sort of
0:34:57 look back
0:34:58 at this
0:34:58 sort of
0:35:00 platform
0:35:00 shift
0:35:01 do you
0:35:01 think
0:35:02 that
0:35:03 there will
0:35:03 be
0:35:03 another
0:35:04 sort
0:35:04 of
0:35:04 iPhone
0:35:05 sort
0:35:06 of
0:35:06 Excel
0:35:07 product
0:35:07 that
0:35:07 kind
0:35:08 of
0:35:08 defines
0:35:09 the
0:35:10 platform
0:35:11 shift
0:35:11 in a way
0:35:11 that
0:35:12 chat
0:35:12 GPT
0:35:12 won’t
0:35:13 or
0:35:14 is it
0:35:14 sort
0:35:14 of
0:35:14 that
0:35:15 the world
0:35:15 has to
0:35:15 catch up
0:35:16 to
0:35:16 how
0:35:17 to use
0:35:17 chat
0:35:18 GPT
0:35:18 or
0:35:19 something
0:35:19 like
0:35:19 so
0:35:19 both
0:35:20 of these
0:35:21 can be
0:35:21 true
0:35:21 because
0:35:21 there
0:35:22 was
0:35:22 a lot
0:35:22 of
0:35:22 like
0:35:23 it
0:35:23 took
0:35:23 time
0:35:23 to
0:35:24 realize
0:35:24 how
0:35:24 you
0:35:24 would
0:35:24 use
0:35:24 Google
0:35:25 Maps
0:35:25 and
0:35:25 what
0:35:25 you
0:35:25 could
0:35:26 do
0:35:26 with
0:35:26 Google
0:35:26 and how
0:35:27 you
0:35:27 could
0:35:27 use
0:35:27 Instagram
0:35:28 and
0:35:28 all
0:35:28 of
0:35:28 these
0:35:28 products
0:35:29 have
0:35:29 evolved
0:35:29 a huge
0:35:30 amount
0:35:30 over
0:35:30 time
0:35:31 so
0:35:31 some
0:35:31 of
0:35:31 it
0:35:32 is
0:35:32 like
0:35:32 you
0:35:32 grow
0:35:33 towards
0:35:33 realizing
0:35:33 what
0:35:33 you
0:35:34 could
0:35:34 do
0:35:34 with
0:35:34 this
0:35:34 like
0:35:34 you
0:35:35 realize
0:35:35 that’s
0:35:35 just
0:35:35 a Google
0:35:36 query
0:35:36 now
0:35:36 you
0:35:37 realize
0:35:37 that you
0:35:37 could
0:35:37 just
0:35:37 do it
0:35:38 like
0:35:38 that
0:35:57 10,000
0:35:57 10,000
0:35:58 really
0:35:58 clever
0:35:58 people
0:35:58 sitting
0:35:58 and
0:35:59 trying
0:35:59 to
0:35:59 work
0:35:59 out
0:35:59 what
0:36:00 those
0:36:00 things
0:36:00 are
0:36:00 and
0:36:00 then
0:36:00 showing
0:36:00 it
0:36:01 to
0:36:01 you
0:36:01 as
0:36:01 a
0:36:02 product
0:36:03 I
0:36:03 think
0:36:03 another
0:36:03 side
0:36:04 of
0:36:04 this
0:36:04 is
0:36:04 like
0:36:05 you
0:36:06 know
0:36:06 there
0:36:06 were
0:36:06 always
0:36:06 these
0:36:07 precursors
0:36:08 so
0:36:09 like
0:36:09 there
0:36:09 were
0:36:09 lots
0:36:10 of
0:36:10 other
0:36:10 things
0:36:11 before
0:36:11 Instagram
0:36:13 YouTube
0:36:14 didn’t
0:36:14 start
0:36:15 as
0:36:15 YouTube
0:36:16 it
0:36:16 started
0:36:16 as
0:36:16 video
0:36:17 dating
0:36:17 I
0:36:17 think
0:36:18 there
0:36:18 were
0:36:19 lots
0:36:19 of
0:36:19 attempts
0:36:20 to
0:36:20 do
0:36:20 online
0:36:21 dating
0:36:21 that
0:36:21 all
0:36:22 kind
0:36:22 of
0:36:22 worked
0:36:23 until
0:36:23 Tinder
0:36:24 kind
0:36:24 of
0:36:24 pulled
0:36:24 the
0:36:24 whole
0:36:25 thing
0:36:25 inside
0:36:25 out
0:36:26 and
0:36:26 so
0:36:27 there
0:36:27 were
0:36:27 always
0:36:28 lots
0:36:28 of
0:36:28 things
0:36:29 what’s
0:36:29 the
0:36:29 phrase
0:36:29 local
0:36:30 maxima
0:36:30 in
0:36:31 fact
0:36:31 this
0:36:31 is
0:36:31 where
0:36:31 we
0:36:31 were
0:36:32 particularly
0:36:32 with
0:36:32 the
0:36:33 iPhone
0:36:35 before
0:36:35 because
0:36:35 I
0:36:36 was
0:36:36 working
0:36:36 in
0:36:36 mobile
0:36:37 for
0:36:37 the
0:36:37 previous
0:36:38 decade
0:36:39 it
0:36:40 didn’t
0:36:40 feel
0:36:40 like
0:36:40 we
0:36:41 were
0:36:41 waiting
0:36:41 for
0:36:41 a
0:36:42 thing
0:36:42 it
0:36:42 felt
0:36:42 like
0:36:43 it
0:36:43 was
0:36:43 kind
0:36:43 of
0:36:43 working
0:36:44 like
0:36:44 every
0:36:44 year
0:36:44 the
0:36:45 network
0:36:45 got
0:36:45 faster
0:36:45 and
0:36:46 the
0:36:46 phone
0:36:46 got
0:36:46 better
0:36:47 and
0:36:48 we
0:36:48 had
0:36:49 apps
0:36:49 and
0:36:49 we
0:36:49 had
0:36:49 app
0:36:50 stores
0:36:50 and
0:36:50 we
0:36:50 had
0:36:51 cameras
0:36:52 and
0:36:53 stuff
0:36:53 seemed
0:36:53 to be
0:36:54 every
0:36:54 year
0:36:54 it was
0:36:54 a bit
0:36:55 better
0:36:55 and
0:36:55 then
0:36:55 the
0:36:55 iPhone
0:36:56 arrives
0:36:56 and
0:36:56 it
0:36:56 just
0:36:59 blowed
0:36:59 the
0:37:00 chart
0:37:00 kind
0:37:00 of
0:37:01 you’ve
0:37:01 got
0:37:01 this
0:37:01 line
0:37:01 doing
0:37:02 this
0:37:02 and
0:37:02 then
0:37:02 there’s
0:37:03 a
0:37:03 line
0:37:03 that
0:37:03 does
0:37:03 that
0:37:04 although
0:37:04 remember
0:37:05 also
0:37:05 the
0:37:05 iPhone
0:37:05 took
0:37:06 two
0:37:06 years
0:37:06 before
0:37:06 it
0:37:06 worked
0:37:07 because
0:37:07 the
0:37:07 price
0:37:07 was
0:37:08 wrong
0:37:08 and
0:37:08 the
0:37:08 feature
0:37:08 set
0:37:09 was
0:37:09 wrong
0:37:09 and
0:37:09 the
0:37:10 distribution
0:37:10 model
0:37:10 didn’t
0:37:10 quite
0:37:10 work
0:37:12 and
0:37:13 so
0:37:13 yeah
0:37:14 you can
0:37:14 think
0:37:16 everything
0:37:17 is
0:37:17 going
0:37:17 well
0:37:17 and
0:37:17 then
0:37:18 something
0:37:18 comes
0:37:18 along
0:37:19 and
0:37:19 you
0:37:19 realize
0:37:19 no
0:37:19 no
0:37:20 no
0:37:20 no
0:37:20 no
0:37:20 no
0:37:20 that
0:37:21 which
0:37:21 is
0:37:21 the
0:37:21 same
0:37:21 for
0:37:22 Google
0:37:23 search
0:37:23 was
0:37:23 a
0:37:24 thing
0:37:24 before
0:37:24 Google
0:37:24 it
0:37:24 just
0:37:25 wasn’t
0:37:25 very
0:37:25 good
0:37:26 so
0:37:26 there
0:37:27 was
0:37:27 lots
0:37:27 of
0:37:28 social
0:37:28 stuff
0:37:28 before
0:37:28 Facebook
0:37:29 and
0:37:29 that
0:37:30 was
0:37:30 the
0:37:30 thing
0:37:30 that
0:37:31 catalyzed
0:37:31 it
0:37:31 so
0:37:32 I
0:37:32 just
0:37:32 think
0:37:33 deterministically
0:37:33 this
0:37:33 whole
0:37:33 thing
0:37:33 is
0:37:33 so
0:37:34 early
0:37:34 that
0:37:34 it
0:37:35 feels
0:37:35 like
0:37:35 of
0:37:35 course
0:37:35 there
0:37:35 are
0:37:35 going
0:37:36 to
0:37:36 be
0:37:36 you
0:37:36 know
0:37:37 dozens
0:37:37 hundreds
0:37:38 of
0:37:38 new
0:37:38 things
0:37:38 otherwise
0:37:39 HGC and Z
0:37:40 would just
0:37:40 kind of
0:37:40 shut down
0:37:40 and give
0:37:41 the money
0:37:41 back
0:37:41 to the
0:37:41 LPs
0:37:42 because
0:37:43 the
0:37:44 models
0:37:44 will
0:37:44 just
0:37:45 do
0:37:45 the
0:37:45 whole
0:37:45 thing
0:37:45 and
0:37:45 I
0:37:46 don’t
0:37:46 think
0:37:46 you’re
0:37:46 going
0:37:46 to
0:37:46 do
0:37:46 that
0:37:47 at
0:37:47 least
0:37:47 I
0:37:47 hope
0:37:47 not
0:37:47 no
0:37:48 if
0:37:48 we
0:37:48 have
0:38:01 sub
0:38:01 sector
0:38:01 that
0:38:01 there
0:38:01 would
0:38:02 be
0:38:03 net
0:38:04 new
0:38:04 companies
0:38:04 created
0:38:05 that
0:38:05 would
0:38:05 be
0:38:06 better
0:38:06 than
0:38:06 the
0:38:07 model
0:38:08 providers
0:38:08 that
0:38:08 there
0:38:08 would
0:38:08 be
0:38:09 even
0:38:09 multiple
0:38:10 model
0:38:10 providers
0:38:10 that
0:38:11 in
0:38:11 every
0:38:12 category
0:38:13 one
0:38:14 thing
0:38:14 we’ve
0:38:14 always
0:38:15 in the
0:38:15 web
0:38:15 two
0:38:15 era
0:38:15 we’ve
0:38:15 always
0:38:15 been
0:38:16 on
0:38:16 the
0:38:16 category
0:38:16 winner
0:38:19 but
0:38:20 these
0:38:21 markets
0:38:21 are
0:38:21 so
0:38:21 big
0:38:22 and
0:38:23 there’s
0:38:23 so
0:38:24 much
0:38:24 expertise
0:38:25 and
0:38:25 specialization
0:38:26 that
0:38:28 one
0:38:28 there
0:38:28 can
0:38:28 be
0:38:28 winners
0:38:29 in
0:38:29 every
0:38:29 category
0:38:29 it’s
0:38:29 not
0:38:30 just
0:38:30 the
0:38:31 model
0:38:31 providers
0:38:31 taking
0:38:31 everything
0:38:32 but
0:38:32 that
0:38:32 even
0:38:32 in
0:38:33 every
0:38:33 category
0:38:33 including
0:38:34 the
0:38:34 model
0:38:34 providers
0:38:35 there
0:38:35 can
0:38:35 be
0:38:35 multiple
0:38:35 winners
0:38:36 and
0:38:36 increasing
0:38:38 specialization
0:38:38 and
0:38:38 the
0:38:39 markets
0:38:39 are
0:38:39 just
0:38:40 big
0:38:40 enough
0:38:40 to
0:38:41 contain
0:38:42 multiple
0:38:42 winners
0:38:43 I think
0:38:43 that’s
0:38:44 right
0:38:45 and I
0:38:45 think
0:38:46 the
0:38:46 categories
0:38:47 themselves
0:38:48 aren’t
0:38:48 clear
0:38:49 and
0:38:50 many
0:38:51 things
0:38:51 you think
0:38:51 this is
0:38:51 a
0:38:52 category
0:38:52 and
0:38:52 it
0:38:52 turns
0:38:52 out
0:38:53 no
0:38:53 it
0:38:53 was
0:38:53 actually
0:38:53 that
0:38:53 whole
0:38:54 other
0:38:54 thing
0:38:54 and
0:38:54 the
0:38:55 categories
0:38:55 kind
0:38:55 of get
0:38:56 unbundled
0:38:56 and
0:38:57 recombined
0:38:57 in
0:38:57 different
0:38:57 ways
0:38:57 I
0:38:58 remember
0:38:58 I
0:38:58 was
0:38:58 a
0:38:58 student
0:38:59 in
0:39:00 1995
0:39:01 and
0:39:03 I
0:39:03 think
0:39:03 I
0:39:03 had
0:39:03 like
0:39:03 four
0:39:03 or
0:39:04 five
0:39:04 different
0:39:04 web
0:39:04 browsers
0:39:05 on my
0:39:05 PC
0:39:08 because
0:39:08 Tim
0:39:08 Berners-Lee’s
0:39:09 original
0:39:09 web
0:39:09 browser
0:39:10 had a
0:39:10 web
0:39:10 editor
0:39:10 in
0:39:11 it
0:39:11 because
0:39:11 he
0:39:11 thought
0:39:11 this
0:39:11 was
0:39:12 kind
0:39:12 of
0:39:12 like
0:39:12 a
0:39:12 network
0:39:12 drive
0:39:13 and
0:39:13 it
0:39:13 was
0:39:13 a
0:39:14 sharing
0:39:14 system
0:39:15 and
0:39:15 not
0:39:16 really
0:39:16 a
0:39:16 publishing
0:39:17 system
0:39:17 so
0:39:17 you
0:39:17 would
0:39:18 have
0:39:18 your
0:39:18 web
0:39:18 pages
0:39:18 on
0:39:19 your
0:39:19 PC
0:39:21 turned
0:39:21 on
0:39:21 and
0:39:21 that
0:39:21 would
0:39:21 be
0:39:22 how
0:39:22 your
0:39:22 colleagues
0:39:22 would
0:39:22 look
0:39:23 at
0:39:23 your
0:39:23 Word
0:39:23 documents
0:39:25 or
0:39:25 your
0:39:25 web
0:39:25 pages
0:39:26 and
0:39:26 so
0:39:26 again
0:39:27 we
0:39:27 just
0:39:27 don’t
0:39:28 know
0:39:29 how
0:39:34 picking
0:39:35 up
0:39:35 on
0:39:35 a
0:39:36 strand
0:39:36 within
0:39:36 what
0:39:36 you
0:39:36 just
0:39:36 said
0:39:37 though
0:39:37 the
0:39:37 interesting
0:39:38 one of
0:39:38 the
0:39:38 things
0:39:38 I’m
0:39:38 sort
0:39:39 of
0:39:39 thinking
0:39:39 about
0:39:39 a lot
0:39:39 is
0:39:40 looking
0:39:40 at
0:39:40 open
0:39:40 AI
0:39:42 because
0:39:44 I’m
0:39:44 sort
0:39:44 of
0:39:44 fascinated
0:39:45 by
0:39:45 disconnections
0:39:46 and
0:39:46 we’ve
0:39:46 got
0:39:47 this
0:39:47 interesting
0:39:47 disconnect
0:39:48 now
0:39:48 which
0:39:48 is
0:39:49 you
0:39:49 know
0:39:49 if
0:39:49 you
0:39:49 look
0:39:49 at
0:39:49 the
0:39:49 benchmark
0:39:50 scores
0:39:51 so
0:39:51 you’ve
0:39:51 got
0:39:51 these
0:39:51 general
0:39:52 purpose
0:39:52 benchmarks
0:39:52 where
0:39:52 the
0:39:52 models
0:39:53 are
0:39:53 basically
0:39:53 all
0:39:53 the
0:39:54 same
0:39:54 and
0:39:54 if
0:39:55 you’re
0:39:55 spending
0:39:56 hours
0:39:56 a day
0:39:56 and
0:39:57 you’ve
0:39:57 got
0:39:57 this
0:39:57 opinion
0:39:57 about
0:39:58 I like
0:40:00 GPT
0:40:00 and I
0:40:00 like
0:40:01 GPT
0:40:01 5.1
0:40:01 more than
0:40:02 GPT
0:40:03 4.9
0:40:03 or whatever
0:40:03 the hell
0:40:03 it’s
0:40:04 called
0:40:05 if
0:40:05 you’re
0:40:05 using
0:40:05 this
0:40:06 once
0:40:06 a
0:40:06 week
0:40:06 you
0:40:06 really
0:40:06 don’t
0:40:07 notice
0:40:07 this
0:40:07 stuff
0:40:08 and
0:40:09 the
0:40:09 benchmark
0:40:09 scores
0:40:09 are all
0:40:10 roughly
0:40:10 the
0:40:10 same
0:40:11 but the
0:40:12 usage
0:40:12 isn’t
0:40:12 it’s
0:40:13 basically
0:40:13 the
0:40:13 only
0:40:14 the
0:40:14 Claude
0:40:14 has
0:40:15 basically
0:40:15 no
0:40:15 consumer
0:40:16 usage
0:40:16 even
0:40:17 though
0:40:17 on
0:40:17 the
0:40:17 benchmark
0:40:17 score
0:40:18 it’s
0:40:18 the
0:40:18 same
0:40:19 and
0:40:19 then
0:40:19 it’s
0:40:20 chat
0:40:20 GPT
0:40:20 and
0:40:21 then
0:40:21 halfway
0:40:22 down
0:40:22 the
0:40:22 chart
0:40:23 it’s
0:40:23 Meta
0:40:24 and
0:40:25 Google
0:40:25 and
0:40:25 the
0:40:26 funny
0:40:26 thing
0:40:26 is
0:40:26 you
0:40:27 read
0:40:27 all
0:40:27 the
0:40:28 AI
0:40:28 newsletters
0:40:28 then
0:40:29 Meta
0:40:29 is
0:40:30 lost
0:40:30 they’re
0:40:30 out
0:40:30 of
0:40:30 the
0:40:30 game
0:40:30 they’re
0:40:31 dead
0:40:32 Mark Zuckerberg
0:40:32 is
0:40:32 spending
0:40:32 a billion
0:40:33 dollars
0:40:33 of
0:40:33 researcher
0:40:34 to get
0:40:34 back
0:40:34 in
0:40:34 the
0:40:34 game
0:40:35 but
0:40:35 from
0:40:35 the
0:40:36 consumer
0:40:36 side
0:40:36 well
0:40:36 it’s
0:40:37 distribution
0:40:39 and
0:40:39 the
0:40:40 interesting
0:40:40 thing
0:40:41 here
0:40:41 is
0:40:42 that
0:40:42 you’ve
0:40:43 got
0:40:43 what
0:40:43 I’m
0:40:43 kind
0:40:43 of
0:40:44 circling
0:40:44 around
0:40:44 is
0:40:45 if
0:40:45 the
0:40:46 model
0:40:46 for
0:40:47 a
0:40:47 casual
0:40:48 consumer
0:40:48 user
0:40:48 certainly
0:40:49 is a
0:40:50 commodity
0:40:51 and
0:40:51 there’s
0:40:51 no
0:40:52 network
0:40:52 effects
0:40:52 or
0:40:52 winner
0:40:53 takes
0:40:53 all
0:40:53 effects
0:40:54 yet
0:40:54 those
0:40:54 may
0:40:55 emerge
0:40:55 but
0:40:55 we
0:40:55 don’t
0:40:55 have
0:40:55 them
0:40:56 yet
0:40:56 and
0:40:57 things
0:40:57 like
0:40:57 memory
0:40:58 aren’t
0:40:58 network
0:40:58 effects
0:40:58 so
0:40:59 stickiness
0:40:59 but
0:40:59 they
0:40:59 can
0:40:59 be
0:41:00 copied
0:41:04 how
0:41:04 is
0:41:05 it
0:41:05 that
0:41:05 you
0:41:06 compete
0:41:06 do
0:41:06 you
0:41:07 just
0:41:07 compete
0:41:07 on
0:41:08 being
0:41:08 the
0:41:08 recognized
0:41:09 brand
0:41:09 and
0:41:09 adding
0:41:10 more
0:41:10 features
0:41:10 and
0:41:11 services
0:41:11 and
0:41:11 capabilities
0:41:12 and
0:41:12 people
0:41:12 just
0:41:13 don’t
0:41:13 switch
0:41:13 away
0:41:14 which
0:41:14 is
0:41:14 kind
0:41:14 of
0:41:14 what
0:41:15 happened
0:41:15 with
0:41:15 Chrome
0:41:16 for
0:41:16 example
0:41:17 there’s
0:41:17 not a
0:41:18 network
0:41:18 effect
0:41:18 for
0:41:19 Chrome
0:41:19 and
0:41:20 it’s
0:41:20 not
0:41:20 actually
0:41:20 any
0:41:21 better
0:41:21 than
0:41:23 Safari
0:41:23 but
0:41:23 you
0:41:24 use
0:41:24 Chrome
0:41:24 because
0:41:24 you
0:41:24 use
0:41:25 Chrome
0:41:26 or
0:41:28 is
0:41:28 it
0:41:28 that
0:41:29 you
0:41:29 get
0:41:30 left
0:41:30 behind
0:41:31 on
0:41:32 distribution
0:41:34 or
0:41:34 network
0:41:35 effects
0:41:36 that
0:41:36 emerge
0:41:37 somewhere
0:41:37 else
0:41:39 and
0:41:39 meanwhile
0:41:40 you
0:41:40 don’t
0:41:40 have
0:41:40 your
0:41:40 own
0:41:41 infrastructure
0:41:42 so
0:41:42 I
0:41:42 suppose
0:41:42 what
0:41:42 I’m
0:41:43 getting
0:41:43 at
0:41:43 is
0:41:43 you’ve
0:41:43 got
0:41:44 these
0:41:44 800
0:41:44 or
0:41:44 900
0:41:45 million
0:41:45 weekly
0:41:45 active
0:41:46 users
0:41:46 but
0:41:47 that
0:41:48 feels
0:41:48 very
0:41:49 fragile
0:41:49 because
0:41:49 all
0:41:50 you’ve
0:41:50 really
0:41:50 got
0:41:50 is
0:41:51 the
0:41:51 power
0:41:51 of
0:41:51 the
0:41:51 default
0:41:52 and
0:41:52 the
0:41:52 brand
0:41:52 you
0:41:53 don’t
0:41:53 have
0:41:53 a
0:41:53 network
0:41:53 effect
0:41:54 you
0:41:54 don’t
0:41:54 really
0:41:54 have
0:41:54 feature
0:41:55 lock
0:41:55 in
0:41:56 you
0:41:56 don’t
0:42:00 infrastructure
0:42:00 so
0:42:00 you
0:42:00 don’t
0:42:01 control
0:42:01 your
0:42:01 cost
0:42:01 base
0:42:02 you
0:42:02 don’t
0:42:02 have
0:42:02 a
0:42:02 cost
0:42:03 advantage
0:42:04 you
0:42:05 get
0:42:05 a bill
0:42:05 every
0:42:05 month
0:42:06 from
0:42:06 Satya
0:42:09 so
0:42:09 you
0:42:09 kind
0:42:10 of
0:42:10 got
0:42:10 to
0:42:10 scramble
0:42:11 as
0:42:11 fast
0:42:11 as
0:42:11 you
0:42:12 can
0:42:12 in
0:42:13 both
0:42:13 of
0:42:13 those
0:42:14 directions
0:42:15 to
0:42:15 on the
0:42:15 one
0:42:16 side
0:42:16 build
0:42:17 product
0:42:17 and
0:42:18 build
0:42:18 stuff
0:42:18 that
0:42:19 on top
0:42:19 of
0:42:19 the
0:42:20 model
0:42:20 which is
0:42:20 our
0:42:20 earlier
0:42:21 conversation
0:42:21 is it
0:42:22 just
0:42:22 the
0:42:22 model
0:42:23 you’ve
0:42:23 got
0:42:23 to
0:42:23 build
0:42:24 stuff
0:42:24 on
0:42:24 top
0:42:24 of
0:42:25 the
0:42:25 model
0:42:25 in
0:42:25 every
0:42:26 direction
0:42:27 it’s
0:42:27 a
0:42:27 browser
0:42:28 it’s
0:42:28 a
0:42:28 social
0:42:29 video
0:42:29 app
0:42:30 it’s
0:42:30 an
0:42:30 app
0:42:31 platform
0:42:31 it’s
0:42:31 this
0:42:31 it’s
0:42:32 that
0:42:32 it’s
0:42:32 like
0:42:32 you
0:42:32 know
0:42:32 the
0:42:32 meme
0:42:33 of
0:42:33 the
0:42:33 guy
0:42:33 with
0:42:33 the
0:42:33 map
0:42:34 with
0:42:34 all
0:42:34 the
0:42:34 strings
0:42:34 on
0:42:34 it
0:42:36 it’s
0:42:36 all
0:42:36 of
0:42:36 these
0:42:36 things
0:42:37 we’re
0:42:37 going to
0:42:37 build
0:42:37 all
0:42:37 of
0:42:37 them
0:42:38 yesterday
0:42:38 and
0:42:38 then
0:42:39 in
0:42:39 parallel
0:42:40 it’s
0:42:40 infrastructure
0:42:43 we’ve
0:42:43 got to
0:42:44 deal
0:42:44 with
0:42:46 NVIDIA
0:42:47 with
0:42:48 Broadcom
0:42:48 with
0:42:48 AMD
0:42:49 with
0:42:49 NVIDIA
0:42:49 with
0:42:50 Oracle
0:42:51 and
0:42:51 with
0:42:52 Petrodollars
0:42:53 because
0:42:53 you’re
0:42:53 kind of
0:42:54 scrambling
0:42:55 to get
0:42:55 from
0:42:57 this
0:42:57 amazing
0:42:58 technical
0:42:58 breakthrough
0:42:59 and
0:43:00 these
0:43:01 800
0:43:01 900
0:43:01 million
0:43:02 wows
0:43:03 to
0:43:03 something
0:43:04 that
0:43:04 has
0:43:05 like
0:43:05 really
0:43:06 sticky
0:43:07 defensible
0:43:07 sustainable
0:43:08 business
0:43:08 value
0:43:08 and
0:43:08 product
0:43:09 value
0:43:10 yeah
0:43:11 and
0:43:11 so
0:43:12 as
0:43:13 you’re
0:43:13 evaluating
0:43:14 the
0:43:14 competitive
0:43:15 landscape
0:43:16 among
0:43:16 the
0:43:17 hyperscalers
0:43:18 what
0:43:18 are
0:43:18 the
0:43:19 questions
0:43:20 that
0:43:20 you’re
0:43:20 asking
0:43:21 that
0:43:21 you
0:43:21 think
0:43:21 are
0:43:21 going
0:43:21 to
0:43:21 be
0:43:22 most
0:43:23 important
0:43:23 in
0:43:23 determining
0:43:25 who’s
0:43:25 going
0:43:25 to
0:43:26 gain
0:43:27 durable
0:43:27 competitive
0:43:27 advantages
0:43:28 or
0:43:28 how
0:43:28 this
0:43:29 competition
0:43:29 is going
0:43:30 to play
0:43:31 out
0:43:32 well
0:43:32 this
0:43:32 kind of
0:43:33 comes
0:43:33 back
0:43:33 to
0:43:33 your
0:43:33 point
0:43:33 about
0:43:34 sustaining
0:43:35 advantage
0:43:35 and
0:43:35 we
0:43:36 talked
0:43:36 about
0:43:36 Google
0:43:36 like
0:43:37 if we
0:43:37 think
0:43:37 about
0:43:38 the
0:43:38 shift
0:43:38 to
0:43:39 mobile
0:43:40 for
0:43:40 meta
0:43:41 this
0:43:41 turned out
0:43:41 to be
0:43:42 transformative
0:43:43 like it
0:43:43 made the
0:43:43 products
0:43:44 way more
0:43:44 useful
0:43:46 for Google
0:43:47 it turned out
0:43:48 mobile search
0:43:48 is just
0:43:48 search
0:43:52 and
0:43:53 maps
0:43:53 changed
0:43:53 probably
0:43:54 and
0:43:54 YouTube
0:43:54 changed
0:43:55 a bit
0:43:55 but
0:43:55 basically
0:43:56 for
0:43:56 Google
0:43:56 search
0:43:56 Google
0:43:57 search
0:43:57 is search
0:43:57 and the
0:43:57 web
0:43:58 search
0:43:59 means more
0:43:59 people doing
0:44:00 more search
0:44:00 more
0:44:00 more of the
0:44:01 time
0:44:03 and
0:44:05 the
0:44:06 default
0:44:06 view
0:44:07 now
0:44:07 would
0:44:07 seem
0:44:08 to be
0:44:08 well
0:44:09 Gemini
0:44:09 is as
0:44:09 good
0:44:09 as
0:44:09 anybody
0:44:10 else
0:44:10 next week
0:44:10 like the
0:44:11 new
0:44:11 model
0:44:11 I haven’t
0:44:12 looked at
0:44:12 the benchmarks
0:44:13 for GPT 5.1
0:44:14 which is out
0:44:14 today
0:44:15 is it better
0:44:15 than Gemini
0:44:15 probably
0:44:16 will it still
0:44:16 be better
0:44:17 next month
0:44:17 no
0:44:20 so
0:44:21 so
0:44:21 that’s a
0:44:22 given
0:44:22 like you’ve
0:44:23 got a
0:44:23 frontier
0:44:23 model
0:44:24 fine
0:44:25 what does
0:44:25 that cost
0:44:26 it costs
0:44:26 you
0:44:27 pick a
0:44:27 number
0:44:28 250
0:44:28 billion
0:44:28 dollars
0:44:28 a year
0:44:29 100
0:44:29 billion
0:44:29 dollars
0:44:30 a year
0:44:31 what’s
0:44:32 our earlier
0:44:32 conversation
0:44:32 about
0:44:33 CapEx
0:44:33 okay
0:44:33 so
0:44:34 Google
0:44:35 can pay
0:44:35 that
0:44:36 because
0:44:36 they’ve
0:44:36 got the
0:44:36 money
0:44:36 they’ve
0:44:37 got the
0:44:37 cash
0:44:37 away
0:44:37 from
0:44:38 everything
0:44:38 else
0:44:39 and so
0:44:39 you do
0:44:39 that
0:44:40 and your
0:44:40 existing
0:44:40 products
0:44:41 you optimize
0:44:42 search
0:44:42 you optimize
0:44:42 your ad
0:44:43 business
0:44:43 you build
0:44:45 new experiences
0:44:46 maybe you
0:44:46 invent the
0:44:47 new iPhone
0:44:48 of AI
0:44:48 maybe there
0:44:48 is no
0:44:49 iPhone
0:44:49 of AI
0:44:50 maybe someone
0:44:50 else does it
0:44:51 and you do
0:44:51 an Android
0:44:51 and just
0:44:52 copy it
0:44:55 so fine
0:44:56 it’s a new
0:44:56 mobile
0:44:58 we’ll just
0:44:58 carry on
0:44:58 search as
0:44:59 such
0:45:00 AI is AI
0:45:01 we’ll do
0:45:01 the new
0:45:01 thing
0:45:02 we’ll make
0:45:02 it a
0:45:02 feature
0:45:02 we’ll just
0:45:03 carry on
0:45:03 doing it
0:45:05 for meta
0:45:06 it feels
0:45:06 like there
0:45:06 are bigger
0:45:07 questions
0:45:07 on what
0:45:08 this means
0:45:08 for search
0:45:10 or what
0:45:10 it means
0:45:10 for content
0:45:11 and social
0:45:12 and experience
0:45:12 and recommendation
0:45:13 which makes
0:45:13 it all that
0:45:14 more imperative
0:45:15 that they have
0:45:15 their own
0:45:15 models
0:45:15 just as
0:45:16 it is
0:45:16 for Google
0:45:18 for
0:45:19 Amazon
0:45:20 okay
0:45:21 well
0:45:21 on the
0:45:22 one side
0:45:22 it’s
0:45:22 commodity
0:45:22 infra
0:45:23 and we’ll
0:45:23 sell it
0:45:24 as commodity
0:45:24 infra
0:45:25 and on the
0:45:26 other side
0:45:27 maybe
0:45:29 stepping back
0:45:30 if you’re
0:45:30 not a
0:45:31 hyperscaler
0:45:31 if you’re
0:45:32 a web
0:45:32 publisher
0:45:33 a marketer
0:45:34 a brand
0:45:35 an advertiser
0:45:35 a media
0:45:36 company
0:45:38 you could
0:45:38 make a list
0:45:39 of questions
0:45:39 but you don’t
0:45:39 even know
0:45:40 what the
0:45:40 questions are
0:45:41 right now
0:45:43 what is
0:45:43 this
0:45:44 what happens
0:45:45 if I ask
0:45:46 a chatbot
0:45:46 a thing
0:45:47 instead of
0:45:47 asking
0:45:47 Google
0:45:48 even if
0:45:48 it’s
0:45:48 Google
0:45:49 from
0:45:49 Google’s
0:45:50 point of view
0:45:50 what I’ll ask
0:45:51 Google’s
0:45:51 chatbot
0:45:51 it’s fine
0:45:52 but as a
0:45:53 marketer
0:45:53 what does that
0:45:53 mean
0:45:54 what happens
0:45:55 if I ask
0:45:56 for a recipe
0:45:57 and the LLM
0:45:57 just gives me
0:45:58 the answer
0:45:58 what does that
0:45:59 mean if my
0:45:59 business is
0:46:00 having recipes
0:46:02 do you have
0:46:02 do you have
0:46:02 do you have
0:46:02 do you have
0:46:02 a kind
0:46:03 of split
0:46:04 between
0:46:04 and this
0:46:04 is also
0:46:05 an Amazon
0:46:05 question
0:46:06 how does
0:46:07 the purchasing
0:46:07 decision
0:46:08 happen
0:46:08 how does
0:46:09 the decision
0:46:09 to buy
0:46:10 a thing
0:46:10 that I didn’t
0:46:11 know existed
0:46:12 before happen
0:46:13 what happens
0:46:13 if I wave
0:46:14 my phone
0:46:14 at my
0:46:14 living room
0:46:15 and say
0:46:15 what should
0:46:16 I buy
0:46:17 where does
0:46:17 that take
0:46:17 me
0:46:18 in ways
0:46:18 that it
0:46:19 wouldn’t
0:46:19 have taken
0:46:19 me
0:46:19 in the
0:46:20 past
0:46:21 so there’s
0:46:21 a lot
0:46:22 of questions
0:46:22 further
0:46:23 downstream
0:46:23 and that
0:46:24 goes
0:46:24 upstream
0:46:25 to meta
0:46:25 and to
0:46:26 stomach
0:46:26 stent for
0:46:26 Google
0:46:27 it’s a much
0:46:28 bigger question
0:46:29 in the long
0:46:29 term for
0:46:30 Amazon
0:46:31 do LLMs
0:46:32 mean that
0:46:32 Amazon can
0:46:33 finally do
0:46:34 really good
0:46:34 at scale
0:46:35 recommendation
0:46:36 and discovery
0:46:36 and suggestion
0:46:37 in ways that
0:46:37 it couldn’t
0:46:38 really do
0:46:38 in the past
0:46:40 because of
0:46:40 this kind
0:46:40 of pure
0:46:41 commodity
0:46:41 retailing
0:46:41 model
0:46:42 that it
0:46:42 has
0:46:44 Apple
0:46:46 sort of
0:46:46 off on
0:46:47 one side
0:46:48 you know
0:46:48 interestingly
0:46:49 they produced
0:46:50 this incredibly
0:46:50 compelling
0:46:51 vision of
0:46:51 what Siri
0:46:52 should be
0:46:52 two years
0:46:52 ago
0:46:53 it just
0:46:54 turned out
0:46:54 that they
0:46:54 couldn’t
0:46:55 make it
0:46:56 interestingly
0:46:56 nobody
0:46:57 else
0:46:57 could
0:46:57 have
0:46:57 made
0:46:57 it
0:46:57 either
0:46:58 you
0:46:58 go back
0:46:58 and watch
0:46:59 the Siri
0:46:59 demo
0:47:00 that they
0:47:00 gave
0:47:00 and you
0:47:00 think
0:47:01 okay
0:47:02 so we’ve
0:47:02 got
0:47:03 multimodal
0:47:03 instantaneous
0:47:04 on device
0:47:05 tool using
0:47:06 agentic
0:47:07 multi-platform
0:47:08 e-commerce
0:47:09 in real time
0:47:10 with no prompt
0:47:10 injection
0:47:11 problems
0:47:11 and zero
0:47:11 error
0:47:12 rates
0:47:12 well that
0:47:12 sounds
0:47:13 good
0:47:14 I mean
0:47:15 has anyone
0:47:16 got that
0:47:16 working
0:47:16 like no
0:47:17 open
0:47:18 eye
0:47:18 open
0:47:18 google
0:47:19 and open
0:47:19 AI
0:47:19 I don’t
0:47:19 have that
0:47:19 working
0:47:20 I don’t
0:47:20 think
0:47:20 Google
0:47:21 or
0:47:21 open
0:47:21 AI
0:47:21 could
0:47:22 deliver
0:47:22 the
0:47:22 Siri
0:47:22 demo
0:47:23 that Apple
0:47:23 gave
0:47:23 two years
0:47:23 ago
0:47:25 I mean
0:47:25 they could
0:47:25 probably
0:47:25 do
0:47:25 the demo
0:47:26 but they
0:47:26 couldn’t
0:47:26 like
0:47:27 consistently
0:47:28 reliably
0:47:28 make it
0:47:28 work
0:47:29 I mean
0:47:29 that
0:47:30 demo
0:47:30 that product
0:47:30 isn’t in
0:47:31 Android
0:47:31 today
0:47:33 and
0:47:33 Apple
0:47:34 I mean
0:47:35 Apple to me
0:47:35 has the most
0:47:35 kind of
0:47:36 intellectually
0:47:36 interesting
0:47:36 question
0:47:37 which is
0:47:41 so I saw
0:47:42 Craig Federighi
0:47:42 make this
0:47:42 point
0:47:43 which is
0:47:43 like
0:47:43 we don’t
0:47:44 have our
0:47:44 own
0:47:44 chatbot
0:47:44 fine
0:47:45 we also
0:47:45 don’t
0:47:45 have
0:47:45 YouTube
0:47:46 or
0:47:46 Uber
0:47:50 explain
0:47:51 why
0:47:51 that is
0:47:51 different
0:47:52 which is
0:47:52 a harder
0:47:52 question
0:47:53 to answer
0:47:53 than it
0:47:53 sounds
0:47:54 like
0:47:55 and of
0:47:55 course
0:47:55 the answer
0:47:56 is
0:47:56 if this
0:47:56 actually
0:47:57 fundamentally
0:47:57 change
0:47:58 the nature
0:47:58 of computing
0:47:59 then it’s
0:47:59 a problem
0:48:00 if it’s
0:48:00 just a
0:48:01 service
0:48:01 that you
0:48:01 use
0:48:02 like
0:48:02 Google
0:48:03 then that’s
0:48:03 not a
0:48:03 problem
0:48:04 which is
0:48:05 kind of
0:48:05 the point
0:48:05 about
0:48:06 you know
0:48:07 where
0:48:07 does
0:48:07 Siri
0:48:07 go
0:48:08 but the
0:48:08 interesting
0:48:09 candle
0:48:09 example
0:48:09 here
0:48:09 would be
0:48:10 to think
0:48:10 about
0:48:10 what
0:48:10 happened
0:48:10 to
0:48:11 Microsoft
0:48:11 in the
0:48:12 2000s
0:48:12 which is
0:48:13 the entire
0:48:13 dev environment
0:48:14 gets away
0:48:14 from them
0:48:15 and no one
0:48:15 builds Windows
0:48:16 apps after
0:48:16 like 2001
0:48:17 or something
0:48:18 but you need
0:48:19 to use the
0:48:19 internet
0:48:19 to use the
0:48:19 internet
0:48:20 you need
0:48:20 a PC
0:48:21 and what
0:48:21 PC
0:48:21 are you
0:48:21 going to
0:48:22 buy
0:48:22 well
0:48:23 like
0:48:23 Apple’s
0:48:24 not really
0:48:24 a player
0:48:25 at that
0:48:26 time
0:48:26 and
0:48:27 just
0:48:27 getting
0:48:27 back
0:48:27 into
0:48:28 the
0:48:28 game
0:48:28 Linux
0:48:28 is
0:48:29 obviously
0:48:29 not
0:48:29 an option
0:48:29 for any
0:48:35 magnitude
0:48:36 more
0:48:36 PCs
0:48:40 as a result
0:48:41 of this
0:48:41 thing
0:48:41 that
0:48:42 Microsoft
0:48:42 lost
0:48:43 and then
0:48:44 it takes
0:48:44 until
0:48:44 mobile
0:48:45 that
0:48:46 they lose
0:48:46 the device
0:48:47 as well
0:48:47 as the
0:48:47 development
0:48:48 environment
0:48:49 so here’s
0:48:49 this kind
0:48:49 of
0:48:49 question
0:48:50 is
0:48:50 if all
0:48:50 the new
0:48:50 stuff
0:48:51 is built
0:48:51 on AI
0:48:52 and I’m
0:48:52 accessing it
0:48:53 in an app
0:48:53 that I download
0:48:54 from the App Store
0:48:56 to what extent
0:48:56 is this a problem
0:48:57 for Apple
0:49:01 and what would
0:49:01 have to
0:49:02 you would need
0:49:03 a much more
0:49:04 fundamental shift
0:49:04 in what it was
0:49:05 that was happening
0:49:06 for that to be
0:49:06 a problem
0:49:06 for Apple
0:49:07 and even if
0:49:08 you take
0:49:08 like the
0:49:08 you know
0:49:09 not the
0:49:09 like the full
0:49:10 like the
0:49:11 rapture arrives
0:49:11 and we all
0:49:12 just kind of
0:49:12 go and live
0:49:13 sleep in pods
0:49:13 like the guys
0:49:14 in Up
0:49:15 not Up
0:49:16 yes
0:49:16 what is it
0:49:17 the one with
0:49:17 the robot
0:49:18 that’s capturing
0:49:18 the trash
0:49:19 which one
0:49:19 is that
0:49:19 WALL-E
0:49:20 WALL-E
0:49:21 WALL-E
0:49:21 yeah
0:49:21 you know the
0:49:22 guys in the
0:49:22 pods in that
0:49:23 movie
0:49:23 maybe we’ll
0:49:24 be like that
0:49:25 in which case
0:49:25 fine
0:49:27 but like
0:49:27 there’s a
0:49:27 sort of
0:49:27 a mid case
0:49:28 which is
0:49:28 like the
0:49:28 whole nature
0:49:29 of software
0:49:29 changes
0:49:29 and there
0:49:30 are no
0:49:30 apps
0:49:30 anymore
0:49:30 and you
0:49:31 just go
0:49:31 and ask
0:49:31 the LLM
0:49:31 a thing
0:49:32 fine
0:49:32 what is
0:49:33 the device
0:49:33 on which
0:49:33 you ask
0:49:34 the LLM
0:49:34 a thing
0:49:35 well it’s
0:49:36 probably
0:49:36 going to
0:49:36 have a nice
0:49:36 big color
0:49:37 screen
0:49:38 and it’s
0:49:38 probably
0:49:38 going to
0:49:39 have like
0:49:39 a one
0:49:39 day battery
0:49:40 life
0:49:40 probably
0:49:40 needs a
0:49:41 microphone
0:49:42 probably a
0:49:42 good camera
0:49:48 kind of
0:49:48 sounds like
0:49:49 an iPhone
0:49:52 am I going
0:49:52 to buy the
0:49:53 one that’s
0:49:53 a tenth of
0:49:54 the price
0:49:54 and just
0:49:54 use the
0:49:55 LLM
0:49:55 on it
0:49:58 no
0:49:59 because I’ll
0:49:59 still want the
0:50:00 good camera
0:50:00 and the
0:50:01 good screen
0:50:01 and the
0:50:02 good battery
0:50:02 life
0:50:02 so
0:50:03 it’s
0:50:04 not
0:50:05 there’s a
0:50:05 bunch of
0:50:06 kind of
0:50:06 interesting
0:50:07 strategic
0:50:07 questions
0:50:08 when you
0:50:08 start poking
0:50:09 away
0:50:09 well what
0:50:09 does this
0:50:10 mean for
0:50:10 Amazon
0:50:11 those are
0:50:12 completely
0:50:12 different
0:50:12 questions
0:50:13 to what
0:50:13 does it
0:50:14 mean for
0:50:14 Google
0:50:15 or what
0:50:15 does it
0:50:15 mean for
0:50:15 Apple
0:50:16 what does
0:50:16 it mean
0:50:16 to Facebook
0:50:17 or what
0:50:17 does it
0:50:17 mean to
0:50:17 Salesforce
0:50:18 what does
0:50:19 it mean
0:50:20 to Uber
0:50:21 and then
0:50:22 right back
0:50:22 to what we
0:50:22 were saying
0:50:23 at the
0:50:23 beginning
0:50:23 of this
0:50:24 conversation
0:50:24 what does
0:50:24 this
0:50:25 mean
0:50:25 for Uber
0:50:27 well
0:50:27 their
0:50:28 operations
0:50:29 get X
0:50:29 percent
0:50:29 more
0:50:30 efficient
0:50:30 and now
0:50:31 the fraud
0:50:31 detection
0:50:31 works
0:50:33 and
0:50:33 you know
0:50:34 okay maybe
0:50:34 they’re
0:50:34 autonomous
0:50:35 cars
0:50:35 different
0:50:35 conversation
0:50:36 but
0:50:36 presume
0:50:37 no
0:50:37 autonomous
0:50:37 cars
0:50:37 that’s
0:50:37 a whole
0:50:38 other
0:50:38 conversation
0:50:39 otherwise
0:50:39 as Uber
0:50:39 what does
0:50:40 this
0:50:40 change
0:50:40 well
0:50:42 not a
0:50:42 huge
0:50:43 amount
0:50:44 I want to
0:50:44 sort of zoom
0:50:45 out a little
0:50:45 bit
0:50:46 this whole
0:50:46 framing
0:50:48 so you’ve
0:50:48 been doing
0:50:48 these
0:50:49 presentations
0:50:49 for a while
0:50:49 now
0:50:50 you’ve
0:50:50 bumped
0:50:50 them up
0:50:51 two times
0:50:52 because
0:50:52 so much
0:50:53 is changing
0:50:54 and one
0:50:55 of the
0:50:55 things you do
0:50:55 in each
0:50:56 presentation
0:50:56 is you’re
0:50:57 famous for
0:50:57 asking
0:50:58 really great
0:50:58 questions
0:50:59 and chronicling
0:50:59 what are
0:50:59 the important
0:51:00 questions to
0:51:01 be asking
0:51:02 I’m curious
0:51:03 as you reflect
0:51:04 you know
0:51:05 maybe post
0:51:05 you know
0:51:06 chat GPT in
0:51:06 2022
0:51:08 or GPT 3
0:51:08 rather
0:51:10 the questions
0:51:10 you were asking
0:51:11 then
0:51:12 and you reflect
0:51:12 on to now
0:51:14 to what extent
0:51:15 do we have
0:51:15 some direction
0:51:16 on some of
0:51:16 those questions
0:51:17 or to what
0:51:17 extent
0:51:18 are they
0:51:18 the same
0:51:19 questions
0:51:20 or new
0:51:21 and different
0:51:22 questions
0:51:23 or what
0:51:23 what is
0:51:23 sort
0:51:23 of your
0:51:24 you know
0:51:24 if I
0:51:25 woke up
0:51:25 in a coma
0:51:27 after reading
0:51:27 your you know
0:51:29 your original
0:51:29 presentation
0:51:30 let’s say
0:51:30 the one
0:51:32 after GPT 3
0:51:32 launch
0:51:33 came out
0:51:34 and then
0:51:35 seeing this
0:51:35 one now
0:51:36 what were
0:51:36 the sort
0:51:36 of most
0:51:37 surprising
0:51:37 things
0:51:38 or things
0:51:38 that we
0:51:39 learned
0:51:40 that updated
0:51:40 those questions
0:51:42 so I
0:51:42 think we
0:51:43 have
0:51:44 a lot
0:51:45 of new
0:51:45 questions
0:51:46 this year
0:51:48 so I
0:51:49 feel like
0:51:49 you know
0:51:49 you could
0:51:50 make a list
0:51:50 of as it
0:51:51 might be
0:51:51 half a dozen
0:51:52 questions
0:51:53 in spring
0:51:53 of 23
0:51:55 like
0:51:56 open source
0:51:57 China
0:51:58 Nvidia
0:51:59 does scaling
0:52:00 continue
0:52:02 what happens
0:52:03 to images
0:52:05 how long
0:52:05 does open
0:52:06 AI’s lead
0:52:06 remain
0:52:07 and those
0:52:07 questions
0:52:08 didn’t really
0:52:08 change in
0:52:09 23 and
0:52:09 24
0:52:11 and most
0:52:11 of those
0:52:11 questions
0:52:11 are kind
0:52:12 of still
0:52:12 there
0:52:12 like the
0:52:12 Nvidia
0:52:13 question
0:52:13 hasn’t really
0:52:13 changed
0:52:15 the answer
0:52:17 on how many
0:52:18 models will
0:52:18 there be
0:52:18 the answer
0:52:19 is okay
0:52:19 there’s going
0:52:19 to be
0:52:20 anybody
0:52:21 who can
0:52:21 spend a couple
0:52:23 of billion dollars
0:52:24 can have a
0:52:25 frontier model
0:52:25 and that was
0:52:27 pretty obvious
0:52:28 in early 23
0:52:29 it took a while
0:52:29 for everyone
0:52:30 to understand
0:52:30 that
0:52:31 and big
0:52:31 models
0:52:31 and small
0:52:31 models
0:52:32 will we have
0:52:32 small models
0:52:33 running on
0:52:33 devices
0:52:34 no because
0:52:34 the small
0:52:34 models
0:52:36 the capabilities
0:52:36 keep moving
0:52:37 too fast
0:52:38 for the small
0:52:39 models to shrink
0:52:39 the small
0:52:39 model onto
0:52:40 the device
0:52:40 but those
0:52:41 questions
0:52:41 kind of
0:52:41 didn’t
0:52:41 change
0:52:42 for two
0:52:42 and a half
0:52:42 years
0:52:43 I think
0:52:43 we now
0:52:44 have
0:52:44 I think
0:52:45 a bunch
0:52:46 of more
0:52:47 product
0:52:47 strategy
0:52:48 questions
0:52:49 as you see
0:52:50 real consumer
0:52:50 adoption
0:52:51 and open
0:52:51 AI and
0:52:52 Google
0:52:53 building stuff
0:52:53 in different
0:52:54 directions
0:52:54 Amazon
0:52:55 going in
0:52:55 different
0:52:56 directions
0:52:57 Apple
0:52:57 trying and
0:52:58 obviously
0:52:58 failing and
0:52:59 then trying
0:53:00 again to do
0:53:00 stuff
0:53:01 there’s some
0:53:02 sense of
0:53:03 like there
0:53:03 is something
0:53:04 more going
0:53:04 on in the
0:53:05 industry
0:53:05 than just
0:53:06 well let’s
0:53:06 just build
0:53:07 another model
0:53:07 and spend
0:53:08 more money
0:53:09 there’s
0:53:09 more
0:53:10 questions
0:53:10 and more
0:53:10 decisions
0:53:11 now
0:53:12 there’s
0:53:12 also
0:53:13 more
0:53:13 questions
0:53:14 outside
0:53:15 of
0:53:15 tech
0:53:16 in
0:53:16 certainly
0:53:17 on like
0:53:17 the
0:53:17 retail
0:53:18 media
0:53:18 side
0:53:19 of
0:53:21 how do you
0:53:21 start
0:53:22 thinking
0:53:23 about
0:53:24 what you
0:53:25 would do
0:53:25 with this
0:53:26 and again
0:53:27 you know
0:53:27 classic
0:53:27 framing
0:53:28 in my
0:53:28 deck
0:53:28 is
0:53:28 like
0:53:28 step
0:53:29 one
0:53:29 is
0:53:29 you
0:53:29 make
0:53:29 it
0:53:29 a feature
0:53:30 and you
0:53:30 absorb
0:53:30 it
0:53:30 and you
0:53:31 do the
0:53:31 obvious
0:53:31 stuff
0:53:31 step
0:53:31 two
0:53:32 is
0:53:32 you do
0:53:32 new
0:53:32 stuff
0:53:33 step
0:53:33 three
0:53:33 is
0:53:34 maybe
0:53:34 someone
0:53:34 will
0:53:34 come
0:53:35 and pull
0:53:35 the whole
0:53:35 industry
0:53:36 inside out
0:53:36 and completely
0:53:37 redefine
0:53:37 the question
0:53:38 and so
0:53:38 you could
0:53:39 kind of
0:53:39 do
0:53:39 like
0:53:39 an
0:53:39 imagine
0:53:40 if
0:53:40 here
0:53:41 of
0:53:41 like
0:53:42 step
0:53:42 one
0:53:43 is
0:53:44 you know
0:53:45 you’re
0:53:45 a manager
0:53:45 at a
0:53:46 Walmart
0:53:46 in
0:53:48 the
0:53:48 Bay Area
0:53:49 or DC
0:53:50 or whatever
0:53:50 it is
0:53:50 step
0:53:51 one
0:53:51 is
0:53:51 find me
0:53:51 that
0:53:52 metric
0:53:52 step
0:53:53 two
0:53:53 is
0:53:53 build
0:53:53 me
0:53:53 a
0:53:54 dashboard
0:53:55 step
0:53:55 three
0:53:56 is
0:53:56 it’s
0:53:57 Black
0:53:57 Friday
0:53:59 and I’m
0:53:59 running
0:53:59 managing
0:54:00 a
0:54:00 Walmart
0:54:01 outside
0:54:01 of DC
0:54:03 what should
0:54:03 I be
0:54:03 worried
0:54:04 about
0:54:05 like
0:54:06 and that
0:54:06 might be
0:54:07 the wrong
0:54:07 one
0:54:08 but it’s
0:54:08 like
0:54:08 you know
0:54:09 step
0:54:09 one
0:54:09 for
0:54:10 Amazon
0:54:10 is
0:54:10 you bought
0:54:11 light bulbs
0:54:11 so here’s
0:54:12 so you bought
0:54:12 bubble wrap
0:54:13 so here’s
0:54:13 some packing
0:54:14 tape
0:54:15 but what
0:54:15 Amazon
0:54:16 should actually
0:54:16 be doing
0:54:17 is saying
0:54:17 hmm
0:54:18 it’s like
0:54:18 this person
0:54:19 is moving
0:54:19 home
0:54:19 we’ll show
0:54:20 them a
0:54:20 home
0:54:20 insurance
0:54:21 ad
0:54:21 which is
0:54:22 something
0:54:22 that
0:54:22 Amazon’s
0:54:23 correlation
0:54:23 systems
0:54:24 wouldn’t
0:54:24 get
0:54:24 because
0:54:24 they
0:54:24 wouldn’t
0:54:25 have
0:54:25 that
0:54:25 in
0:54:25 their
0:54:25 purchasing
0:54:26 data
0:54:27 and
0:54:27 we’re
0:54:28 still
0:54:28 very much
0:54:29 at the
0:54:29 like
0:54:31 we’re
0:54:31 still
0:54:32 starting
0:54:32 we’re
0:54:33 still on
0:54:33 the
0:54:33 step
0:54:33 one
0:54:34 of
0:54:34 that
0:54:34 but
0:54:34 thinking
0:54:35 much
0:54:35 more
0:54:35 what
0:54:35 would
0:54:36 the
0:54:36 step
0:54:36 two
0:54:36 step
0:54:37 three
0:54:37 be
0:54:38 what
0:54:38 would
0:54:38 new
0:54:39 revenue
0:54:39 be
0:55:03 what
0:55:04 kinds
0:55:04 of
0:55:04 content
0:55:05 were
0:55:05 predicated
0:55:05 on
0:55:06 Google
0:55:06 rooting
0:55:07 that
0:55:07 question
0:55:07 to
0:55:08 you
0:55:08 and
0:55:08 what
0:55:09 kind
0:55:09 of
0:55:09 content
0:55:11 isn’t
0:55:11 really
0:55:12 that
0:55:12 question
0:55:13 like
0:55:13 do I
0:55:14 want
0:55:15 a
0:55:16 bolognese
0:55:16 recipe
0:55:17 or do I
0:55:17 want
0:55:18 to hear
0:55:19 Stanley Tucci
0:55:20 talking about
0:55:21 cooking in
0:55:21 Italy
0:55:22 like
0:55:23 do I
0:55:23 just
0:55:23 want
0:55:24 the
0:55:24 do I
0:55:25 want
0:55:25 that
0:55:25 SKU
0:55:26 or do I
0:55:27 want
0:55:27 to work
0:55:27 out
0:55:28 which
0:55:28 product
0:55:28 I
0:55:29 should
0:55:29 buy
0:55:29 which
0:55:29 is
0:55:30 Amazon
0:55:30 is
0:55:30 great
0:55:30 at getting
0:55:31 you
0:55:31 the SKU
0:55:31 terrible
0:55:32 at telling
0:55:32 you what
0:55:32 SKU
0:55:35 do I
0:55:35 just
0:55:35 want
0:55:36 the
0:55:36 slide
0:55:36 deck
0:55:37 or do
0:55:37 I
0:55:37 want
0:55:37 to
0:55:38 spend
0:55:38 a
0:55:38 week
0:55:39 talking
0:55:39 to
0:55:39 a
0:55:39 bunch
0:55:39 of
0:55:40 partners
0:55:40 from
0:55:40 Bain
0:55:41 about
0:55:41 how
0:55:41 I
0:55:41 could
0:55:41 think
0:55:42 about
0:55:42 doing
0:55:42 this
0:55:43 do I
0:55:43 just
0:55:44 want
0:55:44 money
0:55:45 or
0:55:45 do I
0:55:46 want
0:55:46 to
0:55:46 work
0:55:46 with
0:55:48 A16Z’s
0:55:49 operating
0:55:50 groups
0:55:51 what is
0:55:51 it that
0:55:52 I’m
0:55:52 doing
0:55:53 here
0:55:54 and I
0:55:54 think
0:55:54 the
0:55:55 LLM
0:55:57 thing
0:55:57 is
0:55:57 starting
0:55:57 to
0:55:58 crystallize
0:55:58 that
0:55:59 question
0:55:59 in
0:55:59 lots
0:56:00 of
0:56:00 different
0:56:00 ways
0:56:02 what am
0:56:02 I
0:56:02 actually
0:56:03 trying
0:56:03 to
0:56:03 do
0:56:03 here
0:56:03 do I
0:56:04 just
0:56:04 want
0:56:04 a
0:56:04 thing
0:56:04 that
0:56:04 a
0:56:05 computer
0:56:05 can
0:56:05 now
0:56:06 answer
0:56:06 for
0:56:06 me
0:56:06 or
0:56:07 do I
0:56:07 want
0:56:07 something
0:56:08 else
0:56:08 that
0:56:09 isn’t
0:56:09 because
0:56:09 the
0:56:09 LLMs
0:56:10 can do
0:56:10 a bunch
0:56:10 of
0:56:10 stuff
0:56:11 that
0:56:11 computers
0:56:11 couldn’t
0:56:11 do
0:56:12 before
0:56:12 right
0:56:13 is
0:56:14 that
0:56:14 thing
0:56:14 that
0:56:15 the
0:56:15 computer
0:56:15 couldn’t
0:56:15 do
0:56:16 before
0:56:16 my
0:56:16 business
0:56:18 or
0:56:18 or
0:56:18 am
0:56:18 I
0:56:18 actually
0:56:19 doing
0:56:19 something
0:56:20 else
0:56:20 we’re
0:56:20 about
0:56:21 to
0:56:21 figure
0:56:21 out
0:56:21 what
0:56:21 is
0:56:22 the
0:56:22 in
0:56:22 a
0:56:22 much
0:56:22 more
0:56:23 granular
0:56:23 way
0:56:23 what
0:56:24 is
0:56:24 the
0:56:24 true
0:56:24 job
0:56:24 to be
0:56:25 done
0:56:25 for
0:56:26 many
0:56:26 of
0:56:26 these
0:56:28 yeah
0:56:29 and
0:56:29 you know
0:56:29 going back
0:56:29 to the
0:56:30 internet
0:56:30 there was
0:56:30 you know
0:56:30 the
0:56:31 observation
0:56:32 about
0:56:32 newspapers
0:56:32 is that
0:56:33 newspapers
0:56:33 looked on
0:56:33 the internet
0:56:34 and they
0:56:34 talked about
0:56:34 you know
0:56:35 expertise
0:56:36 and curation
0:56:36 and journalism
0:56:37 and everything
0:56:37 else
0:56:38 and didn’t
0:56:38 really say
0:56:39 well we’re
0:56:39 a light
0:56:40 manufacturing
0:56:40 company
0:56:41 and a
0:56:41 local
0:56:41 distribution
0:56:41 and
0:56:42 trucking
0:56:42 company
0:56:43 and that
0:56:44 was the
0:56:44 bit
0:56:44 that was
0:56:44 the
0:56:45 problem
0:56:46 and until
0:56:46 the internet
0:56:46 arrived
0:56:47 like that
0:56:48 wasn’t a
0:56:48 conversation
0:56:49 you thought
0:56:49 about
0:56:50 and then
0:56:51 the internet
0:56:51 suddenly
0:56:52 makes that
0:56:52 clear
0:56:52 and suddenly
0:56:53 creates an
0:56:54 unbundling
0:56:54 that didn’t
0:56:55 exist before
0:56:56 and so
0:56:56 there will
0:56:57 be those
0:56:58 kinds of
0:56:58 like you
0:56:59 didn’t
0:56:59 realize you
0:57:00 were that
0:57:00 before
0:57:01 until an
0:57:01 LLM
0:57:02 comes along
0:57:03 and points
0:57:03 to
0:57:04 someone
0:57:04 comes along
0:57:05 with an
0:57:05 LLM
0:57:05 and says
0:57:06 I can use
0:57:07 this to do
0:57:07 this thing
0:57:07 that you
0:57:08 didn’t really
0:57:09 realize was
0:57:09 the basis
0:57:10 of your
0:57:11 defensibility
0:57:11 or the
0:57:11 basis
0:57:11 of your
0:57:12 profitability
0:57:13 I mean
0:57:13 it’s like
0:57:15 the joke
0:57:15 about
0:57:16 US health
0:57:16 insurance
0:57:17 the basis
0:57:18 of US
0:57:18 health
0:57:18 insurance
0:57:19 profitability
0:57:19 is making
0:57:20 it really
0:57:20 boring
0:57:21 and difficult
0:57:21 and time
0:57:21 consuming
0:57:22 that’s where
0:57:23 the profits
0:57:23 come from
0:57:24 maybe it isn’t
0:57:24 I don’t know
0:57:25 for the sake
0:57:26 of argument
0:57:27 say that’s
0:57:27 your
0:57:28 defensibility
0:57:28 well an
0:57:28 LLM
0:57:29 removes
0:57:30 boring
0:57:31 time consuming
0:57:32 mind numbing
0:57:32 tasks
0:57:35 so what
0:57:35 industries
0:57:36 are protected
0:57:36 by having
0:57:37 that
0:57:37 and they
0:57:37 didn’t
0:57:38 realize
0:57:38 that
0:57:39 and these
0:57:39 you know
0:57:40 it’s like
0:57:41 you could
0:57:41 have asked
0:57:42 these
0:57:42 questions
0:57:42 about the
0:57:43 internet
0:57:43 in the
0:57:44 mid-90s
0:57:44 or about
0:57:44 mobile
0:57:45 a decade
0:57:46 later
0:57:47 and generally
0:57:47 you’d have
0:57:48 half of the
0:57:48 questions you’d
0:57:49 ever asked
0:57:49 would have been
0:57:49 the wrong
0:57:50 questions in
0:57:51 hindsight
0:57:51 I remember
0:57:52 as a baby
0:57:53 analyzed in
0:57:53 2000
0:57:54 everyone kept
0:57:55 saying what’s
0:57:55 the killer
0:57:55 use case for
0:57:56 3G
0:57:58 what’s a good
0:57:58 use case
0:57:59 for 3G
0:57:59 and it
0:57:59 turned out
0:57:59 that having
0:58:00 the internet
0:58:00 in your
0:58:00 pocket
0:58:01 everywhere
0:58:01 was the
0:58:02 use case
0:58:02 for 3G
0:58:03 but that
0:58:04 wasn’t the
0:58:04 question that
0:58:05 people were
0:58:05 asking
0:58:06 and I’m
0:58:06 sure that
0:58:07 will be
0:58:07 the thing
0:58:08 now
0:58:10 is
0:58:11 there’s so
0:58:11 much that
0:58:12 will happen
0:58:13 and get
0:58:13 built
0:58:14 where you
0:58:15 go and
0:58:15 you realize
0:58:16 oh
0:58:18 that’s
0:58:18 how you
0:58:19 would do
0:58:19 this
0:58:19 you can
0:58:20 turn it
0:58:20 into
0:58:21 that
0:58:23 and I’m
0:58:23 sure you’ve
0:58:23 had this
0:58:24 experience
0:58:24 seeing
0:58:24 entrepreneurs
0:58:25 you get
0:58:26 every now
0:58:26 and then
0:58:27 they come
0:58:28 in and they
0:58:28 pitch the
0:58:28 thing
0:58:28 and you’re
0:58:29 like
0:58:30 oh
0:58:31 okay
0:58:33 you can
0:58:33 turn it
0:58:34 into that
0:58:35 I didn’t
0:58:35 realize it
0:58:36 was that
0:58:37 100%
0:58:39 my last
0:58:39 question
0:58:41 if we’re
0:58:42 talking
0:58:42 two or
0:58:42 three years
0:58:43 from now
0:58:43 you’re
0:58:44 doing a
0:58:44 presentation
0:58:44 you say
0:58:45 oh
0:58:46 this is
0:58:46 actually
0:58:47 bigger than
0:58:47 the internet
0:58:47 or maybe
0:58:48 this is
0:58:49 like
0:58:50 computing
0:58:51 what would
0:58:52 need to
0:58:52 be true
0:58:53 what would
0:58:53 need to
0:58:53 happen
0:58:54 what would
0:58:55 evolve our
0:58:56 thinking
0:58:57 I mean I
0:58:58 kind of
0:58:59 you know
0:58:59 sort of
0:59:00 come back to
0:59:01 my point
0:59:01 about you
0:59:01 know
0:59:03 Jews and
0:59:03 Christians
0:59:04 and the
0:59:04 Messiah came
0:59:05 nothing happened
0:59:10 we forget
0:59:11 I mean it’s
0:59:11 maybe two
0:59:12 very brief ways
0:59:13 to think about
0:59:13 this
0:59:14 one of them
0:59:14 is I think
0:59:15 we forget
0:59:16 how enormous
0:59:17 the iPhone
0:59:17 was
0:59:17 and how
0:59:18 enormous
0:59:18 the internet
0:59:19 was
0:59:19 and you
0:59:20 can still
0:59:21 find people
0:59:22 in tech
0:59:22 who claim
0:59:22 that
0:59:22 smartphones
0:59:23 aren’t a
0:59:23 big deal
0:59:25 and this
0:59:26 was the
0:59:26 basis of
0:59:26 people
0:59:27 complaining
0:59:27 about me
0:59:28 like this
0:59:28 idiot
0:59:28 he thinks
0:59:29 like
0:59:29 generative
0:59:29 AI
0:59:30 as big
0:59:30 as those
0:59:30 silly
0:59:31 phone
0:59:31 things
0:59:33 come on
0:59:35 I think
0:59:36 another answer
0:59:36 would be
0:59:37 like
0:59:39 I don’t want
0:59:40 to get into
0:59:40 the argument
0:59:41 about you
0:59:41 know what
0:59:41 is the
0:59:42 grace rating
0:59:43 capability
0:59:43 and benchmarks
0:59:44 and all
0:59:44 you know
0:59:45 you can see
0:59:45 lots of
0:59:46 five hour
0:59:46 long podcasts
0:59:47 of people
0:59:47 talking about
0:59:48 this stuff
0:59:50 but the
0:59:50 stuff we
0:59:51 have now
0:59:52 is not a
0:59:53 replacement
0:59:55 for an
0:59:55 actual
0:59:56 person
0:59:58 outside of
0:59:59 some very
1:00:00 narrow and
1:00:00 very tightly
1:00:01 constrained
1:00:02 guardrails
1:00:02 which is why
1:00:02 you know
1:00:03 Demis’s point
1:00:04 that it’s
1:00:05 absurd to say
1:00:05 that we have
1:00:06 PhD level
1:00:07 capabilities
1:00:07 now
1:00:13 what we
1:00:13 would have
1:00:14 to be
1:00:14 seeing
1:00:15 something
1:00:15 that would
1:00:16 really
1:00:16 shift
1:00:17 our
1:00:18 perception
1:00:19 of the
1:00:19 capability
1:00:20 of this
1:00:20 stuff
1:00:22 so that
1:00:22 it’s
1:00:23 actually
1:00:24 a person
1:00:26 as opposed
1:00:26 to
1:00:27 it can
1:00:27 kind of
1:00:28 do these
1:00:29 people like
1:00:29 things
1:00:30 really well
1:00:31 sometimes
1:00:31 but not
1:00:32 other times
1:00:33 and it’s
1:00:34 a very
1:00:35 tough
1:00:35 conceptual
1:00:38 thing to
1:00:38 think about
1:00:39 because
1:00:40 I’m
1:00:41 conscious
1:00:41 I’m not
1:00:41 giving you
1:00:42 a
1:00:42 falsifiable
1:00:42 answer
1:00:43 but I’m
1:00:44 not sure
1:00:44 what a
1:00:44 falsifiable
1:00:45 answer
1:00:45 would be
1:00:46 to that
1:00:47 when would
1:00:47 you know
1:00:48 whether this
1:00:48 was AGI
1:00:49 it’s the
1:00:50 Larry Tesla
1:00:50 line
1:00:50 AI is
1:00:51 whatever
1:00:51 doesn’t
1:00:51 work
1:00:51 yet
1:00:52 as soon
1:00:52 as people
1:00:53 say it
1:00:53 works
1:00:54 people say
1:00:54 well that’s
1:00:54 just not
1:00:54 AI
1:00:55 that’s
1:00:55 just
1:00:55 software
1:00:57 it’s
1:00:57 a
1:00:59 you know
1:00:59 it’s
1:00:59 an
1:01:00 and it
1:01:01 becomes
1:01:01 like a
1:01:02 kind
1:01:02 of
1:01:03 a
1:01:03 slightly
1:01:03 drunk
1:01:04 philosophy
1:01:05 grad
1:01:05 student
1:01:06 kind
1:01:06 of
1:01:06 conversation
1:01:07 as much
1:01:08 as it
1:01:08 is a
1:01:08 technology
1:01:09 conversation
1:01:09 like what
1:01:10 would it
1:01:10 have you
1:01:10 ever
1:01:11 considered
1:01:11 Eric
1:01:12 that
1:01:12 maybe
1:01:13 we’re
1:01:13 not
1:01:14 either
1:01:20 all I
1:01:21 can say
1:01:21 to give
1:01:22 a tangible
1:01:22 answer to
1:01:22 this
1:01:23 question
1:01:23 is
1:01:23 what
1:01:23 we
1:01:24 have
1:01:24 right
1:01:24 now
1:01:26 isn’t
1:01:26 that
1:01:27 will
1:01:27 it
1:01:28 grow
1:01:28 to
1:01:28 that
1:01:29 we
1:01:30 don’t
1:01:30 know
1:01:31 you
1:01:31 may
1:01:32 believe
1:01:32 it
1:01:32 will
1:01:32 I
1:01:33 can’t
1:01:33 tell you
1:01:33 that
1:01:33 you’re
1:01:34 wrong
1:01:34 we’ll
1:01:35 just
1:01:35 have
1:01:35 to
1:01:35 find
1:01:35 out
1:01:36 I
1:01:36 think
1:01:36 that’s
1:01:37 a good
1:01:37 place
1:01:37 to
1:01:38 wrap
1:01:38 the
1:01:39 presentation
1:01:39 is
1:01:39 AI
1:01:39 used
1:01:39 the
1:01:40 world
1:01:40 we’ll
1:01:40 link
1:01:40 to
1:01:40 it
1:01:41 it’s
1:01:41 fantastic
1:01:42 Benedict
1:01:42 thanks
1:01:42 so much
1:01:42 for
1:01:42 coming
1:01:42 on the
1:01:43 podcast
1:01:43 to
1:01:43 discuss
1:01:43 it
1:01:44 sure
1:01:44 thanks
1:01:45 a lot
1:01:47 thanks
1:01:48 for
1:01:48 listening
1:01:48 to
1:01:48 this
1:01:49 episode
1:01:49 of the
1:01:50 a16z
1:01:50 podcast
1:01:51 if you
1:01:51 liked this
1:01:52 episode
1:01:52 be sure
1:01:53 to like
1:01:53 comment
1:01:54 subscribe
1:01:54 leave us
1:01:55 a rating
1:01:55 or review
1:01:56 and share
1:01:56 it with
1:01:57 your friends
1:01:57 and family
1:01:58 for more
1:01:59 episodes
1:01:59 go to
1:02:00 youtube
1:02:00 apple
1:02:01 podcast
1:02:01 and spotify
1:02:02 follow us
1:02:03 on x
1:02:04 a16z
1:02:05 and subscribe
1:02:05 to our
1:02:06 substack
1:02:06 at
1:02:07 a16z
1:02:07 substack
1:02:08 dot com
1:02:08 thanks
1:02:09 again
1:02:09 for
1:02:09 listening
1:02:09 and I’ll
1:02:10 see you
1:02:10 in the
1:02:10 next
1:02:11 episode
1:02:12 as a
1:02:13 reminder
1:02:13 the content
1:02:13 here is
1:02:14 for
1:02:14 informational
1:02:14 purposes
1:02:15 only
1:02:15 should
1:02:16 not be
1:02:16 taken
1:02:16 as
1:02:16 legal
1:02:17 business
1:02:17 tax
1:02:17 or
1:02:18 investment
1:02:18 advice
1:02:19 or be
1:02:19 used
1:02:19 to
1:02:19 evaluate
1:02:20 any
1:02:20 investment
1:02:20 or
1:02:21 security
1:02:21 and
1:02:21 is
1:02:21 not
1:02:22 directed
1:02:22 at
1:02:22 any
1:02:22 investors
1:02:23 or
1:02:23 potential
1:02:24 investors
1:02:24 in
1:02:24 any
1:02:25 a16z
1:02:25 fund
1:02:26 please
1:02:26 note
1:02:26 that
1:02:27 a16z
1:02:27 and
1:02:27 its
1:02:28 affiliates
1:02:28 may
1:02:28 also
1:02:28 maintain
1:02:29 investments
1:02:29 in
1:02:29 the
1:02:29 companies
1:02:30 discussed
1:02:30 in
1:02:30 this
1:02:31 podcast
1:02:31 for
1:02:31 more
1:02:32 details
1:02:32 including
1:02:33 a link
1:02:33 to
1:02:33 our
1:02:33 investments
1:02:34 please
1:02:34 see
1:02:35 a16z
1:02:36 dot com
1:02:36 forward
1:02:37 slash
1:02:38 disclosures

AI is reshaping the tech landscape, but a big question remains: is this just another platform shift, or something closer to electricity or computing in scale and impact? Some industries may be transformed. Others may barely feel it. Tech giants are racing to reorient their strategies, yet most people still struggle to find an everyday use case. That tension tells us something important about where we actually are.

In this episode, technology analyst and former a16z partner Benedict Evans joins General Partner Erik Torenberg to break down what is real, what is hype, and how much history can guide us. They explore bottlenecks in compute, the surprising products that still do not exist, and how companies like Google, Meta, Apple, Amazon, and OpenAI are positioning themselves.

Finally, they look ahead at what would need to happen for AI to one day be considered even more transformative than the internet.

Timestamps: 

0:00 – Introduction 
0:17 – Defining AI and Platform Shifts
1:50 – Patterns in Technology Adoption
6:04 – AI: Hype, Bubbles, and Uncertainty
13:25 – Winners, Losers, and Industry Impact
19:00 – AI Adoption: Use Cases and Bottlenecks
24:00 – Comparisons to Past Tech Waves
32:00 – The Role of Products and Workflows
40:00 – Consumer vs. Enterprise AI
46:00 – Competitive Landscape: Tech Giants & Startups
51:00 – Open Questions & The Future of AI

Resources:

Follow Benedict on LinkedIn: https://www.linkedin.com/in/benedictevans/

 

Stay Updated:

If you enjoyed this episode, be sure to like, subscribe, and share with your friends!

Find a16z on X: https://x.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Listen to the a16z Podcast on Spotify: https://open.spotify.com/show/5bC65RDvs3oxnLyqqvkUYX

Listen to the a16z Podcast on Apple Podcasts: https://podcasts.apple.com/us/podcast/a16z-podcast/id842818711

Follow our host: https://x.com/eriktorenberg

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see http://a16z.com/disclosures.

Stay Updated:

Find a16z on X

Find a16z on LinkedIn

Listen to the a16z Show on Spotify

Listen to the a16z Show on Apple Podcasts

Follow our host: https://twitter.com/eriktorenberg

 

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

Leave a Reply

a16z Podcasta16z Podcast
Let's Evolve Together
Logo