AI transcript
0:00:03 Today, we’re doing something a little different.
0:00:05 We’re dropping an episode from Impact Theory,
0:00:07 a show hosted by Tom Bilyeu,
0:00:09 featuring a conversation with Ben Horowitz,
0:00:11 co-founder of Andreessen Horowitz.
0:00:13 Ben rarely does interviews like this,
0:00:16 and in this one, he goes deep on AI, on power,
0:00:18 on the future of work, and what it really means
0:00:20 to be a human in a world of intelligent machines.
0:00:24 He breaks down why AI is not life, not consciousness,
0:00:25 but something else entirely.
0:00:28 Why blockchain is critical to preserving trust and truth
0:00:29 in the age of deepfakes.
0:00:32 Why distribution may matter more than code,
0:00:34 and why history tells us this isn’t the end of jobs,
0:00:36 but the beginning of something new.
0:00:37 Let’s get into it.
0:00:42 This information is for educational purposes only
0:00:45 and is not a recommendation to buy, hold, or sell
0:00:47 any investment or financial product.
0:00:49 This podcast has been produced by a third party
0:00:51 and may include paid promotional advertisements,
0:00:53 other company references, and individuals
0:00:54 unaffiliated with A16Z.
0:00:57 Such advertisements, companies, and individuals
0:01:00 are not endorsed by AH Capital Management LLC,
0:01:02 A16Z, or any of its affiliates.
0:01:04 Information is from sources deemed reliable
0:01:05 on the date of publication,
0:01:08 but A16Z does not guarantee its accuracy.
0:01:14 Revolutions don’t always come with banners and protests.
0:01:18 Sometimes, the only shots fired are snippets of code.
0:01:20 This is one of those moments.
0:01:24 AI is the most disruptive force in history,
0:01:26 and it’s no longer a distant possibility.
0:01:28 It is here right now,
0:01:31 and it’s already changing the foundations of power
0:01:32 and the economy.
0:01:34 Few people have been as influential
0:01:36 in shaping the direction of AI
0:01:38 than mega-investor Ben Horowitz.
0:01:40 A pioneer in Silicon Valley,
0:01:42 he spent decades at the center
0:01:44 of every major technological disruption,
0:01:47 including standing up to the Biden administration’s
0:01:49 attempts to limit and control AI.
0:01:50 In today’s episode,
0:01:53 he lays out where AI is really taking us,
0:01:55 the forces that will define the next decade,
0:01:57 and how to position yourself
0:01:59 before it’s too late.
0:02:04 You are in an area making investments,
0:02:05 thinking about some of the things
0:02:07 that I think are the most consequential
0:02:08 in the world today
0:02:09 as it relates to innovation.
0:02:10 But along those lines,
0:02:13 you and Marc Andreessen are all in on AI,
0:02:15 but how do we make sure
0:02:16 that it benefits everyone
0:02:18 instead of making humans obsolete?
0:02:19 To begin with,
0:02:21 we have to just realize,
0:02:22 you know, what AI is
0:02:23 because I think that
0:02:24 because we called it
0:02:25 artificial intelligence,
0:02:26 you know,
0:02:29 our whole industry of technology
0:02:30 has a naming problem
0:02:31 in that we started,
0:02:32 you know,
0:02:34 by calling computer science,
0:02:35 computer science,
0:02:37 which everybody thought,
0:02:39 oh, that’s just like computers.
0:02:42 It’s like the science of a machine
0:02:44 as opposed to information theory
0:02:45 and what it really was.
0:02:47 And then in Web3 world,
0:02:48 which you’re familiar with,
0:02:49 we call it cryptocurrency,
0:02:51 which to normal people
0:02:52 means secret money.
0:02:54 But that’s not what it does.
0:02:55 That’s a good point.
0:02:56 And then I think
0:02:57 with artificial intelligence,
0:03:01 I think that’s also like a bad name
0:03:02 in a lot of ways
0:03:03 and that, you know,
0:03:04 look, the people
0:03:05 who work on in the field
0:03:08 call what they’re building models.
0:03:09 And I think that’s
0:03:11 a really accurate description
0:03:12 in the sense that
0:03:13 what we’re doing
0:03:15 is we’re kind of modeling
0:03:17 something we’ve always done,
0:03:18 which is we’re trying
0:03:19 to model the world
0:03:22 in a way that enables us
0:03:23 to predict it.
0:03:24 And then, you know,
0:03:25 we’ve built much more
0:03:26 sophisticated models
0:03:27 with this technology
0:03:29 than we could.
0:03:30 in the old days,
0:03:31 we had E equals MC squared,
0:03:33 which was like amazing,
0:03:35 but a relatively simple model.
0:03:38 Now we have models
0:03:39 with what,
0:03:41 like 600 billion variables
0:03:42 and this kind of thing.
0:03:43 And we can model like,
0:03:45 what’s the next word
0:03:46 that I should say
0:03:48 and that kind of thing.
0:03:51 So that’s amazing and powerful.
0:03:52 But I would say like,
0:03:53 we need to distinguish
0:03:54 the fact that
0:03:56 it’s a model
0:03:59 that is directed by us
0:04:00 to tell us things
0:04:01 about the world
0:04:02 and do things
0:04:03 on our behalf.
0:04:05 But it’s not a,
0:04:06 it’s not life.
0:04:08 It doesn’t have a free will
0:04:09 and these kinds of things.
0:04:11 So I think default,
0:04:13 you know,
0:04:13 we are the master
0:04:16 and it is the servant
0:04:18 as opposed to vice versa.
0:04:20 The question though
0:04:21 that you’re getting at,
0:04:21 which is, okay,
0:04:24 how do we not get obsoleted?
0:04:26 Like, why do we need us
0:04:27 if we’ve got these things
0:04:28 that can do all the jobs
0:04:29 that we currently do?
0:04:31 And I think, you know,
0:04:33 we’ve gone through that
0:04:34 in the past
0:04:36 and it’s been interesting, right?
0:04:39 So in I think 1750,
0:04:41 over 90% of the jobs
0:04:42 in the country
0:04:43 were agricultural.
0:04:45 And, you know,
0:04:46 there was a huge fight
0:04:47 and a group called
0:04:47 the Luddites
0:04:48 that fought the plow
0:04:49 and, you know,
0:04:50 some of these
0:04:53 newfangled inventions
0:04:54 that eventually,
0:04:55 by the way,
0:04:57 eliminated 97%
0:04:58 of the jobs
0:04:59 that were there.
0:05:01 But I think that
0:05:03 most people would say,
0:05:03 gee,
0:05:05 the life I have now
0:05:06 is better than
0:05:07 the life that I would have
0:05:08 had on the farm
0:05:10 where all I did
0:05:11 was farm
0:05:12 and nothing else
0:05:13 in life.
0:05:15 But, you know,
0:05:15 like,
0:05:16 and if you want to farm
0:05:16 still,
0:05:17 you can.
0:05:18 that is an option.
0:05:20 But most people
0:05:21 don’t take that option.
0:05:23 So the jobs
0:05:24 that we have now
0:05:25 will,
0:05:25 you know,
0:05:26 a lot of them
0:05:27 will go away
0:05:28 and will have,
0:05:29 but will likely
0:05:30 have new jobs.
0:05:30 I mean,
0:05:31 humans are pretty good
0:05:33 at figuring out
0:05:34 new things to do
0:05:36 and new things
0:05:36 to pursue
0:05:37 and so forth.
0:05:40 And, you know,
0:05:40 like including,
0:05:41 like,
0:05:42 going to Mars
0:05:42 and that kind of thing,
0:05:44 which obviously
0:05:45 isn’t a thing today
0:05:46 but could very well
0:05:47 be a thing tomorrow.
0:05:50 So I think that,
0:05:50 you know,
0:05:51 we have to stay creative
0:05:53 and keep dreaming
0:05:54 about, like,
0:05:55 a better future
0:05:56 and how to kind of
0:05:57 improve things for people.
0:05:57 But I think that,
0:05:58 you know,
0:05:58 particularly
0:06:01 for kind of the people
0:06:01 in the world
0:06:03 that are on the struggle bus
0:06:04 who are living
0:06:05 on a dollar a day
0:06:07 or, you know,
0:06:07 kind of subject
0:06:08 to all kinds of diseases
0:06:09 and so forth,
0:06:10 life is going to get,
0:06:10 like,
0:06:11 radically better for them.
0:06:13 When I look at the plow example
0:06:13 and the Luddites
0:06:14 fighting against it,
0:06:15 I think you’ll see
0:06:16 the same thing with AI.
0:06:17 You’re going to get people
0:06:19 that just completely reject it,
0:06:19 refuse to engage
0:06:21 in anything for sure.
0:06:24 But when I look at AI,
0:06:26 what I worry about
0:06:26 is that there will be
0:06:28 no refuge to go to,
0:06:30 meaning if you realize,
0:06:30 oh,
0:06:31 I can’t plow as well
0:06:33 as a plow
0:06:34 or a tractor
0:06:35 or now a combine,
0:06:36 there are still
0:06:37 a lot of other things
0:06:38 that technology
0:06:39 can’t do better than me.
0:06:40 Do you think
0:06:41 there’s an upper bound
0:06:42 to artificial intelligence,
0:06:43 bad name or not,
0:06:45 or do you think
0:06:46 that it keeps going
0:06:47 and it literally
0:06:49 becomes better than us
0:06:50 once it’s embodied
0:06:50 in robotics
0:06:52 at everything?
0:06:54 We are kind of limited
0:06:55 by the new ideas
0:06:56 that we have.
0:06:58 And artificial intelligence
0:06:58 is really,
0:06:59 by the way,
0:07:00 artificial human intelligence,
0:07:01 meaning,
0:07:03 right,
0:07:05 humans looked at the world,
0:07:06 humans figured out
0:07:07 what it was,
0:07:08 you know,
0:07:09 described it,
0:07:09 came up with these
0:07:11 concepts like trees
0:07:11 and,
0:07:12 you know,
0:07:14 air and all this stuff
0:07:16 that it’s not necessarily
0:07:16 real,
0:07:19 just how we decided
0:07:20 to structure the world.
0:07:21 And AI has learned
0:07:22 our structure.
0:07:23 Like,
0:07:24 they’ve learned language,
0:07:25 human language,
0:07:26 which is a version
0:07:27 of the universe
0:07:27 that is not
0:07:29 an accurate version
0:07:29 of the universe.
0:07:30 It’s just our version
0:07:31 of the universe.
0:07:32 But you’re going to
0:07:33 have to go deep on that.
0:07:34 I know my audience
0:07:35 is going to be like,
0:07:36 air seems pretty real
0:07:37 when you’re underwater.
0:07:38 What do you mean
0:07:39 that trees and air
0:07:40 are not necessarily real?
0:07:41 It’s,
0:07:41 look,
0:07:42 it’s a construction
0:07:43 that we made.
0:07:43 You know,
0:07:44 we decided
0:07:46 it is literally
0:07:48 the way humans
0:07:48 have interpreted
0:07:49 the world
0:07:50 in order for humans
0:07:51 to navigate it.
0:07:52 And,
0:07:53 you know,
0:07:55 as is language,
0:07:55 right?
0:07:56 Language isn’t
0:07:57 the universe
0:07:58 as it is.
0:07:59 Like,
0:08:00 if completely objective,
0:08:02 if you had an objective
0:08:02 look at,
0:08:03 you know,
0:08:05 the atoms and so forth
0:08:06 and how they were arranged
0:08:07 and whatnot,
0:08:09 you probably,
0:08:10 you know,
0:08:11 those descriptions
0:08:12 are lacking
0:08:13 in a lot of ways.
0:08:13 They’re not
0:08:14 completely accurate.
0:08:16 They don’t,
0:08:17 you know,
0:08:17 they certainly don’t
0:08:18 kind of predict
0:08:19 everything about
0:08:19 how the world works.
0:08:21 And so,
0:08:23 what machines have learned
0:08:23 or like what
0:08:25 artificial intelligence is
0:08:26 is an understanding
0:08:27 of our knowledge,
0:08:29 of the human knowledge.
0:08:31 so it’s taken
0:08:31 in our knowledge
0:08:32 of the universe
0:08:34 and then it is
0:08:36 kind of,
0:08:36 can refine that
0:08:37 and it can work on that
0:08:39 and it can derive things
0:08:40 from our knowledge,
0:08:42 our axiom set.
0:08:44 But it isn’t actually
0:08:45 observing the world
0:08:46 at this point
0:08:47 and figuring out
0:08:47 new stuff.
0:08:48 So,
0:08:48 you know,
0:08:49 at the very least,
0:08:50 you know,
0:08:52 humans still have to
0:08:55 discover new principles
0:08:56 of the universe
0:08:57 or kind of
0:08:57 interpret it
0:08:58 in a different way
0:08:59 or the machines
0:09:00 have to somehow
0:09:01 observe directly
0:09:02 the world
0:09:02 which they’re not
0:09:03 yet doing.
0:09:06 And so,
0:09:06 you know,
0:09:07 that’s a pretty big role
0:09:08 I would say
0:09:09 but then,
0:09:09 you know,
0:09:10 in addition,
0:09:11 you know,
0:09:11 we direct the world.
0:09:12 I think like Star Trek
0:09:13 is actually a pretty
0:09:14 good metaphor for that.
0:09:15 Like the Star Trek
0:09:16 computer was pretty
0:09:16 badass
0:09:18 but,
0:09:19 you know,
0:09:20 the people in Star Trek
0:09:21 were still like
0:09:22 flying around the universe
0:09:23 discovering new things
0:09:23 about it.
0:09:24 You know,
0:09:24 there was still
0:09:25 much to do
0:09:27 and I think that
0:09:27 it’s always
0:09:29 a little
0:09:31 kind of difficult
0:09:32 to figure out
0:09:33 what the new jobs
0:09:34 that get created
0:09:34 are.
0:09:35 And we’ve had
0:09:37 intelligence for a while,
0:09:37 right?
0:09:38 Like we’ve had
0:09:39 machines that could
0:09:40 do math way better
0:09:40 than us
0:09:42 and,
0:09:43 you know,
0:09:43 I mean,
0:09:44 I can remember
0:09:46 when I was in
0:09:47 junior high school
0:09:49 or junior high
0:09:49 put on a play
0:09:50 about like
0:09:51 how bad it was
0:09:51 that there were
0:09:52 calculators
0:09:52 because nobody
0:09:53 would know
0:09:53 how to do
0:09:54 arithmetic
0:09:54 and then
0:09:56 all the calculators
0:09:56 would break
0:09:57 and then we’d
0:09:57 be stuck.
0:09:58 We’d be trying
0:09:59 to like fly around
0:09:59 the universe
0:10:00 and rockets
0:10:01 but we wouldn’t
0:10:02 be able to do math
0:10:03 and the calculators
0:10:03 would be broken
0:10:04 and we’d be screwed.
0:10:06 So there is always
0:10:07 that fear
0:10:09 and,
0:10:09 you know,
0:10:10 we’ve had computers
0:10:11 that can play games
0:10:12 better than us.
0:10:12 We currently have
0:10:13 computers that can
0:10:14 drive better than us
0:10:15 and so forth.
0:10:16 So we have a lot
0:10:16 of intelligence
0:10:17 out there
0:10:19 but it hasn’t,
0:10:21 you know,
0:10:22 created like
0:10:23 this super dystopia,
0:10:24 you know,
0:10:25 in any degree.
0:10:25 It’s actually
0:10:26 made things better,
0:10:27 you know,
0:10:28 everywhere it’s appeared.
0:10:29 So I would expect
0:10:30 that to continue.
0:10:31 What do you think
0:10:32 though is the limiting
0:10:32 function?
0:10:34 So when I look
0:10:34 at AI,
0:10:36 I always say
0:10:37 unless we run
0:10:38 into an upper bound
0:10:39 where the computation
0:10:40 just can’t allow
0:10:41 the intelligence
0:10:42 to keep progressing,
0:10:45 it seems like
0:10:45 it will become
0:10:46 not only
0:10:47 generalized human
0:10:47 intelligence
0:10:48 and thusly be able
0:10:48 to do everything
0:10:49 that we can do,
0:10:50 it will become
0:10:52 embodied as robotics
0:10:53 and if,
0:10:54 I ran the math
0:10:54 on this once,
0:10:56 Einstein is roughly
0:10:58 2.4 times smarter
0:10:58 than someone
0:11:00 who is definitionally
0:11:00 a moron
0:11:02 and the gap
0:11:03 just between
0:11:04 those two
0:11:06 is so dramatic
0:11:07 the army won’t
0:11:08 even draft
0:11:09 somebody
0:11:10 that is a moron
0:11:11 at, you know,
0:11:11 whatever,
0:11:12 81 IQ
0:11:13 or whatever it is.
0:11:14 because it’s,
0:11:14 they create
0:11:15 more problems
0:11:16 than they solve
0:11:17 even just by being,
0:11:18 you know,
0:11:19 bullet fodder.
0:11:21 So do you think
0:11:22 there is something
0:11:23 that’s going to cause
0:11:25 that upper bound
0:11:27 or you have a belief
0:11:29 about the nature
0:11:30 of intelligence
0:11:31 that will keep
0:11:33 AI subservient
0:11:34 to us?
0:11:35 The smartest people
0:11:36 don’t rule the world,
0:11:37 you know,
0:11:38 Einstein wasn’t in charge
0:11:42 and, you know,
0:11:42 many of us
0:11:43 are like ruled
0:11:45 by our cats
0:11:47 and so,
0:11:48 like,
0:11:49 power and intelligence
0:11:50 don’t necessarily
0:11:51 go together,
0:11:51 particularly when
0:11:52 the intelligence
0:11:53 has no free will
0:11:54 or has no
0:11:55 desire to do
0:11:56 free will.
0:11:56 It doesn’t have will.
0:11:58 You know,
0:11:59 it is kind of a model
0:12:00 that’s computing things.
0:12:02 I think also,
0:12:02 you know,
0:12:03 the whole general
0:12:04 intelligence thing
0:12:06 is interesting
0:12:07 in that,
0:12:10 Waymo’s got a super
0:12:10 smart AI
0:12:11 that can drive a car
0:12:14 but that AI
0:12:14 that drives a car
0:12:16 doesn’t know English
0:12:18 and, you know,
0:12:19 isn’t, you know,
0:12:20 particularly good
0:12:21 at other tasks,
0:12:22 you know,
0:12:23 currently
0:12:24 and then the
0:12:26 chat GPT
0:12:27 can’t drive a car.
0:12:30 And so that’s,
0:12:31 you know,
0:12:33 how much things
0:12:35 generalize particularly
0:12:36 and if you look
0:12:36 at, well,
0:12:37 why is that?
0:12:38 A lot of it
0:12:39 actually has to do
0:12:40 with the long tail
0:12:41 of human behavior
0:12:43 where humans,
0:12:45 you know,
0:12:46 the distribution
0:12:47 of human behavior
0:12:47 is,
0:12:49 it’s fractal,
0:12:50 it’s mantle
0:12:51 broadion or whatever,
0:12:53 it’s not evenly
0:12:54 distributed at all
0:12:56 and so,
0:12:58 you know,
0:12:58 an AI
0:13:00 that kind of
0:13:01 captures all that
0:13:02 turns out to be,
0:13:04 you know,
0:13:05 we’re not so much
0:13:06 on the track,
0:13:07 we’re more on the track
0:13:08 for kind of
0:13:10 really great reasoning
0:13:12 over kind of
0:13:12 a set of axioms
0:13:14 that we came up with,
0:13:15 you know,
0:13:15 in say math
0:13:16 or physics
0:13:18 but not so much
0:13:21 you know,
0:13:21 kind of general
0:13:23 human intelligence
0:13:24 which is,
0:13:25 you know,
0:13:26 being able to
0:13:28 navigate other humans
0:13:30 and the world
0:13:30 in a way
0:13:32 that is productive
0:13:33 for us
0:13:33 is kind of a,
0:13:34 it’s a little bit
0:13:35 of a different
0:13:36 dimension of things.
0:13:37 You know,
0:13:37 yeah,
0:13:38 you can compare
0:13:40 the math capabilities
0:13:41 or the go
0:13:43 playing capabilities
0:13:43 or the driving
0:13:44 capabilities
0:13:46 or the IQ test
0:13:47 capabilities
0:13:48 of a computer
0:13:49 but that,
0:13:51 that’s not really
0:13:52 a human.
0:13:53 I think a human
0:13:54 is kind of different
0:13:55 in a fairly
0:13:56 fundamental way.
0:13:58 So what we end up
0:13:58 doing,
0:13:59 I think is going
0:14:00 to be different
0:14:00 than what we’re
0:14:01 doing today
0:14:02 just like what
0:14:02 we’re doing today
0:14:03 is very different
0:14:04 than what we did
0:14:04 a hundred years ago.
0:14:06 But,
0:14:07 you know,
0:14:08 the not having
0:14:09 a need for us,
0:14:10 I think that,
0:14:10 you know,
0:14:12 these AIs
0:14:13 are tools for us
0:14:15 to basically
0:14:16 navigate the world
0:14:17 and help us
0:14:18 solve problems
0:14:19 and do things
0:14:20 like,
0:14:21 you know,
0:14:22 everything from
0:14:24 prevent pandemics
0:14:25 to deal with
0:14:27 climate change
0:14:28 to that sort of thing,
0:14:29 to not kill each other
0:14:30 driving cars,
0:14:32 which we do a lot of.
0:14:34 You know,
0:14:35 hopefully it doesn’t,
0:14:36 you know,
0:14:38 create more wars,
0:14:38 hopefully it creates
0:14:39 less wars,
0:14:40 but we’ll see.
0:14:41 What I know
0:14:43 about the human brain
0:14:44 maybe tricking me
0:14:46 into painting a vision
0:14:46 of the future
0:14:47 that isn’t going
0:14:47 to come true,
0:14:49 let me put words
0:14:49 in your mouth
0:14:50 and you tell me
0:14:52 if they fit appropriately.
0:14:53 What I hear you saying
0:14:54 is something akin
0:14:55 to the way
0:14:56 that we’re approaching
0:14:57 artificial intelligence
0:14:57 right now,
0:14:58 let’s round it
0:15:00 to large language models,
0:15:02 that is going
0:15:04 to hit an upper bound
0:15:05 where it’s not able
0:15:07 to have insights
0:15:07 that a human
0:15:08 won’t already have,
0:15:10 that they are trapped
0:15:11 inside of the box
0:15:11 that we have created,
0:15:12 what you’re calling
0:15:13 the axioms
0:15:14 by which we navigate
0:15:14 the world.
0:15:15 They get trapped
0:15:16 inside that box
0:15:18 and thusly will never
0:15:19 be able to look
0:15:19 at the world
0:15:20 and go,
0:15:21 I’m not going
0:15:21 to predict
0:15:22 the next frame,
0:15:22 I’m going to render
0:15:23 the next frame
0:15:24 based on what I know
0:15:25 about physics.
0:15:27 And so water reacts
0:15:28 this way
0:15:29 in an earthbound
0:15:29 gravity system
0:15:30 and so it’s going
0:15:31 to splash like this
0:15:32 and it understands
0:15:32 liquid dynamics,
0:15:33 et cetera,
0:15:33 et cetera.
0:15:36 So is that accurate?
0:15:37 Are you saying
0:15:37 that it is trapped
0:15:38 inside of our box
0:15:39 and will never have?
0:15:40 It hasn’t demonstrated
0:15:41 that capability yet
0:15:43 so it hasn’t
0:15:44 walked up to a rock
0:15:45 and said this is a rock,
0:15:46 right?
0:15:47 We labeled it a rock
0:15:49 because that’s our structure.
0:15:50 But a rock
0:15:51 isn’t probably
0:15:54 a more intelligent being
0:15:55 would have called it
0:15:55 something else
0:15:57 or maybe the rock
0:15:58 is irrelevant
0:16:00 to how you actually
0:16:01 can navigate
0:16:02 the world safely.
0:16:05 and kind of figuring
0:16:06 those things out
0:16:07 or kind of adopting
0:16:08 to them
0:16:08 is just not something
0:16:11 that, you know,
0:16:12 it’s trained on our
0:16:14 rendition of the universe
0:16:16 in our kind of
0:16:19 literally like the way
0:16:20 we have described it
0:16:22 using language
0:16:22 that we invented.
0:16:26 and so it is
0:16:29 constrained a bit
0:16:30 to that in nature
0:16:31 currently.
0:16:33 You know,
0:16:34 that doesn’t mean
0:16:34 it’s not like
0:16:36 a massively useful tool
0:16:37 and can do things
0:16:38 and by the way
0:16:39 can derive new rules
0:16:40 from the old rules
0:16:41 that we’ve given it
0:16:42 for sure.
0:16:46 but, you know,
0:16:46 we’ll,
0:16:49 like I think
0:16:49 it’s a bit of a jump
0:16:50 to go,
0:16:51 you know,
0:16:52 it’s going to replace
0:16:53 us entirely
0:16:56 when the whole
0:16:57 discovery process
0:16:58 is something that we do
0:16:59 that it doesn’t do yet.
0:17:01 Okay.
0:17:02 The way that the human
0:17:03 mind is architected
0:17:04 is you have
0:17:05 competing regions
0:17:06 of the brain
0:17:06 like if you cut
0:17:07 the corpus callosum
0:17:08 the part that connects
0:17:09 the left and the right
0:17:09 hemisphere
0:17:10 you can get
0:17:12 two distinct personalities
0:17:13 one that is atheist
0:17:14 for instance
0:17:15 and one that
0:17:17 believes deeply in God
0:17:17 and they’ll argue
0:17:18 back and forth.
0:17:18 I mean,
0:17:19 this is in the same
0:17:20 human brain.
0:17:22 So that tells me
0:17:23 that what you have
0:17:24 is basically
0:17:25 regions of the brain
0:17:26 that get good
0:17:26 at a thing
0:17:28 and then they end up
0:17:28 coming together
0:17:29 to collaborate
0:17:30 and that is
0:17:31 sort of human intelligence
0:17:31 and I’ve heard you
0:17:32 talk about
0:17:32 there’s something
0:17:34 like 200 computers
0:17:36 inside of a single car
0:17:38 so if we already know
0:17:38 that you can
0:17:39 daisy chain
0:17:40 all of these
0:17:40 like it’s
0:17:41 it’s a very deep
0:17:42 knowledge about one thing
0:17:44 but as you daisy chain
0:17:45 then the intelligence
0:17:47 gets what I’ll call
0:17:48 more generalized
0:17:49 you don’t see that
0:17:50 as a flywheel
0:17:51 that is going to
0:17:52 keep going.
0:17:53 You know,
0:17:54 what we can compute
0:17:55 will get
0:17:57 better and better
0:17:57 and better
0:18:00 but having said that
0:18:00 you know,
0:18:01 that doesn’t
0:18:02 say that
0:18:05 like humans
0:18:06 one,
0:18:07 you know,
0:18:08 humans
0:18:09 built the machines
0:18:10 plug them in
0:18:11 give them the batteries
0:18:12 all these kinds
0:18:13 of things
0:18:15 and
0:18:17 you know,
0:18:18 and they’ve been
0:18:19 created to
0:18:20 fulfill our purposes
0:18:21 so
0:18:22 you know,
0:18:23 what it means
0:18:23 to be a human
0:18:24 will probably
0:18:25 will change
0:18:26 like it has been
0:18:26 changing
0:18:28 and kind of
0:18:29 how humans
0:18:29 live their life
0:18:30 will change
0:18:31 but humans
0:18:32 still find things
0:18:32 to do
0:18:32 I mean,
0:18:33 like it’s kind
0:18:33 of like
0:18:34 you know,
0:18:35 like a cheetah
0:18:35 has been able
0:18:36 to run faster
0:18:36 than a human
0:18:37 forever
0:18:38 but we
0:18:38 never watch
0:18:39 cheetahs race
0:18:40 we only watch
0:18:40 humans race
0:18:41 each other
0:18:43 you know,
0:18:44 computers have
0:18:44 played chess
0:18:44 better than
0:18:45 humans for a
0:18:45 long time
0:18:46 but nobody
0:18:47 watches computers
0:18:47 play chess
0:18:48 anymore
0:18:48 they watch
0:18:49 humans play
0:18:49 humans
0:18:49 and chess
0:18:50 is more
0:18:50 popular than
0:18:51 it’s ever
0:18:51 been
0:18:52 and so I
0:18:53 think we
0:18:53 have like
0:18:53 a keen
0:18:54 interest
0:18:54 in each
0:18:54 other
0:18:55 and
0:18:57 how that’s
0:18:57 going to
0:18:57 work
0:18:57 and these
0:18:58 will be
0:18:58 kind of
0:18:58 tools
0:18:59 to enhance
0:19:00 that whole
0:19:00 experience
0:19:01 for us
0:19:01 but I
0:19:01 think
0:19:02 it’s
0:19:03 you know
0:19:06 like a
0:19:07 world of
0:19:07 just
0:19:08 machines
0:19:09 seems
0:19:10 like
0:19:10 that
0:19:10 seems
0:19:11 like
0:19:11 really
0:19:11 unlikely
0:19:12 so
0:19:13 you’ve
0:19:13 got
0:19:13 people
0:19:14 like
0:19:15 Elon Musk
0:19:16 Sam
0:19:16 Altman
0:19:17 who have
0:19:17 both
0:19:18 expressed
0:19:18 deep
0:19:19 concerns
0:19:19 about
0:19:21 how
0:19:22 AI
0:19:24 may
0:19:24 in fact
0:19:25 make us
0:19:25 obsolete
0:19:25 Elon
0:19:26 has likened
0:19:27 he’s
0:19:27 certainly
0:19:27 become
0:19:28 fatalistic
0:19:29 but he
0:19:30 gave a
0:19:31 rant
0:19:31 that I
0:19:31 absolutely
0:19:32 love
0:19:32 that is
0:19:33 AI
0:19:33 is a
0:19:33 demon
0:19:34 summoning
0:19:34 circle
0:19:35 and
0:19:35 you’re
0:19:35 calling
0:19:36 forward
0:19:36 this demon
0:19:36 that you
0:19:37 were just
0:19:37 convinced
0:19:38 you’re going
0:19:38 to be
0:19:38 able to
0:19:38 control
0:19:39 and he
0:19:39 certainly
0:19:40 is not
0:19:40 so sure
0:19:40 and at
0:19:41 one point
0:19:41 and again
0:19:42 I’m fully
0:19:42 aware
0:19:43 that he’s
0:19:43 on his
0:19:44 fatalist
0:19:44 arc
0:19:44 and he’s
0:19:45 just
0:19:45 moving
0:19:45 forward
0:19:45 and he’s
0:19:46 building
0:19:46 as fast
0:19:46 as he
0:19:46 can
0:19:48 it’s
0:19:48 interesting
0:19:49 that both
0:19:50 of them
0:19:50 despite
0:19:51 saying
0:19:51 these
0:19:52 things
0:19:52 are
0:19:52 building
0:19:53 AI
0:19:53 as
0:19:54 fast
0:19:54 as
0:19:54 like
0:19:55 they’re
0:19:55 literally
0:19:56 in a race
0:19:56 with each
0:19:56 other
0:19:57 to see
0:19:57 who can
0:19:57 build it
0:19:58 faster
0:20:01 that they’re
0:20:01 warning
0:20:02 about
0:20:03 what do
0:20:03 you take
0:20:04 away
0:20:04 from that
0:20:04 is it
0:20:04 just
0:20:05 regulatory
0:20:05 capture
0:20:06 on both
0:20:06 of their
0:20:07 parts
0:20:07 is it
0:20:08 is Elon
0:20:09 being
0:20:09 sincere
0:20:10 not that I
0:20:10 need you
0:20:10 to mind
0:20:10 read him
0:20:11 but like
0:20:11 what do
0:20:12 you take
0:20:12 away
0:20:13 in
0:20:14 the fact
0:20:14 that
0:20:14 they’ve
0:20:15 both
0:20:15 warned
0:20:15 against
0:20:16 it
0:20:16 and
0:20:16 they’re
0:20:16 both
0:20:17 deploying
0:20:18 it
0:20:18 as
0:20:18 fast
0:20:18 as
0:20:18 they
0:20:18 can
0:20:19 yeah
0:20:20 it seems
0:20:20 fairly
0:20:21 contradictory
0:20:23 look I
0:20:24 think there
0:20:24 is
0:20:25 like I
0:20:26 won’t
0:20:26 question
0:20:27 either
0:20:27 their
0:20:28 sincerity
0:20:28 at some
0:20:28 degree
0:20:29 but I
0:20:29 do
0:20:29 think
0:20:29 there
0:20:30 are
0:20:32 many
0:20:32 reasons
0:20:33 to warn
0:20:33 about
0:20:33 it
0:20:33 but like
0:20:34 I also
0:20:34 think
0:20:35 that
0:20:36 you know
0:20:37 any
0:20:37 kind
0:20:38 of new
0:20:38 super
0:20:38 powerful
0:20:39 technology
0:20:40 you know
0:20:41 in a way
0:20:41 they’re right
0:20:42 to kind
0:20:42 of warn
0:20:42 about
0:20:42 like
0:20:43 okay
0:20:44 this
0:20:45 thing
0:20:46 if we
0:20:48 you know
0:20:48 if we
0:20:48 don’t
0:20:49 think
0:20:49 about
0:20:49 some
0:20:49 of
0:20:49 the
0:20:50 implications
0:20:50 of
0:20:50 it
0:20:51 could
0:20:51 get
0:20:51 dangerous
0:20:52 and I
0:20:52 think
0:20:52 that’s
0:20:52 a good
0:20:53 thing
0:20:53 like
0:20:53 every
0:20:53 technology
0:20:54 we’ve
0:20:54 ever
0:20:54 had
0:20:54 from
0:20:55 fire
0:20:55 to
0:20:57 you know
0:20:58 from
0:20:58 fire
0:20:59 to
0:20:59 automobiles
0:20:59 to
0:21:00 nuclear
0:21:00 to
0:21:01 AI
0:21:02 has
0:21:02 got
0:21:03 the
0:21:03 internet
0:21:05 has
0:21:05 got
0:21:05 downsides
0:21:06 to
0:21:06 it
0:21:06 they
0:21:06 all
0:21:06 have
0:21:07 downsides
0:21:08 and
0:21:08 the
0:21:09 more
0:21:09 powerful
0:21:09 the
0:21:10 more
0:21:10 kind
0:21:10 of
0:21:12 you know
0:21:12 kind
0:21:12 of
0:21:12 intriguing
0:21:13 the
0:21:13 downside
0:21:13 and
0:21:13 you know
0:21:14 maybe
0:21:14 like
0:21:15 you know
0:21:15 without
0:21:16 the
0:21:16 internet
0:21:16 we probably
0:21:17 would have
0:21:17 never gotten
0:21:18 to AI
0:21:20 and so
0:21:22 maybe that was the downside of the internet
0:21:24 that it led to AI or something like that you could argue
0:21:26 but I think
0:21:27 generally
0:21:33 we would take every technology we’ve invented and keep it because you know net net they’ve
0:21:38 been positive for humanity and for the world and and that’s generally and that’s
0:21:40 why I think they’re building it so fast because I think they know that
0:21:47 all right so anybody with a 17 year old right now is thinking oh my how where do I point
0:21:53 my kid what do I tell them to go study that’s future proof what can we learn about the way you guys are investing at
0:22:01 Andreessen Horowitz that would give somebody an inclination of what you think a 17 year old should be focused on now
0:22:10 yeah you know it’s really interesting I think one of the things what we’re saying in the kind of smartest young people that come out is
0:22:18 they spend a lot of time with AI learning everything they possibly can so I think you want to get very good at like high
0:22:19 curiosity
0:22:21 and
0:22:23 and then learning you know you have
0:22:26 available to you all of human knowledge
0:22:28 in something that will talk to you
0:22:33 and that’s you know that’s an incredible opportunity and I think that
0:22:36 anything you want to do in the world to make the world better
0:22:40 you now have the tools as an individual to do that in a way that
0:22:44 you know if you look at kind of what Thomas Edison had to do
0:22:47 in creating GE and like what that took and
0:22:51 and so forth you know it was a way higher bar to have an impact
0:22:53 whereas now I think
0:22:55 you know you can
0:22:57 very quickly
0:23:02 you know build something or do something that you know just pick a problem
0:23:07 it’s not like you know sometimes in this AI conversation the thing that we ignore is like
0:23:10 well what are the problems that we have in the world
0:23:10 well they
0:23:13 well we still have
0:23:14 cancer and diabetes
0:23:16 and sickle cell
0:23:17 and
0:23:18 and every disease
0:23:19 and we still have
0:23:20 the threat of pandemics
0:23:21 and we still have
0:23:22 climate change
0:23:23 and we still have
0:23:25 you know lots of people who are
0:23:26 starving to death
0:23:27 and we still have malaria
0:23:27 and
0:23:30 and so like pick a problem you want to solve
0:23:32 and now
0:23:34 you know
0:23:36 you have a huge helping hand in doing that
0:23:38 that nobody in the history of
0:23:40 the planet has ever had before
0:23:40 so
0:23:42 I think there’s
0:23:44 really great opportunities along those lines
0:23:47 so that
0:23:47 that would be my
0:23:48 you know
0:23:50 my
0:23:51 best advice
0:23:53 I think is to
0:23:55 to get really good with that
0:23:55 and
0:23:58 and look I think a lot of the things that we’ve learned
0:24:01 or that have been valuable skills traditionally
0:24:02 are going to change
0:24:03 so you really
0:24:04 you know again
0:24:06 want to be able to learn how to do anything
0:24:08 and
0:24:09 and I think that’s probably
0:24:11 going to be key
0:24:13 when I look at
0:24:13 the
0:24:14 things you were just talking about
0:24:15 that feels right
0:24:17 for people that have
0:24:17 the
0:24:18 the inclination
0:24:19 that have the
0:24:20 cognitive horsepower
0:24:22 to go and say
0:24:23 okay I’m going to leverage AI
0:24:25 to extend my capabilities
0:24:26 to tackle the biggest problems in the world
0:24:28 certainly right now in this moment
0:24:28 that
0:24:29 that is the
0:24:30 thrilling reality
0:24:31 that people should focus on
0:24:32 but then
0:24:34 I contrast that with
0:24:36 the deaths of despair
0:24:37 a young
0:24:38 among
0:24:39 largely
0:24:40 young men
0:24:43 we have this problem
0:24:43 in
0:24:44 call it
0:24:45 middle America
0:24:46 where
0:24:47 manufacturing jobs
0:24:48 have gone away
0:24:48 so for that
0:24:49 normal
0:24:50 just sort of everyday person
0:24:51 I want to have a trade
0:24:52 I want to go out into the world
0:24:53 and get something done
0:24:55 is AI going to be
0:24:56 useful to them
0:24:58 or are they going to
0:24:59 get replaced by
0:24:59 robotics
0:25:01 the truth of it is
0:25:02 is there’s only one
0:25:03 robot supply chain in the world
0:25:04 and that’s in China
0:25:05 and
0:25:07 you know
0:25:07 so
0:25:07 like
0:25:08 we all need a robot
0:25:09 supply chain
0:25:11 we need to
0:25:11 manufacture that
0:25:13 so I think there’s going to be
0:25:13 like a real
0:25:16 manufacturing opportunity
0:25:18 coming up
0:25:19 and it’ll be
0:25:20 a different kind of
0:25:20 manufacturing
0:25:22 certainly more will be
0:25:23 automated and so forth
0:25:24 but there will be a lot of
0:25:26 things to learn
0:25:26 in that field
0:25:28 that I think will be
0:25:29 you know
0:25:30 super interesting
0:25:31 and
0:25:33 and you know
0:25:34 likely very very good job
0:25:35 so I
0:25:36 ironically
0:25:37 I would say
0:25:38 like going into
0:25:39 manufacturing now
0:25:40 as a young man
0:25:41 and trying to
0:25:42 you know
0:25:43 kind of figure out
0:25:44 what that is
0:25:45 and get engaged in it
0:25:46 will probably lead to
0:25:47 you know
0:25:48 quite a
0:25:49 good career
0:25:50 you know
0:25:50 maybe in
0:25:52 creating
0:25:53 factories
0:25:53 have become
0:25:55 like insanely valuable
0:25:55 and
0:25:55 and
0:25:56 and kind of
0:25:57 and strategic
0:25:58 to the national
0:25:59 interests as well
0:26:00 that makes sense
0:26:01 so again
0:26:02 at the level of
0:26:03 the guy smart enough
0:26:04 to build the facility
0:26:04 yes
0:26:05 and I recently saw
0:26:06 a video of the
0:26:07 grocery store
0:26:08 of the future
0:26:08 where it is
0:26:10 a huge grid
0:26:11 inside of a
0:26:12 giant facility
0:26:13 and there’s just
0:26:14 like these
0:26:15 bots that
0:26:15 look kind of like
0:26:16 small shopping carts
0:26:17 and they’re just
0:26:18 grid patterning
0:26:19 across all the items
0:26:20 snatching up
0:26:20 whatever you order
0:26:22 so you order online
0:26:22 these things
0:26:23 grab all that stuff
0:26:24 and then they send
0:26:25 it off to you
0:26:26 so for the person
0:26:27 that’s savvy enough
0:26:28 to build that facility
0:26:29 yes
0:26:30 tremendous
0:26:31 but
0:26:33 what I think
0:26:33 I hear you saying
0:26:34 and correct me
0:26:34 if I’m wrong
0:26:35 is that
0:26:36 okay there are
0:26:37 two opportunities here
0:26:37 the opportunity
0:26:38 one is
0:26:39 if you’re the kind
0:26:39 of person
0:26:40 that can leverage
0:26:40 AI to build
0:26:41 that facility
0:26:42 massive opportunity
0:26:44 if you’re the kind
0:26:44 of person
0:26:44 that would traditionally
0:26:45 work at that
0:26:45 factory
0:26:46 something new
0:26:47 is coming
0:26:47 we know that
0:26:48 because looking
0:26:48 back at history
0:26:49 all these technologies
0:26:50 unleash things
0:26:51 we can’t yet see
0:26:52 and so I have faith
0:26:53 in the
0:26:54 we can’t yet see it
0:26:54 but it is coming
0:26:55 yeah
0:26:56 no for sure
0:26:56 look I mean
0:26:58 you know what
0:26:58 like the biggest
0:27:00 in-demand job
0:27:00 in the world
0:27:01 is right now
0:27:02 data labelers
0:27:04 and
0:27:05 like data labeling
0:27:06 wasn’t a job
0:27:08 not long ago
0:27:09 but if you talk
0:27:10 I’ve even heard of this
0:27:11 what is data labeling
0:27:12 yeah
0:27:13 so it’s what
0:27:13 Alex said
0:27:14 it’s what scale
0:27:15 AI does
0:27:16 you know
0:27:16 they pay
0:27:18 armies and armies
0:27:19 and armies of people
0:27:20 to label data
0:27:21 so say hey
0:27:23 this is a plant
0:27:24 or this is
0:27:24 you know
0:27:24 a fig
0:27:25 or whatever it is
0:27:26 for the AI
0:27:28 to then understand it
0:27:29 and then you know
0:27:30 now with the
0:27:32 you know
0:27:33 with the kind of
0:27:34 reinforcement learning
0:27:35 coming back
0:27:36 into play
0:27:37 you know
0:27:38 labeling
0:27:38 you know
0:27:39 that kind of
0:27:40 supervised learning
0:27:41 is still like
0:27:42 very very very important
0:27:44 and I think that
0:27:46 you know
0:27:47 right now
0:27:48 like he’s got
0:27:50 unlimited hiring demand
0:27:51 which is
0:27:51 you know
0:27:52 ironic
0:27:53 for scale AI
0:27:54 to have
0:27:55 unlimited need
0:27:56 for humans
0:27:57 and I think
0:27:58 you know
0:27:59 in manufacturing
0:28:00 there are going to be
0:28:01 jobs like that
0:28:02 and there will be
0:28:02 the kind of physical
0:28:05 well when you go
0:28:06 and you go
0:28:07 into these robot
0:28:08 like the software
0:28:09 companies that are
0:28:10 doing robotics
0:28:12 they have people
0:28:14 managing the robots
0:28:15 right
0:28:15 like they’re training
0:28:16 the robots
0:28:17 humans train robots
0:28:19 to do all kinds
0:28:20 of things
0:28:21 and it turns out
0:28:21 that like
0:28:23 folding clothes
0:28:25 doesn’t necessarily
0:28:27 generalize
0:28:28 to making eggs
0:28:30 they’re like
0:28:31 super different
0:28:32 for robots
0:28:34 and so you need
0:28:35 you know
0:28:36 these robots
0:28:36 trained in all
0:28:37 these kinds
0:28:37 of fields
0:28:38 and so forth
0:28:38 so I think
0:28:38 there’s
0:28:39 you know
0:28:40 there’s a whole
0:28:40 new class
0:28:41 of jobs
0:28:42 that are
0:28:42 a little bit
0:28:43 hard to
0:28:44 anticipate
0:28:46 you know
0:28:47 in advance
0:28:48 but I think
0:28:49 at least
0:28:49 for the next
0:28:50 10 years
0:28:50 I think
0:28:51 the number
0:28:52 of new
0:28:52 jobs
0:28:53 related
0:28:54 to making
0:28:54 these machines
0:28:55 smarter
0:28:56 is going
0:28:57 to increase
0:28:57 a lot
0:28:59 and then
0:29:00 after that
0:29:00 you know
0:29:00 like
0:29:02 I think
0:29:02 there will
0:29:03 be
0:29:04 there just
0:29:05 tend to be
0:29:05 like
0:29:06 throughout
0:29:07 history
0:29:08 so many
0:29:09 needs
0:29:09 for new
0:29:09 things
0:29:10 that we
0:29:10 never
0:29:11 anticipated
0:29:11 like well
0:29:12 I mean
0:29:12 you know
0:29:12 one of my
0:29:13 favorite
0:29:13 examples
0:29:13 is
0:29:14 okay
0:29:15 computers
0:29:17 are going
0:29:18 to kill
0:29:20 the typesetting
0:29:20 business
0:29:21 and they did
0:29:22 everybody knew
0:29:22 that
0:29:23 like that
0:29:23 was coming
0:29:24 nobody
0:29:26 nobody
0:29:26 said oh
0:29:26 and then
0:29:27 there’s
0:29:27 going to be
0:29:28 5 million
0:29:28 graphic design
0:29:29 jobs
0:29:29 that come
0:29:30 out of
0:29:30 the PC
0:29:31 like
0:29:32 nobody
0:29:33 not a
0:29:33 person
0:29:34 predicted
0:29:34 that
0:29:35 so
0:29:35 it’s
0:29:35 really
0:29:36 easy
0:29:36 to figure
0:29:36 out
0:29:36 which
0:29:37 jobs
0:29:37 are
0:29:37 going
0:29:37 to go
0:29:37 away
0:29:38 it’s
0:29:38 much
0:29:39 more
0:29:39 difficult
0:29:39 to
0:29:39 kind
0:29:40 of
0:29:40 figure
0:29:40 out
0:29:40 which
0:29:41 jobs
0:29:41 are
0:29:41 going
0:29:41 to
0:29:41 come
0:29:42 but
0:29:43 like
0:29:43 if
0:29:43 you
0:29:43 look
0:29:43 at
0:29:43 the
0:29:44 history
0:29:44 of
0:29:44 automation
0:29:46 which
0:29:46 is
0:29:46 kind
0:29:47 of
0:29:47 automated
0:29:47 away
0:29:48 everything
0:29:48 we
0:29:48 did
0:29:48 100
0:29:49 years
0:29:49 ago
0:29:50 there’s
0:29:54 and so
0:29:54 you go
0:29:56 okay
0:29:57 and then
0:29:57 you know
0:29:57 like
0:29:58 some of
0:29:58 the
0:29:58 employment
0:29:59 will be
0:30:00 much
0:30:00 more
0:30:01 I think
0:30:01 enjoyable
0:30:02 than the
0:30:02 old
0:30:02 employment
0:30:03 as well
0:30:04 as it
0:30:04 has been
0:30:05 you know
0:30:05 over time
0:30:06 and you
0:30:07 always talk
0:30:07 about
0:30:07 manufacturing
0:30:08 jobs
0:30:08 going
0:30:08 away
0:30:08 but
0:30:09 the
0:30:09 manufacturing
0:30:09 jobs
0:30:10 that
0:30:10 have
0:30:10 gone
0:30:10 away
0:30:10 have
0:30:11 been
0:30:11 the
0:30:11 most
0:30:11 mind
0:30:12 numbing
0:30:12 so
0:30:13 I
0:30:13 think
0:30:13 things
0:30:14 evolve
0:30:14 in
0:30:14 very
0:30:15 very
0:30:15 unpredictable
0:30:16 ways
0:30:17 and
0:30:18 you know
0:30:18 like
0:30:18 I
0:30:18 think
0:30:19 the
0:30:19 hope
0:30:19 is
0:30:19 that
0:30:20 you
0:30:20 know
0:30:20 the
0:30:20 world
0:30:21 just
0:30:21 gets
0:30:22 much
0:30:22 better
0:30:23 but
0:30:23 I’m
0:30:23 not
0:30:24 I’m
0:30:24 not
0:30:25 so
0:30:26 worried
0:30:26 about
0:30:27 kind
0:30:27 of
0:30:28 anticipating
0:30:28 all the
0:30:28 horror
0:30:29 that’s
0:30:29 going to
0:30:29 come
0:30:29 I mean
0:30:29 I think
0:30:30 the main
0:30:30 reason
0:30:30 we’re
0:30:31 making
0:30:31 these
0:30:31 things
0:30:31 is
0:30:32 you
0:30:32 know
0:30:33 the
0:30:33 ways
0:30:34 that
0:30:34 they’re
0:30:34 making
0:30:34 life
0:30:35 better
0:30:36 and
0:30:36 you know
0:30:37 just like
0:30:37 we finally
0:30:38 figured out
0:30:38 a way
0:30:39 for
0:30:39 everybody
0:30:40 like
0:30:40 we
0:30:40 already
0:30:41 have
0:30:42 in
0:30:42 our
0:30:43 hands
0:30:44 everybody
0:30:44 can
0:30:44 get
0:30:45 a
0:30:45 great
0:30:45 education
0:30:46 like
0:30:47 that
0:30:47 whole
0:30:49 inequality
0:30:50 of
0:30:50 access
0:30:50 to
0:30:51 education
0:30:51 is
0:30:52 like
0:30:52 literally
0:30:53 gone
0:30:53 right
0:30:53 now
0:30:54 which
0:30:55 is
0:30:56 pretty
0:30:56 amazing
0:30:57 I mean
0:30:57 it’s
0:30:57 certainly
0:30:58 huge
0:30:59 yeah
0:30:59 nothing
0:30:59 that I
0:31:00 ever
0:31:00 thought
0:31:00 I’d
0:31:00 see
0:31:01 so
0:31:02 hopefully
0:31:04 things
0:31:04 go well
0:31:05 the great
0:31:05 irony
0:31:05 it’s
0:31:06 so
0:31:06 crazy
0:31:07 I
0:31:07 don’t
0:31:07 think
0:31:08 anybody
0:31:08 anybody
0:31:09 saw that
0:31:09 coming
0:31:09 it was
0:31:09 always
0:31:10 going
0:31:10 to
0:31:10 be
0:31:10 it’s
0:31:10 going
0:31:10 to
0:31:10 go
0:31:10 for
0:31:10 the
0:31:11 drivers
0:31:11 it’s
0:31:11 going
0:31:11 to
0:31:12 go
0:31:12 for
0:31:12 all
0:31:12 those
0:31:12 hard
0:31:13 difficult
0:31:13 repetitive
0:31:14 tasks
0:31:14 yeah
0:31:14 it’s
0:31:15 been
0:31:15 very
0:31:16 fascinating
0:31:16 to
0:31:16 see
0:31:16 what
0:31:17 actually
0:31:17 is
0:31:17 in
0:31:18 danger
0:31:18 like
0:31:19 super
0:31:19 creative
0:31:20 jobs
0:31:20 very
0:31:21 much
0:31:21 in
0:31:21 danger
0:31:23 but
0:31:23 yeah
0:31:23 as
0:31:23 you
0:31:23 get
0:31:24 down
0:31:24 I
0:31:24 mean
0:31:24 look
0:31:24 I
0:31:25 think
0:31:25 I
0:31:25 believe
0:31:26 way
0:31:26 more
0:31:26 strongly
0:31:27 than
0:31:27 you
0:31:27 do
0:31:27 that
0:31:28 robots
0:31:28 are
0:31:28 just
0:31:28 going
0:31:28 to
0:31:28 get
0:31:29 better
0:31:29 and
0:31:29 better
0:31:29 and
0:31:29 better
0:31:30 and
0:31:30 better
0:31:31 but
0:31:32 that
0:31:32 could
0:31:32 be
0:31:32 that
0:31:33 I’m
0:31:33 not
0:31:33 as
0:31:33 close
0:31:33 to
0:31:33 the
0:31:34 problem
0:31:34 as
0:31:34 you
0:31:34 are
0:31:35 speaking
0:31:35 of
0:31:35 which
0:31:36 how
0:31:36 the
0:31:37 insights
0:31:37 that
0:31:37 you’ve
0:31:38 had
0:31:38 into
0:31:39 AI
0:31:39 how
0:31:39 are
0:31:39 they
0:31:40 informing
0:31:40 the
0:31:41 investments
0:31:41 that
0:31:41 you
0:31:41 guys
0:31:42 make
0:31:42 the
0:31:43 theory
0:31:43 is
0:31:43 there’s
0:31:44 this
0:31:44 one
0:31:44 like
0:31:44 super
0:31:45 intelligent
0:31:45 big
0:31:46 brain
0:31:46 that’s
0:31:46 going to
0:31:46 do
0:31:47 everything
0:31:47 the
0:31:48 reality
0:31:48 on the
0:31:48 ground
0:31:49 is
0:31:49 even
0:31:50 with
0:31:50 the
0:31:50 state
0:31:50 of
0:31:50 the
0:31:50 art
0:31:51 models
0:31:51 they’re
0:31:51 all
0:31:52 kind
0:31:52 of
0:31:52 good
0:31:53 at
0:31:53 slightly
0:31:53 different
0:31:54 things
0:31:54 right
0:31:54 like
0:31:54 you know
0:31:55 Anthropic
0:31:55 is
0:31:55 like
0:31:55 really
0:31:56 good
0:31:56 at
0:31:56 code
0:31:57 and
0:31:58 Grok
0:31:58 is
0:31:58 really
0:31:59 good
0:31:59 at
0:31:59 like
0:31:59 real
0:31:59 time
0:32:00 data
0:32:00 because
0:32:00 they’ve
0:32:00 got
0:32:00 the
0:32:01 Twitter
0:32:01 stuff
0:32:01 and
0:32:02 then
0:32:02 you
0:32:02 know
0:32:03 OpenAI
0:32:03 has
0:32:04 gotten
0:32:04 like
0:32:04 very
0:32:04 very
0:32:05 good
0:32:05 at
0:32:05 reasoning
0:32:07 so
0:32:09 with
0:32:09 all
0:32:09 of
0:32:10 them
0:32:10 are
0:32:10 doing
0:32:11 AGI
0:32:11 but
0:32:12 then
0:32:12 they’re
0:32:12 all
0:32:12 good
0:32:12 at
0:32:13 different
0:32:13 stuff
0:32:13 which
0:32:14 is
0:32:14 you know
0:32:15 from
0:32:15 an
0:32:15 investing
0:32:16 standpoint
0:32:16 it’s
0:32:16 very
0:32:16 good
0:32:17 to
0:32:17 know
0:32:17 that
0:32:19 because
0:32:20 if
0:32:20 something
0:32:20 like
0:32:20 that’s
0:32:21 not
0:32:21 winner
0:32:21 take
0:32:21 all
0:32:22 that’s
0:32:22 very
0:32:23 that
0:32:23 becomes
0:32:23 like
0:32:24 super
0:32:24 interesting
0:32:25 it
0:32:25 also
0:32:26 is
0:32:26 interesting
0:32:27 for
0:32:28 what
0:32:28 it
0:32:28 means
0:32:28 at
0:32:28 the
0:32:29 application
0:32:29 layer
0:32:29 because
0:32:30 if
0:32:30 the
0:32:32 infrastructure
0:32:32 products
0:32:33 aren’t
0:32:33 winner
0:32:34 take
0:32:34 all
0:32:34 and
0:32:34 then
0:32:35 the
0:32:35 other
0:32:35 thing
0:32:35 about
0:32:35 the
0:32:36 infrastructure
0:32:36 products
0:32:37 that’s
0:32:37 interesting
0:32:37 is
0:32:38 that
0:32:38 they’re
0:32:39 not
0:32:39 particularly
0:32:40 sticky
0:32:41 in the
0:32:42 way
0:32:42 that
0:32:42 kind
0:32:43 of
0:32:44 Microsoft
0:32:44 Windows
0:32:45 was
0:32:45 very
0:32:45 sticky
0:32:46 right
0:32:46 it was
0:32:46 sticky
0:32:47 you build
0:32:47 an application
0:32:48 on Windows
0:32:49 it doesn’t run
0:32:49 on other
0:32:50 stuff
0:32:50 you’ve got
0:32:51 to do a lot
0:32:51 of work
0:32:52 to move
0:32:52 it to
0:32:52 something
0:32:52 else
0:32:53 so you
0:32:53 get this
0:32:53 network
0:32:55 effect
0:32:55 with
0:32:56 developers
0:32:57 then you
0:32:57 go
0:32:58 okay
0:32:58 well
0:32:59 how does
0:32:59 that work
0:32:59 with
0:33:00 state-of-the-art
0:33:00 models
0:33:01 well people
0:33:01 build
0:33:01 applications
0:33:02 on these
0:33:02 things
0:33:03 but
0:33:03 guess
0:33:04 what
0:33:04 like
0:33:05 to move
0:33:05 your
0:33:05 application
0:33:05 to
0:33:06 DeepSeq
0:33:06 you didn’t
0:33:07 have to
0:33:08 they just
0:33:09 literally
0:33:09 took the
0:33:09 open
0:33:11 Python
0:33:11 API
0:33:12 and like
0:33:12 it runs
0:33:13 on DeepSeq
0:33:13 now
0:33:16 so
0:33:17 you know
0:33:17 that
0:33:17 that kind
0:33:18 of thing
0:33:19 really impacts
0:33:20 you know
0:33:20 how you
0:33:20 think about
0:33:21 investing
0:33:21 and like
0:33:22 what is
0:33:23 the value
0:33:23 of having
0:33:24 lead in
0:33:24 application
0:33:25 and then
0:33:25 you know
0:33:26 where is
0:33:26 the mode
0:33:26 going to
0:33:27 come from
0:33:28 and of
0:33:28 course
0:33:28 AI is
0:33:29 also
0:33:29 getting
0:33:29 like
0:33:30 the one
0:33:30 thing
0:33:30 it is
0:33:30 getting
0:33:31 amazingly
0:33:32 good at
0:33:32 is
0:33:32 writing
0:33:33 code
0:33:34 and
0:33:34 so
0:33:35 then
0:33:35 you know
0:33:35 how much
0:33:36 of a
0:33:36 lead
0:33:36 do you
0:33:37 have
0:33:38 in
0:33:38 the
0:33:38 code
0:33:39 itself
0:33:39 versus
0:33:41 you know
0:33:42 kind of
0:33:42 the other
0:33:42 traditional
0:33:43 things
0:33:43 you know
0:33:43 when I
0:33:44 started
0:33:44 in the
0:33:44 industry
0:33:45 the
0:33:46 sales
0:33:46 people
0:33:46 were
0:33:46 in
0:33:47 charge
0:33:48 they
0:33:48 were
0:33:48 kind
0:33:49 of
0:33:49 like
0:33:49 the
0:33:49 big
0:33:50 there’s
0:33:50 a
0:33:50 great
0:33:52 TV
0:33:52 show
0:33:52 called
0:33:53 Halt
0:33:53 and Catch
0:33:53 Fire
0:33:54 and
0:33:55 if you
0:33:55 watch it
0:33:56 like the
0:33:56 thing
0:33:57 that’s
0:33:57 really
0:33:58 stunning
0:33:58 if
0:33:58 you’re
0:33:59 you know
0:33:59 kind
0:33:59 of
0:34:00 coming
0:34:00 from
0:34:01 the
0:34:02 2010s
0:34:03 2020s
0:34:04 world
0:34:04 is
0:34:05 why
0:34:05 are
0:34:05 the
0:34:05 sales
0:34:06 people
0:34:06 so
0:34:06 powerful
0:34:08 but
0:34:09 they
0:34:09 were
0:34:09 the
0:34:09 most
0:34:09 powerful
0:34:10 in
0:34:10 those
0:34:10 days
0:34:11 and
0:34:11 it
0:34:11 was
0:34:12 because
0:34:12 you know
0:34:12 distribution
0:34:13 was the
0:34:14 most
0:34:14 difficult
0:34:14 thing
0:34:16 and
0:34:16 I
0:34:17 you know
0:34:17 I think
0:34:18 distribution
0:34:18 is going
0:34:19 to get
0:34:19 in
0:34:19 very
0:34:19 very
0:34:20 important
0:34:20 again
0:34:23 because
0:34:24 maintaining
0:34:24 a
0:34:25 technological
0:34:25 lead
0:34:25 is
0:34:26 a lot
0:34:26 harder
0:34:27 you know
0:34:28 when the
0:34:28 machine
0:34:29 is writing
0:34:29 the code
0:34:30 and writing
0:34:30 it very
0:34:30 fast
0:34:32 although
0:34:32 it’s not
0:34:33 all the way
0:34:34 where it
0:34:34 can build
0:34:35 like super
0:34:35 complex
0:34:36 systems
0:34:37 but there’s
0:34:37 you know
0:34:38 a bunch
0:34:39 of things
0:34:40 out now
0:34:40 you know
0:34:40 Replet’s
0:34:41 got a
0:34:41 great
0:34:42 product
0:34:42 for it
0:34:43 there’s
0:34:43 a company
0:34:43 called
0:34:44 Lovable
0:34:44 that’s got
0:34:44 one out
0:34:45 in Sweden
0:34:46 that just
0:34:46 builds you
0:34:47 an app
0:34:47 like if
0:34:47 you need
0:34:48 an app
0:34:48 for something
0:34:49 just say
0:34:49 build me
0:34:50 this app
0:34:51 and there
0:34:51 it is
0:34:52 yeah
0:34:53 another thing
0:34:53 in cursor
0:34:54 that you
0:34:54 can select
0:34:56 what model
0:34:56 you want
0:34:57 to use
0:34:58 for whatever
0:34:58 thing that
0:34:58 you’re about
0:34:59 to generate
0:34:59 so the
0:35:00 ability to
0:35:00 go oh
0:35:01 I want
0:35:01 10 of
0:35:01 these
0:35:02 things
0:35:02 I’m gonna
0:35:02 use this
0:35:03 one for
0:35:03 this kind
0:35:03 of code
0:35:04 this one
0:35:04 for that
0:35:04 kind
0:35:04 of code
0:35:05 it’s
0:35:06 really
0:35:06 fascinating
0:35:08 now the
0:35:08 Biden
0:35:08 administration
0:35:09 was super
0:35:09 hostile
0:35:10 towards tech
0:35:12 when you
0:35:13 look at
0:35:14 what’s going
0:35:14 on now
0:35:15 with the
0:35:15 changes
0:35:16 in
0:35:16 regulatory
0:35:18 what do
0:35:19 you think
0:35:19 about the
0:35:20 race
0:35:21 between
0:35:21 us
0:35:21 and
0:35:21 China
0:35:22 were we
0:35:22 headed
0:35:23 down a
0:35:23 dark
0:35:23 path
0:35:24 where
0:35:24 if
0:35:25 that
0:35:26 administration
0:35:26 had
0:35:27 stayed
0:35:27 with that
0:35:27 like we’re
0:35:28 going to have
0:35:28 one or two
0:35:29 companies
0:35:29 we’re going to
0:35:30 control them
0:35:30 that’s going
0:35:30 to be that
0:35:32 is it possible
0:35:32 we could have
0:35:33 lost that race
0:35:34 is that race
0:35:35 a figment
0:35:35 of my
0:35:36 imagination
0:35:36 is that
0:35:36 real
0:35:38 I think
0:35:38 there’s
0:35:40 multiple layers
0:35:41 to the
0:35:43 AI race
0:35:43 with China
0:35:44 and then
0:35:45 you know
0:35:45 the Biden
0:35:46 administration
0:35:47 was
0:35:48 kind of
0:35:48 hostile
0:35:49 in many
0:35:50 ways
0:35:50 but all
0:35:51 for kind
0:35:51 of a
0:35:51 central
0:35:52 reason
0:35:52 I think
0:35:54 so
0:35:55 you know
0:35:56 in AI
0:35:56 in particular
0:35:58 you know
0:35:58 when we
0:35:59 met
0:36:01 and I
0:36:01 should be
0:36:02 very specific
0:36:03 we did
0:36:03 meet with
0:36:04 Jake Sullivan
0:36:04 but he was
0:36:05 very good
0:36:05 about it
0:36:05 we met
0:36:06 with Gina
0:36:06 Raimondo
0:36:07 she was
0:36:07 very good
0:36:08 about it
0:36:09 but we
0:36:10 met with
0:36:10 the kind
0:36:11 of White
0:36:11 House
0:36:13 and
0:36:13 their
0:36:14 you know
0:36:15 their position
0:36:15 was
0:36:16 super
0:36:18 kind of
0:36:18 I would
0:36:18 say
0:36:19 ill-informed
0:36:20 so
0:36:21 they
0:36:21 basically
0:36:22 were
0:36:23 they
0:36:24 walked in
0:36:24 with this
0:36:25 idea
0:36:26 that like
0:36:26 we’ve got
0:36:27 a three-year
0:36:28 lead on
0:36:28 China
0:36:30 and
0:36:30 we have
0:36:31 to protect
0:36:31 that lead
0:36:33 and
0:36:34 there’s
0:36:34 no
0:36:36 and
0:36:36 therefore
0:36:37 we need
0:36:37 to shut
0:36:38 down
0:36:38 open
0:36:38 source
0:36:39 and
0:36:39 that
0:36:39 doesn’t
0:36:40 matter
0:36:41 to
0:36:41 you
0:36:42 guys
0:36:42 and
0:36:42 startups
0:36:42 because
0:36:43 startups
0:36:44 can’t
0:36:44 participate
0:36:45 in AI
0:36:45 anyway
0:36:47 because
0:36:47 they don’t
0:36:48 have enough
0:36:48 money
0:36:50 and
0:36:50 the only
0:36:51 companies
0:36:51 that are
0:36:52 going to
0:36:52 do
0:36:52 AI
0:36:53 are
0:36:54 going to
0:36:54 be
0:36:54 kind
0:36:54 of
0:36:55 ironically
0:36:56 the
0:36:57 two
0:36:57 startups
0:36:58 Anthropic
0:36:58 and Open
0:36:58 AI
0:36:59 that are
0:36:59 out
0:37:00 and then
0:37:00 the big
0:37:00 companies
0:37:00 Google
0:37:01 and so
0:37:01 forth
0:37:01 and
0:37:02 Microsoft
0:37:03 and so
0:37:03 we can
0:37:04 put a
0:37:05 huge
0:37:05 regulatory
0:37:06 barrier
0:37:06 on them
0:37:06 because
0:37:07 they have
0:37:07 the money
0:37:07 and the
0:37:08 people
0:37:08 to deal
0:37:08 with it
0:37:09 and then
0:37:10 that’ll
0:37:10 be
0:37:11 and you
0:37:11 know
0:37:12 in their
0:37:12 minds
0:37:12 I think
0:37:13 they actually
0:37:13 believe
0:37:14 that
0:37:15 that would
0:37:15 be how
0:37:16 we would
0:37:16 win
0:37:18 but of course
0:37:18 you know
0:37:19 in retrospect
0:37:20 that makes
0:37:20 no sense
0:37:21 and it
0:37:21 kind of
0:37:23 it damages
0:37:23 you know
0:37:23 if you look
0:37:24 at China
0:37:24 and what
0:37:24 China’s
0:37:25 great at
0:37:25 then this
0:37:26 goes to the
0:37:26 next thing
0:37:27 so there’s
0:37:27 how good
0:37:28 is your
0:37:28 AI
0:37:28 and then
0:37:30 how well
0:37:30 is it
0:37:31 integrated
0:37:31 into
0:37:33 your
0:37:33 military
0:37:34 and the way
0:37:34 the government
0:37:34 works
0:37:35 and so
0:37:35 forth
0:37:36 and I
0:37:36 think
0:37:36 that
0:37:37 China
0:37:37 being a
0:37:38 top-down
0:37:39 society
0:37:40 their strength
0:37:40 is
0:37:42 you know
0:37:43 that whatever
0:37:44 AI they have
0:37:44 they’re going
0:37:45 to integrate
0:37:45 into it’s
0:37:46 already all the
0:37:46 companies are
0:37:47 highly integrated
0:37:47 into the
0:37:48 government
0:37:49 so you know
0:37:50 they’re going
0:37:50 to be able
0:37:50 to deploy
0:37:51 that and
0:37:52 and we’re
0:37:52 going to
0:37:52 see it
0:37:53 in action
0:37:55 with their
0:37:55 military
0:37:56 very fast
0:37:57 I think
0:37:57 that the
0:37:58 advantage
0:37:58 of the
0:37:59 US
0:37:59 is like
0:38:00 we’re not
0:38:01 a top-down
0:38:01 society
0:38:02 we’re like
0:38:02 a wild
0:38:03 messy
0:38:03 society
0:38:04 but it
0:38:05 means that
0:38:05 all of our
0:38:06 smart people
0:38:07 can participate
0:38:08 in the field
0:38:10 and look
0:38:11 there’s more
0:38:11 to AI
0:38:12 than just
0:38:12 the big
0:38:12 models
0:38:13 as you said
0:38:13 like
0:38:14 you know
0:38:15 how important
0:38:15 is cursor
0:38:16 it’s really
0:38:17 important
0:38:18 if you’re
0:38:18 building stuff
0:38:19 so like
0:38:20 oh you want
0:38:20 to go
0:38:21 build
0:38:23 you know
0:38:24 the next
0:38:25 whatever
0:38:26 thing that
0:38:27 the CIA
0:38:27 needs
0:38:27 or the
0:38:28 NSA
0:38:28 needs
0:38:28 or this
0:38:29 and that
0:38:30 like
0:38:30 you’re
0:38:30 building
0:38:30 that
0:38:30 with
0:38:31 cursor
0:38:31 you’re
0:38:31 using
0:38:31 a state
0:38:32 of the
0:38:32 art
0:38:32 model
0:38:33 but like
0:38:33 if you
0:38:33 had
0:38:34 eliminated
0:38:34 if you
0:38:34 know
0:38:35 if
0:38:35 the
0:38:35 Biden
0:38:36 White
0:38:36 House
0:38:36 had
0:38:36 gotten
0:38:36 their
0:38:36 way
0:38:37 they’d
0:38:37 eliminate
0:38:37 things
0:38:38 like
0:38:38 cursor
0:38:38 they’d
0:38:38 eliminate
0:38:39 startups
0:38:39 being
0:38:40 able
0:38:40 to do
0:38:41 anything
0:38:41 in AI
0:38:43 and so
0:38:44 the advantages
0:38:45 that we have
0:38:45 that we
0:38:45 don’t just
0:38:46 have a model
0:38:47 we’ve got
0:38:47 all this
0:38:47 other stuff
0:38:48 that goes
0:38:48 with it
0:38:50 and then
0:38:50 we have
0:38:51 new ideas
0:38:51 on models
0:38:53 with new
0:38:53 algorithms
0:38:53 and this
0:38:54 and that
0:38:55 and that’s
0:38:55 what the
0:38:55 U.S.
0:38:56 is great
0:38:56 at
0:38:57 and I
0:38:57 think
0:38:57 what China
0:38:58 is great
0:38:58 at
0:38:58 is
0:39:01 by the
0:39:01 way
0:39:01 they’re
0:39:01 very good
0:39:01 at math
0:39:02 people
0:39:02 people
0:39:03 are good
0:39:03 at math
0:39:03 and AI
0:39:04 is math
0:39:04 so
0:39:04 their
0:39:06 models
0:39:06 are good
0:39:07 they also
0:39:07 have
0:39:08 a data
0:39:08 advantage
0:39:09 on us
0:39:10 where they
0:39:10 have access
0:39:10 to the
0:39:11 Chinese
0:39:11 internet
0:39:12 they have
0:39:12 access
0:39:12 to
0:39:13 copywritten
0:39:13 material
0:39:14 which
0:39:15 they do
0:39:15 not
0:39:16 have
0:39:16 the
0:39:17 same
0:39:17 difference
0:39:18 for it
0:39:18 that we
0:39:18 do
0:39:18 in the
0:39:19 U.S.
0:39:19 and so
0:39:19 they’re
0:39:19 able
0:39:20 to
0:39:20 kind
0:39:21 of
0:39:21 get
0:39:21 to
0:39:22 you know
0:39:23 if you
0:39:23 use
0:39:23 DeepSeq
0:39:24 you go
0:39:24 wow
0:39:24 DeepSeq
0:39:25 really is
0:39:25 a great
0:39:25 writer
0:39:26 compared
0:39:27 to a lot
0:39:27 of the
0:39:28 U.S.
0:39:28 models
0:39:28 why
0:39:28 is
0:39:29 that
0:39:29 well
0:39:29 they
0:39:30 train
0:39:30 on
0:39:30 bigger
0:39:30 data
0:39:31 set
0:39:31 than
0:39:31 we
0:39:31 do
0:39:32 and
0:39:33 that’s
0:39:33 amazing
0:39:34 so
0:39:35 it
0:39:35 really
0:39:36 you know
0:39:37 I think
0:39:37 what we
0:39:38 want
0:39:38 is
0:39:38 we
0:39:39 want
0:39:39 to
0:39:39 have
0:39:40 kind
0:39:40 of
0:39:40 world
0:39:41 class
0:39:41 first
0:39:41 class
0:39:41 AI
0:39:42 in
0:39:42 the
0:39:42 U.S.
0:39:42 and I
0:39:43 think
0:39:43 of it
0:39:43 less
0:39:43 as
0:39:44 you know
0:39:45 is it
0:39:45 ahead
0:39:45 of
0:39:45 China
0:39:46 is it
0:39:46 slightly
0:39:46 ahead
0:39:46 of
0:39:46 China
0:39:47 I
0:39:47 think
0:39:47 that
0:39:47 model
0:39:50 you
0:39:52 know
0:39:52 what
0:39:52 we’ve
0:39:52 seen
0:39:53 with our
0:39:53 own
0:39:53 state
0:39:53 of the
0:39:53 art
0:39:54 models
0:39:54 are
0:39:55 very
0:39:55 shallow
0:39:57 and I
0:39:57 think
0:39:57 that’ll
0:39:58 continue
0:39:58 as long
0:39:58 as we’re
0:39:59 able
0:39:59 and allowed
0:40:00 to build
0:40:00 AI
0:40:00 and then
0:40:01 economically
0:40:02 what you’d
0:40:02 like
0:40:02 is you’d
0:40:03 like
0:40:04 you know
0:40:04 to have
0:40:05 a vibrant
0:40:06 AI
0:40:06 ecosystem
0:40:07 coming out
0:40:07 of the
0:40:07 U.S.
0:40:08 so other
0:40:08 countries
0:40:09 you know
0:40:10 who aren’t
0:40:11 state-of-the-art
0:40:12 with this
0:40:12 stuff
0:40:13 adopt our
0:40:14 technology
0:40:14 and you know
0:40:15 we continue
0:40:16 to be strong
0:40:16 economically
0:40:18 as opposed
0:40:18 to everything
0:40:19 goes to
0:40:19 China
0:40:19 and that
0:40:20 was like
0:40:20 a big
0:40:21 big risk
0:40:21 with the
0:40:21 Biden
0:40:22 administration
0:40:22 I think
0:40:24 and you
0:40:24 know
0:40:25 which was
0:40:26 you know
0:40:26 what they
0:40:27 were doing
0:40:27 on AI
0:40:28 was
0:40:29 you know
0:40:30 tough
0:40:30 I would
0:40:30 say
0:40:30 what they
0:40:31 were doing
0:40:31 on kind
0:40:32 of fintech
0:40:32 and crypto
0:40:32 was even
0:40:33 tougher
0:40:35 in that
0:40:35 they were
0:40:36 just trying
0:40:36 to get
0:40:36 rid of the
0:40:37 industry
0:40:37 in its
0:40:38 entirety
0:40:38 you know
0:40:38 with AI
0:40:39 they were
0:40:39 trying to
0:40:42 I would
0:40:42 say they
0:40:43 were extremely
0:40:43 arrogant
0:40:44 in their
0:40:46 in what
0:40:46 they thought
0:40:47 their ability
0:40:47 was to
0:40:47 predict
0:40:48 the future
0:40:50 you know
0:40:50 Mark and I
0:40:51 were in
0:40:51 there
0:40:51 you know
0:40:52 like our
0:40:52 job is
0:40:52 to
0:40:53 predict
0:40:53 it
0:40:53 like this
0:40:53 is our
0:40:54 job to
0:40:54 invest in
0:40:55 the future
0:40:55 to predict
0:40:56 the future
0:40:57 and they
0:40:57 were saying
0:40:58 things that
0:40:59 like were
0:41:01 so arrogant
0:41:01 that we
0:41:02 would never
0:41:02 even think
0:41:03 to say
0:41:03 them even
0:41:03 if we
0:41:03 thought
0:41:03 them
0:41:04 because
0:41:04 we’re
0:41:04 like
0:41:05 we know
0:41:06 that we
0:41:06 don’t know
0:41:07 the future
0:41:07 like that
0:41:09 you know
0:41:09 it’s just
0:41:10 unknowable
0:41:11 there’s too
0:41:11 many moving
0:41:11 parts
0:41:12 and these
0:41:12 things are
0:41:12 really
0:41:13 complicated
0:41:15 all right
0:41:16 well speaking
0:41:16 of the future
0:41:18 fully accepting
0:41:18 that it is
0:41:19 very opaque
0:41:20 and very
0:41:20 difficult to
0:41:21 see
0:41:21 what would
0:41:21 you say
0:41:22 is the
0:41:22 most
0:41:23 controversial
0:41:23 view that
0:41:24 you hold
0:41:24 about the
0:41:25 future
0:41:26 if we
0:41:27 don’t
0:41:28 get to
0:41:29 world class
0:41:30 in crypto
0:41:30 we’re gonna
0:41:32 be you know
0:41:32 AI really has
0:41:34 the potential
0:41:34 to wreck
0:41:35 society
0:41:37 and what I
0:41:37 mean by that
0:41:38 is if you
0:41:39 think about
0:41:40 what is
0:41:42 obviously
0:41:43 clearly gonna
0:41:43 happen in an
0:41:44 AI world
0:41:44 is one
0:41:46 we’re not
0:41:46 gonna be able
0:41:46 to tell the
0:41:47 difference between
0:41:47 a human and
0:41:48 a robot
0:41:49 two
0:41:49 we’re not
0:41:50 gonna know
0:41:50 what’s
0:41:50 real or
0:41:51 fake
0:41:53 three
0:41:54 the level
0:41:55 of security
0:41:56 attacks
0:41:57 on big
0:41:58 central data
0:41:58 repositories
0:41:59 is gonna
0:41:59 get so
0:42:00 good
0:42:01 that
0:42:02 everybody’s
0:42:02 data is
0:42:03 going to
0:42:04 be out
0:42:04 there
0:42:05 and you
0:42:05 know
0:42:06 there is
0:42:06 no safe
0:42:07 haven for
0:42:07 a consumer
0:42:09 and then
0:42:10 finally
0:42:10 you know
0:42:10 for these
0:42:11 agents
0:42:12 and these
0:42:12 bots
0:42:13 actually be
0:42:13 useful
0:42:14 they
0:42:14 actually
0:42:15 need to
0:42:15 be able
0:42:15 to use
0:42:16 money
0:42:17 and pay
0:42:17 for stuff
0:42:18 and get
0:42:18 paid for
0:42:19 stuff
0:42:19 like so
0:42:22 and if you
0:42:22 think about
0:42:22 all those
0:42:23 problems
0:42:23 those are
0:42:24 problems
0:42:25 that are
0:42:25 by far
0:42:26 best solved
0:42:27 by kind
0:42:27 of blockchain
0:42:28 technology
0:42:30 so one
0:42:32 we absolutely
0:42:33 need
0:42:34 a public
0:42:34 key
0:42:34 infrastructure
0:42:36 such that
0:42:37 every citizen
0:42:38 has their
0:42:39 own wallet
0:42:39 with their
0:42:40 own data
0:42:40 with their
0:42:41 own
0:42:41 information
0:42:43 and if
0:42:44 you need
0:42:44 to get
0:42:45 credit
0:42:46 or prove
0:42:46 you’re a
0:42:46 citizen
0:42:47 or whatever
0:42:48 you can do
0:42:49 that with
0:42:49 the zero
0:42:49 knowledge
0:42:49 proof
0:42:50 you don’t
0:42:50 have to
0:42:50 hand over
0:42:51 your social
0:42:51 security
0:42:52 numbers
0:42:52 your bank
0:42:53 account
0:42:53 information
0:42:54 all this
0:42:55 kind of
0:42:55 thing
0:42:56 because
0:42:58 the AI
0:42:58 will get
0:42:59 it
0:43:00 so you
0:43:01 really need
0:43:01 your own
0:43:02 keys
0:43:02 and your
0:43:02 own
0:43:02 data
0:43:03 and there
0:43:03 can’t be
0:43:04 these
0:43:04 gigantic
0:43:06 massive
0:43:06 honeypots
0:43:07 of information
0:43:08 that people
0:43:08 can go
0:43:09 after
0:43:10 I think
0:43:10 that with
0:43:11 deep
0:43:11 fakes
0:43:12 if you
0:43:12 think
0:43:12 about
0:43:13 okay
0:43:14 we’re
0:43:14 going to
0:43:15 have to
0:43:15 be able
0:43:16 to
0:43:16 whitelist
0:43:17 things
0:43:17 we’re
0:43:17 going to
0:43:17 have to
0:43:17 be able
0:43:18 to say
0:43:18 what’s
0:43:18 real
0:43:19 but who
0:43:20 keeps
0:43:20 track
0:43:20 of what’s
0:43:21 true
0:43:21 then
0:43:21 is it
0:43:21 the
0:43:22 government
0:43:23 you know
0:43:24 please
0:43:24 Jesus
0:43:24 everybody
0:43:25 trust
0:43:25 everybody
0:43:25 trust
0:43:26 Trump
0:43:26 now
0:43:27 you know
0:43:27 everybody
0:43:27 trusted
0:43:28 Biden
0:43:28 is it
0:43:29 going to
0:43:29 be
0:43:29 Google
0:43:29 we
0:43:30 trust
0:43:30 those
0:43:30 guys
0:43:31 or is
0:43:31 it
0:43:31 going to
0:43:32 be
0:43:32 the
0:43:32 game
0:43:33 theoretic
0:43:33 mathematical
0:43:34 properties
0:43:34 of the
0:43:35 blockchain
0:43:35 that
0:43:36 can hold
0:43:36 that
0:43:39 and so
0:43:39 I think
0:43:39 that
0:43:40 you know
0:43:40 it’s
0:43:41 essential
0:43:42 that we
0:43:43 regenerate
0:43:43 our
0:43:44 kind of
0:43:45 blockchain
0:43:45 crypto
0:43:46 development
0:43:46 in the
0:43:46 US
0:43:47 and we
0:43:47 get
0:43:47 very
0:43:48 serious
0:43:48 about
0:43:48 it
0:43:49 and
0:43:49 you know
0:43:49 like
0:43:50 if the
0:43:50 government
0:43:50 were to
0:43:51 do
0:43:51 something
0:43:51 I think
0:43:51 it should
0:43:52 be to
0:43:52 start to
0:43:53 require
0:43:54 these
0:43:55 information
0:43:56 distribution
0:43:56 networks
0:43:57 these social
0:43:57 networks
0:43:58 to have
0:43:59 a way
0:43:59 to
0:44:00 you know
0:44:01 verifiably
0:44:01 prove
0:44:02 you’re human
0:44:02 you know
0:44:03 prove where
0:44:04 a piece
0:44:04 of data
0:44:05 came from
0:44:05 and so
0:44:05 forth
0:44:06 and I
0:44:07 think
0:44:07 that
0:44:08 you know
0:44:09 we have
0:44:09 to
0:44:09 you know
0:44:10 have
0:44:10 banks
0:44:12 start
0:44:12 accepting
0:44:13 zero
0:44:13 knowledge
0:44:14 proofs
0:44:15 and you
0:44:15 know
0:44:16 and that
0:44:16 be just
0:44:17 the way
0:44:17 the world
0:44:17 works
0:44:17 we need
0:44:19 a network
0:44:20 architecture
0:44:20 that is
0:44:21 up to
0:44:22 the challenge
0:44:22 of you
0:44:23 know
0:44:23 these
0:44:23 super
0:44:24 intelligent
0:44:24 agents
0:44:25 that are
0:44:25 running
0:44:25 around
0:44:26 we were
0:44:27 talking
0:44:27 before we
0:44:27 started
0:44:28 rolling
0:44:28 that you
0:44:28 guys
0:44:28 have an
0:44:29 office
0:44:29 in DC
0:44:30 and part
0:44:30 of what
0:44:30 you do
0:44:32 is advise
0:44:32 on that
0:44:33 like what
0:44:33 what does
0:44:34 the infrastructure
0:44:35 changes
0:44:35 what do
0:44:35 they need
0:44:36 to look
0:44:36 like
0:44:37 what are
0:44:37 a small
0:44:38 handful
0:44:38 of things
0:44:39 that you
0:44:39 guys are
0:44:40 really
0:44:40 pushing
0:44:41 to see
0:44:41 the government
0:44:42 adopt
0:44:42 to modernize
0:44:43 the way
0:44:43 that
0:44:44 the whole
0:44:44 bureaucracy
0:44:45 works
0:44:46 yeah so
0:44:47 there’s a few
0:44:48 things you know
0:44:48 and one of the
0:44:49 things is because
0:44:50 you know
0:44:51 blockchain
0:44:52 technology
0:44:53 involves money
0:44:55 we do need
0:44:55 right it’s not
0:44:56 like we don’t
0:44:56 need any
0:44:57 regulation
0:44:58 we do need
0:44:58 regulation
0:45:00 and there are
0:45:00 kind of very
0:45:01 specific things
0:45:02 that that
0:45:03 we’re working
0:45:04 with the
0:45:04 administration
0:45:05 to make
0:45:05 sure
0:45:06 are done
0:45:07 in a way
0:45:08 that kind
0:45:09 of creates
0:45:09 a great
0:45:10 environment
0:45:10 for everybody
0:45:11 what do
0:45:12 you guys
0:45:12 hoping
0:45:13 will get
0:45:13 blocked
0:45:13 out
0:45:14 for instance
0:45:14 is that
0:45:14 what you’re
0:45:14 about to
0:45:15 cover
0:45:16 yeah I mean
0:45:17 so like
0:45:18 one of the
0:45:20 first thing
0:45:20 you need
0:45:21 is you know
0:45:22 we do need
0:45:23 electronic money
0:45:23 you know
0:45:24 in the
0:45:26 form of
0:45:26 stable coin
0:45:27 so actual
0:45:27 currency
0:45:29 and
0:45:30 but we need
0:45:31 that to not
0:45:32 like it’s very
0:45:32 bad if one of
0:45:33 those collapses
0:45:36 because then
0:45:36 like the
0:45:37 whole trust
0:45:37 and the
0:45:37 system
0:45:38 breaks down
0:45:38 and so
0:45:38 forth
0:45:39 well why
0:45:39 do we need
0:45:40 this kind
0:45:40 of money
0:45:41 this kind
0:45:42 of internet
0:45:42 native money
0:45:44 well
0:45:45 I’ll give you
0:45:45 an example
0:45:46 so we have
0:45:46 a company
0:45:48 called
0:45:48 Daylight Energy
0:45:50 and what
0:45:50 they do
0:45:50 is
0:45:52 so we’re
0:45:52 running
0:45:52 we’re going
0:45:53 to run
0:45:53 into a big
0:45:54 energy problem
0:45:55 with AI
0:45:55 that I think
0:45:56 most people
0:45:57 probably listening
0:45:58 to this know
0:45:58 about where
0:45:59 AI consumes
0:46:00 a massive
0:46:00 amount of
0:46:01 energy
0:46:01 you know
0:46:02 much more
0:46:02 than Bitcoin
0:46:02 ever did
0:46:03 by the way
0:46:04 which everybody
0:46:04 was all up
0:46:05 in arms about
0:46:07 and you know
0:46:07 so much so
0:46:08 that like
0:46:08 you can’t
0:46:09 really even
0:46:09 get it out
0:46:09 of the power
0:46:10 grid
0:46:10 and I think
0:46:11 Trump has
0:46:11 been smart
0:46:12 about this
0:46:12 saying hey
0:46:12 you probably
0:46:13 need to build
0:46:15 a kind
0:46:17 of power
0:46:17 next to your
0:46:18 data center
0:46:19 because we
0:46:19 can’t be
0:46:20 giving it
0:46:20 to you
0:46:20 from the
0:46:21 central tank
0:46:22 but beyond
0:46:22 that I think
0:46:23 that you know
0:46:23 kind of
0:46:24 individuals
0:46:25 you know
0:46:26 now have
0:46:27 Tesla
0:46:28 kind of
0:46:28 solar panels
0:46:29 and power
0:46:29 walls
0:46:30 and these
0:46:30 kinds of
0:46:31 things
0:46:32 and when
0:46:32 you have
0:46:32 one of
0:46:33 those
0:46:33 you
0:46:33 sometimes
0:46:34 have
0:46:34 more
0:46:34 energy
0:46:35 than you
0:46:35 need
0:46:36 and sometimes
0:46:36 have
0:46:36 less
0:46:37 and wouldn’t
0:46:37 it be
0:46:38 great
0:46:39 if you
0:46:39 could
0:46:40 you know
0:46:40 if there
0:46:41 was a
0:46:41 nice
0:46:42 system
0:46:43 that figured
0:46:43 out who
0:46:44 needed
0:46:44 energy
0:46:45 and who
0:46:46 had
0:46:46 energy
0:46:47 and you
0:46:47 could just
0:46:47 trade
0:46:48 and there
0:46:49 was some
0:46:50 kind of
0:46:50 contract
0:46:51 that said
0:46:51 okay this
0:46:51 is what
0:46:52 you pay
0:46:52 during peak
0:46:53 this is what
0:46:53 you pay
0:46:53 at different
0:46:54 periods
0:46:55 and that
0:46:56 contract
0:46:56 probably
0:46:57 best done
0:46:57 in the
0:46:57 form of
0:46:58 a smart
0:46:58 contract
0:46:59 but a
0:46:59 power wall
0:47:00 is not
0:47:00 a human
0:47:01 so it
0:47:01 doesn’t have
0:47:01 a credit
0:47:02 card
0:47:02 it can’t
0:47:02 get a credit
0:47:03 card
0:47:03 doesn’t have
0:47:03 a bank
0:47:04 account
0:47:04 doesn’t have
0:47:04 a social
0:47:05 security number
0:47:07 but it can
0:47:07 trade crypto
0:47:08 it can trade
0:47:08 stable coins
0:47:11 and so we
0:47:11 need that
0:47:11 kind of
0:47:12 currency
0:47:13 to kind
0:47:14 of facilitate
0:47:16 all these
0:47:16 kind of
0:47:17 automated
0:47:17 agreements
0:47:18 and automated
0:47:19 transfer
0:47:19 of
0:47:22 kind of
0:47:23 wealth
0:47:24 between
0:47:25 entities
0:47:26 in order to
0:47:26 kind of
0:47:27 solve these
0:47:28 big problems
0:47:28 that we have
0:47:29 like energy
0:47:31 and so we
0:47:31 need a
0:47:32 stable coin
0:47:32 bill that
0:47:33 kind of
0:47:33 says okay
0:47:34 look
0:47:35 we need
0:47:36 these
0:47:36 currencies
0:47:37 to be
0:47:37 backed
0:47:38 one for
0:47:39 one with
0:47:39 US dollars
0:47:40 or whatever
0:47:41 it is
0:47:42 so that
0:47:43 you know
0:47:43 we can
0:47:44 have a
0:47:46 system that
0:47:46 works and
0:47:46 is trusted
0:47:47 now there’s
0:47:48 this really
0:47:49 interesting side
0:47:49 benefit to
0:47:50 that which
0:47:51 is if
0:47:51 you look
0:47:52 at treasury
0:47:52 auctions
0:47:53 lately
0:47:55 the demand
0:47:56 for dollars
0:47:56 is not
0:47:57 good
0:47:58 you know
0:47:58 and a lot
0:47:59 of that
0:47:59 is the
0:48:00 two biggest
0:48:01 kind of
0:48:02 lenders
0:48:02 to the
0:48:03 US have
0:48:04 been China
0:48:04 and Japan
0:48:06 and you
0:48:06 know China
0:48:06 has backed
0:48:07 off a lot
0:48:08 and Japan
0:48:09 has backed
0:48:10 off somewhat
0:48:11 and so
0:48:11 the demand
0:48:12 for dollars
0:48:12 has gone
0:48:12 down
0:48:13 we’ve done
0:48:13 things to
0:48:14 also dampen
0:48:15 demand like
0:48:16 you know
0:48:16 when we
0:48:17 sanctioned
0:48:18 Russia
0:48:18 and we
0:48:19 seized the
0:48:19 assets of
0:48:20 the Russian
0:48:20 central bank
0:48:21 you know
0:48:21 there were
0:48:22 other countries
0:48:22 that had
0:48:23 you know
0:48:23 other entities
0:48:24 that had
0:48:24 money there
0:48:26 and their
0:48:26 money got
0:48:26 frozen
0:48:27 and they
0:48:27 couldn’t
0:48:28 access it
0:48:28 and so
0:48:28 that makes
0:48:29 people more
0:48:30 wary of holding
0:48:30 everything in
0:48:31 dollars
0:48:32 so we’ve
0:48:33 done a lot
0:48:34 to dampen
0:48:34 that which
0:48:34 of course
0:48:35 is you
0:48:35 know
0:48:36 fueled
0:48:36 inflation
0:48:37 in the
0:48:38 same way
0:48:38 that increasing
0:48:39 supply
0:48:40 fuels
0:48:40 inflation
0:48:41 killing
0:48:41 demand
0:48:42 fuels
0:48:43 inflation
0:48:44 so here
0:48:44 we would
0:48:44 have this
0:48:45 new
0:48:46 major source
0:48:46 of demand
0:48:47 for dollars
0:48:48 and then the
0:48:49 dollars would be
0:48:49 much more useful
0:48:50 because you can
0:48:50 use them
0:48:51 online as well
0:48:53 and machines
0:48:53 can use them
0:48:54 and so forth
0:48:54 so we
0:48:55 really
0:48:55 and sorry
0:48:56 really fast
0:48:56 for people
0:48:57 that are
0:48:57 trying to
0:48:58 track that
0:48:58 the reason
0:48:59 that that
0:49:01 would increase
0:49:01 the demand
0:49:01 for dollars
0:49:02 is that they
0:49:02 would the
0:49:03 stable coin
0:49:03 would be
0:49:04 backed one
0:49:04 for one
0:49:05 with debt
0:49:06 is that the
0:49:06 idea
0:49:07 well yeah
0:49:07 with
0:49:09 where you
0:49:09 would basically
0:49:10 have
0:49:10 you would
0:49:11 have to
0:49:11 have a
0:49:12 dollar
0:49:12 right
0:49:15 for
0:49:16 every
0:49:16 whole
0:49:17 treasuries
0:49:17 stable coin
0:49:17 yeah you
0:49:18 basically
0:49:18 hold
0:49:18 treasuries
0:49:19 so that
0:49:20 if somebody
0:49:20 wanted to
0:49:21 redeem their
0:49:21 stable coins
0:49:22 they could
0:49:23 and then that
0:49:23 way
0:49:24 you know
0:49:24 kind of
0:49:25 the equivalent
0:49:26 of the
0:49:26 gold standard
0:49:27 in the old
0:49:27 days
0:49:27 you know
0:49:28 when dollars
0:49:28 were trying
0:49:29 to get
0:49:29 credible
0:49:30 you know
0:49:31 we would
0:49:31 need
0:49:31 like
0:49:33 dollars
0:49:34 to be the
0:49:34 gold standard
0:49:35 for the
0:49:35 stable coin
0:49:37 you know
0:49:38 and probably
0:49:38 we should
0:49:39 never back
0:49:40 off of that
0:49:40 maybe we
0:49:40 should never
0:49:41 backed off
0:49:41 of gold
0:49:42 but
0:49:43 you know
0:49:43 it’s easier
0:49:44 when it’s
0:49:44 dollars
0:49:45 because we
0:49:45 did kind of
0:49:46 start to run
0:49:46 out of gold
0:49:47 a bit
0:49:49 so
0:49:50 yeah so
0:49:51 that’s
0:49:51 you know
0:49:51 one thing
0:49:53 then
0:49:54 secondly
0:49:55 there’s
0:49:56 a bill
0:49:57 that went
0:49:58 through the
0:49:58 house
0:49:58 known as
0:49:59 the market
0:49:59 structure
0:50:00 bill
0:50:00 it was
0:50:01 technically
0:50:01 called
0:50:02 fit 21
0:50:03 that’s a
0:50:04 very important
0:50:04 you know
0:50:05 whether it’s
0:50:05 exactly that
0:50:06 or some
0:50:06 form of
0:50:07 that
0:50:08 because
0:50:09 you know
0:50:09 when you
0:50:10 talk about
0:50:11 tokens
0:50:12 which are
0:50:13 this kind
0:50:13 of instrument
0:50:14 that’s very
0:50:15 very important
0:50:16 in blockchain
0:50:16 world
0:50:17 because it’s
0:50:17 the way
0:50:19 that this
0:50:20 amazing
0:50:21 kind of
0:50:21 network
0:50:22 of computers
0:50:23 gets paid
0:50:23 for it
0:50:23 so
0:50:24 you know
0:50:25 who
0:50:26 who pays
0:50:26 the people
0:50:26 for running
0:50:28 the computers
0:50:28 well that’s
0:50:29 paid in the
0:50:29 form of
0:50:30 these tokens
0:50:31 but these
0:50:31 tokens
0:50:33 which can be
0:50:34 created on
0:50:34 blockchain
0:50:36 have
0:50:38 they can be
0:50:39 many things
0:50:41 so you can
0:50:41 create a token
0:50:42 that’s a
0:50:42 collectible
0:50:43 you know
0:50:44 you can
0:50:45 create a
0:50:45 token
0:50:47 that is a
0:50:48 digital property
0:50:48 right
0:50:49 you know
0:50:49 that links
0:50:51 to you know
0:50:51 some piece
0:50:52 of real
0:50:52 estate
0:50:53 or a piece
0:50:53 of art
0:50:53 or so
0:50:54 forth
0:50:55 a token
0:50:55 can be
0:50:57 a you know
0:50:58 pokemon card
0:50:58 a token
0:50:58 could be
0:50:59 a coupon
0:51:00 a token
0:51:00 could be
0:51:01 a security
0:51:02 that represents
0:51:02 a stock
0:51:04 it could be
0:51:04 you know
0:51:05 a dollar
0:51:06 so which
0:51:07 one is it
0:51:08 is a very
0:51:09 kind of
0:51:09 important set
0:51:10 of rules
0:51:11 that doesn’t
0:51:11 exist
0:51:12 and this is
0:51:12 one of the
0:51:14 most insidious
0:51:14 thing that
0:51:15 the Biden
0:51:15 administration
0:51:16 did was
0:51:17 basically say
0:51:18 well everything
0:51:19 is a security
0:51:19 everything’s
0:51:20 a stock
0:51:22 or you know
0:51:22 like some
0:51:23 thing with
0:51:23 asymmetric
0:51:24 information
0:51:25 which kind
0:51:26 of basically
0:51:28 undermines
0:51:28 the whole
0:51:30 power of the
0:51:30 technology
0:51:31 and so it
0:51:32 was basically
0:51:33 a scheme
0:51:34 for them
0:51:36 to kind
0:51:37 of get rid
0:51:37 of the industry
0:51:38 but it was
0:51:39 very very
0:51:40 dark
0:51:43 cynical
0:51:44 way of
0:51:46 legislating
0:51:47 things and
0:51:47 they would
0:51:47 make these
0:51:48 fake claims
0:51:49 about scams
0:51:50 and so forth
0:51:51 but the
0:51:52 market structure
0:51:52 bill is
0:51:53 very very
0:51:53 important in
0:51:54 that way
0:51:54 and by the
0:51:55 way also
0:51:55 you know
0:51:56 another thing
0:51:57 that was in
0:51:57 the original
0:51:58 market structure
0:51:59 bill which
0:52:00 is important
0:52:00 is look
0:52:01 there are
0:52:03 also scams
0:52:04 there are
0:52:04 you know
0:52:05 and we call
0:52:05 it the casino
0:52:07 but you know
0:52:08 like I can
0:52:09 create some
0:52:10 coin like the
0:52:11 Hak Tua girl
0:52:11 did right
0:52:12 like and
0:52:13 she creates
0:52:14 a coin
0:52:15 she kind
0:52:16 of lies
0:52:17 about
0:52:19 you know
0:52:20 her holdings
0:52:20 and says
0:52:21 she’s going
0:52:21 to hold
0:52:21 them but
0:52:22 then sells
0:52:23 them you
0:52:23 know
0:52:24 after people
0:52:25 buy it
0:52:25 in a short
0:52:26 time period
0:52:26 and so forth
0:52:27 and there’s
0:52:28 no part of
0:52:28 the problem
0:52:28 is there’s
0:52:29 no kind
0:52:30 of rules
0:52:30 around that
0:52:31 but in
0:52:31 the kind
0:52:32 of bill
0:52:33 that passed
0:52:33 the house
0:52:34 it said
0:52:34 like you
0:52:35 can create
0:52:35 a token
0:52:37 but if
0:52:37 you hold
0:52:37 it you
0:52:38 can’t trade
0:52:38 it for
0:52:39 four years
0:52:40 that kind
0:52:41 of takes
0:52:41 a lot
0:52:42 of the
0:52:42 ability
0:52:43 to scam
0:52:44 out of
0:52:44 it
0:52:45 and kind
0:52:45 of forces
0:52:46 people to
0:52:46 do things
0:52:47 that are
0:52:47 their real
0:52:48 utilities
0:52:48 or if
0:52:48 it’s a
0:52:49 collectible
0:52:50 you know
0:52:50 if it is
0:52:51 the Hak Tua
0:52:51 collectible
0:52:53 you know
0:52:53 it’s got to
0:52:54 be a real
0:52:54 collectible
0:52:55 where you
0:52:56 don’t just
0:52:56 you know
0:52:57 rug
0:52:58 the users
0:52:59 of it
0:52:59 right away
0:53:00 and so
0:53:01 that you
0:53:01 know
0:53:01 that’s
0:53:02 right away
0:53:04 it’s okay
0:53:04 to do it
0:53:04 later
0:53:04 but
0:53:05 well but
0:53:05 you know
0:53:06 like in
0:53:06 four years
0:53:07 it is
0:53:07 what it
0:53:07 is
0:53:08 right
0:53:08 yeah
0:53:09 no no
0:53:09 I’m just
0:53:09 giving you
0:53:10 a hard
0:53:10 time
0:53:10 I just
0:53:10 know how
0:53:11 that’s
0:53:11 going to
0:53:11 sound
0:53:11 to
0:53:11 people
0:53:12 yeah
0:53:13 yeah
0:53:14 thank you
0:53:15 but you
0:53:15 know
0:53:15 so these
0:53:16 kinds of
0:53:16 things I
0:53:17 think are
0:53:18 are going
0:53:18 to be
0:53:18 really
0:53:18 important
0:53:19 to
0:53:19 making
0:53:19 the
0:53:20 whole
0:53:20 industry
0:53:20 work
0:53:21 and so
0:53:21 we’re
0:53:21 working
0:53:22 you know
0:53:23 on that
0:53:23 you know
0:53:24 trying to
0:53:24 make it
0:53:25 safe for
0:53:25 everybody
0:53:26 but as
0:53:27 I said
0:53:27 it’s just
0:53:28 such a
0:53:28 critical
0:53:29 technology
0:53:34 yes it’s
0:53:34 just going
0:53:34 to be
0:53:35 like a
0:53:35 very
0:53:36 kind of
0:53:37 problematic
0:53:39 you know
0:53:40 it’s going
0:53:40 to be
0:53:40 it’s
0:53:41 cyberpunk
0:53:41 it’s
0:53:44 it’s
0:53:44 a
0:53:44 it’s
0:53:44 a
0:53:45 not
0:53:46 yeah
0:53:46 it’s
0:53:46 a
0:53:47 high
0:53:47 technology
0:53:49 difficult
0:53:50 society
0:53:51 yeah
0:53:51 yeah
0:53:52 it
0:53:53 it was
0:53:54 shocking
0:53:54 to me
0:53:55 the level
0:53:55 of
0:53:56 backlash
0:53:57 that
0:53:57 the
0:53:57 blockchain
0:53:58 web
0:53:58 3
0:53:59 community
0:54:00 got
0:54:01 what
0:54:02 do you
0:54:02 think
0:54:03 drives
0:54:03 that
0:54:03 is
0:54:03 it
0:54:04 just
0:54:04 the
0:54:05 perception
0:54:05 that
0:54:05 it
0:54:06 was
0:54:06 only
0:54:06 scams
0:54:06 and
0:54:07 there’s
0:54:07 nothing
0:54:07 real
0:54:08 like
0:54:08 what
0:54:08 was
0:54:08 that
0:54:08 all
0:54:09 about
0:54:09 so
0:54:09 there
0:54:10 was
0:54:10 multiple
0:54:11 factors
0:54:11 so
0:54:11 the
0:54:11 first
0:54:11 one
0:54:12 is
0:54:12 the
0:54:12 one
0:54:12 that
0:54:12 hits
0:54:13 all
0:54:13 new
0:54:13 technology
0:54:14 where
0:54:15 oh
0:54:16 it’s
0:54:16 a
0:54:16 toy
0:54:18 it
0:54:18 doesn’t
0:54:18 do
0:54:18 anything
0:54:19 new
0:54:19 like
0:54:20 the
0:54:20 old
0:54:20 way
0:54:20 of
0:54:20 doing
0:54:21 things
0:54:21 is
0:54:21 better
0:54:22 and
0:54:24 you
0:54:24 know
0:54:24 we
0:54:24 saw
0:54:24 that
0:54:25 with
0:54:25 social
0:54:25 networking
0:54:26 we
0:54:26 actually
0:54:27 saw
0:54:27 that
0:54:27 with
0:54:27 the
0:54:27 internet
0:54:28 I
0:54:28 think
0:54:29 Paul
0:54:29 Krugman
0:54:29 famously
0:54:30 said
0:54:30 never
0:54:31 have
0:54:31 more
0:54:32 economic
0:54:32 impact
0:54:32 on a
0:54:33 fax
0:54:33 machine
0:54:33 and so
0:54:33 forth
0:54:34 so
0:54:34 that’s
0:54:34 just
0:54:34 kind of
0:54:34 a
0:54:35 normal
0:54:35 thing
0:54:35 that
0:54:35 happens
0:54:36 with
0:54:36 new
0:54:36 technologies
0:54:37 as they
0:54:37 start
0:54:37 out
0:54:57 and
0:54:57 so
0:54:58 you
0:54:58 know
0:54:58 if
0:54:58 you
0:54:59 look
0:54:59 at
0:55:00 even
0:55:00 like
0:55:01 the
0:55:01 iPhone
0:55:02 it
0:55:02 was
0:55:02 a
0:55:03 bad
0:55:03 phone
0:55:04 it
0:55:05 had
0:55:05 a
0:55:05 horrible
0:55:06 keyboard
0:55:06 you
0:55:07 know
0:55:07 if
0:55:07 you
0:55:07 compared
0:55:07 it
0:55:07 to
0:55:08 anything
0:55:08 it
0:55:08 wasn’t
0:55:08 very
0:55:09 powerful
0:55:09 it
0:55:09 had
0:55:10 a
0:55:10 little
0:55:10 itty
0:55:10 bitty
0:55:11 screen
0:55:12 but
0:55:13 it
0:55:13 had
0:55:13 a
0:55:13 feature
0:55:14 that
0:55:15 was
0:55:15 pretty
0:55:15 awesome
0:55:16 which
0:55:16 you
0:55:16 could
0:55:16 put
0:55:16 in
0:55:16 your
0:55:17 pocket
0:55:18 and
0:55:18 it
0:55:18 had
0:55:18 like
0:55:18 a
0:55:19 GPS
0:55:19 in
0:55:19 it
0:55:19 and
0:55:19 a
0:55:19 camera
0:55:20 in
0:55:20 it
0:55:21 and
0:55:21 so
0:55:21 now
0:55:21 you
0:55:21 could
0:55:21 build
0:55:22 Instagram
0:55:22 you
0:55:22 could
0:55:22 build
0:55:23 Uber
0:55:24 which
0:55:24 you
0:55:24 could
0:55:24 not
0:55:24 build
0:55:24 with
0:55:25 the
0:55:25 PC
0:55:25 and
0:55:25 you
0:55:25 still
0:55:26 can’t
0:55:26 build
0:55:26 with
0:55:26 the
0:55:26 PC
0:55:28 and
0:55:28 so
0:55:28 that
0:55:29 was
0:55:29 enough
0:55:30 and
0:55:30 then
0:55:30 eventually
0:55:31 it
0:55:31 started
0:55:31 to
0:55:31 add
0:55:32 the
0:55:32 other
0:55:32 features
0:55:33 and
0:55:33 it’s
0:55:33 an
0:55:33 awfully
0:55:34 powerful
0:55:34 computer
0:55:35 these
0:55:35 days
0:55:36 if
0:55:36 you
0:55:37 look
0:55:37 at
0:55:37 blockchain
0:55:38 it’s
0:55:39 slower
0:55:40 it’s
0:55:40 more
0:55:41 complicated
0:55:41 to
0:55:41 program
0:55:42 like
0:55:42 there’s
0:55:42 a lot
0:55:42 of
0:55:43 issues
0:55:43 with
0:55:43 it
0:55:44 but
0:55:44 it’s
0:55:44 got
0:55:44 a
0:55:51 that
0:55:52 code
0:55:53 you
0:55:53 know
0:55:53 says
0:55:54 there’s
0:55:54 only
0:55:54 21
0:55:55 million
0:55:55 bitcoin
0:55:56 you
0:55:57 can
0:55:57 absolutely
0:55:58 count
0:55:58 on
0:55:58 that
0:55:59 in
0:55:59 a
0:55:59 way
0:56:00 that
0:56:00 like
0:56:00 you
0:56:00 can’t
0:56:01 trust
0:56:01 Google
0:56:02 you
0:56:02 can’t
0:56:02 trust
0:56:03 you
0:56:03 know
0:56:04 Facebook
0:56:04 to
0:56:04 say
0:56:05 like
0:56:05 oh
0:56:05 these
0:56:05 are
0:56:06 our
0:56:06 privacy
0:56:07 rules
0:56:08 like
0:56:08 you
0:56:08 can’t
0:56:08 trust
0:56:09 that
0:56:09 at
0:56:09 all
0:56:10 you
0:56:10 can’t
0:56:10 trust
0:56:10 the
0:56:10 US
0:56:11 government
0:56:11 to
0:56:11 say
0:56:11 they’re
0:56:11 not
0:56:11 going
0:56:11 to
0:56:11 print
0:56:12 any
0:56:12 more
0:56:12 money
0:56:13 like
0:56:13 that’s
0:56:13 for
0:56:13 sure
0:56:14 and
0:56:15 so
0:56:15 you
0:56:15 know
0:56:21 count
0:56:21 on
0:56:22 and
0:56:22 you
0:56:22 don’t
0:56:23 have
0:56:23 to
0:56:23 trust
0:56:23 a
0:56:23 company
0:56:24 you
0:56:24 don’t
0:56:24 have
0:56:24 to
0:56:24 trust
0:56:25 a
0:56:26 lawyer
0:56:26 you
0:56:26 just
0:56:27 have
0:56:27 to
0:56:27 trust
0:56:28 the
0:56:28 game
0:56:28 theoretic
0:56:29 mathematical
0:56:30 properties
0:56:30 of the
0:56:30 blockchain
0:56:32 and
0:56:32 that’s
0:56:32 amazing
0:56:33 so
0:56:33 now
0:56:33 you
0:56:33 can
0:56:34 you
0:56:34 know
0:56:35 program
0:56:35 property
0:56:36 rights
0:56:36 and
0:56:36 money
0:56:37 and
0:56:37 law
0:56:37 and
0:56:38 all
0:56:38 these
0:56:38 kinds
0:56:38 of
0:56:39 things
0:56:39 that
0:56:39 you
0:56:39 could
0:56:39 never
0:56:39 do
0:56:40 before
0:56:41 and
0:56:42 so
0:56:42 I
0:56:42 think
0:56:43 that’s
0:56:43 hard
0:56:43 for
0:56:44 normal
0:56:44 people
0:56:44 to
0:56:45 understand
0:56:46 who
0:56:46 aren’t
0:56:47 deep
0:56:52 and
0:56:52 then
0:56:53 you
0:56:53 know
0:56:53 I
0:56:53 think
0:56:54 the
0:56:54 next
0:56:54 wave
0:56:55 was
0:56:55 you
0:56:55 had
0:56:56 look
0:56:57 you
0:56:57 know
0:56:57 it
0:56:58 was
0:56:58 a
0:56:58 very
0:56:59 odd
0:56:59 thing
0:57:01 with
0:57:01 the
0:57:01 Biden
0:57:01 administration
0:57:02 because
0:57:02 he
0:57:02 wasn’t
0:57:03 really
0:57:04 I
0:57:04 think
0:57:04 it’s
0:57:05 come
0:57:05 out
0:57:05 now
0:57:05 he
0:57:05 wasn’t
0:57:06 really
0:57:06 the
0:57:06 president
0:57:06 he
0:57:07 wasn’t
0:57:07 really
0:57:07 making
0:57:07 any
0:57:08 decisions
0:57:09 you
0:57:09 couldn’t
0:57:10 even
0:57:10 get a
0:57:10 meeting
0:57:10 with
0:57:10 him
0:57:11 if
0:57:11 you
0:57:11 were
0:57:11 in
0:57:12 his
0:57:12 cabinet
0:57:13 and
0:57:13 in
0:57:13 terms
0:57:13 of
0:57:14 domestic
0:57:14 policy
0:57:14 that
0:57:15 was
0:57:15 run
0:57:15 by
0:57:16 Elizabeth
0:57:16 Warren
0:57:17 and
0:57:17 then
0:57:17 the
0:57:17 second
0:57:18 confusing
0:57:18 thing
0:57:18 is
0:57:19 Elizabeth
0:57:19 Warren
0:57:20 is
0:57:20 always
0:57:20 calling
0:57:21 like
0:57:21 people
0:57:21 fascist
0:57:22 her
0:57:23 whole
0:57:24 push
0:57:24 with
0:57:25 fintech
0:57:25 and
0:57:25 crypto
0:57:26 was
0:57:27 to
0:57:27 make
0:57:28 sure
0:57:29 that
0:57:29 she
0:57:29 could
0:57:30 kick
0:57:30 people
0:57:30 out
0:57:30 of
0:57:30 the
0:57:31 banking
0:57:31 system
0:57:31 who
0:57:31 are
0:57:32 political
0:57:32 enemies
0:57:33 and
0:57:33 so
0:57:34 in
0:57:34 order
0:57:34 to
0:57:34 do
0:57:34 that
0:57:35 you
0:57:35 have
0:57:35 to
0:57:35 outlaw
0:57:36 new
0:57:36 forms
0:57:36 of
0:57:37 financial
0:57:38 technology
0:57:39 because
0:57:39 those
0:57:40 would
0:57:40 be
0:57:40 kind
0:57:40 of
0:57:41 back
0:57:41 doors
0:57:41 or
0:57:42 side
0:57:42 doors
0:57:42 or
0:57:43 parallels
0:57:44 to
0:57:45 the
0:57:46 g-sibs
0:57:46 and
0:57:46 the
0:57:46 banking
0:57:46 system
0:57:47 which
0:57:47 she
0:57:48 comprehensively
0:57:49 and I
0:57:49 think
0:57:49 this is
0:57:50 coming out
0:57:50 now
0:57:50 could
0:57:50 kick
0:57:51 people
0:57:51 out
0:57:51 of
0:57:52 and
0:57:52 so
0:57:52 when
0:57:52 you
0:57:53 use
0:57:53 when
0:57:54 it’s
0:57:54 a full
0:57:55 top-down
0:57:55 hierarchy
0:57:56 and you
0:57:56 can
0:57:56 use
0:57:57 private
0:57:57 companies
0:57:58 to enforce
0:57:58 your will
0:57:59 that is
0:58:00 the way
0:58:00 fascism
0:58:01 works
0:58:01 and then
0:58:02 the way
0:58:02 she does
0:58:02 it
0:58:02 is
0:58:03 she
0:58:03 sells
0:58:04 this
0:58:04 fake
0:58:05 story
0:58:05 about
0:58:06 you know
0:58:07 it’s
0:58:08 funding
0:58:08 terror
0:58:08 and it
0:58:09 turns out
0:58:09 like
0:58:09 the
0:58:10 USAID
0:58:10 was
0:58:11 funding
0:58:11 the
0:58:11 terrorist
0:58:11 groups
0:58:12 but
0:58:12 that’s
0:58:12 a
0:58:12 different
0:58:12 story
0:58:13 but
0:58:13 you know
0:58:13 it’s
0:58:14 doing
0:58:14 all
0:58:14 these
0:58:15 nefarious
0:58:15 things
0:58:16 which
0:58:16 you know
0:58:17 was just
0:58:17 a very
0:58:18 unfair
0:58:18 portrayal
0:58:19 and so
0:58:19 then
0:58:20 the whole
0:58:20 industry
0:58:21 got this
0:58:21 reputation
0:58:22 as scammy
0:58:22 and this
0:58:23 and that
0:58:23 and the
0:58:23 other
0:58:24 and then
0:58:24 of course
0:58:25 we had
0:58:26 Sam
0:58:26 Bankman
0:58:26 freed
0:58:27 who
0:58:27 didn’t
0:58:27 do
0:58:27 us
0:58:27 any
0:58:28 favors
0:58:28 by
0:58:30 you know
0:58:30 and this
0:58:31 is another
0:58:32 kind of
0:58:32 though
0:58:33 issue
0:58:33 with what
0:58:34 Elizabeth Warren
0:58:34 did
0:58:34 is she
0:58:35 blocked
0:58:35 all
0:58:36 legislation
0:58:38 and so
0:58:38 the
0:58:39 criminals
0:58:39 were running
0:58:40 free
0:58:40 and the
0:58:41 people
0:58:41 doing
0:58:41 things
0:58:42 that
0:58:42 should
0:58:42 have
0:58:42 been
0:58:42 legal
0:58:43 were
0:58:43 getting
0:58:44 terrorized
0:58:44 by the
0:58:45 government
0:58:46 when they
0:58:46 should
0:58:46 have been
0:58:46 looking
0:58:47 they should
0:58:48 have been
0:58:48 looking at
0:58:49 FTX
0:58:49 they were
0:58:50 looking at
0:58:50 Coinbase
0:58:51 which was
0:58:51 totally
0:58:52 compliant
0:58:52 public
0:58:53 company
0:58:54 you know
0:58:54 begging
0:58:55 for
0:58:55 feedback
0:58:56 tell us
0:58:56 what you
0:58:56 want us
0:58:57 to do
0:58:57 exactly
0:58:59 yeah
0:58:59 that
0:59:00 whole
0:59:00 thing
0:59:00 was
0:59:00 crazy
0:59:01 so
0:59:01 given
0:59:02 that
0:59:02 AI
0:59:02 is
0:59:03 putting
0:59:03 us
0:59:03 on
0:59:03 a
0:59:03 collision
0:59:04 course
0:59:04 with
0:59:05 I
0:59:05 don’t
0:59:05 know
0:59:06 who’s
0:59:06 real
0:59:06 I
0:59:06 don’t
0:59:06 know
0:59:07 what’s
0:59:07 fake
0:59:07 do
0:59:07 you
0:59:08 think
0:59:08 that
0:59:08 blockchain
0:59:09 is
0:59:09 about
0:59:09 to
0:59:09 have
0:59:09 its
0:59:09 day
0:59:10 like
0:59:10 in
0:59:10 the
0:59:10 next
0:59:11 12
0:59:11 to
0:59:12 24
0:59:12 months
0:59:13 or
0:59:13 is
0:59:13 this
0:59:13 still
0:59:14 something
0:59:14 that
0:59:14 it’s
0:59:14 so
0:59:15 embedded
0:59:15 deep
0:59:15 in
0:59:15 the
0:59:16 infrastructure
0:59:16 it’s
0:59:16 going
0:59:16 to
0:59:17 take
0:59:17 a
0:59:17 long
0:59:17 time
0:59:17 to
0:59:18 really
0:59:18 have
0:59:19 its
0:59:19 I
0:59:19 told
0:59:19 you
0:59:19 so
0:59:20 moment
0:59:21 you
0:59:21 know
0:59:21 I
0:59:22 think
0:59:22 it’s
0:59:23 I
0:59:23 think
0:59:23 it’s
0:59:24 within
0:59:24 24
0:59:24 months
0:59:24 for
0:59:25 sure
0:59:25 I
0:59:25 mean
0:59:25 I
0:59:25 think
0:59:26 that
0:59:27 there’s
0:59:27 enough
0:59:28 you
0:59:29 know
0:59:29 there
0:59:29 were
0:59:29 like
0:59:30 actual
0:59:30 like
0:59:30 if
0:59:30 you
0:59:30 like
0:59:31 kind
0:59:31 of
0:59:31 the
0:59:31 last
0:59:31 wave
0:59:31 of
0:59:32 blockchain
0:59:33 there
0:59:33 were
0:59:34 real
0:59:35 technological
0:59:35 limitations
0:59:36 that
0:59:36 made
0:59:36 it
0:59:37 you
0:59:37 know
0:59:38 I
0:59:38 think
0:59:39 we’re
0:59:39 slowing
0:59:39 it
0:59:40 down
0:59:40 from
0:59:40 getting
0:59:40 broader
0:59:41 adoption
0:59:41 so
0:59:42 you
0:59:42 know
0:59:42 very
0:59:43 obvious
0:59:43 usability
0:59:44 challenges
0:59:45 the
0:59:46 fees
0:59:46 were
0:59:46 really
0:59:46 high
0:59:47 the
0:59:47 blockchains
0:59:47 were
0:59:48 slow
0:59:49 so
0:59:49 there
0:59:49 were
0:59:49 just
0:59:49 a
0:59:50 lot
0:59:50 of
0:59:50 use
0:59:50 cases
0:59:50 that
0:59:50 you
0:59:51 just
0:59:51 couldn’t
0:59:51 do
0:59:51 on
0:59:51 them
0:59:53 I
0:59:53 think
0:59:54 that’s
0:59:54 changing
0:59:55 very
0:59:55 very
0:59:55 fast
0:59:56 things
0:59:56 are
0:59:56 you
0:59:56 know
0:59:57 the
0:59:57 chains
0:59:58 are
0:59:58 much
0:59:58 faster
0:59:59 the
1:00:00 layer
1:00:00 two
1:00:00 stuff
1:00:01 makes
1:00:01 them
1:00:02 very
1:00:02 fast
1:00:02 and
1:00:03 cheap
1:00:05 you
1:00:05 know
1:00:05 people
1:00:05 are
1:00:05 doing
1:00:06 a
1:00:06 lot
1:00:07 on
1:00:07 usability
1:00:09 you
1:00:09 know
1:00:09 for
1:00:10 wallets
1:00:10 and
1:00:10 these
1:00:11 kinds
1:00:11 of
1:00:11 things
1:00:11 so
1:00:12 I
1:00:13 think
1:00:13 we’re
1:00:13 getting
1:00:13 pretty
1:00:14 close
1:00:14 and
1:00:14 then
1:00:14 I
1:00:19 world
1:00:19 coin
1:00:20 you
1:00:20 know
1:00:20 to
1:00:21 me
1:00:21 the
1:00:22 difference
1:00:23 between
1:00:23 that
1:00:23 thing
1:00:23 being
1:00:24 very
1:00:24 broadly
1:00:25 adopted
1:00:25 and
1:00:26 where
1:00:26 it
1:00:26 is
1:00:26 now
1:00:26 where
1:00:26 it’s
1:00:27 I
1:00:27 think
1:00:27 half
1:00:27 the
1:00:27 people
1:00:28 in
1:00:28 Buenos
1:00:28 Aires
1:00:29 use
1:00:29 it
1:00:30 daily
1:00:31 so
1:00:31 it’s
1:00:32 widely
1:00:33 adopted
1:00:33 where
1:00:33 it’s
1:00:33 been
1:00:34 legal
1:00:35 I
1:00:35 think
1:00:35 that
1:00:36 if
1:00:36 they
1:00:37 are
1:00:37 able
1:00:37 to
1:00:38 get
1:00:38 integrated
1:00:39 into
1:00:39 some
1:00:39 of
1:00:39 the
1:00:39 big
1:00:40 social
1:00:41 platforms
1:00:42 then
1:00:43 everybody
1:00:44 needs
1:00:44 proof
1:00:44 of
1:00:45 human
1:00:47 and
1:00:49 you
1:00:49 know
1:00:49 like
1:00:49 it
1:00:49 would
1:00:50 make
1:00:50 the
1:00:51 experience
1:00:52 online
1:00:52 so
1:00:52 much
1:00:52 better
1:00:53 if
1:00:53 you
1:00:53 knew
1:00:53 who
1:00:53 was
1:00:53 human
1:00:54 and
1:00:54 who
1:00:54 was
1:00:54 not
1:00:55 and
1:00:55 right
1:00:55 now
1:00:56 like
1:00:56 you
1:00:56 can’t
1:00:57 tell
1:00:57 at
1:00:58 all
1:00:59 and
1:00:59 so
1:00:59 and
1:01:00 that
1:01:00 problem
1:01:00 is
1:01:00 going
1:01:00 to
1:01:00 get
1:01:01 worse
1:01:01 and
1:01:01 then
1:01:01 the
1:01:02 solution
1:01:02 is
1:01:02 really
1:01:02 here
1:01:04 so
1:01:04 I
1:01:05 think
1:01:05 it’s
1:01:05 going
1:01:05 to
1:01:06 start
1:01:06 to
1:01:06 take
1:01:06 off
1:01:06 and
1:01:06 you
1:01:06 only
1:01:07 need
1:01:07 one
1:01:07 or
1:01:08 two
1:01:09 big
1:01:09 use
1:01:10 cases
1:01:10 to
1:01:11 start
1:01:11 getting
1:01:11 the
1:01:12 whole
1:01:12 infrastructure
1:01:13 deployed
1:01:14 and
1:01:14 once
1:01:15 the
1:01:15 infrastructure
1:01:15 is
1:01:16 deployed
1:01:16 I
1:01:17 think
1:01:17 we’ll
1:01:17 certainly
1:01:18 rely
1:01:18 on
1:01:18 it
1:01:19 and
1:01:20 if
1:01:20 you
1:01:20 look
1:01:20 at
1:01:21 actually
1:01:21 the
1:01:21 curve
1:01:22 of
1:01:22 people
1:01:22 who
1:01:22 have
1:01:23 active
1:01:23 wallets
1:01:24 and
1:01:25 the
1:01:25 curve
1:01:25 of
1:01:26 internet
1:01:27 adoption
1:01:27 they’re
1:01:28 pretty
1:01:28 similar
1:01:32 it’s
1:01:32 about
1:01:32 I think
1:01:33 blockchain
1:01:33 is
1:01:33 growing
1:01:34 a little
1:01:34 faster
1:01:34 than
1:01:34 the
1:01:35 internet
1:01:35 did
1:01:36 initially
1:01:37 and
1:01:37 so
1:01:37 I
1:01:37 think
1:01:38 we’ll
1:01:38 get
1:01:38 to
1:01:39 a
1:01:39 place
1:01:39 where
1:01:40 certainly
1:01:40 everybody
1:01:40 in
1:01:40 the
1:01:41 US
1:01:41 will
1:01:41 be
1:01:42 on
1:01:42 it
1:01:44 which
1:01:44 by
1:01:44 the
1:01:44 way
1:01:45 could
1:01:45 be
1:01:46 great
1:01:46 from
1:01:46 a
1:01:46 government
1:01:47 standpoint
1:01:47 you
1:01:47 know
1:01:48 Elon
1:01:48 has
1:01:48 talked
1:01:49 about
1:01:49 putting
1:01:49 all
1:01:49 the
1:01:50 government
1:01:50 payments
1:01:50 on
1:01:50 the
1:01:51 blockchain
1:01:51 which
1:01:51 I
1:01:51 think
1:01:52 would
1:01:52 be
1:01:52 really
1:01:53 good
1:01:53 for
1:01:53 transparency
1:01:55 we never
1:01:55 get into
1:01:56 this weird
1:01:56 situation
1:01:57 now
1:01:57 where
1:01:58 half
1:01:58 the
1:01:58 country
1:01:59 wants
1:01:59 to
1:02:00 tear
1:02:00 down
1:02:01 all
1:02:01 the
1:02:01 government
1:02:02 services
1:02:02 and half
1:02:02 them
1:02:03 wants
1:02:03 to
1:02:03 keep
1:02:03 it
1:02:03 because
1:02:04 nobody
1:02:04 knows
1:02:04 what
1:02:04 the
1:02:05 spending
1:02:05 is
1:02:06 but
1:02:07 that
1:02:07 would
1:02:07 be
1:02:07 great
1:02:08 but
1:02:08 you know
1:02:09 beyond
1:02:09 that
1:02:09 like
1:02:09 if
1:02:10 you
1:02:10 think
1:02:10 about
1:02:11 well
1:02:11 why
1:02:11 is
1:02:11 there
1:02:12 so
1:02:12 much
1:02:12 waste
1:02:12 and
1:02:12 fraud
1:02:14 well
1:02:14 part
1:02:14 of
1:02:14 it
1:02:14 is
1:02:14 you know
1:02:15 like
1:02:16 you know
1:02:17 I
1:02:17 get
1:02:18 taxed
1:02:18 I
1:02:18 give
1:02:19 my
1:02:19 money
1:02:19 to
1:02:19 the
1:02:20 IRS
1:02:21 the
1:02:21 IRS
1:02:22 gives
1:02:22 it
1:02:23 to
1:02:23 Congress
1:02:23 they
1:02:24 you know
1:02:24 do
1:02:24 whatever
1:02:24 they
1:02:24 do
1:02:25 with
1:02:25 it
1:02:25 and
1:02:25 so
1:02:25 forth
1:02:26 and
1:02:27 well
1:02:27 how
1:02:27 does
1:02:27 it
1:02:27 get
1:02:27 to
1:02:27 the
1:02:28 people
1:02:28 who
1:02:28 need
1:02:29 it
1:02:30 you know
1:02:30 that’s
1:02:30 a
1:02:30 very
1:02:31 lossy
1:02:31 process
1:02:32 you know
1:02:32 and
1:02:32 we
1:02:32 don’t
1:02:32 even
1:02:33 know
1:02:33 they
1:02:33 are
1:02:33 and
1:02:33 it’s
1:02:33 very
1:02:34 you know
1:02:35 one of the things we found out during COVID
1:02:38 the government’s not very good at sending people money
1:02:39 it’s good at taking money
1:02:41 it’s not good at sending them money
1:02:41 right
1:02:44 we lost like 400 billion dollars trying to give people
1:02:44 stimulus
1:02:45 ridiculous
1:02:46 yeah
1:02:46 crazy
1:02:47 ridiculous
1:02:48 you know
1:02:48 like
1:02:48 if
1:02:49 everybody
1:02:49 in the US
1:02:50 had
1:02:51 an address
1:02:52 on the blockchain
1:02:54 you could just tell me
1:02:56 okay here’s 10,000 people who need money
1:02:58 please send them
1:02:59 you know
1:03:00 $5,000 each
1:03:02 well probably that’s too much money for me
1:03:02 but
1:03:03 you know
1:03:04 something like that
1:03:05 you know
1:03:07 whatever my tax bill is
1:03:09 or whatever that portion of wealth redistribution is
1:03:11 that would be 100%
1:03:12 zero loss
1:03:13 by the way
1:03:14 I’d feel a lot better about it
1:03:16 because I’d know I’d be helping people
1:03:17 and you know
1:03:17 look
1:03:18 maybe even
1:03:19 somebody would go
1:03:20 hey this is great
1:03:21 thank you
1:03:21 and we’re like
1:03:22 maybe it would bring us
1:03:24 we wouldn’t have this crazy class warfare
1:03:25 because everybody would go
1:03:26 hey we’re all integrated
1:03:26 you know
1:03:27 like I’m helping you
1:03:28 you’re helping me
1:03:30 and then
1:03:31 like if you had that
1:03:32 then you’d fix the whole
1:03:33 kind of
1:03:35 democracy integrity problem
1:03:35 because
1:03:37 everybody could vote
1:03:38 off that address
1:03:39 and by the way
1:03:40 everybody would have an address
1:03:42 because everybody would want the money
1:03:44 so what bigger incentive
1:03:46 to register to vote
1:03:47 than
1:03:48 you
1:03:49 in order to get money
1:03:50 you have to have an address
1:03:52 which registers you to vote
1:03:53 and so that kind of thing
1:03:53 I think
1:03:55 could get us to
1:03:57 just like a much
1:03:58 higher trust
1:04:00 in our own institutions
1:04:01 hmm
1:04:02 yeah
1:04:02 so
1:04:03 wow
1:04:04 speaking of that
1:04:05 I wanted to
1:04:05 absolutely
1:04:06 scream
1:04:08 into the abyss
1:04:08 when I heard
1:04:09 that people couldn’t retire
1:04:10 from the government
1:04:12 faster than the elevator
1:04:13 would lower their records
1:04:15 down into a mine
1:04:16 I was like
1:04:17 what is happening
1:04:19 what do you take away from
1:04:20 why does Elon
1:04:22 want to do this
1:04:22 why is he sleeping
1:04:23 in hallways
1:04:25 um
1:04:26 why
1:04:27 why is he doing this
1:04:28 is it just to get
1:04:28 government contracts
1:04:29 and it’s nefarious
1:04:31 in the way that so many people
1:04:31 think it is
1:04:32 or is there something
1:04:33 positive there
1:04:34 what’s the
1:04:35 what’s the game
1:04:36 I think there’s a
1:04:38 couple of different things
1:04:38 so
1:04:40 one is
1:04:41 the strong thing
1:04:41 is he
1:04:42 he truly believes
1:04:43 that
1:04:44 um
1:04:45 America’s the best
1:04:46 country in the world
1:04:46 you know
1:04:47 he is an immigrant
1:04:48 um
1:04:49 and that
1:04:50 it’s not guaranteed
1:04:51 to stay that way
1:04:52 and
1:04:54 and we have been
1:04:56 in danger of losing it
1:04:57 uh
1:04:58 and
1:04:59 you know
1:05:00 and so the
1:05:01 most important thing
1:05:01 for him to do
1:05:02 in order for his
1:05:03 companies to be relevant
1:05:04 in order for going to Mars
1:05:05 to be relevant
1:05:06 in order for anything
1:05:07 he wants to do in life
1:05:08 to be relevant
1:05:08 is
1:05:09 we’ve got to stabilize
1:05:10 U.S. government
1:05:11 I think that’s the main thing
1:05:12 driving him
1:05:12 you know
1:05:13 so then you say
1:05:13 well how did he get
1:05:14 to that conclusion
1:05:15 that
1:05:16 you know
1:05:17 the whole country
1:05:18 is in jeopardy
1:05:19 and it really
1:05:19 like
1:05:21 it’s a pretty
1:05:22 uh
1:05:22 it was a pretty
1:05:23 interesting thing
1:05:24 to watch
1:05:24 um
1:05:25 because
1:05:26 right in 2021
1:05:28 I think he was a democrat
1:05:29 and he was certainly
1:05:30 pretty apolitical
1:05:32 and uh
1:05:32 I was actually
1:05:34 in a chat group
1:05:34 with him
1:05:35 um
1:05:36 when he got
1:05:37 the idea
1:05:38 or posed
1:05:38 the question
1:05:39 should he buy
1:05:40 Twitter
1:05:41 um
1:05:42 and
1:05:44 a lot of it
1:05:45 stemmed from
1:05:45 um
1:05:46 you know
1:05:47 it started with
1:05:48 the U.S. government
1:05:50 just harassing him
1:05:51 uh
1:05:51 which was a very odd
1:05:52 thing right
1:05:53 like here’s your
1:05:54 I mean I think
1:05:55 you could very well
1:05:56 argue he was our
1:05:57 most productive citizen
1:05:58 he um
1:05:59 was our entire
1:06:00 space program
1:06:02 he advanced
1:06:02 he advanced
1:06:02 the state
1:06:03 of electric
1:06:03 cars
1:06:04 by 20 years
1:06:05 he’s still
1:06:06 like 95%
1:06:06 of there’s
1:06:07 something like
1:06:07 that
1:06:07 percent
1:06:08 of the electric
1:06:09 cars sold
1:06:09 in the U.S.
1:06:12 you know he’s
1:06:13 you know
1:06:14 done the things
1:06:15 with Neuralink
1:06:16 to you know
1:06:17 help people
1:06:18 like uh
1:06:19 use their arms
1:06:20 and legs
1:06:20 who have been
1:06:21 paralyzed
1:06:21 and this kind
1:06:22 of thing
1:06:22 so you know
1:06:23 you’ve got
1:06:24 um
1:06:25 you know
1:06:25 he’s a really
1:06:26 kind of remarkable
1:06:27 person to want
1:06:28 to pick on
1:06:29 but uh
1:06:30 what happened
1:06:31 was uh
1:06:33 because he got
1:06:34 like this PR
1:06:35 for being very
1:06:35 wealthy
1:06:36 um
1:06:37 the Biden
1:06:37 administration
1:06:38 targeted him
1:06:39 um
1:06:39 you know
1:06:40 and again
1:06:40 they’re fascists
1:06:41 so uh
1:06:42 it’s really
1:06:43 a power struggle
1:06:44 always with the
1:06:44 fascists and
1:06:45 anybody who looks
1:06:46 like they’re
1:06:46 becoming powerful
1:06:48 and yeah
1:06:48 some of the
1:06:49 things they did
1:06:50 and you know
1:06:50 one of the ones
1:06:51 that’s talked
1:06:51 about a lot
1:06:52 was um
1:06:54 they sued him
1:06:55 the Biden
1:06:55 Department of
1:06:56 Justice sued
1:06:58 him for uh
1:07:00 discriminating
1:07:01 against refugees
1:07:02 um
1:07:03 but he had a
1:07:04 contract with
1:07:04 the U.S.
1:07:05 Department of
1:07:05 Defense
1:07:06 that required
1:07:06 him to only
1:07:07 hire U.S.
1:07:07 citizens
1:07:09 so he was
1:07:10 breaking the law
1:07:10 either way
1:07:11 uh
1:07:11 and they
1:07:12 they never
1:07:13 dropped the lawsuit
1:07:14 even after it was
1:07:14 pointed out
1:07:15 even after
1:07:16 you know
1:07:16 it was pointed
1:07:17 out by Congress
1:07:18 uh
1:07:18 so it was
1:07:19 clear harassment
1:07:20 and I think
1:07:21 that his
1:07:22 conclusion from
1:07:23 that was
1:07:24 you know
1:07:24 this
1:07:25 we are
1:07:26 like ironically
1:07:27 um
1:07:28 I think his
1:07:29 conclusion was
1:07:29 we’re losing
1:07:30 the democracy
1:07:31 um
1:07:32 you know
1:07:33 we’re going
1:07:34 into this
1:07:34 very very
1:07:35 strange world
1:07:36 uh
1:07:36 where the
1:07:36 incentives
1:07:37 are all
1:07:37 upside down
1:07:38 and
1:07:39 you know
1:07:40 the way
1:07:41 Elon thinks
1:07:42 is
1:07:42 it’s up to
1:07:43 him to
1:07:43 save it
1:07:44 um
1:07:45 and so he
1:07:45 got like
1:07:46 extremely
1:07:46 involved
1:07:47 and then I
1:07:47 think the
1:07:48 more involved
1:07:49 he got
1:07:49 the more
1:07:51 he both
1:07:52 realized like
1:07:53 a lot of the
1:07:53 things really
1:07:54 were dangerous
1:07:54 and then
1:07:55 secondly
1:07:56 that
1:07:57 um
1:07:57 he
1:07:58 personally
1:07:59 uh
1:08:00 would be
1:08:01 somebody who
1:08:01 would know
1:08:01 how to fix
1:08:01 it
1:08:02 and
1:08:03 you go
1:08:03 like
1:08:03 well why
1:08:04 the hell
1:08:04 would Elon
1:08:05 must know
1:08:05 how to
1:08:05 fix the
1:08:06 government
1:08:06 all this
1:08:07 and you
1:08:07 know
1:08:08 this is
1:08:08 the thing
1:08:09 that everybody’s
1:08:09 saying now
1:08:10 uh
1:08:11 and it’s
1:08:11 funny
1:08:11 because I
1:08:12 told this
1:08:13 to um
1:08:14 Andreessen
1:08:15 years ago
1:08:15 uh
1:08:16 was
1:08:17 because I
1:08:18 I’m a big
1:08:19 fan of
1:08:19 Isaac Newton
1:08:20 and you
1:08:20 know like
1:08:21 we always
1:08:21 talked about
1:08:22 like who
1:08:22 is Elon
1:08:22 like
1:08:23 you know
1:08:23 like
1:08:23 what
1:08:24 entrepreneur
1:08:25 comes to
1:08:25 mind
1:08:26 you know
1:08:26 and it
1:08:26 really
1:08:27 wasn’t
1:08:27 maybe
1:08:28 Thomas
1:08:28 Edison
1:08:28 but not
1:08:29 really
1:08:30 um
1:08:31 but Isaac
1:08:32 Newton
1:08:33 uh
1:08:33 whoa
1:08:34 was really
1:08:34 the one
1:08:35 that I
1:08:35 always
1:08:36 thought
1:08:37 he was
1:08:37 most
1:08:37 like
1:08:38 um
1:08:38 you know
1:08:39 because
1:08:39 it’s
1:08:39 like okay
1:08:39 who
1:08:40 can build
1:08:41 like rockets
1:08:41 and cars
1:08:42 and this
1:08:42 and that
1:08:42 and the
1:08:42 other
1:08:43 but the
1:08:43 reason I
1:08:43 thought he
1:08:44 was like
1:08:44 Isaac
1:08:44 Newton
1:08:44 was
1:08:45 at the
1:08:46 end of
1:08:46 Isaac
1:08:46 Newton’s
1:08:46 life
1:08:47 um
1:08:48 but I
1:08:48 think
1:08:48 he was
1:08:49 he was
1:08:49 in his
1:08:49 late
1:08:50 60s
1:08:51 uh
1:08:51 maybe
1:08:52 like
1:08:52 67
1:08:53 68
1:08:53 and this
1:08:54 is you
1:08:54 know
1:08:54 for those
1:08:54 of you
1:08:54 don’t
1:08:55 know
1:08:56 Isaac
1:08:56 Newton
1:08:56 like
1:08:56 he
1:08:56 figured
1:08:57 out
1:08:57 how
1:08:57 the
1:08:57 entire
1:08:58 world
1:08:58 works
1:08:58 and
1:08:59 wrote
1:08:59 it
1:08:59 down
1:08:59 in
1:08:59 a
1:08:59 book
1:09:00 um
1:09:01 you
1:09:01 know
1:09:01 called
1:09:02 Principia
1:09:02 Mathematica
1:09:03 which is
1:09:03 probably
1:09:04 the most
1:09:04 amazing
1:09:05 work
1:09:05 in the
1:09:05 history
1:09:05 of
1:09:05 science
1:09:06 uh
1:09:07 and he
1:09:07 did
1:09:07 it
1:09:08 like
1:09:08 entirely
1:09:08 by
1:09:09 himself
1:09:10 like
1:09:11 he didn’t
1:09:11 even talk
1:09:11 to anybody
1:09:12 at this
1:09:12 you know
1:09:12 at the
1:09:13 time he
1:09:13 wrote it
1:09:13 I think
1:09:14 he was
1:09:14 trying to
1:09:15 figure out
1:09:16 what God
1:09:16 was or
1:09:16 something
1:09:17 like that
1:09:18 um
1:09:19 but
1:09:19 as you
1:09:19 do
1:09:20 but
1:09:20 so he
1:09:20 gets to
1:09:21 be
1:09:21 you know
1:09:21 like
1:09:22 in his
1:09:22 late
1:09:22 60s
1:09:23 and
1:09:23 the
1:09:24 bank
1:09:24 of
1:09:24 England
1:09:24 has
1:09:25 a
1:09:25 crisis
1:09:25 which
1:09:26 is
1:09:26 causing
1:09:27 a
1:09:27 huge
1:09:27 crisis
1:09:27 for
1:09:28 the
1:09:28 whole
1:09:28 country
1:09:29 um
1:09:29 which
1:09:29 is
1:09:30 uh
1:09:30 there’s
1:09:31 a
1:09:31 giant
1:09:32 counterfeiting
1:09:32 problem
1:09:33 so the
1:09:33 currency
1:09:33 is
1:09:34 going to
1:09:34 be
1:09:34 undermined
1:09:35 and
1:09:35 uh
1:09:36 England’s
1:09:36 going to
1:09:36 basically
1:09:37 go
1:09:37 bankrupt
1:09:38 and
1:09:39 so
1:09:39 they
1:09:39 had
1:09:39 no
1:09:40 idea
1:09:40 what
1:09:40 to
1:09:40 do
1:09:41 about
1:09:41 it
1:09:41 um
1:09:41 so
1:09:42 they
1:09:42 call
1:09:42 Isaac
1:09:42 Newton
1:09:43 because
1:09:43 he’s
1:09:43 the
1:09:43 smartest
1:09:44 man
1:09:44 in the
1:09:44 world
1:09:45 of
1:09:45 course
1:09:45 you’re
1:09:45 going
1:09:45 to
1:09:45 call
1:09:46 him
1:09:46 so
1:09:46 Isaac
1:09:47 Newton
1:09:48 67
1:09:48 year
1:09:48 old
1:09:49 like
1:09:50 hermit
1:09:51 physicist
1:09:52 goes
1:09:52 in
1:09:53 and
1:09:53 he
1:09:53 says
1:09:54 okay
1:09:54 I
1:09:54 can
1:09:54 help
1:09:55 with
1:09:55 the
1:09:55 problem
1:09:56 make
1:09:56 me
1:09:56 CEO
1:09:57 of
1:09:57 the
1:09:57 mint
1:09:57 so
1:09:58 they
1:09:58 make
1:09:58 him
1:09:58 CEO
1:09:58 of
1:09:59 the
1:09:59 mint
1:09:59 you
1:09:59 know
1:09:59 kind
1:10:00 of
1:10:00 head
1:10:00 of
1:10:00 doge
1:10:00 whatever
1:10:02 uh
1:10:02 and
1:10:03 he
1:10:04 reorganizes
1:10:04 the
1:10:04 mint
1:10:05 in like
1:10:05 a week
1:10:06 and
1:10:06 then
1:10:06 fixes
1:10:06 the
1:10:07 technology
1:10:07 in a
1:10:07 month
1:10:08 and
1:10:09 completely
1:10:09 makes
1:10:09 it
1:10:10 impossible
1:10:10 to
1:10:10 counterfeit
1:10:11 then
1:10:12 he
1:10:12 becomes
1:10:13 a
1:10:13 private
1:10:13 eye
1:10:15 and
1:10:15 goes
1:10:15 into
1:10:16 all
1:10:16 the
1:10:16 pubs
1:10:16 where
1:10:16 the
1:10:17 counterfeiters
1:10:17 are
1:10:18 arrests
1:10:18 all
1:10:18 of
1:10:18 them
1:10:19 then
1:10:20 he
1:10:20 learns
1:10:20 the
1:10:20 law
1:10:21 and
1:10:21 becomes
1:10:21 the
1:10:22 prosecutor
1:10:23 and
1:10:23 prosecutes
1:10:24 all
1:10:24 the
1:10:24 counterfeiters
1:10:25 and has
1:10:26 100%
1:10:26 conviction
1:10:27 record
1:10:28 and
1:10:28 that
1:10:29 by
1:10:29 the
1:10:29 way
1:10:30 that’s
1:10:30 Elon
1:10:30 so
1:10:32 if
1:10:32 you
1:10:33 I
1:10:33 didn’t
1:10:33 know
1:10:34 that
1:10:34 part
1:10:34 of
1:10:34 his
1:10:34 story
1:10:35 oh
1:10:35 yeah
1:10:35 yeah
1:10:36 so
1:10:36 it’s
1:10:36 an
1:10:37 amazing
1:10:37 thing
1:10:39 and
1:10:39 if
1:10:39 you
1:10:39 look
1:10:40 at
1:10:40 Elon
1:10:40 and
1:10:41 Doge
1:10:41 to
1:10:42 me
1:10:42 the
1:10:42 most
1:10:43 remarkable
1:10:43 thing
1:10:43 about
1:10:44 Doge
1:10:45 is
1:10:46 how
1:10:46 he’s
1:10:47 done
1:10:47 it
1:10:47 so
1:10:48 if
1:10:48 you
1:10:48 or
1:10:49 I
1:10:50 were
1:10:50 to
1:10:51 say
1:10:52 okay
1:10:53 let’s
1:10:53 go in
1:10:54 and
1:10:54 kind
1:10:54 of
1:10:55 get
1:10:55 the
1:10:55 waste
1:10:55 and
1:10:56 fraud
1:10:56 out
1:10:56 of
1:10:56 the
1:10:56 government
1:10:57 we
1:10:57 would
1:10:58 like
1:10:58 audit
1:10:58 the
1:10:59 departments
1:10:59 or
1:10:59 this
1:10:59 and
1:11:00 that
1:11:00 the
1:11:00 other
1:11:00 and
1:11:00 so
1:11:01 forth
1:11:02 no
1:11:02 no
1:11:02 no
1:11:03 like
1:11:03 that’s
1:11:03 how
1:11:03 he
1:11:06 like
1:11:07 how
1:11:07 do
1:11:07 the
1:11:07 checks
1:11:07 go
1:11:08 out
1:11:10 like
1:11:11 how
1:11:11 is
1:11:11 the
1:11:11 system
1:11:12 designed
1:11:12 like
1:11:12 when
1:11:13 does
1:11:13 the
1:11:13 money
1:11:13 leave
1:11:13 the
1:11:14 building
1:11:15 and
1:11:15 then
1:11:15 oh
1:11:16 all
1:11:16 comes
1:11:16 out
1:11:16 of
1:11:17 one
1:11:17 system
1:11:18 let
1:11:18 me
1:11:18 have
1:11:19 access
1:11:19 to
1:11:19 that
1:11:19 system
1:11:20 and
1:11:20 I’ll
1:11:20 look
1:11:20 at
1:11:20 all
1:11:20 the
1:11:21 payments
1:11:22 I’m
1:11:22 not
1:11:22 asking
1:11:22 anybody
1:11:23 what
1:11:23 they’re
1:11:23 spending
1:11:24 I’m
1:11:24 looking
1:11:25 at
1:11:25 what
1:11:25 they’re
1:11:25 spending
1:11:26 like
1:11:26 I’m
1:11:26 getting
1:11:26 to
1:11:26 ground
1:11:27 truth
1:11:27 and
1:11:28 then
1:11:28 I’m
1:11:28 going
1:11:28 to
1:11:28 work
1:11:28 my
1:11:29 way
1:11:29 backwards
1:11:29 from
1:11:30 there
1:11:30 and
1:11:30 he’s
1:11:31 probably
1:11:32 and
1:11:32 you know
1:11:33 so
1:11:34 not only
1:11:34 is he
1:11:35 not
1:11:35 unqualified
1:11:36 he’s
1:11:36 maybe the
1:11:37 only person
1:11:37 qualified
1:11:38 to figure
1:11:38 out
1:11:38 like
1:11:39 how
1:11:46 you know
1:11:46 he’s
1:11:47 just
1:11:47 a
1:11:47 very
1:11:48 unique
1:11:48 individual
1:11:51 he’s
1:11:51 also
1:11:52 a troll
1:11:53 he also
1:11:53 likes
1:11:54 upsetting
1:11:54 people
1:11:55 I get
1:11:55 all
1:11:56 that
1:11:57 but
1:11:58 what
1:11:58 he
1:11:58 brings
1:11:58 to
1:11:58 the
1:11:59 table
1:11:59 is
1:11:59 pretty
1:11:59 interesting
1:12:00 I would
1:12:00 say
1:12:01 I would
1:12:01 say
1:12:02 very
1:12:02 extraordinary
1:12:03 you
1:12:04 have
1:12:04 also
1:12:04 written
1:12:04 about
1:12:05 another
1:12:05 extraordinary
1:12:06 historical
1:12:07 figure
1:12:09 from
1:12:09 the
1:12:10 Haitian
1:12:10 revolution
1:12:11 a guy
1:12:11 named
1:12:12 Toussaint
1:12:16 Tell us
1:12:16 about him
1:12:16 because
1:12:16 there’s
1:12:17 something
1:12:17 about
1:12:17 this
1:12:18 moment
1:12:18 about
1:12:19 being
1:12:19 a
1:12:19 master
1:12:20 strategist
1:12:20 about
1:12:22 using
1:12:22 what you
1:12:23 have
1:12:23 being
1:12:24 creative
1:12:24 that
1:12:25 feels
1:12:25 like
1:12:25 it’s
1:12:26 very
1:12:27 apropos
1:12:27 to
1:12:28 this
1:12:28 moment
1:12:31 what
1:12:31 made
1:12:32 his
1:12:32 story
1:12:32 special
1:12:33 yeah
1:12:33 so
1:12:34 Toussaint
1:12:34 was
1:12:35 another
1:12:35 one
1:12:35 of
1:12:35 these
1:12:36 characters
1:12:36 in
1:12:36 history
1:12:37 like
1:12:37 there
1:12:37 are
1:12:37 certain
1:12:38 I
1:12:39 call
1:12:39 them
1:12:39 like
1:12:39 once
1:12:39 in
1:12:40 every
1:12:40 400
1:12:40 year
1:12:41 type
1:12:41 people
1:12:43 where
1:12:43 you
1:12:43 just
1:12:43 don’t
1:12:43 see
1:12:43 them
1:12:44 that
1:12:44 often
1:12:44 but
1:12:45 so
1:12:46 it
1:12:46 turns
1:12:46 out
1:12:46 like
1:12:46 in
1:12:47 the
1:12:47 history
1:12:47 of
1:12:48 humanity
1:12:50 there’s
1:12:50 been
1:12:51 one
1:12:51 kind
1:12:51 of
1:12:52 successful
1:12:53 slave
1:12:53 revolt
1:12:53 that
1:12:54 like
1:12:54 entered
1:12:55 in
1:12:55 an
1:12:55 independent
1:12:56 state
1:12:57 which
1:12:57 you
1:12:57 know
1:12:57 if
1:12:57 you
1:12:58 think
1:12:58 about
1:12:59 the
1:12:59 history
1:12:59 of
1:13:00 slavery
1:13:00 which
1:13:00 goes
1:13:01 back
1:13:02 thousands
1:13:02 of
1:13:02 years
1:13:02 really
1:13:03 kind
1:13:03 of
1:13:04 from
1:13:04 the
1:13:05 beginning
1:13:05 of
1:13:05 written
1:13:06 history
1:13:06 like
1:13:07 we’ve
1:13:07 had
1:13:07 slavery
1:13:07 so
1:13:08 it’s
1:13:09 like
1:13:09 a
1:13:09 pretty
1:13:10 old
1:13:11 time
1:13:11 construct
1:13:12 and
1:13:13 there’s
1:13:13 a lot
1:13:13 of
1:13:14 motivation
1:13:14 to have
1:13:14 a
1:13:14 revolt
1:13:15 if
1:13:15 you’re
1:13:15 a
1:13:15 slave
1:13:17 but
1:13:17 why
1:13:17 only
1:13:18 one
1:13:18 successful
1:13:18 one
1:13:18 and
1:13:19 it
1:13:19 turns
1:13:19 out
1:13:19 it’s
1:13:20 it’s
1:13:20 really
1:13:21 hard
1:13:23 you know
1:13:24 for
1:13:25 to
1:13:25 generate
1:13:26 an
1:13:26 effective
1:13:27 revolt
1:13:27 if
1:13:27 you’re
1:13:28 slaves
1:13:28 because
1:13:30 slave
1:13:30 culture
1:13:31 is
1:13:32 difficult
1:13:32 because
1:13:32 you
1:13:32 don’t
1:13:32 own
1:13:33 right
1:13:33 if
1:13:33 you
1:13:34 don’t
1:13:34 have
1:13:36 any
1:13:37 sense
1:13:37 of
1:13:38 you know
1:13:38 owning
1:13:38 anything
1:13:39 you don’t
1:13:39 own
1:13:39 your
1:13:39 own
1:13:40 will
1:13:40 right
1:13:40 like
1:13:40 you
1:13:41 are
1:13:42 at
1:13:42 the
1:13:42 kind
1:13:43 of
1:13:43 pleasure
1:13:44 of
1:13:44 who’s
1:13:45 ever
1:13:45 running
1:13:45 things
1:13:46 so
1:13:47 long
1:13:47 term
1:13:48 thinking
1:13:48 doesn’t
1:13:48 make
1:13:49 sense
1:13:50 and
1:13:51 what
1:13:51 it
1:13:52 because
1:13:52 like
1:13:52 why
1:13:53 plan
1:13:53 for
1:13:53 next
1:13:53 week
1:13:54 it
1:13:54 doesn’t
1:13:55 matter
1:13:55 what
1:13:55 you
1:13:55 plan
1:13:56 like
1:13:56 it’s
1:13:56 not
1:13:57 yours
1:13:58 so
1:13:58 everything’s
1:13:58 going to be
1:13:59 very short
1:13:59 term
1:14:00 and short
1:14:00 termism
1:14:01 is
1:14:02 difficult
1:14:02 in a
1:14:02 military
1:14:03 context
1:14:04 because
1:14:06 in order
1:14:06 to have
1:14:06 an effective
1:14:07 military
1:14:07 there needs
1:14:08 to be
1:14:09 a
1:14:09 trust
1:14:10 right
1:14:10 like
1:14:10 a
1:14:10 trust
1:14:12 you know
1:14:12 you have
1:14:12 to be
1:14:12 able
1:14:12 to
1:14:13 trust
1:14:13 people
1:14:13 to
1:14:14 execute
1:14:14 the
1:14:14 order
1:14:14 like
1:14:15 I give
1:14:15 an
1:14:15 order
1:14:16 it’s
1:14:16 kind
1:14:16 of
1:14:16 like
1:14:17 the
1:14:17 Byzantine
1:14:17 generals
1:14:18 problem
1:14:19 to go
1:14:19 back
1:14:19 to
1:14:19 crypto
1:14:21 where
1:14:22 like
1:14:23 I have
1:14:23 to
1:14:23 trust
1:14:23 that
1:14:24 you’re
1:14:24 going
1:14:24 to
1:14:24 do
1:14:24 the
1:14:24 order
1:14:25 you
1:14:25 have
1:14:25 to
1:14:25 trust
1:14:25 that
1:14:26 I’m
1:14:26 giving
1:14:26 the
1:14:26 correct
1:14:27 order
1:14:28 but
1:14:28 trust
1:14:28 is
1:14:28 a
1:14:29 long
1:14:29 term
1:14:29 idea
1:14:29 because
1:14:30 it
1:14:30 comes
1:14:30 from
1:14:30 okay
1:14:31 I’m
1:14:31 going to
1:14:31 do
1:14:32 something
1:14:32 for
1:14:32 you
1:14:32 today
1:14:33 because
1:14:33 I
1:14:33 trust
1:14:33 that
1:14:33 down
1:14:33 the
1:14:34 line
1:14:34 you’ll
1:14:34 do
1:14:34 something
1:14:34 for
1:14:35 me
1:14:35 that
1:14:36 doesn’t
1:14:36 really
1:14:36 exist
1:14:36 in
1:14:47 to
1:14:47 running
1:14:47 a
1:14:48 successful
1:14:48 revolution
1:14:49 and
1:14:49 then
1:14:49 if
1:14:49 you
1:14:49 look
1:14:49 at
1:14:50 Haiti
1:14:50 at
1:14:50 the
1:14:51 time
1:14:53 you
1:14:53 had
1:14:53 the
1:14:54 French
1:14:54 army
1:14:54 the
1:14:55 British
1:14:55 army
1:14:57 and
1:14:57 the
1:14:57 Spanish
1:14:58 army
1:14:58 all
1:14:58 in
1:14:58 there
1:14:59 fighting
1:14:59 for it
1:15:00 so
1:15:00 really
1:15:01 well
1:15:01 developed
1:15:03 the strongest
1:15:04 militaries
1:15:04 of the
1:15:04 era
1:15:05 all
1:15:06 in
1:15:06 that
1:15:06 region
1:15:07 all
1:15:07 very
1:15:08 interested
1:15:08 in
1:15:09 the
1:15:09 sugar
1:15:11 which
1:15:11 was
1:15:12 quite
1:15:12 valuable
1:15:12 at
1:15:12 the
1:15:13 time
1:15:13 so
1:15:14 how
1:15:14 in
1:15:14 the
1:15:14 world
1:15:14 would
1:15:15 you ever
1:15:15 get out
1:15:15 of
1:15:15 that
1:15:17 and
1:15:17 it
1:15:17 turned
1:15:17 out
1:15:18 he
1:15:19 was
1:15:20 probably
1:15:20 the
1:15:21 great
1:15:21 cultural
1:15:22 genius
1:15:22 of
1:15:22 the
1:15:22 last
1:15:23 you know
1:15:25 maybe
1:15:25 in
1:15:25 history
1:15:26 but
1:15:26 certainly
1:15:26 the
1:15:26 last
1:15:27 several
1:15:27 hundred
1:15:28 years
1:15:30 and
1:15:31 he
1:15:32 was
1:15:32 able
1:15:33 because
1:15:34 he
1:15:34 was
1:15:34 a
1:15:35 person
1:15:35 who
1:15:35 although
1:15:35 he
1:15:36 was
1:15:36 born
1:15:36 a
1:15:36 slave
1:15:37 was
1:15:37 very
1:15:38 very
1:15:38 integrated
1:15:38 into
1:15:39 European
1:15:39 culture
1:15:40 because
1:15:40 he
1:15:40 was
1:15:40 so
1:15:41 smart
1:15:42 and
1:15:43 so
1:15:44 the
1:15:44 person
1:15:44 who
1:15:45 ran
1:15:45 the
1:15:45 plantation
1:15:47 kind
1:15:47 of
1:15:48 took
1:15:48 him
1:15:48 to
1:15:48 all
1:15:48 the
1:15:49 diplomatic
1:15:49 meetings
1:15:50 around
1:15:50 and
1:15:50 so
1:15:51 forth
1:15:51 and
1:15:51 he
1:15:52 got
1:15:52 very
1:15:52 involved
1:15:53 and
1:15:53 kind
1:15:53 of
1:15:53 mastered
1:15:55 European
1:15:55 culture
1:15:57 so to
1:15:57 speak
1:15:57 and
1:15:57 the
1:15:57 different
1:15:58 subtleties
1:15:58 around
1:15:59 it
1:16:00 and
1:16:00 he
1:16:01 started
1:16:01 adopting
1:16:01 those
1:16:02 things
1:16:02 and
1:16:02 applying
1:16:03 them
1:16:03 to
1:16:03 his
1:16:04 leadership
1:16:04 and
1:16:05 then
1:16:06 furthermore
1:16:07 incorporated
1:16:08 Europeans
1:16:09 into
1:16:09 the slave
1:16:10 army
1:16:10 so he
1:16:10 would
1:16:11 capture
1:16:12 you know
1:16:13 he
1:16:13 would
1:16:14 defeat
1:16:14 the
1:16:14 Spanish
1:16:15 capture
1:16:16 some
1:16:16 guys
1:16:16 rather
1:16:17 than
1:16:17 kill
1:16:17 them
1:16:17 he
1:16:18 incorporated
1:16:19 the best
1:16:19 leaders
1:16:19 into
1:16:20 his
1:16:20 army
1:16:21 and
1:16:21 he
1:16:21 built
1:16:21 this
1:16:22 very
1:16:23 advanced
1:16:23 hybrid
1:16:24 fighting
1:16:24 system
1:16:24 where
1:16:25 they
1:16:25 used
1:16:25 a lot
1:16:25 of
1:16:25 the
1:16:26 guerrilla
1:16:27 techniques
1:16:28 that
1:16:28 he
1:16:29 had
1:16:29 brought
1:16:29 over
1:16:29 from
1:16:30 Africa
1:16:31 and
1:16:32 then
1:16:32 he
1:16:33 had
1:16:34 combined
1:16:34 that
1:16:35 with
1:16:35 some
1:16:36 of
1:16:36 the
1:16:37 more
1:16:38 regimented
1:16:40 discipline
1:16:41 strategies
1:16:42 of
1:16:42 the
1:16:43 Europeans
1:16:44 and
1:16:44 in
1:16:44 building
1:16:44 all
1:16:44 that
1:16:45 he
1:16:46 ended
1:16:46 up
1:16:46 building
1:16:46 this
1:16:47 massive
1:16:47 army
1:16:48 and
1:16:49 defeated
1:16:49 Napoleon
1:16:50 and
1:16:51 everyone
1:16:51 else
1:16:51 and
1:16:52 it
1:16:52 was
1:16:52 just
1:16:53 quite
1:16:53 a
1:16:54 remarkable
1:16:54 story
1:16:55 about
1:16:55 how
1:16:55 he
1:16:56 figured
1:16:56 everything
1:16:57 out
1:16:57 from
1:16:57 his
1:16:58 principles
1:16:58 and
1:16:58 in
1:16:58 a
1:16:59 way
1:16:59 that
1:16:59 was
1:17:01 very
1:17:02 much
1:17:02 like
1:17:03 Elon
1:17:03 in
1:17:04 that
1:17:04 sense
1:17:05 one
1:17:05 of
1:17:06 the
1:17:06 things
1:17:06 I heard
1:17:06 you talk
1:17:07 about
1:17:10 that
1:17:10 I
1:17:10 thought
1:17:11 was
1:17:11 so
1:17:11 ingenious
1:17:12 was
1:17:12 he
1:17:12 would
1:17:12 basically
1:17:13 use
1:17:13 song
1:17:14 and
1:17:15 sound
1:17:16 as
1:17:16 like
1:17:16 encrypted
1:17:17 language
1:17:18 it’s
1:17:19 really
1:17:19 yeah
1:17:19 so
1:17:20 that
1:17:20 was
1:17:21 like
1:17:21 a
1:17:21 very
1:17:21 cool
1:17:22 thing
1:17:22 so
1:17:23 right
1:17:24 remember
1:17:24 that
1:17:24 this
1:17:24 is
1:17:24 in
1:17:25 the
1:17:25 days
1:17:25 before
1:17:26 telephony
1:17:26 or the
1:17:27 internet
1:17:27 or any
1:17:28 of
1:17:28 these
1:17:28 things
1:17:28 you know
1:17:28 it’s
1:17:29 pre
1:17:29 Alexander
1:17:30 Graham Bell
1:17:30 and all
1:17:30 that
1:17:30 kind
1:17:30 of
1:17:31 thing
1:17:32 and
1:17:33 so
1:17:34 you know
1:17:34 they were
1:17:34 literally
1:17:35 you know
1:17:35 the Europeans
1:17:36 were on
1:17:36 like
1:17:37 you know
1:17:37 notes
1:17:38 carrier pigeons
1:17:40 guys running
1:17:41 you know
1:17:41 back and
1:17:42 forth
1:17:42 and so
1:17:42 forth
1:17:43 and so
1:17:43 as a
1:17:43 result
1:17:45 you know
1:17:45 you kind
1:17:45 of needed
1:17:46 the army
1:17:47 together
1:17:47 in one
1:17:48 place
1:17:48 just
1:17:48 so you
1:17:49 could
1:17:49 communicate
1:17:49 the
1:17:50 order
1:17:52 Toussaint
1:17:53 basically
1:17:54 you know
1:17:55 had these
1:17:57 drummers
1:17:57 and these
1:17:58 songs
1:17:59 which
1:17:59 he could
1:18:00 put on
1:18:00 top of
1:18:00 like
1:18:02 the hill
1:18:02 who could
1:18:03 be very
1:18:03 very loud
1:18:04 and then
1:18:04 he would
1:18:04 separate his
1:18:05 army
1:18:05 you know
1:18:06 into like
1:18:06 six or
1:18:07 seven groups
1:18:09 but in
1:18:10 the song
1:18:11 would be
1:18:11 embedded
1:18:13 the order
1:18:13 of when
1:18:14 to attack
1:18:14 and you
1:18:15 know
1:18:15 when to
1:18:16 retreat
1:18:16 and all
1:18:17 these kinds
1:18:17 of things
1:18:17 so he
1:18:18 had this
1:18:18 like
1:18:18 super
1:18:19 advanced
1:18:20 you know
1:18:20 wide area
1:18:21 communication
1:18:22 system
1:18:23 that nobody
1:18:24 else had
1:18:24 and that
1:18:24 you know
1:18:25 that was a
1:18:26 big advantage
1:18:27 for him
1:18:28 yeah that to
1:18:29 me the reason
1:18:29 that that
1:18:30 comes up
1:18:30 for me
1:18:30 now is
1:18:31 we have
1:18:31 all these
1:18:32 new
1:18:32 technologies
1:18:33 that are
1:18:33 coming
1:18:33 online
1:18:33 and the
1:18:34 person
1:18:34 that’s
1:18:34 going to
1:18:34 be able
1:18:34 to get
1:18:35 outside
1:18:35 that box
1:18:37 and see
1:18:37 something
1:18:38 new
1:18:38 and fresh
1:18:39 is going
1:18:39 to be able
1:18:39 to use
1:18:40 this in
1:18:40 totally
1:18:41 different
1:18:41 ways
1:18:41 and while
1:18:42 in the
1:18:43 final analysis
1:18:43 I think
1:18:44 you and I
1:18:44 see it
1:18:44 very differently
1:18:45 in terms
1:18:46 of AI’s
1:18:46 ability
1:18:47 to ultimately
1:18:48 gobble up
1:18:49 what humans
1:18:49 can do
1:18:50 but right
1:18:51 now
1:18:52 AI is this
1:18:53 incredible
1:18:54 tool that
1:18:54 as an
1:18:54 entrepreneur
1:18:55 for me
1:18:56 it has
1:18:56 been
1:18:58 ridiculously
1:18:59 exciting
1:19:00 to one
1:19:01 see how
1:19:01 much farther
1:19:02 each of my
1:19:02 employees
1:19:03 can push
1:19:03 their own
1:19:04 abilities
1:19:04 by using
1:19:04 AI
1:19:05 and then
1:19:05 it does
1:19:06 not take
1:19:06 much to
1:19:07 prognosticate
1:19:08 out you
1:19:08 know 12
1:19:09 18 months
1:19:10 to understand
1:19:11 where the
1:19:11 tools are
1:19:12 going to be
1:19:12 and how
1:19:12 much more
1:19:13 they’re going
1:19:13 to let
1:19:13 you do
1:19:13 because
1:19:13 we’re
1:19:14 largely
1:19:14 an
1:19:15 entertainment
1:19:15 company
1:19:16 so for
1:19:17 us to
1:19:17 look at
1:19:17 that
1:19:18 and just
1:19:19 the
1:19:20 revolutionary
1:19:21 changes
1:19:21 but you
1:19:22 can’t be
1:19:22 trapped
1:19:23 inside the
1:19:23 old way
1:19:23 of thinking
1:19:24 you’ve got
1:19:25 to like
1:19:25 you said
1:19:25 build up
1:19:26 from first
1:19:26 principles
1:19:27 yeah it’s
1:19:27 a new
1:19:28 creative
1:19:28 canvas
1:19:29 I think
1:19:29 that’s
1:19:29 like a
1:19:30 really great
1:19:30 way of
1:19:30 thinking
1:19:31 about it
1:19:31 in that
1:19:33 it’s
1:19:33 like
1:19:34 well is
1:19:34 your
1:19:35 creativity
1:19:36 going to
1:19:37 be used
1:19:37 on
1:19:39 you know
1:19:40 kind of
1:19:41 frame
1:19:42 by frame
1:19:42 editing
1:19:43 of like
1:19:44 a video
1:19:45 or will
1:19:45 it be
1:19:46 thinking
1:19:46 of like
1:19:47 incredible
1:19:47 new
1:19:47 things
1:19:48 you can
1:19:48 do
1:19:48 in a
1:19:48 video
1:19:49 ad
1:19:50 that
1:19:50 you
1:19:50 could
1:19:50 never
1:19:50 do
1:19:51 before
1:19:51 and
1:19:51 have
1:19:52 the
1:19:52 AI
1:19:52 do
1:19:52 that
1:19:52 for
1:19:53 you
1:19:53 you know
1:19:53 like
1:19:54 and so
1:19:54 it’s
1:19:54 a little
1:19:54 bit
1:19:55 of a
1:19:55 readjustment
1:19:56 of
1:19:57 where
1:19:58 you put
1:19:58 your
1:19:59 creative
1:20:00 energy
1:20:00 into
1:20:01 and
1:20:01 the
1:20:01 things
1:20:01 that
1:20:01 are
1:20:02 possible
1:20:02 and
1:20:02 so
1:20:02 forth
1:20:03 and
1:20:03 I
1:20:03 think
1:20:03 that’s
1:20:04 you know
1:20:04 we’re
1:20:04 really
1:20:05 seeing
1:20:05 that
1:20:05 across
1:20:05 the
1:20:05 board
1:20:10 is
1:20:10 this
1:20:11 going
1:20:11 to
1:20:11 mean
1:20:12 you
1:20:12 know
1:20:12 like
1:20:13 you
1:20:13 don’t
1:20:13 have
1:20:13 human
1:20:14 investors
1:20:14 anymore
1:20:15 and
1:20:15 it’s
1:20:15 actually
1:20:15 been
1:20:16 like
1:20:16 totally
1:20:16 the
1:20:17 opposite
1:20:17 like
1:20:18 instead
1:20:19 of
1:20:19 this
1:20:20 like
1:20:22 painstakingly
1:20:23 collecting
1:20:24 you know
1:20:24 all
1:20:24 all
1:20:25 the
1:20:25 data
1:20:25 needed
1:20:25 to
1:20:26 put
1:20:26 the
1:20:26 investment
1:20:27 memo
1:20:27 together
1:20:28 like
1:20:28 yeah
1:20:28 it
1:20:28 just
1:20:28 does
1:20:29 that
1:20:29 for
1:20:29 you
1:20:29 and
1:20:30 then
1:20:30 you’re
1:20:30 just
1:20:30 thinking
1:20:30 about
1:20:31 like
1:20:31 okay
1:20:31 what
1:20:31 are
1:20:32 the
1:20:32 like
1:20:32 the
1:20:33 really
1:20:33 compelling
1:20:33 things
1:20:34 about
1:20:34 this
1:20:35 or
1:20:35 rather
1:20:36 than
1:20:36 you know
1:20:37 trying
1:20:37 to
1:20:37 track
1:20:38 every
1:20:39 entrepreneur
1:20:40 and
1:20:41 like
1:20:41 great
1:20:41 engineer
1:20:42 in our
1:20:42 database
1:20:43 the AI
1:20:43 is
1:20:43 just
1:20:44 tracking
1:20:44 all
1:20:44 those
1:20:44 people
1:20:45 and
1:20:45 letting
1:20:45 you
1:20:45 know
1:20:45 hey
1:20:46 that
1:20:46 guy
1:20:46 just
1:20:46 updated
1:20:47 his
1:20:47 LinkedIn
1:20:47 profile
1:20:48 or
1:20:54 like
1:20:54 a
1:20:54 much
1:20:55 more
1:20:55 kind
1:20:56 of
1:20:56 fun
1:20:56 part
1:20:56 of
1:20:56 the
1:20:57 game
1:20:58 and
1:20:59 and
1:21:00 so
1:21:00 you know
1:21:00 look
1:21:01 I
1:21:01 would
1:21:02 say
1:21:02 the
1:21:03 best
1:21:03 predictor
1:21:04 of
1:21:04 kind
1:21:04 of
1:21:05 how
1:21:05 things
1:21:05 are
1:21:05 going
1:21:05 to
1:21:06 go
1:21:06 are
1:21:07 more
1:21:07 like
1:21:07 what’s
1:21:08 happening
1:21:08 now
1:21:08 than
1:21:10 like
1:21:10 the
1:21:10 most
1:21:11 dystopian
1:21:11 view
1:21:12 of it
1:21:12 that
1:21:13 we
1:21:13 can
1:21:14 possibly
1:21:14 think
1:21:14 of
1:21:15 which
1:21:15 I
1:21:15 think
1:21:16 is
1:21:16 where
1:21:16 a lot
1:21:16 of
1:21:16 people
1:21:16 go
1:21:17 to
1:21:17 and
1:21:17 like
1:21:17 I
1:21:17 said
1:21:17 I
1:21:17 think
1:21:17 some
1:21:18 of
1:21:18 that’s
1:21:18 the
1:21:18 name
1:21:19 you
1:21:19 know
1:21:20 artificial
1:21:21 intelligence
1:21:21 we
1:21:22 hate
1:21:23 everything
1:21:23 artificial
1:21:24 so
1:21:25 why
1:21:25 do
1:21:25 we
1:21:26 name
1:21:26 it
1:21:26 artificial
1:21:28 that’s
1:21:28 too
1:21:28 true
1:21:29 Ben
1:21:29 I’ve
1:21:30 enjoyed
1:21:30 every
1:21:30 minute
1:21:30 of
1:21:31 this
1:21:31 where
1:21:31 can
1:21:31 people
1:21:31 keep
1:21:32 up
1:21:32 with
1:21:32 you
1:21:33 yeah
1:21:33 well
1:21:33 I
1:21:34 am
1:21:35 B
1:21:35 Horowitz
1:21:35 on
1:21:36 AX
1:21:37 and
1:21:38 you
1:21:39 know
1:21:40 that’s
1:21:40 probably
1:21:40 the
1:21:40 best
1:21:41 thing
1:21:41 we’re
1:21:42 a16z.com
1:21:43 and
1:21:45 hope
1:21:45 you
1:21:45 enjoyed
1:21:45 it
1:21:46 and
1:21:46 that
1:21:46 was
1:21:47 great
1:21:47 fun
1:21:47 good
1:21:48 fun
1:21:48 catching
1:21:48 it
1:21:49 was
1:21:49 indeed
1:21:49 and
1:21:49 then
1:21:49 you
1:21:49 also
1:21:50 have
1:21:51 multiple
1:21:51 books
1:21:52 that
1:21:52 people
1:21:52 can
1:21:52 read
1:21:52 that
1:21:53 are
1:21:53 extraordinarily
1:21:54 well
1:21:55 respected
1:21:55 in the
1:21:55 field
1:21:56 so
1:21:57 also
1:21:57 thank
1:21:57 you
1:21:57 for
1:21:57 those
1:21:58 absolutely
1:22:00 awesome
1:22:00 thanks
1:22:00 all right
1:22:01 well thank
1:22:01 you brother
1:22:02 I appreciate
1:22:02 it
1:22:02 all right
1:22:02 everybody
1:22:03 if you
1:22:03 have
1:22:03 not
1:22:04 already
1:22:04 be
1:22:04 sure
1:22:04 to
1:22:04 subscribe
1:22:05 and
1:22:05 until
1:22:05 next
1:22:05 time
1:22:06 my
1:22:06 friends
1:22:06 be
1:22:07 legendary
1:22:07 take
1:22:07 care
1:22:08 peace
1:22:20 we’ve
1:22:20 got
1:22:20 more
1:22:20 great
1:22:21 conversations
1:22:21 coming
1:22:21 your
1:22:21 way
1:22:22 see
1:22:22 you
1:22:22 next
1:22:22 time

This week on the a16z Podcast, we’re sharing a feed drop from Impact Theory with Tom Bilyeu, featuring a wide-ranging conversation with a16z cofounder Ben Horowitz.

Artificial intelligence isn’t just a tool — it’s a tectonic shift. In this episode, Ben joins Tom to break down what AI really is (and isn’t), where it’s taking us, and why it matters. They dive into the historical parallels, the looming policy battles, and how innovation cycles have always created — not destroyed — opportunity.

From the future of work and education to the global AI race and the role of blockchain in preserving trust, Ben shares hard-won insights from decades at the forefront of technological disruption. It’s a masterclass in long-term thinking for anyone building, investing, or navigating what’s coming next.

Resources: 

Listen to more episodes of Impact Theory with Tom Bilyeu: https://link.chtbl.com/impacttheory

Watch full conversations on YouTube: youtube.com/tombilyeu
Follow Tom on Instagram: @tombilyeu

Learn more about Impact Theory: impacttheory.com

Timecodes: 

00:00 Introduction to Impact Theory with Ben Horowitz

01:12 The Disruptive Power of AI

02:01 Understanding AI and Its Implications

04:19 The Future of Jobs in an AI-Driven World

06:52 Human Intelligence vs. Artificial Intelligence

10:31 The Role of AI in Society

21:41 AI and the Future of Work

35:07 The AI Race: US vs. China

41:25 The Importance of Blockchain in an AI World

44:26 Government Regulation and Blockchain

45:16 The Need for Stablecoins

45:45 Energy Challenges and AI

49:53 Market Structure Bill and Token Regulation

53:51 Blockchain’s Trust and Adoption

01:04:17 Elon Musk’s Government Involvement

01:12:03 Historical Figures and Modern Parallels

01:18:41 AI and Creativity in Business

01:21:29 Conclusion and Final Thoughts

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://x.com/eriktorenberg

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures

Leave a Reply

Your email address will not be published. Required fields are marked *