Steven Sinofsky & Balaji Srinivasan on the Future of AI, Tech, & the Global World Order

AI transcript
0:00:01 For four years, it was just a desert.
0:00:03 The state blocked IPOs.
0:00:05 They’re blocking M&As.
0:00:07 It was just an all-out anti-tech assault.
0:00:10 DC is the zero-sum game.
0:00:11 There’s something positive out there,
0:00:15 so it has to accrue to the DC power base.
0:00:18 Figma managed to make its way through that.
0:00:21 Absolutely no thanks to the state attacking it.
0:00:23 And then Lena Kahn decided to take a victory lap on this,
0:00:25 which was, as I said, it’s like the assassin
0:00:28 congratulating themselves for helping to elect Trump.
0:00:32 There’s been a wave of M&A chaos lately.
0:00:34 Meta in scale, windsurfing Google,
0:00:36 and a lot of it points to something bigger,
0:00:39 how regulation, capital, and innovation
0:00:40 are colliding in 2025.
0:00:43 In today’s episode, I brought on Stephen Sinofsky
0:00:46 and Balaji’s Sreenivasan to break it all down.
0:00:48 We get into how deal-making is changing
0:00:51 from aqua-hires to what Balaji calls aqua-fires,
0:00:53 plus the deeper power struggle
0:00:54 between the state and the network,
0:00:57 and what it all means for AI, startups,
0:00:58 and the future of tech.
0:00:59 Let’s get into it.
0:01:04 As a reminder, the content here
0:01:05 is for informational purposes only.
0:01:07 Should not be taken as legal, business, tax,
0:01:08 or investment advice,
0:01:11 or be used to evaluate any investment or security,
0:01:13 and is not directed at any investors
0:01:15 or potential investors in any A16Z fund.
0:01:18 Please note that A16Z and its affiliates
0:01:19 may also maintain investments
0:01:21 in the companies discussed in this podcast.
0:01:24 For more details, including a link to our investments,
0:01:28 please see a16z.com forward slash disclosures.
0:01:34 There’s been a lot going on in the world of M&A
0:01:36 in the last month, et cetera.
0:01:38 There’s been the meta-RH scale.
0:01:40 There’s been the WindSurf saga.
0:01:43 There’s been Galena Khan discourse surrounding Figma.
0:01:45 And I wanted to bring both Balaji and Steven together
0:01:48 to kind of reflect and discuss at a higher level
0:01:51 how we make sense of what’s happening in M&A land.
0:01:53 And Balaji, I know you’ve had some thoughts recently,
0:01:54 so I thought I’d let you open.
0:01:57 Yeah, so there’s actually like three, I think,
0:01:59 separate issues that are all kind of interrelated.
0:02:01 And they’re all kind of related to how
0:02:03 U.S. capital markets are becoming tougher,
0:02:05 but the internet capital markets are opening up.
0:02:10 Those are the WindSurf scale, character, and so on,
0:02:12 this new kind of deal structure,
0:02:13 then the Figma IPO,
0:02:16 and then finally the new Genius Act.
0:02:19 And to briefly summarize,
0:02:21 since Starbucks,
0:02:23 so Sarbanes-Oxley in the early 2000s,
0:02:25 which was passed in the wake of Enron,
0:02:27 that was the intent to stop Enron’s,
0:02:29 but actually it didn’t stop IPOs.
0:02:32 So the number of public companies has just declined,
0:02:33 the number of IPOs has declined,
0:02:35 and tech companies started going private for longer.
0:02:39 And then now with the FTC antitrust harassment
0:02:41 of the last several years,
0:02:43 that also started to cut off the M&A window.
0:02:44 So for four years,
0:02:45 it was just a desert.
0:02:48 And DOJ went to interfere, for example,
0:02:49 with JetBlue’s acquisition of Spirit.
0:02:50 Spirit went bust.
0:02:52 Roblox had issues.
0:02:54 A bunch of companies silently died.
0:02:54 Go ahead.
0:02:55 Roomba, Roomba.
0:02:58 I mean, kill a robot acting company.
0:02:59 Yes, they said,
0:02:59 Roblox is fine.
0:03:00 Sorry, Roomba.
0:03:00 You’re right.
0:03:04 So first, the state blocked IPOs.
0:03:06 And so we had to go private for longer
0:03:08 and build a whole PE kind of model.
0:03:10 And then they’re blocking M&As.
0:03:14 And so essentially, that just caused,
0:03:14 I mean, among other things,
0:03:16 there’s also the assault on AI
0:03:17 with limiting the number of flops.
0:03:19 There’s the assault on crypto
0:03:21 with the SEC essentially doing lawfare
0:03:22 against the whole space
0:03:23 and debanking companies.
0:03:26 It was just an all-out anti-tech assault.
0:03:30 And then the amazing thing was Figma managed to make its way through that
0:03:32 and actually get to IPO.
0:03:36 Absolutely no thanks to the state attacking it.
0:03:39 And then Lena Kahn decided to take a victory lap on this,
0:03:40 which was, as I said,
0:03:42 it’s like the assassin congratulating themselves
0:03:43 for helping to elect Trump.
0:03:46 It was quite a remarkable statement,
0:03:47 but it really gets to the heart
0:03:51 of sort of the way that DC is the zero-sum game.
0:03:53 And so there’s something positive out there,
0:03:57 so it has to accrue to the DC power base
0:04:01 because otherwise there’s not enough positive left over.
0:04:03 And so it was sort of this absorbing
0:04:06 the one glimmer of hope that’s out there
0:04:10 and ignoring the long, long tail of carnage
0:04:11 that they’ve recently caused.
0:04:13 That’s right.
0:04:14 And basically the thing is
0:04:16 they take credit for the good things
0:04:17 and the bad things,
0:04:19 oh, well, that company must have sucked anyway
0:04:20 or something like that, right?
0:04:22 And actually the DC thing,
0:04:24 I’m not sure this is apocryphal,
0:04:25 but I remember I think Gates said something
0:04:28 in the late 90s or early 2000s,
0:04:30 actually before the whole antitrust thing,
0:04:31 which you guys actually had to go through.
0:04:32 You had to go through the whole thing.
0:04:35 Something like he wanted nothing to do with Washington at all.
0:04:36 It just wanted to code.
0:04:38 And they said, well, that’s fine.
0:04:40 But some politician said,
0:04:41 but we’re just going to hold hearings on you
0:04:43 and then you’re going to have to donate to us.
0:04:44 Do you remember that?
0:04:45 I remember somebody wrote that up.
0:04:46 Go ahead.
0:04:47 I think that there’s just, I mean,
0:04:50 part of what you said brought up two things for me.
0:04:51 One is that, yeah, I mean,
0:04:53 like the whole thing about computing
0:04:58 is that except for the IBM antitrust case,
0:05:00 which started in the late 1960s
0:05:02 and pretty much lasted through me in high school,
0:05:04 the whole role,
0:05:06 computing just kind of arose
0:05:09 without like government regulation,
0:05:10 without government oversight.
0:05:12 and even with the internet,
0:05:13 with actual,
0:05:14 with just government funding it.
0:05:17 And so it’s this very weird thing
0:05:20 if you’re in the business of governing and regulating
0:05:23 that this giant thing that swallowed the economy
0:05:25 happened without you.
0:05:27 And also to be fair,
0:05:30 the entire software industry in a sense happened that way.
0:05:32 You never needed to be licensed to be a software engineer.
0:05:35 There was never approval to sell software.
0:05:39 I mean, the most approval that the government got involved in
0:05:42 was it used to prevent mail-order laptops
0:05:46 and desktop computers because they had radios in them.
0:05:47 And so they needed FCC.
0:05:48 Is that right?
0:05:50 Yeah, they needed FCC approval
0:05:53 in order to finally ship computers to home,
0:05:55 which is something that a guy named Dana Lewin,
0:05:57 who ran the Computer History Museum,
0:06:02 he really cracked that working for Apple in the early 1980s.
0:06:04 And that’s how Macintosh made it to campuses
0:06:07 was that they figured out how to work around the FCC.
0:06:10 And then Michael Dell had to do the same thing in the PC world.
0:06:14 So you have this whole industry
0:06:16 that swallowed the economy, basically,
0:06:18 happened without any hearings.
0:06:21 I mean, there were just no hearings even.
0:06:23 And so it’s remarkable when you think about it.
0:06:25 Yeah.
0:06:27 So my framework on that, actually,
0:06:28 and not that I use this for everything,
0:06:29 but I think it’s a useful framework,
0:06:31 is network versus state.
0:06:34 And like the network, the internet,
0:06:36 and the state, you know, the regulations,
0:06:37 and the government,
0:06:39 also informal things that are aligned with the state.
0:06:41 Because network’s intangible,
0:06:44 the scale of the internet
0:06:46 and how quickly it grew
0:06:48 is still something that even today,
0:06:49 you know, Orwell had the same,
0:06:50 which is like,
0:06:51 it takes an enormous effort
0:06:52 to see what’s in front of one’s own face, right?
0:06:53 And what’s in front of one’s own face?
0:06:54 Well, it’s a phone.
0:06:55 It’s a screen, right?
0:06:57 The internet is simply
0:06:58 the most popular thing in the world,
0:06:59 perhaps in human history.
0:07:00 It’s completely ubiquitous.
0:07:02 It’s upstream of AI.
0:07:03 It’s upstream of phones.
0:07:04 It’s upstream of drones.
0:07:06 It’s upstream of everything on social media
0:07:06 and so on and so forth.
0:07:07 It’s upstream of the election.
0:07:09 Twitter elected Trump,
0:07:10 and then Twitter deplatformed Trump,
0:07:11 and then ex-elected Trump.
0:07:14 Like, the internet’s upstream of everything,
0:07:15 and yet, because it’s invisible,
0:07:19 we don’t think of it as a primary actor yet,
0:07:21 you know, which is, I think,
0:07:21 the whole other point.
0:07:22 The other thing is,
0:07:24 I think one of the most remarkable
0:07:25 future historians
0:07:26 when writing about this era
0:07:27 will say something like,
0:07:29 you need a license to cut hair,
0:07:32 and you need a license to do this,
0:07:33 and you need a license to do that.
0:07:35 If you didn’t need a license to own a computer,
0:07:37 the most powerful device ever created.
0:07:39 Right?
0:07:40 Thank God.
0:07:41 What an oversight.
0:07:44 You know, you raise a really good point,
0:07:45 and I, look,
0:07:48 I’m not going to try to be the other side,
0:07:50 but I can sort of defend that side,
0:07:52 which is, it is true,
0:07:54 you needed a license to apply makeup in a salon
0:07:55 or give a massage,
0:07:57 but, you know,
0:07:59 not to write software to change the world.
0:08:02 And I think it’s so interesting
0:08:05 to think about the mindset of the regulator,
0:08:06 because, of course,
0:08:08 all of the antitrust laws,
0:08:09 if you go back,
0:08:10 whether you start with the Sherman Act
0:08:11 or the Clayton Act,
0:08:14 they were based on these very tangible,
0:08:16 distribution-constrained,
0:08:19 resource-constrained world
0:08:20 where, like,
0:08:23 well, you can’t own the railroad tracks,
0:08:24 the railroad cars,
0:08:27 and the coal that ships on all of them
0:08:29 because that’s, like,
0:08:31 this vertical integration thing.
0:08:32 And, like,
0:08:34 the PC industry just,
0:08:36 it didn’t have any of those constraints.
0:08:37 And the computing industry,
0:08:40 only for that brief time with IBM,
0:08:42 where it cost tens of millions of dollars,
0:08:45 and IBM actually chose to only lease them,
0:08:46 not to sell them,
0:08:46 which, of course,
0:08:48 it makes a ton of sense
0:08:49 in the world of technology
0:08:51 because owning a depreciating asset
0:08:53 that doesn’t matter in a year
0:08:54 is actually a bad idea.
0:08:55 And so,
0:08:57 but all of those laws
0:08:58 came about for that.
0:08:58 I mean,
0:09:00 even these crazy elements of it,
0:09:01 like when they do
0:09:02 merger and acquisition analysis
0:09:04 and they try to understand market share,
0:09:06 which is not in the law.
0:09:07 There’s nowhere in the law
0:09:08 that it says this,
0:09:08 but they go
0:09:09 and they hire these people
0:09:10 and they compute,
0:09:11 like,
0:09:12 the HHI index,
0:09:14 which is this way of,
0:09:16 they take all the players in a market.
0:09:17 So right away,
0:09:18 you presume that the market
0:09:19 is well-defined
0:09:21 and has N players in it.
0:09:23 And then they take each one’s share
0:09:25 meaning you can actually measure it.
0:09:27 And then they square the share
0:09:28 and add those all up
0:09:30 and divide by the number of players
0:09:31 and decide if that’s,
0:09:32 like,
0:09:33 greater than 0.25,
0:09:34 less than 0.75.
0:09:36 And then they go,
0:09:36 oh,
0:09:36 monopoly.
0:09:39 This has been the challenge
0:09:40 in computing forever.
0:09:41 Like,
0:09:42 even just take,
0:09:42 like,
0:09:43 what is the market
0:09:44 for word processors?
0:09:45 Is it,
0:09:45 like,
0:09:47 the thing called a word processor?
0:09:48 Is it everywhere you can type?
0:09:49 Is it only if you print it?
0:09:51 And you very quickly
0:09:52 can’t figure out,
0:09:53 like,
0:09:54 what’s the share of email?
0:09:55 Is it the client?
0:09:57 Is it the server?
0:09:58 Does it depend on features?
0:10:00 Is it mainframe hosted,
0:10:01 minicomputer hosted,
0:10:02 PC hosted,
0:10:04 or now cloud?
0:10:05 And so all of these things,
0:10:07 Steve Jobs put up a slide
0:10:09 at the iPhone launch
0:10:10 of mobile phone share.
0:10:13 But he very deliberately chose
0:10:13 to measure it
0:10:15 by the manufacturer of phones
0:10:17 so that it diminished
0:10:19 the share of Windows phone.
0:10:21 But he also could have
0:10:22 just done it
0:10:23 by operating system,
0:10:23 which would have made
0:10:25 a completely different chart.
0:10:27 But what was the right way
0:10:27 to define it?
0:10:29 When Microsoft was going
0:10:30 through Instancy Trust,
0:10:31 we had 100% share
0:10:32 of the Windows market.
0:10:33 Well,
0:10:34 duh.
0:10:36 And this is what you get into,
0:10:36 like,
0:10:37 when you look at a deal like,
0:10:38 not even to pick up,
0:10:39 but to pick
0:10:40 robot vacuum cleaners.
0:10:42 Do you count Optimus
0:10:43 in the line,
0:10:44 the share of robot
0:10:45 vacuum cleaners,
0:10:46 or is it only
0:10:47 the spinning ones
0:10:48 that cat sit on top
0:10:49 of the dock
0:10:50 in your living room corner?
0:10:51 And they’ve always
0:10:52 struggled with this,
0:10:54 but they don’t admit it.
0:10:55 And so they apply
0:10:56 this sort of econometrics
0:10:58 to the whole thing
0:10:59 that make it seem like
0:11:00 it’s this perfectly
0:11:01 well-defined,
0:11:03 well-reasoned thing.
0:11:04 And even to this day,
0:11:05 I don’t think people,
0:11:06 they would say
0:11:07 it would be incorrect
0:11:09 to define the phone market
0:11:11 as iOS versus Android,
0:11:12 because China would have
0:11:13 something to say
0:11:13 about that.
0:11:14 Android itself,
0:11:15 it’s just not
0:11:17 the same product
0:11:18 across them.
0:11:19 And do you count?
0:11:19 Yeah, actually,
0:11:21 I give three reactions to that
0:11:21 because I think
0:11:22 there’s a bunch of things.
0:11:23 I just want to shoot at that.
0:11:23 Yeah, yeah.
0:11:25 So the first is
0:11:27 on like definition of market,
0:11:28 it’s actually Eric Schmitz
0:11:28 also talked to us,
0:11:29 but basically,
0:11:30 for example,
0:11:31 when the iPhone came out,
0:11:32 people didn’t think of it
0:11:33 as a competitor
0:11:34 in terms of being a camera,
0:11:35 but it was one
0:11:36 of the most popular cameras
0:11:37 because it was ubiquitous
0:11:39 and it was essentially
0:11:40 zero incremental cost
0:11:41 and it was internet connected
0:11:42 and it was programmable.
0:11:43 And so even though
0:11:43 the image quality
0:11:44 is very poor,
0:11:45 and the number of pixels
0:11:46 was a little relative
0:11:47 to a DLSR or whatever,
0:11:49 it was a very popular camera,
0:11:50 but it wasn’t thought of
0:11:51 as a camera
0:11:53 as a primary axis.
0:11:54 And often,
0:11:55 the Christensen framework,
0:11:55 you know,
0:11:56 the late,
0:11:57 great Clint Christensen,
0:11:59 this whole disruptive
0:11:59 innovation concept
0:12:00 is something comes in
0:12:01 and it’s not really
0:12:02 recognizable
0:12:04 as a peer
0:12:05 to the existing products
0:12:06 in the marketplace,
0:12:06 but it’s better
0:12:07 on some critical axis
0:12:08 and it gains adoption
0:12:10 in that way.
0:12:10 And then eventually
0:12:11 it’s a substitute,
0:12:12 but it takes a while
0:12:13 for people to even
0:12:14 acknowledge it’s coming in there.
0:12:15 And then you also
0:12:16 have a fuzzy set
0:12:17 sort of thing
0:12:17 where,
0:12:18 for example,
0:12:18 Google and Apple
0:12:19 compete on operating systems
0:12:20 and Google and Facebook
0:12:22 compete on ads
0:12:23 and Facebook and Apple
0:12:25 compete on headsets
0:12:26 and so on and so forth.
0:12:27 And so there’s lots
0:12:28 of fuzzy set overlap
0:12:28 kinds of things.
0:12:30 And then another point
0:12:31 is the entire concept of,
0:12:32 oh, look at all the,
0:12:33 Benny Devin said
0:12:33 this good saying,
0:12:35 which is think about
0:12:35 all the tech monopolies,
0:12:36 there’s so many of them.
0:12:37 Ha ha.
0:12:38 Right?
0:12:38 Yeah, yeah.
0:12:40 Which is really,
0:12:41 and the issue is
0:12:41 they try to use
0:12:42 these formulas
0:12:44 as sometimes
0:12:45 a substitute for judgment
0:12:46 and sometimes
0:12:47 a mask for animus,
0:12:47 right?
0:12:48 Where really,
0:12:49 you know,
0:12:50 Elizabeth Warren was saying
0:12:50 she wants to build
0:12:51 an anti-crypto army,
0:12:52 but really she wanted
0:12:53 to build an anti-tech army.
0:12:54 And that’s just like
0:12:56 a particularly explicit thing
0:12:57 where it’s like
0:12:58 she’s in a tribe
0:12:59 that’s against our tribe.
0:13:01 And as you said,
0:13:03 put yourself in their shoes.
0:13:04 Like first order
0:13:05 as people of the network
0:13:06 versus them
0:13:07 as people of the state.
0:13:08 People of the state,
0:13:09 it took me a while
0:13:10 to understand that,
0:13:11 but fundamentally
0:13:11 what they want
0:13:12 more than anything else
0:13:13 is to get
0:13:14 a piece of the state
0:13:15 to get a baton,
0:13:17 to be able to be
0:13:18 like assemblymen of this
0:13:20 or undersecretary of that,
0:13:21 have a piece of the state
0:13:21 that’s baton
0:13:23 and they can allocate capital
0:13:24 and start doing things
0:13:25 for the good of the world
0:13:26 and making you do this
0:13:27 and forcing you to do that.
0:13:28 And it’s all about
0:13:29 coercion,
0:13:30 power over other status
0:13:32 and the use of force
0:13:33 implicitly or explicitly.
0:13:35 Whereas our framework
0:13:37 is the total opposite.
0:13:37 We don’t want anyone
0:13:38 to have power over us.
0:13:39 We’re not asking
0:13:40 to tell anybody what to do.
0:13:41 We didn’t consent
0:13:43 to being in our organization.
0:13:43 We just want like
0:13:44 a bare domain name
0:13:46 like reddit.com,
0:13:47 like a field
0:13:48 that we can build up
0:13:48 on ourselves
0:13:49 and no one tells us
0:13:50 what to do.
0:13:50 You know,
0:13:51 maybe an investment
0:13:51 or something.
0:13:52 Okay, maybe it’s a board seat
0:13:52 or something.
0:13:53 But in general,
0:13:54 you’re just able
0:13:55 to build on your own.
0:13:57 And these two kinds
0:13:58 of things can coexist
0:13:59 for a long time
0:13:59 so long as we’re just
0:14:01 typing and doing math
0:14:02 and thereof regulating
0:14:03 bombing countries
0:14:04 or whatever it is.
0:14:06 But as the network
0:14:07 grew and grew and grew
0:14:08 and became,
0:14:09 got to tens of,
0:14:09 this is funny,
0:14:09 Eric, right?
0:14:11 Yeah, as the network
0:14:12 grew to tens of millions
0:14:13 and hundreds of millions
0:14:14 and eventually billions
0:14:15 of people,
0:14:16 we were like,
0:14:16 you know,
0:14:17 this thing just grew
0:14:18 to state level.
0:14:19 And now,
0:14:20 these two things
0:14:21 started a conflict
0:14:22 because we felt
0:14:23 legitimately
0:14:23 that we’re the CEOs,
0:14:24 the founders
0:14:25 of these companies.
0:14:26 We built them
0:14:27 from scratch.
0:14:27 We should be able
0:14:28 to have authority
0:14:29 on what happens
0:14:29 on these networks.
0:14:30 And these guys
0:14:31 started to see,
0:14:32 wait a second,
0:14:33 their authority,
0:14:34 whatever it is on paper,
0:14:35 is actually being
0:14:36 limited in practice
0:14:36 because,
0:14:37 for example,
0:14:37 let’s say you’re a
0:14:38 tax and dialing regulator.
0:14:40 The actual regulator
0:14:41 of taxis
0:14:42 is Uber or Lyft
0:14:42 because they have
0:14:43 real-time tracking,
0:14:44 they have star ratings
0:14:45 on both sides,
0:14:47 or the FCC
0:14:48 or something like that
0:14:49 who issued licenses.
0:14:49 Well,
0:14:50 the actual regulator
0:14:50 of speech
0:14:51 or what have you
0:14:51 is then YouTube,
0:14:52 Facebook.
0:14:53 So our expansion,
0:14:54 our peaceful,
0:14:56 visible internet expansion
0:14:58 started implicitly
0:14:59 taking market share
0:14:59 away from them
0:15:00 and sends up
0:15:01 regulatory power.
0:15:01 So the empire
0:15:02 struck back,
0:15:03 they attacked us hard.
0:15:05 And I can actually,
0:15:06 if I squint,
0:15:07 I can actually
0:15:08 also understand
0:15:10 how we would
0:15:10 think like them
0:15:11 and then vice versa,
0:15:12 right?
0:15:13 So how would
0:15:14 they think
0:15:14 they don’t want
0:15:15 others to have
0:15:15 power of them?
0:15:15 Well,
0:15:16 they don’t like
0:15:16 the fact the network
0:15:17 is getting so big
0:15:18 it’s able to
0:15:19 dwarf them or whatever.
0:15:20 How would
0:15:21 we want to have
0:15:21 power of something?
0:15:22 Well,
0:15:23 within our organizations,
0:15:24 we want to be able
0:15:25 to flip a switch
0:15:26 and make something happen.
0:15:27 And if there’s
0:15:27 a resistance to that,
0:15:28 that’s a huge pain
0:15:30 and we want to
0:15:30 be able to make
0:15:30 that happen.
0:15:31 We’ll reorganize
0:15:32 the company
0:15:32 or something
0:15:34 so that it can be
0:15:35 better and more
0:15:35 functional.
0:15:37 There’s another way
0:15:37 of actually,
0:15:38 have you guys
0:15:39 ever seen
0:15:39 The Political Compass?
0:15:40 Yeah, yeah.
0:15:41 Yeah.
0:15:42 So it’s
0:15:43 top left,
0:15:43 authoritarian left,
0:15:44 authoritarian right,
0:15:45 libertarian left,
0:15:45 libertarian right.
0:15:47 And so if you
0:15:47 roughly,
0:15:48 roughly,
0:15:48 roughly sit
0:15:49 in the lower
0:15:49 right corner,
0:15:51 adjacents are often
0:15:51 allied,
0:15:52 but diagonals
0:15:52 don’t get along.
0:15:54 So a libertarian right,
0:15:55 I can sometimes
0:15:55 understand the
0:15:56 nationalists and I
0:15:57 can understand
0:15:57 the libertarian left,
0:15:58 but the
0:15:59 libertarian quadrant
0:16:00 always seemed foreign
0:16:00 to me until
0:16:02 I ran a large
0:16:03 tech platform.
0:16:04 You know why?
0:16:05 And of course,
0:16:06 you probably have,
0:16:07 because Steve,
0:16:08 you probably had
0:16:08 this experience as well,
0:16:09 but the libertarian
0:16:11 right framework is
0:16:12 everybody has consent
0:16:13 and you pay them
0:16:14 to do something
0:16:15 and it’s a market-based
0:16:16 process and that works
0:16:16 for many things.
0:16:17 But let’s say you’re
0:16:18 running a giant tech
0:16:19 platform with hundreds
0:16:19 of millions or even
0:16:20 billions of people
0:16:21 and you want to
0:16:22 change some parameters
0:16:23 that you cannot put
0:16:24 that up for auction
0:16:26 or discussion with
0:16:27 everybody on everything,
0:16:27 right?
0:16:28 Instead,
0:16:29 you’re going to
0:16:30 just flip a switch
0:16:30 and they’re all
0:16:31 going to be basically
0:16:32 opted into this
0:16:33 and they’re not
0:16:33 going to get paid
0:16:34 and they’re just
0:16:34 going to do,
0:16:34 right?
0:16:35 Because otherwise,
0:16:36 there would literally
0:16:37 be no way you
0:16:38 could possibly have
0:16:39 like,
0:16:40 you know,
0:16:41 the complexity
0:16:41 of these systems,
0:16:42 every algorithm
0:16:43 change,
0:16:44 every update,
0:16:44 every this,
0:16:45 every that cannot
0:16:47 be something that’s
0:16:47 like,
0:16:48 you know,
0:16:49 you have five
0:16:50 million parameter
0:16:51 settings in any,
0:16:52 you know,
0:16:52 even Chrome,
0:16:53 as complicated as it
0:16:53 is,
0:16:54 it sets like,
0:16:54 you know,
0:16:55 a zillion of the
0:16:56 settings for defaults,
0:16:56 right?
0:16:57 So you have to pick
0:16:58 defaults and for
0:16:59 the most part,
0:17:00 the state,
0:17:01 the platform will
0:17:02 actually know better
0:17:02 than the people
0:17:03 because it’s got all
0:17:03 these analytics and
0:17:04 so on and so forth.
0:17:05 So you can put
0:17:06 yourself into the
0:17:07 Elizabeth Warren headspace
0:17:08 if you think of
0:17:09 yourself as a system
0:17:10 administrator of a
0:17:10 platform where you
0:17:11 have lawful authority
0:17:12 over it and you
0:17:13 built it from scratch
0:17:13 and so on and so
0:17:13 forth.
0:17:14 I think the
0:17:15 difference then boils
0:17:16 down to the
0:17:17 competence law,
0:17:17 right?
0:17:18 So the thing is,
0:17:19 these platforms are
0:17:19 at least competing
0:17:20 with other platforms
0:17:22 and if Apple or
0:17:23 Microsoft or Google
0:17:24 makes too bad
0:17:24 a decision,
0:17:25 then the people on
0:17:26 those platforms
0:17:26 have exit,
0:17:27 they have choice,
0:17:27 they can move
0:17:28 between platforms
0:17:29 so there’s ultimately
0:17:29 a market constraint.
0:17:30 But the actual
0:17:32 monopoly is at the
0:17:33 DC level where there
0:17:34 wasn’t a practical
0:17:35 switching option
0:17:36 between things so
0:17:37 they could just mess
0:17:37 up the platform
0:17:38 charts all the time
0:17:39 and us being the
0:17:40 apps on that
0:17:40 platform,
0:17:40 if you think of the
0:17:41 state as a
0:17:41 platform and
0:17:43 companies as the
0:17:43 apps on that
0:17:43 platform,
0:17:44 we would not have
0:17:45 that much choice.
0:17:46 Let me pause there
0:17:46 and get your thoughts.
0:17:47 Well, I think
0:17:48 that’s a fantastic
0:17:49 observation.
0:17:49 I mean, one way to
0:17:51 think about that is
0:17:52 you, you know, and
0:17:53 we probably won’t even
0:17:54 agree on some of
0:17:55 this, but if you look
0:17:57 at what the European
0:17:58 Union has done with
0:17:59 app stores, you
0:18:01 actually see that
0:18:02 dynamic playing out
0:18:03 precisely, which is,
0:18:04 you know, you have
0:18:06 Apple who basically
0:18:07 said, look, we want
0:18:08 to build a platform
0:18:09 that we built PC
0:18:10 platforms.
0:18:10 We understand
0:18:11 what it was like
0:18:11 to build a Mac.
0:18:13 We know how security
0:18:14 violations happen.
0:18:14 We know how privacy
0:18:15 violations happen.
0:18:17 We know how quality
0:18:18 degrades over time
0:18:19 due to software and
0:18:20 third parties and
0:18:21 apps.
0:18:22 We know kernel mode
0:18:22 versus user mode.
0:18:23 We know all of this
0:18:24 stuff.
0:18:25 So when we built the
0:18:26 iPhone and the
0:18:27 platform for the
0:18:28 iPhone, including
0:18:29 the App Store, we
0:18:30 actually were like we
0:18:31 whiteboarded out and
0:18:32 we deliberately said,
0:18:33 well, we’re going to
0:18:34 constrain the API so
0:18:35 that apps can’t steal
0:18:36 information from other
0:18:36 apps.
0:18:37 We’re going to be
0:18:38 secure.
0:18:39 And so we’re not
0:18:39 going to run a bunch
0:18:40 of stuff.
0:18:40 system mode.
0:18:41 There’s not going to
0:18:41 be third party
0:18:42 drivers.
0:18:43 So there’s no kernel
0:18:43 mode, all these
0:18:44 things.
0:18:45 And then the European
0:18:46 Union comes along
0:18:48 after the success of
0:18:50 that and then says,
0:18:51 you know, good idea,
0:18:53 but we actually want
0:18:55 to return the phones
0:18:56 to being PCs again.
0:18:58 So now here’s our
0:18:59 Digital Markets Act,
0:19:01 which basically says
0:19:02 phones have to be
0:19:03 back like PCs.
0:19:05 And Apple’s like,
0:19:07 time, time.
0:19:08 You do understand.
0:19:10 We literally
0:19:12 set out to be more
0:19:12 secure and to be
0:19:13 private.
0:19:14 You, the GDPR
0:19:15 people, we actually
0:19:16 wanted to solve this
0:19:17 problem on the phone
0:19:19 and we wanted to be
0:19:20 secure.
0:19:20 And they’re like, you
0:19:21 know, you’re right.
0:19:22 So we’re going to put in
0:19:23 a thing that says, and
0:19:24 vendors should be allowed
0:19:25 to be secure.
0:19:27 And you’re like, well,
0:19:28 what does that mean?
0:19:29 And they’re like, well,
0:19:30 you have to actually go
0:19:30 figure that out.
0:19:32 But just know that as
0:19:32 the regulators, we’re
0:19:35 demanding that you be as
0:19:37 open and free as the PC
0:19:38 security and secure.
0:19:38 And you’re like, we
0:19:39 did that once already.
0:19:41 That is precisely what we
0:19:42 went and did.
0:19:42 Why we locked it down.
0:19:44 And so you get in these
0:19:45 kind of crazy loops.
0:19:47 And it’s actually not
0:19:49 unlike the loop that the
0:19:50 media had with the
0:19:52 internet, which was, you
0:19:54 know, we believe in
0:19:56 curation and editorial and
0:19:58 control and own the
0:19:58 distribution.
0:19:59 And the internet’s like,
0:20:00 well, we have a way of
0:20:01 doing distribution and
0:20:03 we have a different view on
0:20:04 curation and control.
0:20:05 And then you get in
0:20:06 these loops where then
0:20:07 the media decide, well,
0:20:08 we like the distribution
0:20:10 that comes, but now we
0:20:10 want to constrain the
0:20:12 distribution, but we like
0:20:13 the distribution, but we
0:20:15 want to editorialize the
0:20:15 distribution.
0:20:16 And then the regulators
0:20:17 come in.
0:20:20 And so that diagonal on
0:20:21 the political compass is
0:20:23 really just, I’m actually
0:20:24 walking in your shoes
0:20:25 right now and I realize
0:20:26 what it is that you did,
0:20:28 but I want to want that
0:20:29 and I don’t want to be
0:20:30 tyranny of the oar.
0:20:31 I actually just want
0:20:33 security and openness.
0:20:35 They just, I think a
0:20:36 big part of it actually
0:20:38 honestly boils down to
0:20:39 the fact that they are
0:20:41 simply not numerical and
0:20:42 they’re like AI agents.
0:20:44 AI agents were helpful
0:20:45 because they allowed me
0:20:47 to model like a, you
0:20:48 know, I think all three
0:20:49 of us are actually fairly
0:20:51 verbal people, right?
0:20:52 So we can write and so
0:20:53 and so forth.
0:20:55 But we also have the
0:20:56 system two thinking, you
0:20:57 know, in Kahneman’s, you
0:20:58 know, phrase or what have
0:21:00 you, where you can just
0:21:00 go head down, you can
0:21:01 program, you can do the
0:21:02 math, the numbers actually
0:21:03 have to add up, right?
0:21:04 You have to do the
0:21:04 spreadsheets, you know,
0:21:05 that has to be an
0:21:06 American thing.
0:21:08 And one of the things
0:21:10 that, you know, I realized
0:21:12 is a good chunk of the
0:21:14 people who are in the
0:21:15 American state, not the
0:21:15 Chinese state, that’s a
0:21:16 different thing, we could
0:21:17 talk about that.
0:21:18 But the American state, a
0:21:19 good chunk of them are
0:21:20 those who are selected
0:21:22 for having verbal and not
0:21:23 numerical slash mathematical
0:21:23 ability.
0:21:26 As distinct from, let’s say,
0:21:27 in the 50s or something
0:21:28 like that, where in the
0:21:30 50s, because marginal tax
0:21:31 rates were at 90% in
0:21:32 America, they were 100%
0:21:33 in Soviet Russia, you
0:21:34 know, go to jail, do not
0:21:35 ask, go, do not collect
0:21:36 $200 or 200 rubles.
0:21:38 They were at, they’re at
0:21:39 90%, though, in FDR’s
0:21:40 America.
0:21:41 So, you know, there’s a
0:21:42 book by William White
0:21:43 called The Organization
0:21:44 Man, where it was this
0:21:45 extremely centralized
0:21:46 environment, very corporate,
0:21:47 you know, you couldn’t
0:21:48 really found a company or
0:21:48 anything like that.
0:21:49 It was very hard for
0:21:52 Shockley and Fairchild and
0:21:53 so on to do what they
0:21:54 did.
0:21:57 But at that time, the
0:21:59 Elons of the world would
0:22:00 have worked at NASA or run
0:22:02 NASA, and the Patrick
0:22:03 Hollisons would have
0:22:04 probably run the Federal
0:22:05 Trade Commission and so on.
0:22:06 And whatever was written
0:22:08 down legally, they would
0:22:09 just call each other and
0:22:11 make it work, right?
0:22:13 You’d have a bunch of CEO
0:22:14 level people, founder
0:22:15 level people, because they
0:22:15 couldn’t have found
0:22:16 companies that were
0:22:17 channeled into the
0:22:17 government to make it
0:22:17 work.
0:22:18 Similarly, in the
0:22:19 Soviet Union, when you
0:22:20 couldn’t, those guys
0:22:20 couldn’t do
0:22:21 entrepreneurship, they put
0:22:22 all their energies in
0:22:23 just pure math and
0:22:23 science.
0:22:24 And that’s why there’s
0:22:25 some amazing Soviet
0:22:26 mathematicians and
0:22:28 physicists and so on, if
0:22:28 you’re familiar with
0:22:30 that, because that was
0:22:31 an area where, okay,
0:22:32 you know, that kind of
0:22:34 technical mindset could
0:22:35 at least do something,
0:22:35 you know?
0:22:38 Anyway, so, because
0:22:39 they’re selected for
0:22:40 this, one example of
0:22:42 this, you know, do
0:22:42 you remember this
0:22:44 Lorena Sanchez or
0:22:45 Lorena Gonzalez who
0:22:47 told Michael Solana
0:22:47 that he was a
0:22:49 billionaire, right?
0:22:50 Or how Bernie
0:22:51 Sanders is like,
0:22:52 millionaires and
0:22:52 billionaires, right?
0:22:53 Which is like seeing
0:22:54 meters and
0:22:55 kilometers, right?
0:22:56 Because actually, it’s
0:22:57 like a thousand X
0:22:57 difference.
0:23:00 Or like, you know, when
0:23:00 Brian Williams,
0:23:01 Nara Gay of the
0:23:02 NYT editorial board
0:23:03 said that like
0:23:05 Bloomberg could give
0:23:06 his fortune and
0:23:07 divide it and give a
0:23:08 million dollars to
0:23:09 everybody, right?
0:23:11 So, so that’s like three
0:23:12 examples where I really
0:23:13 start to think they
0:23:14 don’t, they think like
0:23:16 billion means like big
0:23:16 number, you know, like
0:23:18 a primitive tribe will
0:23:18 have numbers for like
0:23:19 one, two, and many,
0:23:20 right?
0:23:23 So like, they just don’t
0:23:25 understand like one E9, you
0:23:26 know, like the difference
0:23:27 between a billion and a
0:23:28 million or the difference
0:23:29 between, for example,
0:23:31 someone who has a billion
0:23:32 dollars liquid, someone who
0:23:33 has a billion dollars net
0:23:34 worth, a billion dollar
0:23:35 fund, a billion dollar
0:23:36 valuation.
0:23:40 And this leads to like,
0:23:41 they literally can’t do,
0:23:42 it’s not like they can’t do
0:23:43 machine learning and they
0:23:44 can’t do gradient descent
0:23:45 and they can’t like
0:23:47 divide or like, you
0:23:49 know, and to have some
0:23:52 sympathy for them, if, if
0:23:53 I was to say, what’s the
0:23:53 difference between a
0:23:54 picofarad and a
0:23:55 microfarad, right?
0:23:56 Unless you’ve done
0:23:58 something with hardware,
0:23:59 you know, like what’s a
0:24:00 lot of capacitance and a
0:24:01 little capacitance, like
0:24:02 there’s a scale there for
0:24:03 capacitance or reductance or
0:24:04 something like that, that
0:24:06 unless you’re actually done
0:24:06 electrical engineering, you
0:24:07 wouldn’t have an intuition
0:24:07 for it.
0:24:08 So, but they just have no
0:24:10 intuition for scales of
0:24:11 money beyond their personal
0:24:12 experience, like a thousand
0:24:14 bucks, beyond something that’s
0:24:15 in their bank account or
0:24:16 checking account, they have
0:24:17 no, they just don’t know
0:24:18 what’s above that because
0:24:18 they haven’t run
0:24:19 organizations or done
0:24:20 investments.
0:24:21 Like a billion might as well
0:24:22 be a trillion, might as well
0:24:23 be a quadrillion.
0:24:24 Why don’t we bubble it up a
0:24:25 little bit and talk a little
0:24:27 bit more about, about this
0:24:28 M&A stuff?
0:24:29 Because I, I just, I’m
0:24:31 completely, I’m completely
0:24:34 fascinated by, by let’s
0:24:36 just take it in general, not
0:24:37 about Figma, but just this,
0:24:39 this kind of crazy
0:24:40 revisionist thing that goes
0:24:42 on with, with M&A.
0:24:44 Like for me, like one
0:24:45 of the big things is you
0:24:46 have to just start from the
0:24:48 premise that when a giant
0:24:51 corporation does M&A, it’s
0:24:52 literally like a speculative
0:24:54 investment that has-
0:24:55 It’s a power law, but for
0:24:56 M&A, exactly.
0:24:57 I love that you said that.
0:24:58 Right, that has a power law
0:24:58 return.
0:25:00 Like there’s only two
0:25:02 truisms about, about
0:25:03 corporate M&A.
0:25:05 One is it literally, it, you
0:25:07 provably a net destroyer of
0:25:07 value.
0:25:09 like you, you, no matter
0:25:11 how many studies get done at
0:25:14 HBS or at MIT Sloan, I, I
0:25:16 literally went and when I was
0:25:17 teaching at HBS, I spent
0:25:19 hours in the library and
0:25:21 pulled all these like papers
0:25:22 with math and calculus and
0:25:24 stuff in them that were
0:25:26 against M&A because I found
0:25:28 M&A at Microsoft very, very
0:25:29 difficult to pull off because
0:25:31 Bill was mandating this
0:25:33 constant synergy and, and, and
0:25:35 synchronicity across
0:25:37 products and M&A was like a
0:25:39 huge perturbation to that
0:25:40 whole system.
0:25:41 With the exception of like
0:25:42 something like PowerPoint,
0:25:43 which was a huge one.
0:25:44 I’ll get to PowerPoint because
0:25:45 that’s near and dear to me.
0:25:48 But, but there, all the
0:25:49 business literature on M&A is
0:25:50 it’s a destroyer value.
0:25:52 So you would think that the
0:25:54 regulators would be out there
0:25:56 against M&A, not because of
0:25:58 the success it has, but
0:25:59 because of the failure.
0:26:00 Like they should be out
0:26:00 there.
0:26:01 The regulators should be
0:26:03 saying, hey, companies, you
0:26:04 shouldn’t buy companies
0:26:05 because you just destroy
0:26:06 them, but you never, ever
0:26:07 hear them doing that.
0:26:09 And no company sets out to
0:26:10 destroy it.
0:26:11 And so I actually didn’t
0:26:12 the biology.
0:26:13 I actually brought my
0:26:14 visual aid this time, which
0:26:14 we’ll put up on the
0:26:14 screen.
0:26:15 Oh, wow.
0:26:16 But this is, this is like
0:26:17 what the New York Times,
0:26:19 your favorite NYT, what
0:26:22 they said when Google
0:26:23 acquired YouTube.
0:26:25 And so the headline on the
0:26:26 front page of the New York
0:26:29 Times is dot com boom is
0:26:30 echoed in the deal for
0:26:32 YouTube, followed by like
0:26:35 five hundred words about
0:26:37 copyright infringement and
0:26:39 they’re overpaid and it’s
0:26:40 just five guys and kitten
0:26:41 videos.
0:26:42 And it’s got like pull
0:26:43 quotes from all these
0:26:45 people explaining what a
0:26:46 disaster it’s going to be.
0:26:47 So then you.
0:26:47 All right.
0:26:49 Well, I’ve got I’ve got I
0:26:51 love that because I can
0:26:52 you put this one on screen?
0:26:54 So Instagram, you know,
0:26:56 Instagram was something
0:26:58 where at the time John
0:26:59 Stewart said because
0:27:00 Instagram had no revenue
0:27:01 right at the time Facebook
0:27:01 bought it.
0:27:03 Little square pictures, too,
0:27:04 like retro square pictures
0:27:05 with filters.
0:27:06 Yeah, yeah, yeah, exactly.
0:27:08 And and and John
0:27:09 Stewart was like Instagram.
0:27:11 The only thing that would be
0:27:11 worth worth a billion
0:27:12 dollars would be something
0:27:14 that instantly get me a
0:27:16 gram of Coke, you know,
0:27:16 or something like that.
0:27:16 Right.
0:27:18 And and so the thing is
0:27:19 Instagram today is being
0:27:22 retconned as this evil
0:27:23 obvious move for the evil
0:27:23 monopolist.
0:27:25 At the time, Zuck was
0:27:26 supposedly an idiot for doing
0:27:28 it because first is it had
0:27:29 raised it five hundred
0:27:30 million dollars the day
0:27:31 before Zuck offered a
0:27:31 billion.
0:27:32 So double evaluation.
0:27:34 Second, that was about 25%
0:27:35 of Facebook’s four billion on
0:27:36 cash in hand.
0:27:37 Third, it was like weeks
0:27:38 before the Facebook IPO.
0:27:39 Fourth, the board wasn’t
0:27:39 consulted.
0:27:40 Fifth, Instagram had no
0:27:41 revenue.
0:27:43 So the balls to do that
0:27:44 when you’re going into an IPO
0:27:45 and need to reassure the
0:27:46 market that, oh, you know,
0:27:48 this boy genius CEO needs,
0:27:49 you know, all these idiots
0:27:50 to say, oh, I don’t
0:27:52 supervision is needed, blah,
0:27:52 blah, blah.
0:27:52 Right.
0:27:55 So, you know, why would
0:27:57 Facebook pay one billion
0:27:57 for a company with no
0:27:58 revenue?
0:27:59 With Facebook’s public
0:28:00 offering only a few weeks
0:28:01 away, spontaneous and
0:28:02 impervizational business
0:28:04 moves become more curious,
0:28:06 you know, like, you know,
0:28:07 or Hacker News, this is not
0:28:08 going to be one of the best
0:28:09 tech acquisitions of the next
0:28:09 decade.
0:28:10 Instagram is a photo
0:28:12 service in a sea of other
0:28:13 photo services.
0:28:14 Can someone please tell me
0:28:15 how Instagram’s actual
0:28:16 content is worth anything
0:28:17 seems like mostly a huge
0:28:18 waste of cash.
0:28:20 And then Alex Wilhelm,
0:28:22 at a market cap of 950
0:28:22 million, the New York Times
0:28:24 is worth, quote, less than
0:28:24 Instagram.
0:28:26 True.
0:28:26 True.
0:28:28 That one I’ll give them.
0:28:29 That’s true.
0:28:32 But I think it’s also like
0:28:35 you have to actually read the
0:28:37 reasons why people thought
0:28:39 because the reasons end up
0:28:40 being these very pedestrian
0:28:43 sort of non-math, unable to see
0:28:46 exponential growth, like, not
0:28:46 strategic.
0:28:48 They just, they always fall back
0:28:49 on something.
0:28:50 Like, in fact, you know,
0:28:51 Instagram was on revenue.
0:28:53 YouTube had no revenue, but
0:28:56 also it was like this morass of
0:28:57 copyright violations.
0:28:59 And how will Google ever figure
0:28:59 that out?
0:29:02 And no one ever wrote about
0:29:02 the potential.
0:29:05 And, of course, the potential.
0:29:05 Well, yeah.
0:29:07 And go ahead.
0:29:07 See what you’re saying.
0:29:10 The potential is the part that
0:29:11 it’s the venture bet that the
0:29:12 company is making.
0:29:14 And the interesting thing is
0:29:16 big companies make the wrong
0:29:18 potential bet 90% of the time.
0:29:20 Like, they generally always think,
0:29:22 you know, like they’ll do like a
0:29:24 HP autonomy acquisition, which is a
0:29:26 very famous off, went off the
0:29:26 rails acquisition.
0:29:28 during the enterprise software
0:29:30 world, where HP just thought,
0:29:31 well, this is sort of this
0:29:34 Nth tier player in enterprise
0:29:35 search and information
0:29:37 retrieval, but we’ll buy them
0:29:38 and we’ll put our magical
0:29:39 sales force and platform
0:29:40 strength behind it.
0:29:42 And that will fix everything,
0:29:44 which is like 90% of the M&A
0:29:45 that happens.
0:29:47 The big company just assumes
0:29:48 that whatever it’s strong at,
0:29:50 it will just wave that dust over
0:29:53 this failing business and it
0:29:54 will make it great.
0:29:55 And I think.
0:29:56 Yes, that’s right.
0:29:57 And I think that that’s what
0:29:59 it’s always like this venture
0:30:00 aspect of it that’s missing in
0:30:03 this retcon that that they
0:30:04 should have stopped it or that
0:30:05 they should go back and stop it
0:30:07 because it got made into a
0:30:10 success against all of the
0:30:11 conventional wisdom at the
0:30:12 time.
0:30:14 It just completely blows my mind
0:30:17 that that’s the framework for
0:30:20 for evaluating an M&A that that
0:30:21 people would would use.
0:30:23 I mean, you know, like nobody’s
0:30:25 going to go back and retroactively
0:30:26 consider whether, you know,
0:30:28 Roomba really would have been
0:30:30 better off being bought by Amazon,
0:30:32 even though it should have been.
0:30:33 You know what it is?
0:30:34 Everybody wants a piece of
0:30:35 reward.
0:30:36 No one wants a piece of risk.
0:30:36 Right.
0:30:39 So that when the state goes and
0:30:41 blocks these acquisitions, they
0:30:43 they assume no downside risk.
0:30:45 They’re not like taking, you know,
0:30:46 hey, like like, like, for example,
0:30:49 Adobe had to pay a billion of
0:30:49 Figma for the breakup.
0:30:50 Right.
0:30:52 Like the breakup fee or what
0:30:53 have you, you know, the fact that
0:30:53 the deal didn’t go through.
0:30:55 So it just goes in metals.
0:30:57 And then these people at the
0:30:58 chutzpah to take a victory lap,
0:31:00 you know, it was really just, you
0:31:02 know, you know, stolen banner.
0:31:04 It’s like Stolina Valor.
0:31:05 Right.
0:31:07 Lina Khan, stolen Valor.
0:31:07 Right.
0:31:10 And genuinely like Dylan Field, like
0:31:12 and the Figma team, superhero this.
0:31:13 But just to talk about that for a
0:31:16 second, like with an M&A, you made a
0:31:17 bunch of good points, Stephen.
0:31:18 I want to kind of add to that.
0:31:20 First is absolutely there’s a power
0:31:21 law for M&A, just like there’s a
0:31:22 power law for startups.
0:31:24 And the best M&A you do can
0:31:25 completely transform your company.
0:31:27 A lot of the returns essentially fail.
0:31:27 And sometimes it’s a little bit
0:31:28 unpredictable.
0:31:32 Number two, I think, is in general,
0:31:33 a company usually needs, in my view,
0:31:35 and you may disagree, it needs to be
0:31:38 about 100x a size of the small
0:31:39 company in order to acquire them.
0:31:42 And the reason is, if it’s even only
0:31:44 10x a size, I’ve only, I can only
0:31:46 think about one deal where it was
0:31:47 about 10x a size and it worked, and
0:31:49 that was Illumina’s acquisition of
0:31:51 Celexa, where it’s like a must-win,
0:31:53 that’s in the genome sequencing
0:31:55 space, but that, or genome sequencing
0:31:57 space, that’s where there was like a
0:31:59 really important technology that
0:32:01 became like the basis for everything
0:32:02 Illumina did for the next decade.
0:32:04 And the entire executive team was
0:32:06 bought in on it, because 10% of
0:32:08 your cap table, 10% of your equity
0:32:09 is like a huge amount.
0:32:10 It’s basically more than you’re going
0:32:12 to spend for the whole year, maybe
0:32:13 for multiple years at a time.
0:32:15 And so, a 10%
0:32:17 bite is massive.
0:32:19 It has to really be
0:32:20 1%, and that’s still a whole
0:32:21 integration effort.
0:32:22 That’s number two.
0:32:24 And the reason I say that is lots of
0:32:25 founders at various stages will be
0:32:27 like, oh, I’m acquiring another
0:32:27 startup.
0:32:29 And I’m like, startup deals never
0:32:30 work in general, because neither of
0:32:31 them have money.
0:32:33 Maybe one of them can shut down and
0:32:33 join the other one.
0:32:34 that sometimes works once in a
0:32:35 while, but in general, they don’t
0:32:36 work.
0:32:39 That’s why M doesn’t work, but A
0:32:39 works.
0:32:41 Like, merger usually doesn’t work,
0:32:41 but acquisition works.
0:32:44 Well, AOL and Time Warner and
0:32:44 stuff.
0:32:45 What a huge disaster that was.
0:32:47 Yeah, exactly.
0:32:48 Again, once in a while, it’s
0:32:49 something like, you know, Steve
0:32:51 Jobs, Pixar, and all the ones that
0:32:53 work are sui generis, where it’s
0:32:55 like, they really acquired some
0:32:56 amazing founder as part of that.
0:32:57 It then leads the company or
0:32:58 something.
0:33:01 The third thing about M&A, as you
0:33:04 again said, is that the smart big
0:33:06 company values them on the basis of
0:33:08 the big company’s distribution, right?
0:33:10 And it’s this product times that
0:33:11 distribution is something.
0:33:14 However, the dumb big company just
0:33:15 thinks they can just roll up
0:33:17 anything and sell it, and that just
0:33:18 doesn’t work, right?
0:33:23 I think one of the huge, I mean,
0:33:25 the responses, by the way, on this
0:33:26 Figma thing, and then let’s get to
0:33:28 actually Windsor, if I want to talk
0:33:29 about that as well, and then also
0:33:30 the genius hack.
0:33:32 The response on the Figma thing, I
0:33:32 think, fell into one of three
0:33:33 categories.
0:33:37 The first was the Elizabeth Warren
0:33:39 School, which is just anti-crypto
0:33:41 army, anti-tech army, they hate tech
0:33:42 guys.
0:33:44 And I actually like that because that’s
0:33:46 just like pure tribal animus.
0:33:49 Okay, meet me on the 50-yard line.
0:33:50 You bring your guys, we bring our
0:33:52 guys, let’s, you know, win in the
0:33:53 battle for the ideas.
0:33:54 Let’s go, right?
0:33:55 I actually prefer that because that’s
0:33:58 like explicit conflict, and it’s just,
0:34:00 you know, tribe versus tribe, you
0:34:00 know, got the war paint on.
0:34:01 Okay.
0:34:05 Then you’ve got the, like, well-meaning
0:34:09 maybe, but, you know, I often can’t
0:34:10 tell if they’re trying to kill us or
0:34:12 they’re just actually arsonists or
0:34:13 what have you, right?
0:34:15 Which is, oh, we’re going to have
0:34:18 more startups if we allow them to
0:34:20 become big and not be eaten by these
0:34:22 other, you know, companies or what have
0:34:22 you.
0:34:28 And I struggle for the analogy or what
0:34:32 have you, but it’s like, I don’t know,
0:34:34 you can’t hire somebody until you
0:34:36 interview 20 people because then
0:34:37 they’ll be like the best of 20 people
0:34:39 and we’re going to let you hire the
0:34:40 best of 20 people.
0:34:43 What it does is, first of all, you
0:34:44 know, you shouldn’t be interfering in
0:34:44 that choice.
0:34:47 Second is that if you cut off the
0:34:50 flow of M&As, obviously, most
0:34:52 companies aren’t, you know, either
0:34:53 good enough or it’s really tough to
0:34:54 make it all the way to IPO.
0:34:56 Like Oculus, for example, down 10
0:34:58 years ago, they were burning a lot of
0:34:58 cash.
0:35:00 They probably couldn’t have made it to
0:35:00 IPO.
0:35:02 A lot of companies are like that.
0:35:03 They’re burning cash where they’re
0:35:04 valuable to a big, deep pocketed
0:35:06 acquirer and there’s like maybe one of
0:35:08 five guys who could buy them, 10
0:35:09 guys, 20 guys, whatever the number is,
0:35:11 but they really can’t operate as a
0:35:11 standalone company.
0:35:13 They got to prove concept enough,
0:35:13 right?
0:35:15 So in that circumstance, you know,
0:35:17 airlines are often like this where
0:35:18 there’s a ton of fixed costs that go
0:35:20 into it and, you know, like mergers
0:35:21 make sense because they have routes
0:35:22 and stuff like that, the JetBlue
0:35:23 spare one, right?
0:35:26 So when they block those deals, they
0:35:29 are actually destroying value, right?
0:35:32 And moreover, one of the biggest
0:35:33 issues, and this is related to
0:35:35 regulation in general, is they think
0:35:37 of it as, oh, this is punishing the
0:35:38 big tech companies, the punishing
0:35:38 of the companies.
0:35:40 It’s actually, even though it’s an
0:35:41 annoyance to them in the short run,
0:35:43 in the medium to longer, it makes
0:35:45 them stronger because if the big
0:35:48 companies can’t buy, well, well,
0:35:49 first of all, figure out other
0:35:50 things, so we’ll get to these
0:35:51 complex deal structures.
0:35:53 But second is, that means less money
0:35:54 for startups because when a big
0:35:55 company makes a big acquisition,
0:35:58 that’s a big surrender because it
0:36:00 means a big company couldn’t have
0:36:00 built it themselves.
0:36:02 Like Google had Google Video, but it
0:36:03 had to buy YouTube for 1.6 bill,
0:36:05 which certainly caused, I’m sure,
0:36:07 some churning internally by the
0:36:08 Google Video guys, right?
0:36:10 And most of the time, at big
0:36:11 companies, there’s some faction
0:36:13 inside who’s like, we could build it
0:36:14 ourselves or, you know, no, no, we’re
0:36:16 paying too much or something like that.
0:36:18 So it’s often a surrender for a big
0:36:19 company to do this.
0:36:20 It’s not what they wanted to do.
0:36:21 They didn’t want to pay a billion
0:36:23 dollars or whatever for this.
0:36:24 And then that surrender money, it
0:36:25 goes and excites everybody.
0:36:27 They’re like, let’s make a million
0:36:28 Instagrams when you see a big billion
0:36:29 dollars for the Instagram
0:36:29 acquisition.
0:36:31 And then you get Snapchat, and then
0:36:33 you get TikTok, and you get all
0:36:33 these other competitors.
0:36:35 Facebook, it’s like throwing
0:36:37 fertilizer on, you know, things to
0:36:39 spring up 1,000 competitors that, you
0:36:40 know, all start attacking you, right?
0:36:42 And so the actual way of regulating
0:36:44 big companies is with 1,000 startup
0:36:47 piranhas, not by dysregulation.
0:36:47 Let me pause here.
0:36:48 I think there’s more.
0:36:50 No, that’s a fantastic observation
0:36:52 because you always remember that in a
0:36:55 big company, whenever something new
0:36:58 that’s adjacent pops up, the immediate
0:37:00 reaction is, okay, we’re selling this
0:37:02 giant blob of stuff in software.
0:37:04 We’re selling this giant blob of
0:37:06 software, and this thing is adjacent
0:37:06 to it.
0:37:08 So our blob needs to have that thing.
0:37:11 And that’s what immediately gets the
0:37:13 antitrust regulators, oh, but that’s
0:37:18 expansion by leverage or by tying or
0:37:18 something like that.
0:37:20 Tying, oh my God.
0:37:21 But what does tying mean?
0:37:22 Tying means you’re tying peanut butter
0:37:24 and jelly together in a sandwich.
0:37:25 Tying is like business strategy one-on-one.
0:37:27 Now, 10 years later, the truth can be
0:37:28 told, I’ll tell you.
0:37:33 And so you have all these meetings at a
0:37:35 big company, which is, well, should we
0:37:36 make it or should we buy it?
0:37:38 And it’s just make versus buy.
0:37:39 And that’s the conversation that you’re
0:37:40 going to have.
0:37:42 And of course, this is the funny part
0:37:42 about-
0:37:43 Big companies find it hard to make,
0:37:44 though.
0:37:44 But go ahead.
0:37:46 But this is the funny thing, because, you
0:37:48 know, if the regulators get involved, then
0:37:50 they get all the memos and all the emails.
0:37:52 And it turns out inside the company, half
0:37:54 people said we could make it and half
0:37:55 people said we could buy it.
0:37:58 And all the people said we have to have
0:37:58 this thing.
0:38:01 And they said it with varying levels of
0:38:01 hysteria.
0:38:04 And and like it’s like we have to have this
0:38:06 is going to put us out of business or, well,
0:38:07 this would be really nice.
0:38:08 and we have some customers on the
0:38:09 peripheral asking for it.
0:38:12 So which one of those enters the discovery
0:38:13 for for the regulators?
0:38:16 The hysterical person who’s probably the
0:38:20 person on point losing a deal in sales or
0:38:23 the engineer that just is is mesmerized by
0:38:25 the exciting implementation of something.
0:38:26 And right.
0:38:29 And like that’s exactly what that’s the
0:38:29 say.
0:38:30 But then they still go back and have the
0:38:31 make versus buy.
0:38:34 They always make one of two choices.
0:38:36 They almost never really just try to make
0:38:36 it.
0:38:39 But if they have to, because the one that
0:38:40 they want, there’s only one.
0:38:41 They can’t buy it or whatever.
0:38:44 They it’s very, very hard to succeed on
0:38:46 the make versus buy when you go to choose
0:38:46 make.
0:38:49 So then you go by and there’s a fork in
0:38:51 the road about two thirds of the time.
0:38:54 The big company says, wow, the leader is
0:38:56 really expensive, but we have our magic
0:38:58 distribution beans.
0:38:59 So we’re going to we’re going to pick the
0:39:01 number two or number three.
0:39:03 That’s like way, way cheaper and get in a
0:39:04 fire sale.
0:39:06 And of course, that never, ever, ever
0:39:07 works like Microsoft.
0:39:08 When is that ever?
0:39:09 When is that ever?
0:39:10 I’m actually trying to think.
0:39:11 It doesn’t.
0:39:14 But like you could you have Google bought
0:39:15 Motorola.
0:39:16 Double click.
0:39:17 Google, I don’t click.
0:39:18 And then was it a quantum?
0:39:20 Microsoft bought a quantum for five
0:39:21 billion dollars.
0:39:24 You had Sprint merging with Nextel, which
0:39:25 was like two number fours.
0:39:27 If that was a possibility, you have
0:39:28 Microsoft and Nokia.
0:39:29 I made a giant long list.
0:39:31 You had everybody in the phone business
0:39:32 that needed everybody in the chip
0:39:34 business that needed modems.
0:39:36 And so then they went and everybody
0:39:38 bought these like number two or three
0:39:41 modem makers like Nvidia almost got
0:39:44 overtaken by a PE raider because it
0:39:45 bought a modem company.
0:39:48 And that was a signal to the market that
0:39:51 it was completely confused about gaming
0:39:51 graphics.
0:39:52 What do you have?
0:39:54 Why would you compete with Qualcomm from a
0:39:55 gaming graphics company?
0:39:58 And this just goes on and on and on with
0:39:59 with with that kind of thing.
0:40:02 But then you still get to the point where
0:40:04 you want to buy something, which is still a
0:40:05 power law return.
0:40:07 But again, one of the things that doesn’t
0:40:10 get taken into account is that in a venture
0:40:14 investments or acquisitions from a company can
0:40:17 actually be really transformative to the big
0:40:19 company, which is a thing that the regulators
0:40:22 don’t really see because they see the world as
0:40:24 a static fixed pie.
0:40:27 So they think like once a company is like IBM
0:40:29 and owns mainframes or as Microsoft and owns
0:40:32 Windows, well, that’s just what it should do.
0:40:34 It should then just make Windows forever and
0:40:36 just be the Windows company.
0:40:38 This is where tech people are very, very
0:40:41 different because they just assume tech has this
0:40:44 finite, you know, sell by date and that the tech is
0:40:47 just not going to be all that useful down the road.
0:40:50 So so, you know, you have to reinvent yourself.
0:40:53 And, you know, like it’s not like people in
0:40:56 1985 thought Apple was going to be a phone company.
0:41:00 And and like that, that whole mindset, it just
0:41:00 sort of escapes people.
0:41:03 Like here’s an example of an acquisition that was
0:41:05 hugely transformative for Microsoft that nobody
0:41:09 knows about today, which was in the throes of the
0:41:12 rise of the Internet in 1996 or so.
0:41:16 We bought a company called FrontPage, which was
0:41:18 basically a word processor for the Web.
0:41:22 And it was a way to design a whole Web site and
0:41:24 to also do something that nobody else did, which was
0:41:27 you could edit on a PC, push a button and those
0:41:28 things would end up on the Internet.
0:41:29 Like that was a that.
0:41:32 At the time, it was actually a pretty good.
0:41:33 It was super cool.
0:41:37 But what it did to Microsoft was it like saw we, you
0:41:39 know, I was in office, we did the deal, but then
0:41:42 we had a fight with the Internet Explorer team who
0:41:44 thought they should do the deal, but they didn’t want
0:41:46 to do the deal until they saw us wanting to do the
0:41:48 deal, which is a whole how things work in a big
0:41:49 company.
0:41:52 And and so we ended up first, we had to solve the
0:41:54 bidding war within our company.
0:41:56 And then we had to get on the phone with Mark and do
0:41:59 the bidding war against Netscape for this this company
0:42:00 in Boston.
0:42:03 But the thing that it did was it it it galvanized
0:42:05 Microsoft to say, you know, what’s really important on
0:42:09 the Internet is editing and nobody was really solving the
0:42:13 way to edit this very finite thing called HTML and publish
0:42:15 it to an Apache Web server.
0:42:19 And so we finally figured out that like editing on the
0:42:23 Internet was not going to be like Word and it was going to be a
0:42:26 different kind of tool that involves script that involved
0:42:26 programming.
0:42:28 And that’s what led to a series of things.
0:42:32 But the people we brought in were experts in the Internet and
0:42:35 editing on the Internet, which we just didn’t have.
0:42:38 And although the product never materialized as a big Microsoft
0:42:43 thing, the people infused that DNA into the company that enabled
0:42:47 Microsoft to go and figure out how to do editing in a browser, which
0:42:49 turned out to be incredibly important.
0:42:53 And and that kind of thing is transformative, but it also
0:42:55 transformed the whole industry.
0:42:59 Like where would we be today had we not figured out these dynamic
0:43:03 websites and the way you could edit web in the browser and stuff like
0:43:06 that wouldn’t have happened because we were just not innovating there.
0:43:12 And I feel like that whole thing is missing even from the Figma, which again,
0:43:16 the specifics of Figma aren’t really super important, but it was a whole new
0:43:22 innovative category of how to do tooling, which then gets to AI and tooling.
0:43:26 So I think that’s a good way to get us to Windsurf and everybody else.
0:43:26 Okay.
0:43:26 Yes.
0:43:33 In general, one of the things that’s been happening is due to the not just past
0:43:36 people, one of the things people can only remember maybe a name.
0:43:37 It’s like a fleeting kind of thing.
0:43:38 They’re like, well, Lina Khan is gone.
0:43:40 So therefore nothing has changed.
0:43:41 U.S.
0:43:44 First Google is a giant interest case still going.
0:43:45 FTC First Meta is still going.
0:43:48 All the interest stuff is also not just the U.S.
0:43:50 All these other countries that ganged up on this Figma thing.
0:43:54 There’s just every lawyer at all of these companies, like their number one priority
0:43:58 is do not get us into some, you know, antitrust situation.
0:44:03 So because of that, a lot of the big companies have been forced to innovate on deal
0:44:05 structures and do things that are new.
0:44:10 Scale, character, inflection, adept, covariant are all, and Windsurf.
0:44:11 They’re all very similar, right?
0:44:18 Where essentially they are, they’re not all the same, but, you know, we have, so there’s
0:44:23 a typical acquisition where you have an acquired, let’s call it Google, and it buys a company.
0:44:27 And what it does when it buys a company, there’s a process which, you know, most people watching
0:44:28 the show will know.
0:44:31 But if they don’t, there’s something called the capitalization table, which says who owns
0:44:31 what shares.
0:44:34 And there’s something associated with it called the liquidation waterfall that says
0:44:36 who gets what money when.
0:44:42 And so, you know, for example, if there’s debt providers, how they get paid and who gets paid
0:44:45 in the middle, who gets paid at the top, the preference stack, who’s at the bottom, common
0:44:46 holders, and so on and so forth, right?
0:44:51 So, the capitalization table and liquidation waterfall get a very well-defined process
0:44:54 for who gets paid when you just buy the whole company, right?
0:44:54 Eat the whole thing.
0:44:55 Okay.
0:44:58 Then you’ve got something which is like an acqui-hire.
0:45:02 An acqui-hire is something where, and I’m just describing these basic things just to set
0:45:04 the context for what this Windsurf thing was.
0:45:11 So, an acqui-hire is something where the acquired company doesn’t really get any money.
0:45:13 It’s not usually done through the liquidation waterfall.
0:45:18 Instead, the company just shuts down, but there is a press release that says it was
0:45:23 acquired, and then the team goes and gets jobs at the new company.
0:45:27 Maybe there’s some cash that’s given to the investors, but essentially, it’s way better
0:45:32 to at least get an acqui-hire than to have a total go-to-zero moment because you get the
0:45:34 status, if not the money, right?
0:45:36 That’s one way of thinking about it, right?
0:45:42 And now we get to the third thing, which is what these deal structures have, where they
0:45:46 have an acqui-hire component, but they also have what I’m calling the acqui-fire, okay?
0:45:53 So, the acqui-fire, what is, the acqui-hire component is, in these like six deals in scale,
0:46:00 character, inflection, adept, covariant, and Windsurf, the big company basically bought the
0:46:06 top AI researchers and engineers out of the smaller company and paid a huge sum for that.
0:46:09 But then, it wasn’t actually buying the company.
0:46:14 The company was left as a shell or as actually an existing entity.
0:46:18 And then, let’s say, for example, in the case of Windsurf, you had 40 people go to Google and
0:46:20 about 200 people were left behind.
0:46:26 But, there was a huge chunk of money that was left in the bank account of the left-behind
0:46:26 entity.
0:46:33 And, it’s usually set up, and it was in the case of Windsurf, in such a way that the money
0:46:36 that was left in the company is what they would have received through the liquidation waterfall,
0:46:36 right?
0:46:41 And so, the point is that in an acqui-hire, you get the status, but not the money.
0:46:44 In an acqui-fire, you get the money, but not the status, okay?
0:46:47 So, that leads to the whole Windsurf drama.
0:46:52 In the other five acquisitions, you know, you’ve seen the Dark Knight, you know Bain?
0:46:55 He’s like, we need one of us to be in the wreckage, brother, right?
0:47:05 So, whoever was in the left-behind, you know, company, right, was somebody who was well-behaved
0:47:08 enough to basically be like, okay, salute, I’m going to go kind of down with the vehicle,
0:47:12 I’m going to dividend the money out, you know, deal with it silently, and so on and so forth,
0:47:12 right?
0:47:18 It was, you know, like, there’s a line of succession for the presidency, and it’s like
0:47:22 president, vice president, like, I think it’s like the Speaker of the House, blah, blah, blah,
0:47:26 and you get to like number 37, it’s like the Secretary of Interior or something like that.
0:47:28 It’s Kiefer Sutherland, the Housing and Urban Development.
0:47:30 Kiefer Sutherland, right, right, exactly, right?
0:47:36 So, the Kiefer Sutherland is the designated successor if the entire leadership structure is
0:47:39 decapitated or, in this case, acquired, right?
0:47:44 If they’re all raptured, right, that is the person who’s behind who’s now the president.
0:47:50 Now, one of the things I think we need to do in our contracts is we need to have, you know,
0:47:51 what comes with a key man provision?
0:47:52 Yeah, yeah.
0:48:00 We need to have a non-key man provision, which is, this is the designated executive who is
0:48:04 in the event of an aquifer-like thing, and we can decide how to describe it.
0:48:09 It’s not an acquisition because there’s an acquisition, then you have all the FTC, blah, blah, blah, blah, blah, stuff, right?
0:48:17 But in the event of an aquifer, this non-key man stays behind, he gets maybe a little more money than he were,
0:48:23 a lot more money, whatever, very, very negotiates, because he’s not getting the status of being acquired, okay?
0:48:30 But he executes an ordinarily shutdown of the company, doesn’t have any drama, and just dividends out the money, okay?
0:48:35 The issue is that this deal structure was new enough that the other five times it went fairly well,
0:48:39 but in this context, what had happened, and just to give you some details that I’m aware of,
0:48:45 first, most of the Windsurf employees are just being hired in the last few months because they were all sales guys.
0:48:52 Second, Google, when acquiring the company, didn’t want to acquire these sales guys because Google has its own sales team.
0:48:53 Google just wanted the engineers, right?
0:49:02 Third, Google put $100 million plus in the bank account of Windsurf, where the intent was to dividend it out.
0:49:07 But the guys who were left behind didn’t understand what was happening because they’re sales guys that just think about money or whatever.
0:49:14 And the problem was, it was so constrained in terms of what could be said about what was going on.
0:49:16 Since it’s not an acquisition, guys, right?
0:49:25 Since they couldn’t say anything about what was actually happening, the people who were doing the deal couldn’t communicate clearly about what was happening.
0:49:28 So it just looked like, oh, my God, the founders left, and they left everybody in the lurch.
0:49:30 Oh, they broke the social contract.
0:49:31 And so that’s not actually what happened at all.
0:49:37 What happened was the FTC and others had made acquisitions so difficult that they had to do this other structure.
0:49:43 And it resulted in the people left behind not getting the hint about this.
0:49:44 That’s one interpretation.
0:49:48 The other interpretation is the people left behind got money but not status.
0:50:00 So, after all, if you put yourselves in their position, like, normally in an acquisition, Google might have acquired a 250-person company, and they might have kept 40 people, and the other 200 people, they said, hey, we’re not acquiring it.
0:50:07 But those people would have had a face-saving thing, and they would have had a line on their CV saying, my company was bought by Google.
0:50:10 You know, I decided to do something else afterwards.
0:50:15 And you know what, that’s actually a common thing, because it has to be both parties have to agree.
0:50:20 Both the big company and small guy, you know, have to agree, hey, I want to still work at Google rather than do another startup.
0:50:21 And it’s very common.
0:50:23 Everybody has a broad, warm halo.
0:50:26 Your exact exit number isn’t published online.
0:50:28 You know, whether you’ve got an offer letter isn’t published online.
0:50:35 So everybody who is acquired has a junction point where they can choose to go to the big company or not, and they have the status and the money, right?
0:50:42 So the issue with the FTC interference in that is it broke the status part of the transaction, where the guys left behind didn’t get status.
0:50:43 So, but they did have money.
0:50:46 So what do they do?
0:50:52 Rationally for them, they negotiated a second acquisition with Cognition, where they got the status of being acquired.
0:50:59 Now, the issue is Cognition’s like 60 people, and acquiring the 200 people of Windsurf, that gets back to our earlier point.
0:51:02 Usually a company can’t buy something that it’s not 10x greater than.
0:51:05 So it’ll be very challenging for Cognition, I think.
0:51:08 By the way, I have nothing against Cognition, nothing against Windsurf, nothing against any of the people here.
0:51:09 I wish everybody the best.
0:51:11 Cognition’s an awesome company.
0:51:12 Windsurf’s awesome.
0:51:13 Varun is awesome.
0:51:16 Nothing bad to say about anybody.
0:51:18 Just describing the incentives, right?
0:51:30 So Cognition will find it challenging, I think, to integrate those 200 people, and it’ll be also challenging for them to lay anybody off because they said, oh, we brought everybody on.
0:51:37 I think on the Windsurf side, basically, like Varun’s side, he’s muzzled, so he can’t say anything.
0:51:45 And in general, my view is usually the guy who’s getting pummeled on social media and can’t speak is usually not as bad a guy as it’s made out to be.
0:51:47 He just literally can’t defend himself, right?
0:51:52 But to defend him, this deal is essentially the same as the other five deals.
0:52:00 The difference is the people left behind, you know, didn’t want to play the Kiefer Sutherland role or what have you just because, you know, for our reason, which is their prerogative.
0:52:04 The way we solve it in the future is a non-key man clause.
0:52:09 And there’s somebody who’s maybe paid more to shut down the company, turn off the lights, because that does suck.
0:52:10 I grant that that sucks.
0:52:13 And I understand why their egos are wounded and so on and so forth.
0:52:20 But ultimately, the person to blame, one of the other things that happens here is in something like this, the last person with a face is the one who’s blamed, right?
0:52:30 Because Varun has a face, but Google doesn’t, and the FTC doesn’t, and then the general anti-tech, anti-trust, US versus Google, FTC versus meta kind of stuff doesn’t, right?
0:52:34 So the last guy with a face is blamed, but the faceless stuff isn’t.
0:52:36 You know, it’s almost like Bastiat seen and unseen, right?
0:52:44 But blame the FTC, blame Lena Kahn, and there’s one thing, which is someone asked, well, why isn’t the current administration reversing this?
0:52:50 And the answer is the current administration, for totally different reasons, I think they have a legitimate bone to pick with big tech because of the censorship and so on and so forth.
0:52:56 But as a consequence of that, many of the cases that were started have been continued, right?
0:52:58 So it’s not like this thing just completely went away.
0:53:05 The new administration is friendly to little tech, mostly, but unfriendly to big tech and continuing those cases.
0:53:09 And then this is the big tech, little tech interaction effect that’s going on there.
0:53:10 All right, that’s a lot I just said.
0:53:11 Let me pause there.
0:53:11 There’s more I can say.
0:53:16 Well, those are super, I mean, very tough stories to hear.
0:53:18 And two things really jump out at me.
0:53:23 One is just purely on the shaping of the landscape and what’s going on.
0:53:25 I think these are extremely important.
0:53:26 Let’s just call them deals.
0:53:41 And the reason they’re extremely important deals is because the way our, I would say with near certainty or in general, but for me personally, we’re undergoing a platform shift now with AI.
0:53:44 We don’t know, I can’t say who the winner is.
0:53:58 I don’t want to say it’s not really important, but there is a shift in where the nexus of the broad tech ecosystem energy is going to be from mobile and cloud to AI.
0:54:04 Now, whether or not that’s a complete break or the same players move to that transition, don’t know.
0:54:16 But what that means is, first and foremost, the most exciting things that are going to be going on in the near term are in the tooling to enable the platform.
0:54:24 And that’s especially true because the way that the AI and platform shift is happening is there’s just a lot of players.
0:54:36 And there’s a lot of people and it reminds me a great deal of the consolidation of the PC operating system world, which was there were dozens of PC operating systems in 1980.
0:54:52 And when IBM came out with the PC and Microsoft came out with DOS, part of that consolidation was due to the implementation of the basic programming language that had already gained strength across many of the platforms.
0:55:07 But this consolidation that on boot up there was basic and then this proliferation of tooling that appeared on DOS because Microsoft invested irrationally in tooling, IBM invested irrationally in tooling far more than there were any independent toolmakers.
0:55:17 And I think what’s happening in AI right now, when you look at all of the energy around coding, is that this is really building the tooling for the AI era.
0:55:26 And so it’s going to be an irrational investment because tooling itself is never a really huge business because you have to have it.
0:55:34 And so it’s sort of this, well, if you have to have it, then there’s going to be, there are going to be many alternatives and there are going to be some low price ones, some high price ones.
0:55:39 But the people that want to have the predominant platform will invest irrationally in tooling.
0:55:45 And so that’s why you’re getting these deals that don’t look rational because there’s just a bunch of tooling.
0:56:05 The second thing is, it’s really important to put this in perspective if you’re one of those people who think that this is kind of gross or hacking the rules in some way, which is antitrust law was itself designed, if you go back to the Sherman Act, it was this very vague, it was barely three pages of legislation.
0:56:09 And it was really designed to attack one specific thing.
0:56:14 And the word trust in that context just meant contract.
0:56:28 And so what was happening is between the railroads and manufacturing and resources and stuff, the way that interstate commerce happened, a company in one geography would sign a contract with a company, a provider or a vertical partner in another geography.
0:56:32 And the interstate commerce laws had not yet really been established.
0:56:40 And so it was sort of this free-for-all of like these exclusive contracts by geography, by resource type, by train tracks.
0:56:46 And it was locking out whole parts of the country from the availability of those things.
0:56:53 So this antitrust became break up these vertically integrated or these horizontally constrained entities.
0:56:59 And then when the Clayton Antitrust Act came along, it said, oh, you know, the real problem is pricing and tying.
0:57:06 And so then all the laws became about how much you can charge, can you have exclusive deals, not exclusive deals.
0:57:15 And at each step, well, you know, the first time, well, then Delaware came along and started being really favorable to companies that were doing business in multiple states.
0:57:22 And so you ended up with this sort of, and I’m, I don’t want to get criticized by, by legal historians or business historians or whatever.
0:57:24 Like I’m, I’m not paying fast and loose.
0:57:27 I’m trying to be abstract about what took place over 30 years.
0:57:34 But then, you know, then when pricing came along, well, businesses just started to develop all of these different ways of dealing with pricing.
0:57:40 Like, like one of the most common things people know is like, if you buy a lot of something, you get a better price.
0:57:44 But if you actually read the Clayton Act, like that doesn’t appear to be legal.
0:57:53 And so then it took a whole bunch of court cases to just make this very basic premise, which is the more you buy, the better price you get, because I like good customers.
0:57:58 Or if you commit to not buying my competitors products, we’ll give you a good price.
0:58:00 And the Clayton Act was like, you cannot do that.
0:58:04 And you’re like, but that seems to be a fairly reasonable constraint.
0:58:10 Like, you’re not going to buy it from me and play that off my competitor and I’ll be nice to you.
0:58:11 And so all of these things.
0:58:24 And so at each juncture in the evolution of regulatory oversight, like the next step of it was what could be viewed as a hack to the systems that got put in place, which generates this animosity with regulators.
0:58:29 And of course, you can go back, banking is a classic one where like checking accounts didn’t have interest.
0:58:35 And so someone clever with software invented this notion that you have a checking account and a savings account.
0:58:38 And your savings account has all your money in it.
0:58:42 And the minute you write a check, we move money from your savings account to your checking account.
0:58:43 So it stops earning interest.
0:58:45 We pay the check and you’re covered.
0:58:47 And that was called a now account.
0:58:52 And that innovation allowed you to have interest on a checking account, which turned out to be a really, a really big thing.
0:59:03 When MCI came out with like, we want you to use our deregulated long distance, but we want you to, you know, like only get a really good price when you call 10 friends and family.
0:59:05 We’ll give you a really good price.
0:59:21 So everybody around the country signed up for MCI when phones were deregulated and had to make a list of all of their friends, which of course turned out to be this massively great marketing tool, because then they would take that list you gave them, give you a discount for calling those 10 people, and then hit those 10 people up to be part of friends and family.
0:59:30 But that was just like a software innovation that completely worked around this idea that the price of long distance should be the same for everyone everywhere.
0:59:34 And then AT&T did it with free minutes up to unlimited long distance calling.
0:59:40 And so what’s happening now is just like, okay, the regulations have been fixed for a long time.
0:59:43 We want to invest irrationally in platforms.
0:59:45 You’re making this part of it very difficult.
0:59:48 So we’re just going to go figure out an innovative way.
0:59:53 And so we have to be careful because, of course, they are going to circle back and make this difficult in some way.
0:59:58 And that’s the cycle that you get in with regulatory oversight.
1:00:09 And, you know, you could be like, I’ve always the guy I used to stand up and fight about, like, this feels like the guy in basketball who decided that when there’s seven seconds left, you should intentionally foul someone.
1:00:18 I always thought that is like the most unsportsmanlike thing because I’m like, they didn’t invent fouls to like be executed on purpose.
1:00:20 They did it so you wouldn’t poke the other guy’s eyes out.
1:00:23 But it became part of the strategy.
1:00:28 And that is like the ultimate American capitalism is exploiting the rules that way.
1:00:32 And it’s also Silicon Valley.
1:00:43 So I think that what happens is the following is that you start out in a totally honorable, I think, fairly honorable, you know, capitalistic way.
1:00:51 And then what happens is when the government attacks you enough, then sometimes the companies that survive that get a taste for the one ring.
1:00:52 Yeah, yeah.
1:00:52 Right.
1:00:55 And they’re like, OK, well, you know what?
1:00:57 We just built this huge lobbying team to defend ourselves.
1:01:00 What if we go on offense, right?
1:01:03 And they’re kind of corrupted by it in a certain way.
1:01:14 And the one issue, there’s like, I think, you know, in chemistry, if you think about like reaction kinetics, sometimes you can have a bunch of time constants where you have this reaction, this reaction, this reaction.
1:01:18 They’re all going and you have to actually do the math to figure out which one goes first, you know.
1:01:23 And so I think there’s several things that are all hitting at the same time in this space that I’ll just give them a quick succession.
1:01:28 The first is these big companies are now getting the taste.
1:01:30 They’re forced to.
1:01:38 They wouldn’t actually want to consider this in the first place, but of, you know, getting like rather than buying the cow, they’re getting the milk for free, right?
1:01:40 Decapitation rather than acquisition, right?
1:01:44 Now that they know that that’s a thing, actually, it’s faster than an acquisition.
1:01:46 Just leave the money in the car.
1:01:47 You know, it’s almost like a deal.
1:01:49 You’re just buying something from somebody.
1:01:56 It’s closer to just like a big, big purchase order almost than it is to an acquisition with all the complexities that are involved in that.
1:01:59 So now they’re like, oh, I can do that faster and less overhead.
1:02:01 Let me have five of those, right?
1:02:01 Yeah, yeah, yeah.
1:02:07 So that’s like one thing that’s happening where you’re giving big companies now a taste of this.
1:02:15 And it’s like, you know, so we’ll have to figure out our deal terms to account for that as something that counts as an exit, but doesn’t count as an exit.
1:02:16 You know, we’ll figure it out.
1:02:16 Okay.
1:02:25 The second thing is AI is making it so that you can do more with less, more with fewer people, right?
1:02:32 So this will be a more common thing where there’s an internal, you know, amplified intelligence rather than artificial intelligence.
1:02:41 You’re going to have a stratification within every company and between companies where the top people will become more and more and more valuable because they can just do so much more, so much more quickly, right?
1:02:52 And the third thing is, you know, one thing people say about AI that I actually didn’t agree with then and actually I don’t agree with now is this is the worst it’ll ever be.
1:02:54 You know, that’s what they used to say, right?
1:03:04 But I remember with Napster, Napster was actually the best it was and then the all the copyright lawsuits and attacks and it made it worse and worse over time.
1:03:06 Google Books was amazing.
1:03:14 And then all these copyright lawsuits gelled it enough so that you could get like some little snippet preview and then you couldn’t see the whole thing, right?
1:03:29 And so it’s quite possible, I would even say probable, that the combination of all the copyright lawsuits, these are desperate lawsuits, by the way, desperate attacks by all these journalists and, you know, authors, writers, et cetera, who hate AI.
1:03:32 And I understand why they hate it, but they just hate it.
1:03:33 So they just want to kill the thing.
1:03:36 And, you know, they’ll say it, they’ll say, are you an AI supporter?
1:03:38 With like venom in their voice.
1:03:40 It’s like, have you not heard that one?
1:03:42 Yeah, go ahead.
1:03:44 I’ll follow up, I promise.
1:03:46 I won’t let you just get away with that.
1:03:46 Okay, okay.
1:03:51 Yeah, so it’s like, because it’s like, you know, they’d say like, are you a Trump supporter?
1:03:52 Are you an AI supporter?
1:04:00 You know, and some, there was some company, there’s a few that, it’s similar to actually like when Discord tried to roll out crypto.
1:04:02 People were like, you’re doing crypto?
1:04:04 You know, people got super, super mad, right?
1:04:11 And that is a building thing of an anti-AI, anti-crypto, anti-tech, and it’s setting fire to the Waymos.
1:04:14 It’s a real thing that we should not just watch out for.
1:04:18 I think it’s going to become actually the future political axis between futurism and primitivism.
1:04:20 That’s going to be the new left, right, after the whole thing finishes rotating.
1:04:27 But, so the issue is that those attacks from a copyright standpoint, and also the energy constraints,
1:04:31 because data center build-outs are going to just start hitting, you know, spare energy constraints.
1:04:37 And the fact that the Chinese models are open, and they’re actually pretty good,
1:04:41 and they’re distributing them quickly, and China’s, do you see my post on AI overproduction a few months ago?
1:04:43 That’s happening now.
1:04:46 You’ve got Kimi, you’ve got Quinn, you’ve got DeepSeq.
1:04:50 These are good models, and they’re open, I shouldn’t say fully open source,
1:04:54 because they’re open coefficients, but not open source,
1:04:57 because they haven’t released the full source code to build them,
1:04:59 and all the complexity that involves, and so on and so forth.
1:05:01 But they are open coefficients.
1:05:07 And so the combination of those three things means it’s quite possible.
1:05:12 And then the fourth is, you know, I already saw something where there’s some, you know,
1:05:14 government restriction on using hosted DeepSeq.
1:05:17 Okay, I can understand that hosted DeepSeq is going to China.
1:05:21 But I wouldn’t be surprised to see something where it all combines such that,
1:05:24 A, USAI companies are hit with copyright lawsuits.
1:05:27 B, they’re blocked by the lack of energy.
1:05:29 C, the Chinese open models are out there.
1:05:33 And D, U.S. regulations prohibit people from using the Chinese open models,
1:05:39 so that actually that lead in AI is actually lost, and it becomes harder to do AI in the U.S.
1:05:43 And it’s similar to what happened with crypto, where crypto had to decentralize outside the U.S.
1:05:46 in the 2020 and 2024 range.
1:05:50 I don’t think that’s the intent, but I can see those storm clouds coming.
1:05:56 And the one other thing I’d say is, because AI does it middle-to-middle, not end-to-end,
1:06:03 it doesn’t do everything, but it does do a lot of, there’s a lot of, you know, bureaucratic jobs,
1:06:09 you know, jobs that lawyers do, that doctors do, that teachers do, professors, artists, journalists.
1:06:14 This is going after, like, the blue base, really going after them.
1:06:21 And so, you know, doing all of this AI in San Francisco and publicly making millions or even billions of dollars
1:06:24 and being demographically different with all these immigrants
1:06:29 and being very publicly rich and recognizable in the blue state,
1:06:31 in the blue city, in the blue state of the union,
1:06:37 to me is not a good long-term recipe for peace and prosperity, right?
1:06:40 It results in accumulating too much capital too publicly,
1:06:45 and then you just start to see some very, very nasty things happening.
1:06:50 So because of all that, I think I am bullish on decentralized AI,
1:06:53 but I’m not so sure how centralized American AI is going to.
1:06:56 I think this is a good way to close.
1:07:00 I’m going to do the impossible thing with Ubology,
1:07:03 which is I’m going to try to get the last word in and let Eric just say thank you very much.
1:07:04 Go, go, go.
1:07:06 No, like, we opened up a lot of topics,
1:07:11 and I would encourage comments and dialogue out on X
1:07:14 for where we should take the next part of this because we should keep going.
1:07:17 But, like, I want to say broadly and deeply,
1:07:21 I agree on the biggest issue we all face right now
1:07:32 in the technology sector of the economy is the risk to the AI innovation trajectory in the U.S.
1:07:36 And you can look at that from a technology perspective, a regulatory perspective,
1:07:42 a business practices perspective, an immigration perspective, like a research funding.
1:07:48 Any way you want to look at it, there are arrows aimed at, from various perspectives,
1:07:55 from preventing it, when the right answer is we need, we just need to let the market work.
1:07:58 The market for talent, the market for technology, the market for people.
1:08:03 There’s just a very strong market that can really, really work.
1:08:10 Because from the Clay Christensen perspective, what China is trying to do is commoditizing our strength.
1:08:17 And so the release of a bunch of pure open source, open weight models coming from China
1:08:24 is specifically designed to go after a rigid or complacent American view of AI,
1:08:31 which is, you know, cloud hosted, buy a few big players, you know, closed source.
1:08:37 You know, that, that, so China is just doing, and I can look at this very emotionally and personally,
1:08:41 which is this is Google releasing Google Docs for free.
1:08:43 Yeah, yeah, yeah, exactly.
1:08:48 And I’m running Microsoft Office, and Google is just like, we’re never going to make money from this.
1:08:52 And here we are in 2025, they still don’t make any money from it.
1:08:58 But they, they, and what Microsoft had to end up relying on is the worst part of the business,
1:09:03 which is just enterprise distribution lock-in as your core part of your business,
1:09:07 not innovation, not moving forward, which bums me out.
1:09:12 I mean, they make money from Google, G Suite, the host of G Suite is starting to get expensive.
1:09:15 So they are, it gets expensive, but relative to what you guys are making, yeah.
1:09:21 It’s like, it’s like, the profits from Office are still the profits of Microsoft with Windows and stuff.
1:09:24 But the other angle is like, copyright, I’m going to come at it from a different angle,
1:09:26 and we should maybe think about talking about this.
1:09:32 Because I, I think people are, the people are rightfully panicked about the U.S. position,
1:09:35 and then point to one of those slings and arrows being copyright,
1:09:38 which of course has no issue in China at all.
1:09:41 Like, they have no problem with copyrights, you know, ask the pharma industry.
1:09:46 I lived in China, I worked on copyright, I know exactly what they’re doing.
1:09:52 But, but the truth is, is that copyright also created the technology industry at my, in the world.
1:10:00 And it was Microsoft and Intel with intellectual property and copyright that, that, and Apple, that enabled the industry.
1:10:05 And so we have to look at it a little bit more critically and not think of just about the, you know,
1:10:10 the starving novelist in, in Brooklyn who is frustrated by being used as training data.
1:10:12 There’s, there’s a lot more.
1:10:13 There’s a lot more depth.
1:10:13 Of course, of course.
1:10:16 There’s a lot of depth to, to the, to the copyright issue.
1:10:21 Finally, I, I do think that there is a, to wrap up just the M&A side,
1:10:27 the, the recent wave of deals is going to get looked at with scrutiny.
1:10:34 And, and the, the truth is, is that something will change in what’s permitted in, in deal structures in terms of oversight,
1:10:41 because I, I think they’re too big to get ignored by regulators in a tech industry that they’re no longer just going to ignore.
1:10:49 But I also think that there’s a lot of opportunity to, to be, to have much more clarity in deal structure to,
1:10:56 maybe it’s a great idea from that Bology raised, like to have, you know, designated survivors as part of corporate governance.
1:11:02 I mean, there’s a lot of interesting things you can think of to make that kind of outcome something that’s thought about.
1:11:08 Because of course today, you know, people who take on money from very late stage private equity investors or corporate venture,
1:11:13 they know the terms and conditions that you have to have in those deals to attract that money.
1:11:19 And so in the same way, if you know the kinds of things that might happen, you structure your cap table,
1:11:23 you structure your corporate governance to facilitate that or prevent it.
1:11:28 And right now, and then it becomes part of the business practice and then it becomes formalized.
1:11:35 And it’s less likely to be something that could just be stopped by sort of an arbitrary ruling by an appellate court in the eighth district
1:11:42 who doesn’t like a deal that happened to a local company, you know, which I think is where we end up with the risk right now
1:11:48 is that you’ll, we’ll just, it’ll be arbitrary and nothing is worse for anybody than arbitrary.
1:11:56 But I feel like we had this very long arc of discussion that was super interesting in terms of M&A and where we’re heading
1:11:57 and look to where to pick it up.
1:11:58 Great.
1:12:05 If Lena Kahn a few years ago when she was sort of, you know, if she was asking for advice or perspective,
1:12:10 is your view, hey, let the markets work because M&A helps, you know, everybody from big companies to small companies
1:12:14 to the second ecosystem to the consumer, or how should we think about antitrust?
1:12:20 Well, I’ll go first and then, but we should wrap up on biology for sure, which is just, the truth is,
1:12:28 because M&A will almost certainly fail, unless they want to come out on the regulatory side of like
1:12:33 defending against the potential for failure, they really can’t come out on the,
1:12:40 we predict that this one will be successful, because that just, it’s statistically not a supportable
1:12:44 public policy approach to the, to the action.
1:12:46 And the markets are much, much better.
1:12:50 Otherwise, they’re just basically instituting rent control on, on investing,
1:12:53 which is definitely not going to be the right way.
1:12:55 Yeah.
1:12:59 So it, I was just, uh, I didn’t get done there for a second.
1:13:02 Um, can you hear that over there?
1:13:03 Yeah.
1:13:05 Oh, is it lightning, you said?
1:13:06 Lightning.
1:13:07 Okay.
1:13:09 Not a tsunami, not an earthquake.
1:13:10 We’re good, right?
1:13:18 So, yeah, so, Eric, to your question, I would say, uh, we have to actually think more deeply in the
1:13:24 following sense, which is, if you model, you know, the public sector as a platform and the private
1:13:29 sector as the apps on that platform, sometimes an app gets big enough that you just have to actually
1:13:31 build a platform or become the platform, right?
1:13:35 Like, you know, Google was search, and then it grew and grew and grew and actually had to build its own
1:13:36 platform, right?
1:13:41 And, like, essentially Google, you know, with Chrome, it kind of became its own thing, as Steve
1:13:42 is aware, built those things.
1:13:50 And so, what we have to do is we have to stop being reactive to Lena Kahn or, like, Scott Wiener on the
1:13:51 AI bill or things like that.
1:13:54 And we have to be proactive in the following way.
1:13:58 A, for every space that we’re in, we figure out what is the ideal set of laws.
1:14:05 B, we write model legislation for all 50 states and all 190 sovereign countries.
1:14:08 And AI can help with this, but obviously it’ll just give you a first trap.
1:14:09 It’ll get you on base.
1:14:15 C, we build a sales team that goes down and knocks on the doors of those 50 states and
1:14:16 190 countries.
1:14:19 And, of course, there’s subdivisions, there’s cities and counties and all kinds of stuff, both
1:14:20 within and outside the US.
1:14:23 Next, we actually find politicians.
1:14:27 And, of course, you can rank, before you go and knock on the doors, you can rank that list
1:14:30 by those that are the most pro-tech, the most amenable to tech, right?
1:14:36 For example, JR Police in Colorado is friendly to, like, accepting Bitcoin for payments there.
1:14:42 Or you have, you know, somebody who’s posted about AI and they clearly, you know, they’re
1:14:43 conversing with it.
1:14:48 And often you’ll find some state senator or, you know, some governor, like, obviously, like,
1:14:54 there’s, McKellie, before he became who he is today, was a very pro-tech, you know, person
1:14:55 in government in El Salvador.
1:14:59 And so we identify all the pro-tech politicians around the world.
1:15:05 And in particular, in this process, small states are the friends of little tech, because
1:15:07 they’re the ones who don’t take anything for granted.
1:15:10 They want to build their economy, and so on and so forth.
1:15:17 And so we go to them, we say, here’s a draft of a bill, and then here’s 10 CEOs or 10 founders
1:15:22 or 10 investors or whatever, 50, representing X billion dollars in AUM or Y billion dollars
1:15:24 in revenue or some combined thing.
1:15:29 And if you pass this legislation, then we will invest in your country.
1:15:29 Right?
1:15:32 Because then that is now unlocked, right?
1:15:34 Now we can build speed of physics, not permits, right?
1:15:36 I should write an article on this, Elon Salvador.
1:15:38 Okay?
1:15:45 Elon Salvador is what it sounds like, which is the tie-up where in an American time zone,
1:15:49 Elon gets some space where he can build speed of physics, not permits, right?
1:15:55 When all 20th century barriers go away, you keep the common sense stuff like, you know,
1:15:57 bash or not kill, you know, assault, murder, blah, blah, blah, whatever.
1:16:00 All those, you don’t have to censor every law, obviously, right?
1:16:02 There’s some laws that are just eternal laws.
1:16:04 But lots of 20th century regulations are just very stupid.
1:16:09 You know, as Ira Pave said, you know, after the internet, you have to kind of go back and
1:16:11 look at a lot of laws and see if they still make sense, right?
1:16:15 Permitting laws, this law is, you know, do they still make sense when you can build in
1:16:16 different ways with robots or other things?
1:16:23 So the answer is, I don’t think it would be a micro answer, Eric, which is, it wouldn’t
1:16:25 just be like, you know, advising an economy or a different economy.
1:16:29 It’s a macro answer of like, go between countries.
1:16:30 Essentially, you know what it is?
1:16:35 Rather than, here’s a flip, rather than say, oh, you know, how do we let them decide
1:16:37 whether we’re a monopoly or not?
1:16:40 Assume the US government was a monopoly, the federal government, and how do we build competition
1:16:41 to that?
1:16:42 Right?
1:16:44 How do we build jurisdictional competition?
1:16:45 How do we build choice?
1:16:50 Because 96% of the world is not American and, you know, only, and 50% is on blue, even within
1:16:50 the US.
1:16:51 You’ve got lots of jurisdictional choice.
1:16:54 So how do we do antitrust on that?
1:16:58 Maybe we’ll wrap on that big idea.
1:16:59 Apologies, Stephen.
1:17:00 This has been a fantastic conversation.
1:17:00 Thanks so much.
1:17:01 Thank you.
1:17:06 Thanks for listening to the A16Z podcast.
1:17:11 If you enjoyed the episode, let us know by leaving a review at ratethispodcast.com slash
1:17:11 A16Z.
1:17:14 We’ve got more great conversations coming your way.
1:17:15 See you next time.

There’s been a wave of M&A deals lately – Meta and Scale, Windsurf and Google – and a lot of it points to something bigger: how regulation, capital, and innovation are colliding in 2025.

In this episode Erik Torenberg brings together Steven Sinofsky, former Microsoft Executive and Balaji Srinivasan, founder of the Network School, and author of the Network State to break it all down. 

From acquihires to “acquifires,” from FTC crackdowns to the deeper battle between the state and the network, this is a sharp conversation on the future of tech and power.

 

Resources

Find Balaji on X: https://x.com/balajis

Find Steven on X: https://x.com/stevesi

Learn more about The Network State: https://thenetworkstate.com

Learn more about The Network School: https://ns.com

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://x.com/eriktorenberg

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Leave a Comment