How AI Is Changing Warfare with Brian Schimpf, CEO of Anduril

AI transcript
0:00:06 The American defense industry is the largest in the world at nearly $1 trillion,
0:00:11 accounting for about 40% of military spending around the world,
0:00:15 and also arguably impacting every person on Earth.
0:00:18 Now this sector has also phase shifted throughout the decades,
0:00:20 including the consolidation of crimes,
0:00:24 shrinking from over 50 to less than 10 large crimes,
0:00:27 receiving a majority of defense dollars.
0:00:30 Those are companies like Lockheed, Raytheon, or Boeing.
0:00:33 But there are some new companies in town,
0:00:37 trying to disrupt how defense is done through new hardware and software.
0:00:42 One of those is Andrewle, a company that just announced Arsenal One,
0:00:45 a billion dollar factory in Columbus, Ohio,
0:00:48 expected to create 4,000 jobs in the region.
0:00:50 Now in today’s episode,
0:00:52 Andrewle co-founder and CEO, Brian Schimpf,
0:00:57 sits down with A16C Growth general partner, David George.
0:01:00 Together they discuss how Andrewle got its first product off the ground,
0:01:03 competing with some of the largest companies in the world
0:01:08 in navigating the US government’s complex procurement processes.
0:01:12 They also discuss how AI changes the modern battlefield.
0:01:13 Being able to pull out that signal
0:01:17 from this overwhelming amount of information that exists.
0:01:20 Plus, what most people get wrong about these technologies.
0:01:25 It is unethical to not apply these technologies to these problems.
0:01:27 And how we shape up to the competition.
0:01:31 They are running hundreds of tests a year of hypersonic weapons.
0:01:33 Right, the US is running like four.
0:01:35 Now if you do like this episode,
0:01:38 it comes straight from our AI Revolution series.
0:01:41 So if you missed previous episodes of that series,
0:01:43 with guests like AMD CEO Lisa Sue,
0:01:46 anthropic co-founder Dario Amade,
0:01:50 or the founders of companies like Databricks, Waymo, Vigma, and more,
0:01:54 head on over to A16C.com/AiRevolution.
0:01:58 All right, let’s get started.
0:02:01 As a reminder, the content here is for informational purposes only.
0:02:04 Should not be taken as legal, business, tax, or investment advice.
0:02:07 Or be used to evaluate any investment or security.
0:02:11 And is not directed at any investors or potential investors in any A16C fund.
0:02:14 Please note that A16C and its affiliates
0:02:17 may also maintain investments in the companies discussed in this podcast.
0:02:20 For more details, including a link to our investments,
0:02:23 please see A16C.com/disclosures.
0:02:28 (Music)
0:02:36 (Music)
0:02:39 Let’s jump right in. What is the end roll? Tell us what you do.
0:02:42 All right, so we were founded in 2017. We’re about seven years in.
0:02:46 The basic idea was we thought there was a better way to make defense technology.
0:02:51 So number one, the tech for the next 20 or 30 years was going to be primarily,
0:02:56 how do you just have more cheap autonomous systems on the battlefield,
0:02:59 just more sensors, just more information flowing in.
0:03:01 That seemed like it had to be true.
0:03:04 So we invested in the core software platform we call Lattice
0:03:06 that enables us to make sense of all these things.
0:03:10 We have built a variety of autonomous products that we fielded over the last seven years,
0:03:12 just an outrageous pace.
0:03:16 And we’re really working on all aspects of national security and defense.
0:03:20 And how did you guys get on to national defense as the place to go spend your time?
0:03:22 I mean, I know your background, but maybe you can share that.
0:03:25 Yeah. So I was at Palantir for about 10 years.
0:03:29 I had been working on a variety of government programs and then several of the co-founders.
0:03:33 So Trace Stevens was also a Palantir, Matt Grimm, our COO as a Palantir.
0:03:34 We’re all really good friends.
0:03:39 And we’ve been talking about doing this idea of there needs to be a next generation defense company.
0:03:44 And then Tray and Palmer met through VC world and Palmer was just getting out of Oculus
0:03:46 and he was like, “It’s the same thing I want to do.”
0:03:49 And so we decided to kick this off together.
0:03:53 But for me, working in defense, it was just obvious the degree to which there was a problem.
0:03:58 You work in this space, the tech is old, it is not moving fast, it is very lethargic.
0:04:00 There are relatively few competitors at this point.
0:04:02 It just felt very ripe to do something different.
0:04:07 And it’s the sort of thing that once you get into it, the people who are actually serving
0:04:10 just have this patriotic motivation to solve the problem.
0:04:14 It’s just very, very motivating problem to work on.
0:04:16 How did you land on the first product?
0:04:19 So first product we worked on was what we call Century.
0:04:20 It’s for border security.
0:04:22 And this was a Palmer idea.
0:04:24 He believed that tech could actually solve this.
0:04:27 So we have these automated cameras with radars.
0:04:31 We can monitor the border for miles away from these cameras.
0:04:36 And he was like, “This is something we can solve super fast with technology.”
0:04:42 And it really kind of hit what has ended up being a very good pattern for us is find an urgent problem
0:04:45 that actually has a real tech solution.
0:04:49 That we can apply the cutting edge technology to.
0:04:53 So early on in 2017, computer vision was just starting to work.
0:04:55 It wasn’t even really embedded GPUs yet.
0:05:01 We were literally taking desktop GPUs and liquid cooling them to get these things to work in a hot desert under solar power.
0:05:04 But we were able to go and get a prototype up in about three months
0:05:09 and then move into like a pilot in about six months and then full scale in about two and a half years.
0:05:10 So really, really quick timeline.
0:05:15 But I kind of fit this problem set of we had a technical insight of how you can do this better.
0:05:18 And there was urgency to solve the problem.
0:05:20 They actually wanted to make a dent in this.
0:05:22 Alright, I’m going to ask you a lot more about that stuff.
0:05:26 But one of the things that people say to me all the time, and you hear it in speeches and all this stuff,
0:05:28 like AI is going to change the nature of warfare.
0:05:29 Yeah.
0:05:33 On the one hand, the major breakthrough that we just had, the way everyone interacts with it,
0:05:35 is like a chat bot and an LLM.
0:05:36 It’s pretty cool.
0:05:37 It’s amazing.
0:05:38 It’s awesome.
0:05:39 I use it for everything.
0:05:47 But what are the implications of this new wave of AI generative AI on modern warfare, physical sense?
0:05:48 You have known the software side.
0:05:49 Let’s talk about that.
0:05:54 So when I think about where AI is going to drive the most value for warfare,
0:06:00 it is dealing with the scale problem, which is really the amount of information that is the number of sensors,
0:06:03 the sheer volume of systems that are going to be fielded and it’s going to go through the roof.
0:06:05 So this is like lattice.
0:06:06 Maybe start even there.
0:06:08 Everything has a sensor.
0:06:09 That’s right.
0:06:11 So what do people do in the DoD?
0:06:15 There’s a lot of things they do, but what’s the primary war fighting function?
0:06:18 They are trying to find where the adversaries are.
0:06:19 Yes.
0:06:21 They need to then deploy effects against them.
0:06:23 That can be a strike.
0:06:25 That can be deterring them by a show of force.
0:06:28 That can be jamming and non-kinetic things.
0:06:31 And they’ve got to then assess, did that actually work?
0:06:32 Find them.
0:06:35 You’ve got to engage and you’ve got to assess.
0:06:36 It’s pretty straightforward.
0:06:39 That is the primary thing that the military does.
0:06:40 And so, okay, what do you need to do that?
0:06:42 You need a ton of sensors.
0:06:47 You need a ton of information on what is going on with an adversary who is constantly trying to hide from you and deceive you.
0:06:52 So just huge amounts of information to make that problem as intractable as possible for them to be able to hide
0:06:55 or when they are deceiving, you can figure it out.
0:06:58 Hard to deceive in every single phenomenology of sensing.
0:07:00 This technology exists, right?
0:07:01 The sensors exist.
0:07:02 The sensors all exist.
0:07:03 The sensors are deployed, right?
0:07:06 They’re going to get better and you’re going to be cheaper and you’re going to be able to do more of them.
0:07:10 But a lot of the limit of why can’t we do more is what the hell are you going to do with the day?
0:07:11 Processing capabilities, yeah.
0:07:13 Processing, but also just operationally.
0:07:20 So, okay, now I say I had a perfect AI system that could tell me where every ship, aircraft, and soldier was in the world.
0:07:21 What are you going to do with that?
0:07:22 Now I know everything.
0:07:24 That is overwhelming, right?
0:07:28 And so then being able to sift through that information to, well, okay, they’re maneuvering here.
0:07:29 What does that imply?
0:07:31 Is this an aggressive action?
0:07:32 Is it outside their norms?
0:07:35 Is this different than we’ve seen in the past?
0:07:41 Being able to pull out that signal from just this overwhelming amount of information that exists.
0:07:43 And then the other side, you got to act, right?
0:07:46 So now I’ve got to actually be able to carry out these missions.
0:07:47 Yeah.
0:07:51 So this is where on the autonomy side, it really comes in, which is, okay, I want to send fighter pilots out.
0:07:54 So the way they do like a predator drone today is like a guy with a joystick.
0:07:55 Yeah.
0:07:56 We’ve all seen that Ukraine, Russia.
0:07:57 Yeah, exactly.
0:07:59 It’s all like manually piloted.
0:08:01 But that doesn’t really scale.
0:08:05 And there presents a lot of limitations on communication shaming, all these things.
0:08:11 So I want to be able to task a team of drones to go out and say, hey, go in this area and find any ships and tell me where they are.
0:08:12 I just wanted to be that simple.
0:08:14 They just need to figure out their own route.
0:08:17 And if I lose some of them, they rebalance, they just go out and handle it.
0:08:18 They’re running target recognition.
0:08:20 They can pop back whatever is relevant.
0:08:26 That is where I think the autonomy side really comes in, which is I can just drive scale into the number of systems I can operate in the environment.
0:08:33 The promise of AI in a lot of ways in the long run with this is just the ability to scale the types of operations I’m doing, the amount of information I have.
0:08:40 And if done very well, it will put humans into a place of sort of better decision making, right?
0:08:45 Instead of being like inundated by a volume of data and then all of our capacity goes to these mechanical tasks.
0:08:53 We can have humans with much better context, much better understanding, historical understanding of what this means, what the implications of different choices are.
0:08:54 Yeah.
0:08:56 Those are all things that AI can enable over time.
0:08:58 Ideally better decision making.
0:09:00 Ideally because we are wildly better decision making.
0:09:04 We’re working with both limited information and imperfect judgment.
0:09:05 That’s right.
0:09:06 I guess, right?
0:09:07 Yeah.
0:09:13 And so the more you can have AI augmentation for these things and synthesis and like clarity, that is where the promise of this is.
0:09:19 And so the U.S. posture on this is very much, we want to have humans accountable for what happens in war.
0:09:20 Yes.
0:09:21 That is how it should be, right?
0:09:25 The military commander that employs a weapon is accountable for the impact of those weapons.
0:09:26 Yeah.
0:09:27 That is correct.
0:09:29 I think that is the system we should have.
0:09:34 And so then nobody is talking about having full-blown AI is going to decide who lives and dies.
0:09:38 That is a crazy version that nobody wants to have.
0:09:44 Well, I think it’s also far-fetched in the sense that it presumes some sort of objective function that isn’t driven by us.
0:09:45 Exactly.
0:09:46 This is my conversation with everybody.
0:09:47 Yeah.
0:09:49 Oh, my God, what about when the AI goes Terminator on us?
0:09:51 I’m like, it’s a tool for humans.
0:09:53 It doesn’t have an objective function.
0:09:55 That’s a leap that is not on the scientific roadmap today.
0:09:56 That’s right.
0:09:58 So, like, why would that be the case in warfare?
0:09:59 That’s right.
0:10:02 And so I think the reality for these things is it’s going to be human augmentation.
0:10:07 It is going to be enabling human software in a much larger scale with much higher precision on these things.
0:10:09 And that is the opportunity with it.
0:10:10 Yeah.
0:10:15 And so to me, it is unethical to not apply these technologies to these problems.
0:10:21 Our view has always been, if we’re the best technologists on these problems or we can get the best technologists to it,
0:10:26 giving the best tools on these absolutely critical decisions that are extremely material,
0:10:28 that seems like probably a good thing.
0:10:33 And engaging in the question of how can you use this technology responsibly and ethically is incredibly important.
0:10:34 Yeah.
0:10:39 Is it more humane to have a fighter pilot in the way of danger or having an autonomous system?
0:10:40 That’s right.
0:10:41 Piloting in a conflict?
0:10:42 That’s right.
0:10:43 And by the way, I have friends who are fighter pilots.
0:10:44 I love fighter pilots.
0:10:45 Yeah.
0:10:47 But the technology has advanced significantly.
0:10:52 And you can make the argument that it is more humane not to put them in the line of fire in the way of danger, right?
0:10:53 Yeah.
0:10:54 We’re not going to want to put US troops at risk.
0:10:55 Yes.
0:11:04 And I think those are the turns factor of the US saying, I have this capability and I’ve reduced my political cost of engaging on these things.
0:11:06 It’s actually a pretty good deterrent as well.
0:11:07 Yeah.
0:11:09 I’m not putting US troops or I can give this to allies.
0:11:10 Yes.
0:11:11 And they can defend themselves.
0:11:12 Yep.
0:11:13 Keep us out of the fight.
0:11:14 Yeah.
0:11:15 Keep our troops out of the fight.
0:11:16 Keep the troops out of the fight.
0:11:17 And it changes the calculus quite a bit.
0:11:25 And so I think that actually in a lot of ways if done well has a significant kind of stabilizing impact and a turn impact.
0:11:29 It just is harder to use force to get your political ends.
0:11:30 Yes.
0:11:31 Exactly.
0:11:32 And I think that can be a very positive thing.
0:11:33 Yeah, exactly.
0:11:34 Yeah.
0:11:39 I keep coming back to deterrence and we need to find a way to create a sense of urgency for the sake of deterrence.
0:11:40 Yeah.
0:11:41 Not for the sake of going to war.
0:11:48 And so it feels like that’s universally like people, people we talk to, I feel like that’s universally known and hopefully we can make some progress.
0:11:49 Yeah.
0:11:50 I think people largely agree.
0:11:52 Well, Vladimir Putin was very convincing of this.
0:11:53 Yes.
0:12:00 Like it turns out invading Ukraine was probably the single biggest shift I’ve seen in terms of people recognizing that, look, there are still bad actors in the world.
0:12:04 They will use force to get to their political well if they think it will work.
0:12:05 Yeah.
0:12:07 If the cost is worth it, they’re going to do it.
0:12:09 And I don’t think there’s any reason to believe that’s going to stop.
0:12:11 It’s been true for tens of thousands of years.
0:12:16 Do you think the future of war first, you said AI is an augmentation for humans?
0:12:17 Yep.
0:12:21 How fully automated do you think a conflict can become, say in the next 10 years?
0:12:31 Look, I think the mechanics of, okay, there’s this airfield and you want to go surveil it and take it.
0:12:32 You’re going to do some strike.
0:12:33 You’re going to do some surveillance.
0:12:34 You’re going to do all these things.
0:12:37 There will be a large degree of automation in that, right?
0:12:43 Like I can just say, hey, send this team of drones out in these waves to go conduct this operation,
0:12:45 find things that pop up that are a threat.
0:12:48 Tell me if you pop up to the human to say engage or not.
0:12:49 It goes, right?
0:12:51 Like you can move at a much faster pace.
0:12:53 I think a lot of the things that were starting to happen in Ukraine,
0:12:56 a lot of the great work Palatier did was on things like this,
0:12:59 where it was like the targeting process of going from satellite imagery through to,
0:13:05 hey, this looks like a tank through to an approval of, is this a legitimate military target or not?
0:13:06 I was streamlined and compressed.
0:13:07 Much faster work for you.
0:13:08 Much, much faster.
0:13:11 So I think those things will happen very, very quickly.
0:13:12 Like very, very quickly.
0:13:17 Then, okay, now it turns into a matter of policy and degree and scope.
0:13:22 That is a thing that I think we’re just going to have to figure out as we work through it with the military.
0:13:24 So what we think about from the technology side is,
0:13:28 okay, I don’t want to design anything that precludes more advanced forms of this over time,
0:13:30 architect it correctly.
0:13:35 But the crawl phases just get a lot of the basics just automated, very mechanical things.
0:13:36 Make it very predictable.
0:13:37 Don’t have any surprises.
0:13:39 And then you can add more sophistication.
0:13:43 As you build trust, the AI advances, these things get more sophisticated over time.
0:13:48 And one of the best examples is on the defensive side where it’s right for AI.
0:13:50 So we do a lot of work on counter drone systems.
0:13:52 This is one of the areas we’re partnering with OpenAI on.
0:13:57 And it’s looking at this question of if you have multiple drones flying at you
0:14:02 and you have minutes to respond before the strike happens on you.
0:14:03 How do you make an optimal decision?
0:14:04 How do you make an optimal decision?
0:14:07 When you are panicked, you are nervous and your life is at risk.
0:14:12 Is that a person manually sitting there making those decisions today?
0:14:16 Yeah, it’s often three because they have a separate radar from a camera
0:14:19 or separate from the guy pulling the trigger on the weapon systems.
0:14:22 And so then the coordination costs can be significant.
0:14:24 So you can automate a lot of this.
0:14:26 And then the other problem with this is then, as we’ve seen in Ukraine,
0:14:30 every single unit, every single soldier is now at risk of drones.
0:14:35 So this has to proliferate out from being a specialty that you do in an operation center
0:14:37 now to every vehicle in the field.
0:14:40 In the field, everyone has to have this capability.
0:14:45 You need the ability to have these systems just process all that sensor data
0:14:49 automatically, fuse it together, tell you viable options for countering this
0:14:51 and tell you what’s a threat and what’s not a threat.
0:14:53 Like these are the types of things you need to be able to do,
0:14:56 respond with intelligence suggestions,
0:14:59 and then have the system just automatically carry it out from there.
0:15:01 Yeah, these are the types of problems we’re working on.
0:15:05 And the defensive side is just you need it, right?
0:15:09 There’s no choice because the timelines are too short and the urgency is too high.
0:15:17 And it’s a very straightforward area to understand where technology can really improve the problem.
0:15:23 Yeah, it’s like the highest stakes version of decisioning that autonomous driving cars are doing today,
0:15:25 but with way more sensor information.
0:15:27 Yeah, it’s not a road.
0:15:31 Yes, with an adversary who’s constantly trying to fool you to see view.
0:15:33 And yes, it’s very, very hard.
0:15:36 So that’s one of the big parts of the partnership with OpenAI.
0:15:37 Yeah, yeah.
0:15:38 So they’ve been great.
0:15:43 And I think Sam especially has been very clear that he supports our warfighters
0:15:48 and he cares about having the best minds in AI working on national security
0:15:51 and who better exists to work through these hard problems.
0:15:52 Yeah.
0:15:55 And so I was just incredibly proud of them for coming out in favor of this
0:15:56 and saying they’re going to work on this.
0:15:57 They’re going to do it responsibly.
0:15:58 They’re going to do it ethically.
0:16:02 But this is an important problem that the best people should be working on.
0:16:07 The defense industry is notoriously difficult for startups to navigate.
0:16:11 So how did you guys actually get traction in the first place?
0:16:16 And do you think that’s going to change in the future?
0:16:18 Do you think it will continue to be hard?
0:16:20 Do you think the primes will continue to have a stranglehold?
0:16:21 I’d love your take on that.
0:16:22 It is very hard.
0:16:27 And I think we built a lot of the right technology and the right business model
0:16:30 of investing in things that we believe need to exist.
0:16:32 I think we’re picking a lot of the right problems to go after,
0:16:34 but probably more than anything,
0:16:37 I think we understood the nature of what it took to sell, right?
0:16:40 And the congressional relationships, the Pentagon relationships,
0:16:44 the military relationships, like all of this that you need to be able to say,
0:16:45 “Hey, we have the right tech.
0:16:46 You can trust us.
0:16:47 We can scale.
0:16:49 We can actually solve these problems for you.”
0:16:55 Proving that it works and then like catalyzing all of these really complex processes around it.
0:17:00 I think the other part that we’ve done quite well is we’re just finding ways
0:17:03 to find those early adopters and we understand those playbooks.
0:17:04 Who’s going to move quick?
0:17:08 How do you just build that momentum and advocacy in the government to make this go?
0:17:10 Look, it’s like more bureaucratic in certain ways.
0:17:13 Is it much worse than selling to a bank or an oil and gas company?
0:17:17 It’s, I don’t know, maybe 30% worse, but like probably not 5x worse.
0:17:18 Yeah.
0:17:22 And I think the reality is it’s like enterprise sales are actually very hard.
0:17:25 Especially the ones with long sales cycles and massive commitments.
0:17:26 That’s right.
0:17:28 These are large capital investments customers making.
0:17:29 That is a slow sales cycle.
0:17:30 That is how it works.
0:17:31 Yeah.
0:17:35 And so I think there’s like a lot of complaining and frustration.
0:17:36 It’s okay.
0:17:38 Well, also being bad at business means you’re bad at business.
0:17:40 If you don’t understand your customer, you’re going to lose.
0:17:41 Yeah.
0:17:42 That’s how it works.
0:17:44 So do I think the government needs to be a better buyer of these things?
0:17:49 Do I think they need to like take better strategies that’ll get them more what they want?
0:17:50 Absolutely.
0:17:55 They’re taking observably bad strategies to get to the outcome they actually want.
0:17:58 Do I think it’s necessary to change for us to be successful?
0:17:59 Not really.
0:18:01 We’re just going to play the game that they present.
0:18:02 Okay.
0:18:04 I want to talk about the observably bad strategies.
0:18:05 Yeah.
0:18:06 What are the observably bad strategies?
0:18:07 Right.
0:18:08 And then what are the good ones?
0:18:13 And maybe also wrap it into this idea that how do you actually convince the government
0:18:15 that your ideas are the right ideas?
0:18:16 Yeah.
0:18:23 So take should you go spend money building a whole new generation of F-35s with man pilots
0:18:27 or a whole new generation of aircraft carriers, or should you do something different?
0:18:29 And how do you actually get your points across to them?
0:18:30 Okay.
0:18:34 So they’re sort of like, how do they contract and buy and what’s been going wrong there?
0:18:38 And then it’s what’s the right composition of even if you could buy perfectly well,
0:18:39 what should you be buying?
0:18:40 What should you buy?
0:18:41 Yeah.
0:18:42 Two different questions.
0:18:43 Yeah.
0:18:46 So the typical government contracts are done in what’s called cost plus fixed fee.
0:18:51 And this actually came out of World War II when we were retooling industry to work on
0:18:52 national security problems.
0:18:55 We’re just like, we’re going to cover all your costs and we’ll give you a fixed profit percentage
0:18:56 on top.
0:18:58 And so the incentives here are sort of obvious.
0:19:00 If it’s more expensive, you get more profit.
0:19:02 If it is less reliable, you get more profit, right?
0:19:03 The longer it takes.
0:19:04 Yeah.
0:19:07 The longer it takes, the less viable it is, the more complicated it is.
0:19:11 There’s no point of incentive in there to actually drive down costs.
0:19:12 And you see this play out, right?
0:19:16 It’s like the companies have gotten so used to this where you look at even something like
0:19:20 Starliner where, you know, I think SpaceX had a third the amount of money that Boeing
0:19:23 was given to make Starliner work.
0:19:27 And SpaceX did it on time probably faster than they even predicted.
0:19:30 Did it probably incredibly profitably and it worked.
0:19:33 And so I think these incentives that don’t hold you accountable are actually bad for
0:19:34 your company.
0:19:36 It just makes you a worse company.
0:19:40 But do people in the government realize that it’s bad for the country?
0:19:41 I think they are frustrated.
0:19:44 I think they understand that this is not really working.
0:19:50 So you look at like F-35 as an example of one of these programs, it took 25 years to
0:19:53 get it from initial concept to fielding.
0:19:57 There’s this awesome chart which is like shows how long it takes to get commercial aircraft
0:20:00 or autos from kickoff to fielding.
0:20:03 And it’s been like flat to slightly better for all of those things, like on the order
0:20:05 of two to three years.
0:20:08 The military aircraft side just went linearly straight up.
0:20:12 If these things are taking longer and longer and longer, there’s an amazing quote that
0:20:15 if you extrapolate this out, this is in the 90s, this guy made this quote.
0:20:20 If you extrapolate this out by like 2046, the US government will be able to afford one
0:20:25 airplane that the Air Force and Navy share and the Marine Corps every other day of the
0:20:26 week gets.
0:20:27 It better be a good airplane.
0:20:28 Yeah.
0:20:29 These things are just crazy.
0:20:31 And so I think they recognize that this is not working, right?
0:20:32 This is broken.
0:20:34 Now the other part of this, they haven’t had a lot of alternatives.
0:20:40 So you have a relatively small cartel of these companies who sort of all say we won’t do
0:20:41 fixed price programs anymore.
0:20:44 They won’t do things on a fixed cost basis.
0:20:46 So okay, if you’re the government and you’re a buyer, what are you going to do?
0:20:47 Yeah, of course.
0:20:48 You don’t have a lot of choice here.
0:20:52 And there’s been a lot of problems with trying to get this model right.
0:20:55 Now, there’s some things that can work a lot better.
0:21:01 I think SpaceX really proved this where they literally built a reusable rocket that you
0:21:03 catch with chopsticks commercially.
0:21:07 I think we could solve these things, guys.
0:21:08 What thing can’t we build?
0:21:09 I think we could build an airplane.
0:21:11 I think we could build it.
0:21:16 So there’s not really a question that this is the only part of some magical thing that
0:21:18 can only, like there’s the only people that can do it anymore.
0:21:19 Well, I knew it anymore.
0:21:20 And you guys with autonomous fighter jets.
0:21:21 Exactly.
0:21:22 Yeah.
0:21:23 Like it’s proven now that the future can be built.
0:21:24 Yeah.
0:21:25 Yeah.
0:21:26 Exactly.
0:21:27 And so I think the alternatives are there now.
0:21:28 And then models that can work a lot better.
0:21:33 It’s like one of my crazier ideas is New Missile takes about 12 years to go from
0:21:35 concept through to fielding, like 12 years.
0:21:36 That’s insane.
0:21:37 It’s insane.
0:21:38 And so, okay, if you’re in that world…
0:21:40 But how fast is the technology evolving?
0:21:41 Oh, like…
0:21:44 This is like 12 years from now, like what we’ll be able to do.
0:21:45 Right.
0:21:46 Exactly.
0:21:47 And then we’ll still be on the previous system.
0:21:48 No, there’s even crazier examples.
0:21:52 Like the Columbia-class nuclear submarine is going into service in 2035, and it’s expected
0:21:54 lifetime this year, 2085.
0:21:57 So how good were we in 1960, I guess, where we were today?
0:21:58 It’s like…
0:21:59 So unclear.
0:22:00 So unclear.
0:22:01 Yeah, we had technology to go to the moon.
0:22:02 Yeah.
0:22:03 Exactly.
0:22:04 Like the quality of the phone.
0:22:05 Yeah.
0:22:06 Those are like the computing power of the phone.
0:22:07 Yeah.
0:22:11 And so, these timelines just get longer and longer, and this is death spiral, these things.
0:22:14 Contrast that cycle of development with China.
0:22:15 Do they take 12 years?
0:22:17 How is their tech stack up to ours?
0:22:22 The single best stat for this is they are running hundreds of tests a year of hypersonic
0:22:23 weapons.
0:22:24 Right.
0:22:25 The U.S. is running like four.
0:22:26 Right.
0:22:29 Anyone who’s worked in technology understands the compounding value of iterating on these
0:22:33 things, and it is just so undervalued.
0:22:34 Why is that the case?
0:22:39 Look, the U.S., all these tests are very expensive, very complicated.
0:22:42 There’s so much build up because every test has to go well, because we do relatively
0:22:43 few tests.
0:22:46 So then it increases the risk and the duration that you prep for these tests and increase
0:22:47 the cost.
0:22:48 It’s cycle times are long.
0:22:49 Yeah.
0:22:50 And you’re just in this vicious negative cycle.
0:22:53 Like anyone who’s worked in software understands this, like the old school way of releasing
0:22:54 software.
0:22:56 If you did a yearly release, you try to shove everything you can into that.
0:22:59 The risk goes through the roof, quality is a disaster.
0:23:06 Too fast has an insane quality of its own and just how quickly you can learn and how
0:23:09 much you can actually reduce costs on these things.
0:23:14 And so they’re just much more willing to test and iterate in a way that the U.S. is not
0:23:15 right now.
0:23:18 And so I think that is like long-term, the biggest thing I worry about for the U.S. is
0:23:22 the pace, the pace of iteration, pace of iteration on these things.
0:23:27 It probably is the single biggest determining factor of how successful you’re going to be
0:23:29 over a 20 to 30 year period.
0:23:30 How do we create a sense of urgency?
0:23:31 Yeah.
0:23:35 Like you look at that retooling, we had a two year period of lend lease and the amount
0:23:38 of GDP that was spent on lend lease, the time was through the roof.
0:23:41 And we weren’t at war then for others people.
0:23:46 So we had a two year head start to recondition U.S. industry around this before we even entered
0:23:47 into a conflict.
0:23:48 And that’s about how long it took.
0:23:53 And particularly Russia, about same duration, about two years to retool their industry against
0:23:58 defense production, they are now out producing all of NATO on munitions.
0:23:59 Russia.
0:24:00 Yeah.
0:24:01 Russia.
0:24:02 Well, I believe it.
0:24:04 And we’ve sanctioned them to hell and they still are doing, right?
0:24:05 Well, they still have gas.
0:24:07 They still have plenty of gas.
0:24:11 And so it’s quite tricky to think you’re going to reconstitute this in a single day.
0:24:13 I think the department has a lot of urgency on it.
0:24:15 One of the areas where we see it is showing up with weapons.
0:24:20 So when you look at these wargaming sort of scenarios, all these wargames are sort of
0:24:22 questionable in their own ways.
0:24:27 But pretty consistently, the stockpile of key U.S. munitions is exhausted in about eight
0:24:28 days.
0:24:30 And that’s usually problematic.
0:24:37 And that is because we have gone down this path of thinking that we’ll be able to have
0:24:40 this Gulf War strategy of concluding a conflict in two or three days and that’s how we’re
0:24:41 going to fight our wars.
0:24:42 And it’s just not true.
0:24:43 Right?
0:24:45 It’s like any of these high intensity conflicts.
0:24:47 Well, for any adversary that matters.
0:24:48 That’s right.
0:24:49 It’s not even close.
0:24:51 We’ve got to be prepared to sustain these protracted conflicts.
0:24:54 And that in and of itself is probably one of the best deterrent factors we can have.
0:24:55 Exactly.
0:24:56 It’s like, we will not stop.
0:24:57 We will not back down.
0:24:59 We will have the capacity to withstand anything.
0:25:00 Right?
0:25:03 That is a message we need to send to our adversaries worldwide.
0:25:06 We have critical gaps on a lot of the kind of constituent parts of supply chain.
0:25:08 This is a national security issue.
0:25:11 So I think there is a feeling that this is a problem.
0:25:13 I don’t think anyone thinks everything’s going great.
0:25:15 Now the question is, what are the strategies to get a way out?
0:25:16 Right?
0:25:19 I don’t think there’s any debate that we’re on our back foot in terms of the capacity
0:25:22 we need, the mass we need, the type systems we need.
0:25:24 Now like, how do you get out of it?
0:25:25 That’s a much harder question.
0:25:31 And do it in a way that is going to work with Congress, is affordable, is actually something
0:25:32 we can sustain.
0:25:36 The path we’re on is probably more incremental than revolutionary, I would say, with like
0:25:40 US government, where companies like us are going to come in and win incremental new programs
0:25:41 and show the different problems.
0:25:42 Yes.
0:25:43 We’ll be more innovative.
0:25:44 We’ll be more innovative.
0:25:45 I think that flywheel is really starting to go.
0:25:47 We saw a volume issue.
0:25:48 But it is a major volume issue.
0:25:53 And I think on the weapons production side, look, the only solve out of this is to actually
0:25:56 tap into the commercial and industrial supply chains that exist.
0:25:57 We’re pretty good at building cars.
0:25:58 We’re pretty good at building electronics.
0:25:59 Certainly the components that go into it.
0:26:00 The components for sure.
0:26:01 We’ve been building a lot of components.
0:26:02 Like we can do this stuff.
0:26:03 Yeah.
0:26:06 And you could design your systems in a way that take advantage of those commercial supply
0:26:07 chains.
0:26:09 Like one example we have is maybe like a low cost cruise missile.
0:26:10 Yeah.
0:26:11 It’s very cool.
0:26:12 Several hundred mile range.
0:26:16 And we made the exterior fuselage in this process that’s used for making acrylic bathtubs.
0:26:17 It’s this hot press process.
0:26:19 It’s like we’re making the gas tank.
0:26:20 This is a mad scientist thing.
0:26:21 It’s incredible.
0:26:22 It’s awesome.
0:26:25 And the fuel tank is maybe the same thing as the rotomolding you use it for like making
0:26:26 plastic toys.
0:26:27 And it works great.
0:26:28 Yeah.
0:26:30 There’s a huge supply base that’s available to do these things.
0:26:35 And in contrast that with most of these traditional weapons where it’s like overly bespoke components,
0:26:40 we got to get the dude that knows how to solder this one thing out of retirement in the supply
0:26:44 chains are super deep, like four year lead times on these weapons.
0:26:46 Like it’s really, really bad once you get into it.
0:26:50 Like I saw this thing, the defense primes were like, we need to change the federal acquisition
0:26:55 rules so that we can stockpile four year lead time parts for like a four year lead time
0:26:56 part.
0:26:57 What do we hear?
0:26:58 What do we do?
0:26:59 What do we do?
0:27:00 The world has changed in four years.
0:27:01 Yeah.
0:27:02 Like what could is happening?
0:27:04 And so I think there’s a problem, but then the government doesn’t help or they don’t
0:27:06 allow them to change the components.
0:27:08 There’s no incentive to change the components.
0:27:09 Well, so this is the problem.
0:27:10 There’s no origin.
0:27:11 It goes back to there’s no origin.
0:27:12 Exactly.
0:27:14 And so look, I think a lot of the traditional players are like patriots and they really
0:27:18 care, but it’s like they’re in a system that doesn’t encourage them, support them or
0:27:21 even I kind of boil it down to like two key things.
0:27:23 One is meaningful redirection of resources.
0:27:29 So like right now the amount of money that’s actually spent on capabilities, like the types
0:27:33 of things working on it somewhere between 0.1 and 0.2% of the defense budget.
0:27:34 That seems pretty loud.
0:27:40 Even if we got to 2%, 2% we are like in a wildly different world in terms of what you
0:27:42 can do with that type of money.
0:27:44 You’re making like a VC sounding pitch.
0:27:50 Yeah, if I could even, if I could just get 1%, but that’s actually very helpful context
0:27:51 all kidding aside.
0:27:53 This is a crazy small number.
0:27:57 It’s a crazy small number and even the small numbers are pretty big, but you really need
0:27:58 to up this.
0:28:03 So like number one is make the hard choices to drive redirection of resources into the
0:28:06 technologies that are actually going to be what you need, right?
0:28:10 Where they’re so stuck with these legacy costs.
0:28:15 Number two is every company in the world gets this, which is you need to empower good people
0:28:20 to run hard at a problem and put all the things that they need to do it and all the approvals
0:28:23 and all of that under their command to just get to yes.
0:28:24 Yes.
0:28:25 Yes.
0:28:26 It’s very simple, right?
0:28:29 That’s how every company operates and that is how you’re successful.
0:28:31 Just empower good leaders to get results.
0:28:32 Yeah.
0:28:33 Hold them accountable.
0:28:36 It is the opposite of how it works in the Pentagon where every time something has gone
0:28:41 wrong, a new process and a new office has been added to check the homework and say no
0:28:43 and they saw all progress out.
0:28:46 And so I think there’s relatively simple things that can be done with some combination of
0:28:52 congressional action and executive action to flip that on its head, say, nope, these
0:28:57 program offices are fully empowered to field their capabilities and they’re just accountable
0:29:00 to senior leaders on the risk of trade offs.
0:29:01 Yeah.
0:29:02 And that’s it.
0:29:05 And you give them a budget, give them a budget, give them a target and they have to understand
0:29:06 the risk.
0:29:10 They have to do all this, but they’re going to make informed choices on risk and cost
0:29:11 and schedule and performance trade offs.
0:29:12 Yeah.
0:29:13 That’s their job.
0:29:14 That’s what we’re hiring them to do.
0:29:18 If we create really empowered people to actually field stuff, you will get amazing results
0:29:20 because there are really good people in the government.
0:29:24 It’s just there are 10 times as many people who say no as there are to people who are
0:29:25 accountable for doing it.
0:29:26 Oh, that’s fascinating.
0:29:28 10 times more people who hang around say no than say yes.
0:29:29 That’s right.
0:29:32 Could you do just like a project warp speed for defense?
0:29:37 I know that’s like, that implies something short term, like it’s like a one-time catch-up
0:29:38 or something.
0:29:40 Yeah, this is probably needs to be just like a permanent shift.
0:29:41 I think you have to do both.
0:29:42 Right.
0:29:45 So you’ve got to say, look, yeah, we need a warp speed for autonomous systems or weapons.
0:29:46 We need that.
0:29:47 Right.
0:29:48 That’s a no brainer that we need to have.
0:29:53 And in doing that, you can tease out what are those things that you cut and everything
0:29:55 worked out fine.
0:29:58 And you just didn’t need to do it again.
0:30:02 And then in parallel, you do the painful and slow process of just whacking back all
0:30:04 these like bureaucratic things that exist.
0:30:07 I think you got to do something right and use that as a template.
0:30:12 And so these sort of like things that prove you can be successful, do more of them, go
0:30:16 at bigger scale, while also cutting back all the nonsense on things that just don’t need
0:30:17 to exist anymore.
0:30:19 They made sense at the time.
0:30:23 Now let’s revert, walk back and reset where we actually need to be for where we are.
0:30:24 Like tech has changed.
0:30:25 The pace has changed.
0:30:26 Reflect that in your process.
0:30:31 It seems even before the stuff we were just talking about in 2019, when you guys started
0:30:37 the company in 2017, starting a company in defense was extremely unpopular.
0:30:41 And when you talk about what do you need to succeed as a startup, there’s so many things,
0:30:46 but capital talent, relationships with customers, like all of those things are way, way harder
0:30:52 or were way, way harder in defense in 2017, and in fact, like radioactive for some in
0:30:53 2017.
0:30:57 It’s a lot of the engineers and things are just like a religiously opposed.
0:31:02 Now it seems that there’s this whole new burgeoning interest in defense startups and we have an
0:31:06 American Dynamism Fund and lots of people are interested.
0:31:07 How did that happen?
0:31:10 Because it seemed to happen a little bit before Ukraine too.
0:31:12 Started to shift just before Ukraine.
0:31:13 Yeah.
0:31:14 What was the cause of that?
0:31:19 So yeah, when we started, I mean, the number of VCs who gave us like ethics interviews or
0:31:27 just said no or look, like my crass take is that Silicon Valley is like quite mimetic.
0:31:29 The VC world as well.
0:31:36 And once the mainline funds like you guys, Founders Fund, General Catalyst all came out
0:31:41 and said, we’re doing this and like our valuation was high enough, then everyone was like, then
0:31:42 they got it.
0:31:43 Chase, chase.
0:31:44 Yeah.
0:31:46 I think that was like step one was it was sort of normalized.
0:31:47 Yeah.
0:31:50 And stream VC funds were saying, no, we’re doing this is important.
0:31:53 I know Mark put out post on it at the time.
0:31:58 And so I think that was like the snowball then of, okay, this is succeeding.
0:31:59 It’s actually okay.
0:32:01 Everyone’s been told it’s okay.
0:32:05 And then there was this catalyzing event around Ukraine.
0:32:09 And then I think on the why so many defense tech startups, it’s like, look, this stuff
0:32:11 is, I think it’s very important to work.
0:32:14 It’s also as an engineer, just some of the hardest and most interesting process you’re
0:32:15 going to work in.
0:32:16 Yeah.
0:32:23 When engineers grew up looking at skunkworks and seeing the SR 71 Blackbird, all these
0:32:28 wild things that the US was able to pull off, that was your inspiration growing up as an
0:32:29 engineer.
0:32:30 Yeah.
0:32:31 Like this stuff is iconic.
0:32:32 People want to work on these things.
0:32:36 And so I think it just really mobilized people who really cared about this.
0:32:40 And then you have a ton of vets or leaving the military and just want to solve problems
0:32:41 that they encountered.
0:32:42 Yeah.
0:32:44 And so you just have all these kind of a ton of interest in working on it now, a ton of
0:32:49 capital because they’ve seen our success, they know it can be done, and then just the
0:32:52 social normalization of the whole thing really flipped the narrative.
0:32:53 Yeah.
0:32:59 And I would say the evolution of the sort of primitives for technology has actually advanced
0:33:01 the opportunity big time, right?
0:33:05 So like a lot of the dollars that would go to something like an aircraft carrier, which
0:33:11 is untouchable for a startup, should go to smaller form factor, a treatable, fully autonomous
0:33:12 equipment.
0:33:13 You’re 100% right.
0:33:17 And a big part of our strategy on this has been like, we are leaning into everywhere
0:33:19 where there’s commercial investment.
0:33:24 And so many of the things that historically have been like defense exclusive are no longer
0:33:25 the case.
0:33:26 Totally.
0:33:28 One of the examples of this is we built this electronic warfare system.
0:33:29 It’s really cool.
0:33:31 It’s a jammer, sensors, jams, radio signals.
0:33:35 If we did that five years ago, 10 years ago, you would have custom tape out chips.
0:33:36 It’s hundreds of millions.
0:33:37 Yeah.
0:33:38 And it’s a huge thing.
0:33:40 So only government funded things did it.
0:33:42 It was on a really slow cycle.
0:33:47 Well, now with all the 5G tech, this is like the performance of these things is through
0:33:48 the roof.
0:33:49 You just take commercial parts.
0:33:50 Yeah.
0:33:53 And then just being the fastest to integrate and understand how to utilize these technologies
0:33:54 becomes the advantage.
0:33:55 Same with AI.
0:33:57 It was like, we don’t do AI model research.
0:33:58 Yep.
0:33:59 We don’t need to.
0:34:00 Yeah.
0:34:01 We just take the best things that are there.
0:34:02 The best models.
0:34:03 Yeah, exactly.
0:34:04 So riding these tech waves has been a huge part of it.
0:34:08 And that is the macro shift that occurred where the department hasn’t reconciled yet,
0:34:13 which is like the innovation is much more coming from the commercial world.
0:34:15 So it becomes being the best adopter.
0:34:18 It is no longer these 10-year tech road maps of the department controls.
0:34:19 Yes, exactly.
0:34:21 It is a totally different world we’re living in.
0:34:25 And so I think, yeah, the macro piece of why a company like us can see major technology
0:34:30 shifts around where the innovation is coming from, huge geopolitical shifts.
0:34:31 Yes.
0:34:35 And then the consolidation of the existing industrial base with the bad incentives has
0:34:37 led to an erosion of capacity.
0:34:41 And so you combine all these things together and you’re like, the conditions were sort
0:34:44 of set for us to be successful on.
0:34:45 Yes.
0:34:46 Yeah.
0:34:47 They don’t think we could have done it five years or later.
0:34:48 It would be too late.
0:34:49 Five years earlier.
0:34:50 Probably would have been too early.
0:34:51 It wouldn’t have worked.
0:34:52 Yeah.
0:34:53 I think we were in this like two to three year window where we could ride all those waves
0:34:54 correctly.
0:34:55 Yeah.
0:34:56 Brian, it’s so fun to be with you.
0:34:57 Thanks a ton for spending the time.
0:34:58 Yeah.
0:35:01 Thank you for what you’re building as your investor, but more importantly, for all of
0:35:02 America.
0:35:02 Thank you.
0:35:22 [MUSIC]

How is AI reshaping modern warfare? 

Speaking with a16z Growth General Partner David George, Anduril cofounder and CEO Brian Schimpf discusses how AI helps humans make better strategic decisions by sorting through the enormous amount of data collected from modern battlefields. Schimpf also discusses navigating the US government’s complex procurement processes, using commercial technologies to kickstart their own product development, and the growing opportunities for startups in defense. Throughout, Brian offers a deep dive into the intersection of technology, geopolitics, and the future of defense.

This episode is part of our AI Revolution series, where we explore how industry leaders are leveraging generative AI to steer innovation and navigate the next major platform shift. Discover more insights and content from the AI Revolution series at a16z.com/AIRevolution.

 

Resources: 

Find Brian on X: https://x.com/schimpfbrian

Find David on X: https://x.com/davidgeorge83

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Leave a Comment