The Future of Drone Warfare

AI transcript
0:00:04 The industrial capacity of China is fearsome.
0:00:08 Being able to deploy highly autonomous AI-driven drones at scale
0:00:10 is still a domain that we can win in.
0:00:14 I think the technologies that matter most to the future of war
0:00:17 are right there in front of us.
0:00:20 I think a modern conflict becomes basically like a software writing fight.
0:00:22 It will be the pace of deployment
0:00:27 that is the make or break for militaries around the world.
0:00:30 The game theory here is just as simple and obvious as it can be.
0:00:35 Unless we find a way for industry and government to work together,
0:00:39 we will find ourselves in a very tough situation.
0:00:43 Will Durant once said, quote,
0:00:46 war is one of the constants of history, unquote.
0:00:49 And while the presence of war has not changed,
0:00:51 the way it’s conducted has.
0:00:55 It is technology, whether steel, gunpowder, radio, GPS,
0:00:59 or nuclear weapons, which have defined conflicts over the eras.
0:01:02 And while the images of tanks or machine guns
0:01:04 dominate the visuals we have of war,
0:01:07 these are not the decisive technologies of the future.
0:01:09 And here’s the thing.
0:01:11 The future of warfare isn’t coming.
0:01:12 It’s already here.
0:01:16 It’s fought in the skies over Ukraine, Israel, and beyond
0:01:18 by AI-empowered drones.
0:01:22 These drones have become a crucial weapon of war with asymmetric capability,
0:01:26 where a handful of drones costing hundreds or thousands of dollars
0:01:30 can disable equipment like tanks or aircraft that cost orders of magnitude more.
0:01:33 We’re literally off by like three orders of magnitude.
0:01:37 Within a few short years, drones have gone from a reconnaissance tool
0:01:40 to one that ensures aerial and battlefield dominance.
0:01:43 The army with more drones has a decisive advantage.
0:01:48 So, with China dominating about 80% of the global drone market,
0:01:52 what does that say about our national security in the growing power competition?
0:01:54 Why have we fallen so far behind?
0:01:56 And what will it take to build up our domestic drone industry?
0:02:00 Plus, where should autonomy play a role in military decision-making,
0:02:03 when lives are literally on the line?
0:02:07 In today’s episode, recorded live at our third annual American Dynamism Summit
0:02:09 in the heart of Washington, D.C.,
0:02:13 A16Z’s Senior National Security Advisor, Matt Cronin,
0:02:16 sits down with two people who have been thinking about these questions
0:02:21 and building solutions dating all the way back to when this industry was full of hobbyists.
0:02:24 That is Ryan Tseng, co-founder and CEO of Shield AI,
0:02:28 and Adam Breit, co-founder and CEO of Skydio.
0:02:33 Shield has been building intelligent systems like AI-powered fighter pilots
0:02:35 and drones since 2015,
0:02:40 while Skydio has been manufacturing drones for use in the battlefield since 2014,
0:02:44 and today is the largest U.S. drone manufacturer by volume.
0:02:47 So who will command the skies in the years to come?
0:02:49 Listen in to find out.
0:02:54 As a reminder, the content here is for informational purposes only,
0:02:57 should not be taken as legal, business, tax, or investment advice,
0:03:00 or be used to evaluate any investment or security,
0:03:02 and is not directed at any investors or potential investors
0:03:04 in any A16Z fund.
0:03:07 Please note that A16Z and its affiliates
0:03:10 may also maintain investments in the companies discussed in this podcast.
0:03:13 For more details, including a link to our investments,
0:03:15 please see A16Z.com slash disclosures.
0:03:26 We’re here to chat about drones, autonomy, and great power conflict.
0:03:31 Now, both of you have started incredibly successful and innovative drone-focused companies.
0:03:35 At the same time, both of you started when the industry was truly nascent.
0:03:40 It was seen as a field for hobbyists rather than something that could actually shape the future of warfare.
0:03:46 So what led you to be interested in this field and to spend so much of your time,
0:03:50 blood, sweat, and tears, on this now incredibly important industry?
0:03:53 I had started and sold a company to Qualcomm.
0:03:59 And I’ve always been somebody that has had an intense passion to compete, fight, win at whatever I was doing.
0:04:02 And in my last year, Qualcomm was there for about four years.
0:04:05 I didn’t have that fire in the belly that I had had throughout my life.
0:04:09 And so I started thinking about what it was that would really motivate me,
0:04:11 not for the next five years, but for the next 50.
0:04:15 And I decided that if I could find the intersection of three things,
0:04:18 a noble mission, the chance to work with extraordinary people,
0:04:21 and a chance to define the possible, I’d have that fire in the belly.
0:04:26 And my brother went on to become a Navy SEAL, so a totally different track in life.
0:04:29 And he was getting ready to go to business school.
0:04:32 I encouraged him to think about what he wanted to do in life.
0:04:35 And he came to me with the idea of bringing the best of what was going on
0:04:39 in the autonomous driving sector to the mission of protecting service members and civilians.
0:04:45 He felt like if somebody would do that, it would have brought home a lot of his friends safely to their families.
0:04:50 And he felt like it would be a pillar for the future of American military dominance.
0:04:52 I thought it was an extraordinary mission.
0:04:54 I told you earlier, I thought it was a stupid business.
0:05:00 And so I wished him luck and suggested that he come up with a better business to make mission impact.
0:05:03 But SEALs are very persistent people.
0:05:04 My brother is no exception.
0:05:09 And for a long story short, I started to spend more time with him learning about the challenges.
0:05:13 And I was just shocked by the scope of the problem and how little was being done about it.
0:05:19 And I’ve been proud and humbled every day for the last 10 years to have an opportunity to contribute to a mission that I think is so important.
0:05:19 Incredible.
0:05:21 Well, I’m glad SEALs are persistent in general.
0:05:25 And I’m especially glad that your brother, the SEAL, dragged you along in this important mission.
0:05:26 Yes.
0:05:26 What about you?
0:05:29 Yes, we come at this from very different places.
0:05:42 First, I think that S.H.I.E.L.D. deserves just enormous credit for 10-plus years ago recognizing the need and the opportunity for what we now think of as defense tech at a time when that really did not exist.
0:05:44 And people thought they were crazy for even trying.
0:05:47 Candidly, I don’t think we get that same kind of credit.
0:05:49 So I grew up flying radio-controlled airplanes.
0:05:51 I’ve basically been doing grown stuff my whole life.
0:06:03 And that led me to be a grad student at MIT in the late 2000s, early 2010s, when you could basically take radio-controlled airplanes and put computers and sensors on them and write software to get them to do smart stuff.
0:06:10 So I really became obsessed with trying to build AI systems that could fly better than people could and wasn’t thinking so much about applications at the time.
0:06:18 But in 2013, 2014, my lab mate and I started to look out and see there were interesting things starting to happen with small light quadcopters.
0:06:23 And we felt like the applications and implications of the technology could be enormous.
0:06:27 But needing to have an expert pilot there flying it was just sort of a fundamental restriction.
0:06:35 So the big bet that we made when we started Skydio was that AI and autonomy built into a small light quadcopter was going to be very powerful for a wide range of industries.
0:06:40 And the government and enterprise applications were always part of the vision.
0:06:47 But we explicitly decided to start with a consumer product because we thought that something like light, integrated, easy to use would be a really good platform for this other stuff.
0:06:52 And for me personally, when I was in grad school, I was wrestling with, you know, I love this technology.
0:06:54 I wonder if I want to keep working on it.
0:06:56 Am I going to have to go work for a defense contractor?
0:07:02 I deeply believe in like the mission of the U.S. military, but the idea of working at a traditional defense contractor was very unappealing to me.
0:07:05 And so we started with a consumer product.
0:07:11 And it’s reflective of kind of the trajectory of the space that very quickly, you know, like 2018, 2019 timeframe.
0:07:19 And I think to the credit of the U.S. military, they realized that these small light civilian quadcopters had enormous value on the battlefield.
0:07:30 And the technology from consumer to enterprise to military is so tight that in the span of a year, basically, we won our program of record, the Army Short Range Reconnaissance Program.
0:07:34 And I think it was officially announced in 2021, but it really started in 2018, 2019.
0:07:38 And that was the beginning for us of expanding to serve a much broader set of markets.
0:07:40 Adam was also part of my early story.
0:07:43 I don’t know if you know this, Adam, but I read all of your papers.
0:07:44 We’ve learned a lot since then.
0:07:46 Who’s the foundation?
0:07:46 Yeah, yeah.
0:07:50 No, I mean, look, a lot of the ideas that we use at Skydio came from the research community.
0:07:53 Some of the research that we had done, some that other folks had done.
0:08:05 So one of the things I think has most shocked many and fascinated military strategists is how drones have reshaped the nature of warfare in the past decade, particularly in the past three years.
0:08:08 There’s just been huge proliferation, right?
0:08:11 Far more mass being brought to the battlefield via drones.
0:08:19 And it’s enabling much more distributed, decentralized, and lethal force structures, right?
0:08:29 Anybody running around in a truck can pop out with a drone that might go 1,000 nautical miles or a collection of drones that might go 1,000 nautical miles and hit who knows what far off into the distance.
0:08:36 There used to be much more concentration of forces, dependence on large, exquisite assets to deliver capabilities.
0:08:40 So it’s been just a complete transformation.
0:08:44 And the world has seen a lot of that unfold in the conflict in Ukraine.
0:08:50 I think one of the major questions is how can the United States and her allies adopt those lessons?
0:08:53 Because the things that we’re doing have a lot of merit.
0:08:56 It has been a force structure that’s dominated the last several decades.
0:09:01 But our adversaries have spent a long time thinking about how to counter what we’re doing.
0:09:12 And I think that it’s going to be important for us to think about how we can embrace 100 times more systems to empower our service members to be lethal and effective, to come home safely to their families.
0:09:15 And I think drones, in many ways, are the future of war.
0:09:16 Absolutely.
0:09:30 And Adam, one of the things that particularly shocked those observing, the Ukraine conflict in particular, is you would see, as writers are referring to, say a commercial quadcopter like Escadio in some instances, they’ll go and take out an exquisite system like a tank.
0:09:35 Two or three of those packed with sufficient munitions could cripple a large armored vehicle.
0:09:41 So how have you seen that sort of shift impact how military looks at dual-use commercial drones?
0:09:46 To be honest, I think that’s a question that the U.S. military has not fully digested yet.
0:09:52 And one of the realities of drones is that it just creates this massive asymmetry, right?
0:09:57 A system that costs a few thousand dollars can take out a system that costs a few million dollars.
0:10:01 And I don’t think we’ve fully grappled with that yet, to be honest.
0:10:04 I think that we’re seeing evidence of this in Ukraine.
0:10:14 And the Ukrainians, largely out of necessity, and really just incredibly impressive ingenuity, they have a broad array of ground robots or drones, air drones, sea drones.
0:10:16 They’re building them at an incredible rate.
0:10:17 They’re iterating very quickly.
0:10:19 And it’s very scrappy.
0:10:23 The U.S. military may not be subject to the same kind of constraints that the Ukrainians are.
0:10:29 But I think that we need to understand that the possibility for that asymmetry is real.
0:10:37 Like, the possibility of building very low-cost systems that are very capable and capable of delivering strikes or capable of maintaining surveillance is very real.
0:10:40 And our adversaries are likely to take advantage of it.
0:10:43 That’s a journey that we still need to go on to some extent.
0:10:46 There’s still quite a bit of inertia and momentum in our military.
0:10:49 I don’t know what your experience has been towards, like, larger, more traditional, exquisite systems.
0:10:51 I think that is starting to pivot.
0:10:54 But that’s a real question for us for the future.
0:10:58 There’s a phrase used in the military context often.
0:11:00 It’s that quantity has a quality all its own.
0:11:08 And there is an incredible disparity between what our chief adversary, the People’s Republic of China, can produce in a month for drones.
0:11:12 Whether it’s going to be formal military-style drones or dual-use commercial drones.
0:11:22 And that scale, I think a lot of people are wondering what that means in order to deter a future conflict or, God forbid, whether it would be a conflict in the Taiwan Strait or elsewhere.
0:11:33 Adam, would you mind just talking about the sort of differences in scale, just roughly speaking, between the production capabilities of both countries and why that is, particularly on a regulatory basis?
0:11:35 So I mentioned I grew up flying radio-controlled airplanes.
0:11:39 So basically all radio-controlled airplanes were made in China in the 90s and 2000s.
0:11:41 And nobody was thinking too hard about that.
0:11:42 They still are, right?
0:11:43 They probably still are, yeah.
0:11:47 And, you know, that didn’t seem like a national security issue at the time.
0:11:55 But if you think about what a drone is, it’s basically like the combination of radio-controlled airplane-type stuff, motors, and consumer electronics.
0:11:57 A lot of the same stuff that goes into a phone.
0:12:01 And, you know, as a country, we basically outsource manufacturing to China.
0:12:06 And I think that that was a series of policy decisions, expediency on the part of business.
0:12:19 That was a mistake from a military standpoint because one of the major themes is that the gap between kind of civilian technology and consumer technology and military technology is closing in many of these domains.
0:12:31 The general manufacturing capacity in China for low-cost, capable compute systems, which are really now becoming robotic systems, not just drones, but other kinds of robots, is substantial.
0:12:32 It’s a whole ecosystem.
0:12:35 And it’s not about cost either at this point.
0:12:46 It’s really about, like, technical expertise, built capacity in terms of all the different things that it takes to, like, mold and machine and build PCBs, the circuit boards, and place components on them.
0:12:49 So I don’t think this is something that we can solve overnight.
0:12:59 The ultimate thing in TBD on if this is attainable, I always say, like, wherever they’re building iPhones, they’re going to have a really good ecosystem for building drones and other kinds of electronics.
0:13:06 And so the real prize is, can we bring that level of scaled manufacturing back to the U.S.?
0:13:09 I don’t know if we can, to be honest, but I think it’s worth a shot.
0:13:12 And from the outset, as a company, we’ve been manufacturing our drones in the U.S.
0:13:18 And I would say that when we started doing that in 2016, it felt like we were swimming upstream into, like, a fast-flowing river.
0:13:26 I would say that, like, we’re maybe starting to get some, like, signs of tailwinds, especially with the new administration, which, you know, is cause for optimism.
0:13:31 But I think this is one that we just can’t give up on because robots are going to become more and more important.
0:13:39 They’re going to be using the same kind of ingredients from consumer electronics and other kinds of, like, relatively low-cost systems.
0:13:41 Cars are going in this direction as well.
0:13:45 Cars are starting to look more like laptops and phones in terms of the components that are in them.
0:13:49 This is a combination of industry and policy and all of us working together.
0:13:51 And I think there’s a cause for optimism over the last couple of years.
0:13:55 There’s still a lot of work to do, but I don’t think it’s an insurmountable hill for us.
0:13:58 I’ve got a, I guess, good news, bad news take.
0:14:00 I think the bad news, you sort of led with it.
0:14:04 The industrial capacity of China is fearsome.
0:14:10 And try as we might, I think that’s going to be a very difficult thing to close out in one year and even a decade.
0:14:16 And I’m an optimist and always like to believe that there’s a way, but it is a substantial gap.
0:14:23 And I think we’re sort of approaching, to use a rocket term, a Max-Q moment from a national security perspective where we’re undergoing a forced transformation.
0:14:27 Huge technology changes are afoot.
0:14:29 Things like AI are coming into play.
0:14:36 We’ve got an adversary that’s become extremely wealthy, huge industrial capacity, also investing massively in their military capabilities.
0:14:39 And so the question then becomes, what’s the right play?
0:14:49 And I think that history has shown, World War I, World War II, Vietnam, tremendous amounts of mass were brought to the battlefield.
0:15:00 And in some cases, despite tremendous amounts of mass being brought to sections of the battlefield, it just turned into grinded out, war of attrition without a lot of movement on either side.
0:15:02 And we start to see some of that in the Russia-Ukraine conflict as well.
0:15:07 Tremendous amounts of mass being brought to the battlefield, but front lines that are extremely difficult to move.
0:15:12 And I think a reason for that is there’s a difference between mass and effect.
0:15:20 And simply, the world is big, and targets are relatively small compared to the scale of the world.
0:15:25 And it turns out that just throwing a bunch of mass downrange, it can still be pretty hard to hit the things that matter.
0:15:33 And where the United States has dominated over the last couple decades is the software prowess, the AI, the autonomy.
0:15:45 And so I think that if we can combine the fantastic work that’s going to re-industrialize the United States to build more mass, to build more capability, we have to do that.
0:16:01 But if we can combine it with the software and autonomy capabilities, if we can close, like, the OODA loop so that we can push software updates at a moment’s notice and make every ounce of charge, every minute of flight time, maximally effective, I think that’s how we can compete.
0:16:06 There’s a second wave that hasn’t really broken yet, which is really AI and autonomy.
0:16:10 The vast majority of what’s happening in Ukraine is still basically one-to-one.
0:16:17 You’ve got these FPV pilots who are expert operators who are flying the thing, or they’re flying drones with very limited, pretty simple mission.
0:16:20 Go to these coordinates and deliver a strike.
0:16:31 And I think that we are going to see over the next decade another fundamental change as rather than having these drones animated by an operator on the ground or by, like, relatively simple algorithms on board,
0:16:35 they become animated by really advanced autonomy, and they can communicate with each other.
0:16:42 The implications of that, I think, are probably going to be even more significant than the first step to just having these things be unmanned.
0:16:47 And I do think that’s an area where, as a country, it plays more to our strengths.
0:16:51 And you can’t forget about the hardware and the manufacturing capacity.
0:16:52 You need to be able to do that stuff.
0:16:57 But winning on the AI front, I think, is even more important.
0:17:01 So to ensure we dominate, to ensure we win, Vice President Vance,
0:17:05 just spoke a few moments ago at the American Dynamism Summit, and he said,
0:17:11 our goal is to essentially remake the economy, to fix that mistake that you guys were referencing earlier,
0:17:15 about we just offshored all of our manufacturing, we can’t make things anymore.
0:17:21 And there are leading members in Congress committed to that, and also committed to defense procurement reform.
0:17:27 So if you had an opportunity to sit down with anyone, the leaders in the executive branch, leaders in the legislative branch,
0:17:31 or anyone who would be listening right now, those leaders or staffers for them,
0:17:37 like if you just fix one or two really pivot points, and if those were resolved, we could do so much more,
0:17:41 either for drones in particular, manufacturing generally, unleashing AI.
0:17:43 What sort of recommendations would you give?
0:17:47 Number one, I would ask that they continue to do what they’ve been doing,
0:17:52 which is reinforce their belief and investments in the incredible people that sign up to serve.
0:17:58 The one thing that I think we’ve got absolutely right is we have brilliant people that are brave.
0:18:03 They believe in the values of this country, and they just go and do extraordinary things.
0:18:05 And it’s been a privilege of mine to be able to meet them.
0:18:10 My brother drew me into that universe, and I just think it is such a gift to the nation
0:18:12 that we have these people that are willing to do what they do.
0:18:17 Now, we strive to contribute to make sure they can be as effective as possible
0:18:19 to fight when the Tura come home safely to their families,
0:18:22 and to provide them the best possible tools.
0:18:24 And look, what do I know?
0:18:27 The administration has hard jobs, so let me put that disclaimer up front.
0:18:31 I think the technologies that matter most to the future of war,
0:18:34 that matter most to the future of this country,
0:18:38 and all of our allies around the world are right there in front of us.
0:18:46 And I think the security challenge of our time is whether or not we can mobilize the bureaucracy
0:18:50 to go reach out and pick up what’s on the table.
0:18:54 Everybody already knows is one of the most important capabilities to the future,
0:18:59 and that is, like, taking and making real the large-scale deployment
0:19:02 and operationalization of autonomy technologies.
0:19:07 If you look in the kind of space that we play, which are small, light quadcopters
0:19:09 that weigh a few pounds that are soldier-carried,
0:19:13 the Ukrainians today are using these things at the rate of millions per year,
0:19:15 literally millions per year.
0:19:18 It’s the primary method through which they’re delivering strikes and surveilling the battlefield.
0:19:24 The U.S. military, I think, to their credit, has programs generally pointed in this direction,
0:19:28 but those programs are operating at the scale of thousands, like single-digit thousands.
0:19:32 So we’re literally off by three orders of magnitude, I would argue,
0:19:38 relative to what evidence suggests the modern battlefield demands.
0:19:40 There are similar trends in Israel as well.
0:19:44 I mean, Israel has rapidly adopted these class of systems at substantial scale.
0:19:47 And this is an area, to the point about re-industrialization,
0:19:51 where military purchasing power makes a massive difference.
0:19:58 The consumer-civilian quadcopter markets are measured in the scale of single-digit billions.
0:20:01 That’s, like, comparable, I would argue, to what the military should be spending in the space.
0:20:04 They’re not spending anywhere close to that today for this class of system.
0:20:11 And so I think that’s a pretty obvious lever that has a bunch of benefits from a national security perspective.
0:20:15 I mean, one, you’re equipping our soldiers with modern, relevant technology.
0:20:19 But two, the purchasing power that the military can bring to bear there is significant enough
0:20:24 to actually move the needle from an industrial-based standpoint for this class of technology
0:20:25 that serves other markets.
0:20:28 Not just us, but, like, the companies in our space serve public safety
0:20:32 and critical infrastructure inspection, and there’s huge technology overlap
0:20:35 between the products that you use to do that
0:20:37 and the products that folks use on the battlefield.
0:20:40 I think if you just went down all the quantities of everything
0:20:42 and just added a zero behind all of them,
0:20:44 and then, of course, there’s a cost challenge,
0:20:46 and we have to figure out how to do that.
0:20:48 Yeah, I mean, I think there’s some things that you can delete,
0:20:49 and deleting those things is also painful.
0:20:51 Well, maybe not a zero.
0:20:52 Sometimes a zero in front of something.
0:20:58 Yeah, yeah, no, I think that’s the trade-off that needs to be made.
0:21:02 Buying legacy exquisite systems
0:21:04 that cost hundreds of millions of dollars,
0:21:05 some cases billions of dollars,
0:21:08 there’s a trade-off between one or two of something
0:21:11 or 10,000 or 100,000 or a million drones.
0:21:13 And if you just look into the future,
0:21:15 which of those things is going to be more powerful?
0:21:18 I think, like, large quantities of AI-driven drones is,
0:21:20 it’s not the only thing that you need,
0:21:22 but I think in many situations is the right answer.
0:21:23 Well, let’s dive into it a little bit more.
0:21:26 So, as you both noted a few moments ago,
0:21:29 the U.S. military, starting around the 1980s, 1990s,
0:21:33 really went all in on highly expensive, exquisite systems,
0:21:36 small numbers, and that worked fine for us.
0:21:38 But our now near-peer adversary,
0:21:42 China, developed an asymmetric military designed to counter that.
0:21:45 So, they have a carrier-strike missile.
0:21:47 So, for a cost of, say, 100 million,
0:21:50 they take out tens of billions, if not more, on our side,
0:21:52 plus the horrible lives we lost.
0:21:55 So, there’s been a move within the DoD
0:21:57 to counter the counter,
0:21:59 to have a project replicator.
0:22:02 Let’s presume even if we move those decimal points,
0:22:03 so that all of a sudden,
0:22:05 the older systems has less procurement,
0:22:10 and then the newer systems have more dollars allotted to procure.
0:22:13 China, some would argue,
0:22:14 still have certain advantages,
0:22:17 perhaps because they’ve had more time working on it,
0:22:18 perhaps because they’re more subsidized.
0:22:20 One of them, people would argue,
0:22:22 and I welcome you to say that this is wrong,
0:22:24 would be in the area of swarm technology.
0:22:25 So, you can see the light shows
0:22:28 over Shenzhen and other cities
0:22:30 for the Chinese New Year,
0:22:32 where it was just extraordinary shows
0:22:33 with hundreds of thousands
0:22:36 that broke the Guinness World Record book again this year.
0:22:39 Do we have that level of sophistication,
0:22:40 and why or why not?
0:22:42 And what can we do to make sure
0:22:43 that we not only achieve parity if we have not,
0:22:45 but achieve superiority in that space?
0:22:48 So, this is an area that we’re quite focused on.
0:22:50 I mean, swarm is sort of one of these terms
0:22:52 that can mean a lot of different things, right?
0:22:53 And it’s really like,
0:22:55 what tasks are you trying to accomplish?
0:22:57 So, when you see the drone light shows,
0:22:59 they’re 100% relying on GPS,
0:23:01 so they’re using GPS to figure out where they are
0:23:03 and to position themselves precisely,
0:23:07 and they’re 100% reliant on a comms link
0:23:08 between all the drones all the time.
0:23:10 And both of those technologies
0:23:13 are basically irrelevant on the modern battlefield.
0:23:15 It’s very easy to jam GPS,
0:23:17 and comms is always contested.
0:23:19 And so, I think that those are representative
0:23:21 of their ability to build a bunch of drones
0:23:22 and get them into the air,
0:23:24 which is certainly part of the equation,
0:23:26 but from a core technology standpoint,
0:23:29 they’re less relevant for the things
0:23:32 that I think would be impactful on the battlefield.
0:23:34 And we have the great joy of competing
0:23:36 against the leading Chinese drone company, DJI,
0:23:39 in civilian markets that are unregulated,
0:23:40 where customers can buy everything.
0:23:42 And our biggest advantage competing against them
0:23:44 is AI and autonomy capability.
0:23:45 The stuff that we’ve been able to build into our drones
0:23:49 is far more advanced than what DJI has been able to do
0:23:50 in terms of being able to sense the environment
0:23:52 in real time, respond to it,
0:23:54 automate complex tasks and missions.
0:23:56 And the advantage that we have,
0:23:59 I think, is reflective of our strengths as a country.
0:24:02 We were talking about building on academic research.
0:24:04 The technology of Skydio is reflective
0:24:05 of a lot of smart investments
0:24:08 that the government has made over the years.
0:24:10 So, I think that being able to deploy
0:24:12 highly autonomous AI-driven drones at scale
0:24:15 is still a domain that we can win in,
0:24:17 and, in my view, is still up for grabs
0:24:19 and is something that we’re quite focused on as a company.
0:24:22 Right, and your company has also invested heavily
0:24:23 in autonomy.
0:24:25 Yeah, so our investments on autonomy,
0:24:28 kind of going back to one of my earlier statements,
0:24:29 is sort of predicated on this belief
0:24:31 that to make math effective,
0:24:33 it has to be intelligent, right?
0:24:34 That’s the difference between
0:24:37 grinded out, trenched situations
0:24:39 and being able to assert dominance in a space
0:24:43 is if you can find, fix, and finish targets at scale.
0:24:46 So, we just think that’s fundamentally important.
0:24:47 And the first 10 years of our journey
0:24:48 was defined by,
0:24:51 let’s strive to make the world’s best AI pilot
0:24:53 and climb the aviation food chain,
0:24:56 which culminated in us doing some work on F-16s
0:24:58 that has circulated the internet.
0:25:00 When we thought about the next 10 years,
0:25:01 the question was,
0:25:03 is the future really about Shield AI
0:25:04 building the world’s best AI pilot,
0:25:06 or is it about making a contribution
0:25:07 to the industrial base
0:25:10 so that everybody that’s building these systems
0:25:11 across America,
0:25:14 building sort of incredibly sophisticated machines,
0:25:16 striving to do it at larger scale,
0:25:19 can now deploy the best possible AI pilot
0:25:20 for their vehicles
0:25:22 for the customer’s missions.
0:25:24 And we think our best and highest contribution
0:25:26 is to enable the industrial base
0:25:29 to fast forward the large-scale deployment
0:25:32 of the world’s best AI pilots.
0:25:34 In Ukraine, you see drones come up,
0:25:35 drones come down,
0:25:36 oftentimes they’re DJI drones
0:25:38 in a matter of minutes or seconds,
0:25:39 many times failing in the mission.
0:25:42 How have you thought through
0:25:44 how your drones can operate
0:25:45 in a battlefield
0:25:46 where there is a high degree
0:25:47 of electronic warfare?
0:25:48 So, you do not have access to GPS,
0:25:50 you don’t have access to reliable signals
0:25:51 to the controller and operators.
0:25:53 When we started the company,
0:25:55 we made a huge bet on computer vision
0:25:58 as the right technology for AI and autonomy.
0:25:59 And this was back in 2014
0:26:01 when it was like much less clear
0:26:02 than it is today.
0:26:05 So, just sort of natively,
0:26:06 our drones have a bunch of cameras,
0:26:07 they look out and see the world,
0:26:09 they use that to figure out where they are
0:26:09 and how they’re moving
0:26:12 and what’s interesting and important around them.
0:26:13 We come from a place
0:26:15 of not being reliant on GPS,
0:26:16 having more of an ability
0:26:17 to do onboard things.
0:26:19 Having said that, candidly,
0:26:20 like our first round of drones
0:26:22 in Ukraine basically failed.
0:26:25 And we had built the system
0:26:28 largely informed by the U.S. Army’s requirements
0:26:30 for what they thought was important
0:26:31 for a quadcopter in this space.
0:26:32 And electronic warfare
0:26:34 was just nowhere on that list.
0:26:36 And so, from a radio standpoint
0:26:38 and from a navigation standpoint,
0:26:40 our first generation system
0:26:42 was just not set up for success.
0:26:44 And it was a painful process for us.
0:26:46 So, I’ve been to Ukraine twice myself.
0:26:48 We had a bunch of folks on our team
0:26:49 spend a bunch of time there
0:26:50 and we learned a bunch of hard lessons
0:26:51 about what it takes
0:26:53 to really operate in this environment.
0:26:56 And that really started to drive our development.
0:26:58 And more so, honestly,
0:27:00 than the requirements coming from the U.S. military,
0:27:02 we made an explicit decision as a company
0:27:05 that we think this is the real world situation
0:27:07 that matters both from an immediate impact standpoint,
0:27:09 but also whether or not they realize it,
0:27:11 this is what everybody else is going to need as well.
0:27:13 And so, there was a process for us
0:27:14 that really took a couple of years
0:27:18 of adapting the sort of native vision AI primitives
0:27:20 to work in this environment,
0:27:21 which we’ve now gotten to
0:27:23 with pretty phenomenal results
0:27:24 where the drone is incredibly resilient
0:27:26 and capable from both a comm standpoint
0:27:29 and from a GPS-denied navigation standpoint
0:27:31 in extremely harsh environments.
0:27:35 But it’s a real technology hurdle to get across.
0:27:36 And we’re now seeing this.
0:27:38 The bet that we made is in many ways paying off.
0:27:41 We’re seeing this with other militaries around the world.
0:27:42 They’re starting to come around
0:27:43 that this is the kind of thing that matters
0:27:45 and our systems are performing.
0:27:46 Now, electronic warfare testing
0:27:48 is becoming part of the evaluation protocol
0:27:49 for a lot of purchases.
0:27:51 I think Adam hit the nail on the head.
0:27:53 The effectiveness of your systems in an EW environment
0:27:56 is the difference between whether they’re relevant or not.
0:27:58 And a lot of people will say
0:28:00 that our next generation product
0:28:01 is going to work in an EW environment.
0:28:03 But I think that’s hard to say
0:28:05 unless you’re actually doing it.
0:28:06 I mean, that’s a pretty tall claim.
0:28:08 Another element of effectiveness
0:28:10 in these complex battlefields,
0:28:12 I think it’s just the adaptability
0:28:15 of your capabilities of your software.
0:28:17 So, anecdote from Ukraine,
0:28:19 to solve for this environment,
0:28:20 which is constantly changing,
0:28:21 we were going to go out.
0:28:22 We have a product called the V-Bad.
0:28:23 It’s a plane.
0:28:25 It’s 12 feet tall, 12-foot wingspan.
0:28:27 I can fly for about 12 hours.
0:28:30 We got sort of the brief from the Ukrainians
0:28:31 and the situation we expected.
0:28:32 Here’s the situation.
0:28:34 You guys are going to take off.
0:28:35 You’re going to go this way,
0:28:37 maybe 70, 80, 90 nautical miles.
0:28:39 You’ll have GPS on the ground.
0:28:40 As soon as you get to 200 feet,
0:28:41 the jammer’s going to hit you.
0:28:43 And you have to have no GPS
0:28:44 from that point forward.
0:28:46 So, engineer team’s like,
0:28:46 great, got it, good.
0:28:48 We’ll get the fix on the ground.
0:28:49 We’ll take off.
0:28:50 We’ll go take care of business.
0:28:50 It’s going to be sick.
0:28:52 We go out there,
0:28:53 and at three feet,
0:28:55 the airplane loses GPS.
0:28:57 And so, the Ukrainians
0:28:58 are constantly jamming to us.
0:28:59 The Russians, Ukrainians,
0:29:00 because the Ukrainians stop jamming,
0:29:02 they find themselves at risk.
0:29:02 Even if you’re launching
0:29:03 on the friendly side,
0:29:04 there’s intense jamming
0:29:05 at three feet off the ground.
0:29:08 So, the airplane takes off
0:29:09 and it just starts flying
0:29:10 the other direction.
0:29:11 I don’t know if you’ve seen
0:29:12 the movie Interstellar,
0:29:13 where he’s going through the cornfield,
0:29:14 the dude has a laptop,
0:29:15 and he brings down the drone.
0:29:18 Our team and the Ukrainians
0:29:19 just took off in a truck
0:29:21 and chased it for about two hours
0:29:22 and found it orbiting
0:29:23 over a cornfield
0:29:25 about, I think,
0:29:26 80 kilometers away.
0:29:29 And in the span of 24 hours,
0:29:30 the engineering team
0:29:31 re-architected the stack
0:29:33 to not use GPS
0:29:34 at any point in the mission.
0:29:36 We validated it
0:29:37 at our Texas facilities,
0:29:38 and then we pushed it forward,
0:29:39 and 24 hours later,
0:29:40 the team took off
0:29:41 and then conducted
0:29:42 a demonstration
0:29:43 that was written about
0:29:44 in the Wall Street Journal,
0:29:46 where the outcome of it
0:29:46 was ultimately
0:29:47 a Russian SA-11
0:29:48 getting found,
0:29:49 getting fixed,
0:29:49 and finished
0:29:50 by a high Mars.
0:29:52 I tell that anecdote
0:29:53 because it will be
0:29:55 the pace of deployment
0:29:56 that is the make
0:29:57 or break
0:29:58 for militaries
0:29:59 around the world.
0:30:00 If we want to make
0:30:01 a software change
0:30:02 in some of our programs,
0:30:03 it can take up to a year,
0:30:04 right?
0:30:05 it is considered
0:30:07 a big deal
0:30:08 to change the software.
0:30:10 When we took
0:30:11 that aircraft forward
0:30:11 to Ukraine,
0:30:12 we wrote new software
0:30:13 in 24 hours,
0:30:14 we pushed it,
0:30:15 and then we needed
0:30:16 24 hours
0:30:17 to properly plan
0:30:18 the operation.
0:30:19 And unless
0:30:21 we find a way
0:30:21 for industry
0:30:22 and government
0:30:23 to work together
0:30:24 to deploy
0:30:25 software capabilities
0:30:26 at that pace,
0:30:27 we will find ourselves
0:30:29 in a very tough
0:30:30 situation.
0:30:31 In a world
0:30:32 where you’re using
0:30:33 autonomous systems,
0:30:34 everything is basically
0:30:35 software-defined,
0:30:35 right?
0:30:36 The entire behavior
0:30:37 capability of the system
0:30:37 is software-defined.
0:30:39 Electronic warfare
0:30:40 is also software-defined.
0:30:41 The behavior
0:30:41 of the jammers
0:30:42 and everything
0:30:42 is coming largely
0:30:43 through software.
0:30:44 And so in many ways,
0:30:45 I think a modern conflict,
0:30:46 and you see this in Ukraine,
0:30:47 becomes basically
0:30:48 like a software writing fight.
0:30:49 And the speed
0:30:50 at which you can write it
0:30:51 and deploy it
0:30:51 really matters.
0:30:52 And I have no choice
0:30:53 but to be hopeful
0:30:54 and optimistic here
0:30:54 because we’ve had
0:30:55 the same experience
0:30:56 where sometimes
0:30:57 it’s taken us
0:30:57 two years
0:30:59 to push new software
0:31:01 into deployed systems.
0:31:02 I would like to think
0:31:02 that if we were
0:31:03 actually in a conflict,
0:31:04 that would just evaporate
0:31:05 and we could do it
0:31:06 in a day,
0:31:07 maybe that’s overly
0:31:08 optimistic and hopeful.
0:31:09 But I think that’s one
0:31:09 where the status quo
0:31:10 is unacceptable.
0:31:12 Two exceptional stories.
0:31:14 So we’re discussing autonomy.
0:31:16 And one of the concerns
0:31:18 either of you may hear
0:31:19 from time to time
0:31:20 is that,
0:31:20 well,
0:31:21 if you have autonomous drones,
0:31:22 that means humans
0:31:23 are not in the loop.
0:31:24 And that means
0:31:25 we’re ceding our authority
0:31:26 to essentially software.
0:31:28 is that an accurate assessment?
0:31:30 What does autonomy mean
0:31:31 in terms of humans
0:31:32 actually having control
0:31:34 over the end state
0:31:35 of what the drones
0:31:35 are doing?
0:31:37 And how do we best
0:31:38 configure the military
0:31:39 and also civil society
0:31:40 to a future
0:31:42 where autonomous drones
0:31:43 are not only available
0:31:43 but cheap,
0:31:44 widely adopted?
0:31:46 The first thing to say,
0:31:47 and this is really,
0:31:48 I think,
0:31:48 the important backdrop
0:31:49 for all of this
0:31:51 is that this is
0:31:52 terrible stuff, right?
0:31:52 I mean,
0:31:53 we’re talking about
0:31:53 weapon systems
0:31:54 that kill people
0:31:56 and create immense suffering
0:31:57 and that is the reality
0:31:58 of war.
0:31:59 The real goal
0:32:00 for everything
0:32:02 is to act as a deterrent
0:32:04 and to make conflict
0:32:04 less likely
0:32:05 and to make it such
0:32:07 that if conflict does happen,
0:32:08 you can be maximally targeted
0:32:08 and precise
0:32:10 and minimize human suffering.
0:32:11 I think that’s just
0:32:12 sort of like
0:32:13 an important backdrop
0:32:13 for all of this
0:32:14 when we’re thinking
0:32:15 about what we do
0:32:16 and the kinds of systems
0:32:17 that we’re building.
0:32:18 Now,
0:32:19 I think there’s a bunch
0:32:20 of legitimate concerns
0:32:21 about what does it mean
0:32:22 to have these AI-driven robots
0:32:23 and how automated
0:32:24 are they going to be
0:32:25 and how much authority
0:32:26 are we going to delegate to them?
0:32:27 I think the thing
0:32:28 that we have to also
0:32:29 keep in mind
0:32:29 is unfortunately
0:32:31 the game theory here
0:32:32 is just as simple
0:32:33 and obvious as it can be,
0:32:33 right?
0:32:34 Nobody thinks
0:32:35 nuclear weapons
0:32:36 are good for humanity
0:32:38 on an individual level.
0:32:39 Deploying a nuclear weapon
0:32:40 is like a miserable,
0:32:41 terrible, terrible thing,
0:32:43 but the only world
0:32:44 worse than one
0:32:44 where like you
0:32:45 and your adversary
0:32:46 have nuclear weapons
0:32:47 is one where only
0:32:48 your adversary does,
0:32:48 right?
0:32:49 So I think
0:32:50 one of our strengths
0:32:51 as a country
0:32:52 is our values
0:32:52 and the way
0:32:53 that we try
0:32:54 to conduct conflict
0:32:56 with a high ethical standard
0:32:57 and I think
0:32:57 that it’s important
0:32:58 to maintain that.
0:32:58 One of the things
0:32:59 that I’ve been impressed by
0:33:01 is really the level
0:33:01 of sophistication
0:33:02 within the military
0:33:03 on these issues.
0:33:03 I mean,
0:33:04 there’s people
0:33:05 whose job it is
0:33:06 to think deeply
0:33:07 about the implications
0:33:08 of different kinds
0:33:08 of weapon systems
0:33:09 and how authority
0:33:10 is delegated.
0:33:11 there’s very robust
0:33:12 controls in place
0:33:13 for how the military
0:33:14 thinks about these systems.
0:33:15 Those things are evolving
0:33:17 as the technology evolves,
0:33:18 but this is something
0:33:19 I’ve thought quite a bit
0:33:19 personally about.
0:33:20 The more that I’ve thought
0:33:20 about it,
0:33:21 the more time I’ve spent
0:33:21 with the military,
0:33:23 the U.S. military in particular,
0:33:24 the more comfortable
0:33:24 I’ve gotten
0:33:26 that this is a robust organization
0:33:27 that cares about
0:33:27 doing the right thing,
0:33:28 is thinking deeply
0:33:29 about the implications
0:33:30 of technology.
0:33:32 And my general view,
0:33:33 which I think is shared
0:33:34 by the U.S. military
0:33:35 policy and doctrine,
0:33:36 is that ultimately
0:33:36 human judgment
0:33:37 is really important.
0:33:39 A human exercising judgment
0:33:41 in how force should be used
0:33:42 is super important.
0:33:43 But the other thing
0:33:44 that people have to understand
0:33:45 is that the status quo
0:33:46 is not great.
0:33:46 Oftentimes,
0:33:47 your choice,
0:33:48 if you want to take out
0:33:48 a target,
0:33:49 is dropping a 500
0:33:51 or 2,000-pound bomb,
0:33:52 which is going to cause
0:33:53 widespread destruction
0:33:55 and a lot of collateral damage.
0:33:56 And so,
0:33:57 an AI system
0:33:58 that might be
0:33:59 using autonomy
0:34:01 to a pretty intense degree
0:34:01 to figure out
0:34:02 where something is
0:34:03 and what to do about it
0:34:04 is probably better
0:34:05 than dropping
0:34:07 a 2,000-pound bomb
0:34:07 and blowing up
0:34:08 a whole city block.
0:34:09 And so,
0:34:10 I think you’ve always
0:34:10 got to be thinking about
0:34:11 what’s the status quo
0:34:13 and can we use AI
0:34:14 and autonomy
0:34:15 to better,
0:34:16 more precisely,
0:34:17 accomplish the thing
0:34:18 that we care about
0:34:19 while inducing
0:34:20 as little human suffering
0:34:21 as possible.
0:34:23 Maybe the good
0:34:25 and true news
0:34:25 right now
0:34:26 is that
0:34:27 I think human-machine teams
0:34:28 are far more effective
0:34:29 than machine-only teams
0:34:30 right now
0:34:31 and for the next
0:34:32 several years
0:34:33 that’ll continue
0:34:33 to be true.
0:34:34 And maybe it’s true
0:34:35 for longer than that.
0:34:36 And the frameworks
0:34:37 that we have in place,
0:34:39 people are in the loop
0:34:40 on the decisions,
0:34:41 make a tremendous amount
0:34:41 of sense.
0:34:43 I think Adam brings out
0:34:43 an excellent point.
0:34:44 The game theory
0:34:45 is one that
0:34:46 if somebody finds
0:34:47 that a machine-only team
0:34:49 is the most effective team
0:34:50 in certain missions
0:34:50 and circumstances,
0:34:52 the question is
0:34:52 why and when
0:34:53 would that be used
0:34:54 and what do you do
0:34:54 about it?
0:34:55 And how do you make sure
0:34:56 that you’re ready for it?
0:34:57 And so I don’t think
0:34:58 you can live in a world
0:34:59 where you have blinders.
0:35:00 I was speaking
0:35:01 to a senior military leader
0:35:02 that had a nice framing,
0:35:03 which is his expectation
0:35:04 is the more defensive
0:35:06 you end up being,
0:35:07 the more likely you are
0:35:07 to turn things over
0:35:08 to machine control
0:35:10 to get the very fast reactions
0:35:11 and the dominance
0:35:11 that you need
0:35:12 to come out
0:35:13 of that situation.
0:35:14 And so a great,
0:35:15 very practical example
0:35:15 of that today
0:35:17 is the phalanx gun system
0:35:18 that protects ships, right?
0:35:19 If you come into
0:35:20 that thing’s weapon
0:35:20 engagement zone
0:35:21 and it’s turned on,
0:35:22 it will kill it, right?
0:35:23 And so you can think
0:35:24 about that
0:35:25 on a larger scale.
0:35:26 If a force is pressing
0:35:27 on another force,
0:35:28 they find themselves
0:35:29 in a defensive situation
0:35:30 and they flip the switch
0:35:31 and they go full auto,
0:35:33 what does that mean, right?
0:35:34 And how does that get
0:35:35 put back in the box?
0:35:35 And tactically,
0:35:36 what does that mean
0:35:36 for your forces?
0:35:37 Were they trained
0:35:38 to face something
0:35:39 that had that level
0:35:40 of capability
0:35:41 and that level
0:35:41 of discretion,
0:35:42 that level of speed?
0:35:43 And then how does it
0:35:44 play forward from there?
0:35:45 And so I think
0:35:45 that there are a lot
0:35:46 of hard questions.
0:35:47 I think that like
0:35:48 the convenient answer,
0:35:48 the easy answer
0:35:49 is that humans
0:35:50 are always going
0:35:50 to be on the loop.
0:35:51 It’s going to be fine.
0:35:52 Don’t worry about it.
0:35:53 But I think that
0:35:53 the world is a little bit
0:35:54 more complicated than that.
0:35:56 I don’t have answers for it
0:35:57 other than
0:35:58 a strong conviction
0:36:00 that America needs to lead.
0:36:01 So no matter
0:36:02 what you believe
0:36:03 about any of this,
0:36:04 whether you have conviction
0:36:05 on one side or the other
0:36:06 or you’re just uncertain
0:36:07 what the future
0:36:08 is going to be
0:36:08 and I think there’s
0:36:09 a lot of uncertainty
0:36:10 about how the future
0:36:11 will play out,
0:36:12 American leadership
0:36:12 is the answer.
0:36:14 At the conceptual level,
0:36:15 a lot of these things
0:36:16 are less new
0:36:17 than they seem.
0:36:18 So like dropping a bomb
0:36:19 in World War II,
0:36:19 like once that thing
0:36:20 leaves the bomber,
0:36:21 it’s autonomous.
0:36:22 It’s pretty dumb autonomy,
0:36:23 but human judgment
0:36:24 is over, right?
0:36:25 That thing is falling
0:36:26 and it’s guided
0:36:26 by gravity
0:36:27 and wind
0:36:27 and physics
0:36:28 and other things
0:36:30 and at some point
0:36:30 in that trajectory,
0:36:32 if it starts heading
0:36:32 towards the wrong place
0:36:33 or you realize
0:36:34 that it was the wrong target,
0:36:34 there’s nothing
0:36:35 you can do about it.
0:36:36 We are used
0:36:37 to relinquishing control
0:36:39 over the end outcome
0:36:40 and usually that results
0:36:41 in much less precision
0:36:42 and much less ability
0:36:44 to actually accomplish
0:36:44 the thing
0:36:45 that you care about
0:36:47 and AI fundamentally
0:36:48 changes that equation,
0:36:49 but I don’t think
0:36:50 that the concept
0:36:50 of the human
0:36:51 relinquishing control
0:36:53 over the ultimate
0:36:54 thing that happens
0:36:55 is actually new.
0:36:57 both of you
0:36:57 questioned the premise
0:36:59 that this is a new thing
0:37:00 that’s never happened before.
0:37:01 It is not the case.
0:37:01 And second,
0:37:02 both of you
0:37:02 zoomed out
0:37:03 and said,
0:37:04 listen,
0:37:05 we are not
0:37:06 at the end of history.
0:37:07 Fukuyama was wrong.
0:37:08 There are rival systems
0:37:09 of government,
0:37:10 totalitarian states
0:37:11 and free states
0:37:13 and we have to make sure
0:37:13 and we have to make sure
0:37:14 whether your concerns
0:37:16 may be in a conscientious,
0:37:17 thorough and democratic manner,
0:37:19 we ensure that we are the ones
0:37:19 who have dominated
0:37:20 this technology
0:37:21 to ensure deterrence.
0:37:22 Now let’s imagine
0:37:23 two different futures
0:37:24 10 years from now.
0:37:25 One,
0:37:27 where totalitarian states,
0:37:28 China,
0:37:28 Russia,
0:37:29 et cetera,
0:37:30 has dominance
0:37:31 over this technology.
0:37:32 They have not only
0:37:33 in terms of procurement,
0:37:33 in terms of production,
0:37:34 but also in terms of technology.
0:37:35 And another,
0:37:37 where we maintained
0:37:39 and then advanced
0:37:40 our technological advancement
0:37:41 and lead over them
0:37:42 and also achieved
0:37:43 manufacturing,
0:37:43 at least parity
0:37:44 if not superiority.
0:37:46 What do those two futures
0:37:47 look like?
0:37:49 It’s hard to predict exactly,
0:37:50 but I think AI
0:37:51 is the most important technology
0:37:52 really in the history
0:37:52 of humanity.
0:37:54 And a world
0:37:55 where our adversaries
0:37:57 have it and we don’t
0:37:57 is not a good one.
0:37:58 I mean,
0:37:58 a world where like
0:37:59 the Soviet Union
0:38:00 had nuclear weapons
0:38:01 and we didn’t.
0:38:02 A world where
0:38:02 Nazi Germany
0:38:03 had nuclear weapons
0:38:04 and we didn’t.
0:38:04 I mean,
0:38:05 these are not pleasant things
0:38:06 to think about.
0:38:06 And so,
0:38:07 in my standpoint,
0:38:08 I think we should just view that
0:38:10 as an unacceptable outcome,
0:38:12 like one that we cannot allow.
0:38:14 And some of this plays out
0:38:15 like beyond
0:38:16 the military domain,
0:38:17 but I think there is
0:38:18 in AI,
0:38:19 and I think this is changing rapidly
0:38:20 for the better
0:38:21 with the new administration,
0:38:23 there was a lot of talk
0:38:24 of like safety
0:38:24 and regulation
0:38:25 and we can’t do this
0:38:26 and we can’t have
0:38:26 this many parameters
0:38:27 and if you use more
0:38:28 than this much compute,
0:38:29 it’s not allowed.
0:38:30 If you’re sitting
0:38:31 in Beijing or Moscow,
0:38:32 I mean,
0:38:32 that’s just got to be
0:38:33 music to your ears,
0:38:33 right?
0:38:34 Of please,
0:38:34 yes,
0:38:35 slow down.
0:38:35 It’s not to say
0:38:36 that we shouldn’t be
0:38:36 thoughtful,
0:38:37 but like,
0:38:38 we’ve got to be real
0:38:39 about the game theory
0:38:40 dynamics here
0:38:41 and the implications
0:38:42 for this technology.
0:38:44 So, to close out,
0:38:45 there are, I’m sure,
0:38:46 a number of founders
0:38:48 watching online
0:38:49 and I’m sure many of them
0:38:50 would be interested
0:38:50 in getting into
0:38:51 public safety,
0:38:52 defense tech,
0:38:53 want to make a difference.
0:38:54 They see the mission,
0:38:56 they see the need.
0:38:57 Perhaps they’re scared,
0:38:57 they’re afraid,
0:38:58 they’re afraid of failure,
0:38:59 they don’t know how to start.
0:39:00 So, if you were to give
0:39:02 just one piece of advice
0:39:03 to these founders
0:39:03 that would be founders,
0:39:04 what would it be?
0:39:05 Number one,
0:39:07 the mission is worth it.
0:39:08 I don’t think that there is
0:39:10 a thing in the world
0:39:11 that you can care about
0:39:11 that doesn’t build
0:39:12 from a foundation
0:39:13 of security and stability.
0:39:15 And I wake up every day
0:39:18 just incredibly excited
0:39:19 and honored to have
0:39:19 the opportunity
0:39:20 to contribute
0:39:21 to something that I think
0:39:22 is so important.
0:39:23 The other thing I would say
0:39:23 is this stuff
0:39:25 is just extremely challenging
0:39:25 and I think that
0:39:26 especially if you’re
0:39:27 building hardware
0:39:27 and you’re serving
0:39:28 these critical industries
0:39:29 where the stakes
0:39:30 are really high,
0:39:31 you should just expect
0:39:32 it to be really hard.
0:39:33 And it’s hard
0:39:34 in different ways
0:39:35 at different points in time.
0:39:37 But at some level,
0:39:38 I think to be successful
0:39:39 at it over the long term,
0:39:40 you’ve got to kind of love that.
0:39:41 And I think probably
0:39:41 both of us,
0:39:42 different things
0:39:43 in our backgrounds, we do.
0:39:44 I love the challenge
0:39:45 of trying to solve
0:39:46 these difficult problems
0:39:47 and I think probably
0:39:48 both our companies
0:39:48 tend to attract people
0:39:50 who want to be really pushed
0:39:51 and challenged
0:39:53 and work on hard problems
0:39:54 and struggle with things
0:39:55 over days, months, years.
0:39:56 And that is definitely
0:39:57 part of the journey here.
0:39:59 If you want to get to something
0:40:00 that is going to have
0:40:00 real impact,
0:40:01 you know, you’ve got to
0:40:02 embrace the struggle.
0:40:03 100%.
0:40:04 Wise words.
0:40:07 Now, if you made it this far,
0:40:09 a reminder that this
0:40:10 was recorded live
0:40:11 at our third annual
0:40:12 American Dynamism Summit
0:40:14 in the heart of Washington, D.C.
0:40:15 And if you’d like to see
0:40:16 more exclusive content
0:40:17 from the summit,
0:40:18 head on over to
0:40:19 a16z.com
0:40:20 slash American
0:40:21 dash dynamism
0:40:22 dash summit
0:40:24 or you can click the link
0:40:25 in our description.

War has always been shaped by technology—from steel and gunpowder to GPS and nuclear weapons. But the decisive technologies of tomorrow aren’t coming—they’re already here.

In this episode, recorded live at our third annual American Dynamism Summit, a16z’s Senior National Security Advisor Matt Cronin sits down with Ryan Tseng (cofounder & CEO, Shield AI) and Adam Bry (cofounder & CEO, Skydio) to discuss the rise of autonomous drones, AI-driven warfare, and the escalating great power competition with China. They cover:

  • Why drones are reshaping the battlefield in Ukraine, Israel, and beyond
  • The asymmetry of $1,000 drones taking out $10M tanks
  • Why U.S. drone production lags China—and how to catch up
  • The ethical and tactical implications of autonomy in combat
  • What it will take to reindustrialize America and maintain deterrence

If the future of warfare is software-defined, who writes that software—and who deploys it first—matters more than ever.

 

Resources: 

Find Adam on X: https://x.com/adampbry

Find Ryan on LinkedIn: https://www.linkedin.com/in/ryantseng/

Find Matt on LinkedIn: https://www.linkedin.com/in/matt-cronin-8b88811/

See more from the American Dynamism Summit: www. a16z.com/american-dynamism-summit

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Leave a Comment